this post was submitted on 01 Jul 2023
34 points (100.0% liked)

Programming

13385 readers
1 users here now

All things programming and coding related. Subcommunity of Technology.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

I had a bunch of personal scripts to manage my music database. Maybe 10-sh scripts, max a few hundred lines long, nothing too big. A little while ago, I wrapped them into a big emacs org file for literate programming, and to tangle them, so I could easily edit them in one place. Backed them up to at least three servers, both locally and in another building. I also have Cronopete running, (a Linux implementation of MacOS Time Machine), so everything is safe, right? Right?!. I didn't need the scripts for 3 months or so, but today I wanted to use them but couldn't find them anywhere. Not on any backup server, not on the Cronopete drive. The only thing I can think of is that I must have saved that org file on the backup server and then backed up over it (and it never got pulled by Cronopete because it does of course not look at the backup server). I will have to start rewriting those scripts from scratch. FML.

top 16 comments
sorted by: hot top controversial new old
[–] thequickben 42 points 1 year ago (3 children)

I’ve made too many mistakes like this, so I check in anything important into git. Gitlab is easy to run locally.

[–] tietze111@feddit.de 22 points 1 year ago (1 children)

If it is not in git, it is not safe, learned that the hard way as well... I guess we all do at some point

[–] Dunstabzugshaubitze@feddit.de 22 points 1 year ago (1 children)

I guess some lessons need to be learned through pain.

  • Commiting regulary.
  • Following the branch rules.
  • writing tests.
  • writing tests, that test the desired not the current behaviour
  • refactoring your code.
  • not refactoring code, you don't understand nor have tests for.
  • actually reading code before merging a pr.
  • not pulling in 23 unmantained libraries to solve a simple problem.
  • keeping your dependencies up to date.
  • that dirty hack will make your life harder.

Yes, all those hurt. They sometimes still do, most of us are not machines that turn caffeine into code and we are never as clever as we think we are.

[–] steph@lemmy.clueware.org 3 points 1 year ago (1 children)

On a side note, w.r.t. keeping the dependencies up to date, have a look at renovatebot. It creates merge request for each and every dependency update, thus triggering a build to check that everything is OK.

[–] Dunstabzugshaubitze@feddit.de 1 points 1 year ago (1 children)

Oh, that seems nice. I'll bring it up at work, we have some projects that could use this.

Can it run against a simple git repository or does it only work with the apis of github, bitbucket and co?

[–] steph@lemmy.clueware.org 1 points 1 year ago

You decide which repo you want to be managed, there's an embarking option that creates an issue explaining how to have the tool embarking the repo and the tool itself has a filter if you want a "whitelist" approach.

The docs list GitHub, Gitlab, Bitbucket, Gitea and some Azure and AWS solutions. The runner is only available on Gitlab, though.

There's also a "freemium" solution, but I couldn't get it to work and the runner is working fine anyway.

[–] jsveiga@vlemmy.net 11 points 1 year ago

If you'll run gitlab locally just for you, it's easier to simply create a network shared directory and use it as a git repository. Git on your local machines can push/pull/clone to/from a directory (local or remote) just like to/from a git server.

[–] DeltaWhy 7 points 1 year ago (1 children)

GitLab is pretty resource heavy - if you want to self host something I prefer Gitea. Very easy to set up, doesn’t require Docker, just a single binary.

[–] donio 9 points 1 year ago* (last edited 1 year ago) (1 children)

Why bother with either of those for private personal repos though? Why not just regular remote repos over ssh?

[–] DeltaWhy 3 points 1 year ago

That's also an option - I've used gitolite before to set that up. In my case though I wanted to mirror repos from gitlab.com and github, and I might want to hook up CI and webhooks later on.

[–] StringPotatoTheory 11 points 1 year ago

damn that's really rough. do you know how to use git? might be helpful to have your scripts on a private gitlab or github project just in case

[–] pcouy@lemmy.pierre-couy.fr 4 points 1 year ago

I recently lost my whole home dir by bind mounting it into a chroot while tinkering with some package building stuff. While trying to check for reproducibility, I ran a command that basically sudo rm - rfed the chroot, with my home dir mounted inside.

That was two weeks ago, and I'm still working on recovering some of my most valuable ugly scripts that I never properly backed up.

[–] anon@lemm.ee 4 points 1 year ago

backups need to be automatic and opt-out, you need to backup stuff you don't know you need to backup or you will lose data.

[–] heartlessevil@lemmy.one 2 points 1 year ago

At least it was just scripts and not important pictures or rare albums or anything

[–] edent@lemmy.one 1 points 1 year ago

Of it helps, I can highly recommend Beets as a music manager for Linux. http://beets.readthedocs.io/

A great command line tool for organising all your MP3s.

[–] PoisonedPrisonPanda@discuss.tchncs.de 1 points 1 year ago* (last edited 1 year ago)

i fucked up my syncthing configuration when I had to hard reset my phone. 6 month of pictures gone. I was so angry at myself.

edit: of course I didnt backup my signal backup because "I hAvE EvErYThinG iN SynCtHinG"

load more comments
view more: next ›