this post was submitted on 25 Oct 2024
10 points (100.0% liked)

AskBeehaw

2003 readers
1 users here now

An open-ended community for asking and answering various questions! Permissive of asks, AMAs, and OOTLs (out-of-the-loop) alike.

In the absence of flairs, questions requesting more thought-out answers can be marked by putting [SERIOUS] in the title.


Subcommunity of Chat


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 1 year ago
MODERATORS
 

So, I have about 12 GB and I want to fill it with files that need archival.

My criteria are:

At risk of becoming lost

Being usable files without specific hardware (needing a software emulator is fine)

Willing to break my rules if I'm.given better ideas

top 11 comments
sorted by: hot top controversial new old
[–] ReversalHatchery 4 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

Oh my sweet summer child, 12 GB is not a lot! :)

well, for one you can start saving webpages you found helpful, maybe your useful links collection or bookmarks, if you have any of those. I would recommend using Firefox and the singlefile addon, or the webscrapbook addon. feel free to look into their settings, but don't let it overwhelm you, of you need it take it in smaller pieces, there's no shame in it.
since this is mostly text, it shouldn't take up that much space quickly, and it's also very efficiently compressible! for example with 7zip.

if you often watch videos, like on YouTube or somewhere else, and you find something useful or otherwise you think it's worth preserving (entertainment is also a valid reason), you can grab it too. have a look at yt-dlp, it's very versatile, very configurable, and not only for youtube.
but this will easily take up a lot of space, videos are huge and not really compressible losslessly.

other than that, have a look at the DataHoarder community: !datahoarder@lemmy.ml (I hope the link works). for even more, you may check the datahoarder and opendirectories subreddits through libreddit/redlib

[–] sleepybisexual 1 points 3 weeks ago (1 children)

Well, I might have another 64, just looking for something to supplement my ROM archive

[–] ReversalHatchery 1 points 3 weeks ago (1 children)

what kind of devices do you have? do you have a desktop computer?

[–] sleepybisexual 1 points 3 weeks ago (1 children)

I have an android device and a laptop running linux

[–] ReversalHatchery 1 points 2 weeks ago (1 children)

I would definitely recommend more to use the laptop for this

[–] sleepybisexual 1 points 2 weeks ago

Yea, that's what I'm using, android is not capable of what I'd need in this context

[–] some_guy@lemmy.sdf.org 3 points 3 weeks ago* (last edited 3 weeks ago)

Whoops, you said 12GB. I read it as TB. The original suggestion would not fit. My bad.

[–] ReversalHatchery 3 points 3 weeks ago

oh and if you're interested in archiving, definitely check out the Archive Team!

they are always running archiving projects, they even participate in preserving reddit content, and they have a connection with archive.org and the Wayback Machine.
they maintain a virtual machine image that you can run at home even on a simpler PC, and help in their projects. It does not consume much storage actively, only some network bandwidth. It's basically a distributed archiving tool, a lot of people running it download all kinds of data (good for performance and to avoid restrictions) for the selected project, and upload it to AT for preservation

[–] Gamers_mate 1 points 3 weeks ago (1 children)

12GB is small but that just means you might want to opt for links you find useful like ReversalHatchery said.You could probably store a lot of fanfiction and tutorials if its just text. I have had an idea of creating a special storage code that can store large amounts of information in a smaller file size but I am bad at maths and programming and have no idea what I am doing so it might take a while.

[–] Ava 3 points 3 weeks ago

In general, this is definitely an area where the best approach is to just find an existing tool for what you need and use that. Especially for text data, compression is a pretty well-studied field and there are plenty of public (and open-source, if that's a requirement) tools that will do a fantastic job at reducing size. Rolling your own is likely to result in significantly worse compression rates, and if you make an error your data could be irreparably destroyed which you won't know until you try to access it later.

If your data is incredibly specific you might be able to do better, but it's usually best to ignore that sort of optimization until you actually need it.

[–] ExperimentalGuy@programming.dev 1 points 3 weeks ago

The internet archive has a lot of torrents that need seeding if you're up for something like that? I think they're still down, but once they're back up, it's pretty easy to help out with that.