this post was submitted on 19 Oct 2023
113 points (100.0% liked)

Hardware

168 readers
1 users here now

This is a community dedicated to the hardware aspect of technology, from PC parts, to gadgets, to servers, to industrial control equipment, to semiconductors.

Rules:

founded 4 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] ALostInquirer@lemm.ee 24 points 1 year ago* (last edited 1 year ago) (10 children)

From the article:

As intriguing as the idea is, we have to admit it smacks of a publicity stunt more than an earnest act of preservation. Even if the data is secure, are the robots the new points of failure? What’s to protect them from fires, floods, EMPs, and all the other threats? What about the readers, which are delicate lasers driven by algorithms? In all likelihood, any explorers in the year 12,000 that might stumble onto the remains of the Global Music Vault would just display it in a museum as a collection of crystal coasters.

I was asking myself similar questions to these, alongside even more basic details like, "What if the future computer systems simply aren't compatible with the old filesystems, thus indicating nothing as being present on the storage media (if it's even recognized as storage media to test)?" It's the deeply fascinating problem all long-term information storage/transmission faces regarding future comprehensibility.

[–] cheery_coffee@lemmy.ca 4 points 1 year ago (1 children)

Realistically I think this will only be used for short (sub 100 years) storage, or archives like a microfiche archive that are in continuous use.

There are quite a few use cases where a government or company might be obligated to keep data for long periods.

I’m curious about the 10,000 year claim, does that apply to the full plate, or is that average time to fail per some unit of data?

[–] MxM111@kbin.social 2 points 1 year ago

Since I am sure error correction code is used, it is one and the same.

load more comments (8 replies)