Subscribe to DSC Newsletter

How to design computer systems that keep data alive for 100 years?

I am just curious about how digital data is kept alive for like a hundred years. Obviously, so far, there is not a single chip or computer system that has kept data alive in memory for 100 years, since the oldest computers are about 50 years old. Yet the human brain is capable of keeping data for 100 years in one single brain, and transmit it to your children and other people (and even to computers).

You read a lot about AI going to take over humankind, yet this simple memory problem, as far as I know, has no answer. How vulnerable are memory chips and computer systems to powerful magnetic fields, or to data manipulations by terrorists or hackers, or by huge solar flares? How data redundancy  (also implemented in the human brain) can help with this? It looks like the human brain can handle these scenarios rather easily (though some diseases like Alzheimer attack data / memory centers in the brain), but what about computers? I guess you could design computers in such a way that they can withstand most attempts (accidental or not) at memory erasure or alteration. Maybe it is cost-prohibitive today but is feasible in the near future? So far we have never faced a case of widespread, massive data loss / data failure / data alteration, but if such extreme events occur once every 100 years, it is a concern, and makes the data stored in the human brain having at least this advantage over digital data.

What do you think?

DSC Resources

Views: 662

Reply to This

Replies to This Discussion

See the Long Now Foundation.  http://longnow.org/     Aka the 10,000 year clock.    Which we part of for years, which examined related questions.  

RSS

Videos

  • Add Videos
  • View All

© 2019   Data Science Central ®   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service