We live in a highly computerized society (the concept of the Internet of Things) where not only are there a lot of electronic devices abound, you probably have a computer on your person right now. Think of how many people in the world have a smartphone (surprisingly, I don’t – I’m just cheap). According to Experts Exchange, the guidance systems used in the Apollo missions have only twice the computing capability of the old Nintendo Entertainment System (or Famicom for you video game hipsters out there). That’s right – with half of the RAM that NASA used to guide Apollo 11 to the moon, I’ve been using it to save the Kingdom of Hyrule.
My old Sega Genesis (CPU Speed: 7.6 MHz, RAM: 72 KB) would have blown away the Apollo guidance systems. No contest.
All this computerization requires a lot of resources, especially silicon to make chips. There may be a lot of it around, but we may need more due to the sheer amount of data that is generated from all computers. Here’s a nifty infographic, and this is just from the Internet in one minute. What about from office Intranets and other private networks? What about data that isn’t connected, like USB drives? Also (and if you’re born in the Nintendo generation, please brace yourself) some of our old video games are developing amnesia. Hyrule will forget the name of their greatest hero (I pretty much have, but it’s probably inappropriate anyway). And that doesn’t even include the metrics for Pokémon Go. That data can become staggering in a very short amount of time. So, what can we do? Scientists propose using genetic material.
They’re serious – memory devices running on DNA rather than silicon wafers. Why? Think about it: our biology teachers (especially the better ones, and you likely remember who they are) often tell us that DNA is data that the cell reads and carries out its instructions. Without oversimplifying it, that’s pretty much what it is. We can apply that thinking to computer data as well; our human genome is pretty much the data of what we are, and it can be recorded and documented. See the featured image? I had to crop it, but the entire human genome is recorded in those books, and depending on who you ask, it can easily fit into a flash drive. So it really is possible to work with DNA as a form of data storage. But how?
Data in our computers is a long series of binary code. Every time you see a long string of ones and zeroes in science function, that’s binary. What our computers do – and they’ve been getting faster and faster at it, remember the first infographic above – is write, transmit, and read data in binary. There’s a lot of explanations on why computers use binary that we won’t explore here (but you could if you’re so inclined), but let me demonstrate what happens, using one of my favorite quotes from a book:
01001001 00100000 01100001 01101100 01110111 01100001 01111001 01110011 00100000 01110100 01101000 01101111 01110101 01100111 01101000 01110100 00100000 01110100 01101000 01100101 01110010 01100101 00100000 01110111 01100001 01110011 00100000 01110011 01101111 01101101 01100101 01110100 01101000 01101001 01101110 01100111 00100000 01100110 01110101 01101110 01100100 01100001 01101101 01100101 01101110 01110100 01100001 01101100 01101100 01111001 00100000 01110111 01110010 01101111 01101110 01100111 00100000 01110111 01101001 01110100 01101000 00100000 01110100 01101000 01100101 00100000 01110101 01101110 01101001 01110110 01100101 01110010 01110011 01100101
Ha-ha! Oh, my sides. Wait – you didn’t get it? Of course you don’t – you weren’t trained to read in binary (me neither). This is how computers read my favorite quote, which is from The Restaurant at the End of the Universe by Douglas Adams (1980). For those of us who prefer to read our novels instead of mathing them out, here’s how the above binary string translates, according to Roubaix Interactive:
“I always thought there was something fundamentally wrong with the universe.” (note: the binary code did not include punctuation marks)
A computer that runs on genetics doesn’t run on binary, it would run on base 4, or quaternary. Binary is base 2 because its values are based on 0 and 1. Western (and a lot of Eastern, as far as I know) counting systems are base 10 (or decimal) because all values are based on 0 through 9. So base 4 would be – you guessed it. 0, 1, 2, and 3. Why bring this up? Because our genes are long sequences of only four molecules: adenine (A), guanine (G), cytosine (C), and thymine (T):
That’s right, just like all English words are just sequences of 26 letters and all music are just sequences of 12 tones, all genetics is based on the order of four molecules (chew on that). So what would my favorite quote look like? Well, if we’re going to run it through a “genetic” computer, it would probably have to translate it into quaternary. I had to do this the long way using some basic tools around the Internet (there’s no direct translation between English and quaternary/base 4 yet), but here’s what I get:
With a little genetic magic (OK, I used Microsoft Word, you got me, leave me alone), we can translate it into genetic code by using adenine for 0, guanine for 1, cytosine as 2, and thymine as 3:
There we go – one sentence, expressed as a strand of DNA. Keen, huh?
Could genetic data storage be a viable alternative to silicon-based devices? There’s a lot going for it – it requires very low power and it can keep itself together for a very long time. You can store literally any kind of data that you well please. Can it be retrieved and read back into some form we can understand, like a human language? What about rewriting data? It’s easy enough to write it the first time, but what if one has to rewrite something? It’s not just a matter of ‘flipping switches’ on or off (which is what most of binary is really), but rewriting genetic data means that you are essentially telling your data to mutate. Then it gets complicated. Computer engineering may have to take on a completely different structure, including the math, logic, and programming languages that goes into putting together the programs and apps that support and irritate our ever-so-digitized lives.
Before anyone gets into a snit, relax, they’re not going to upload long genetic ‘text’ into your body. After all, who knows what it can actually do to the human body? I don’t want to know what happens if someone uploads anything from the Cthulhu mythos or the works of H. R. Giger into their genome. Should you try it though, please let us know here at sci.casual. We may need to get in touch.
Featured image source: Wikimedia Commons (English Wikipedia, CC-BY-30, user: Russ London)