By Uwe Kemmer
Data is constantly being generated in all areas of our lives. Worldwide, the volume of data generated each year is now in the zettabyte age (1 zettabyte = 1 billion terabytes). Processing and storing this flood of data poses major challenges for companies – especially when they meet data-intensive fields such as artificial intelligence (AI), smart video and analytics. The gap between what companies can economically and efficiently store versus demand is growing ever wider.
So where to put all that data? Cold storage, i.e. long-term archiving, is becoming increasingly important. A significant portion of data generated consists of unstructured information – such as video surveillance footage, sensor data, images and more – and could contain value for future AI or analytics.
Cold storage is ideal for data that is not actively used (immediately) and is only retrieved when needed. It is not for nothing that it is the fastest-growing segment in data storage – by 2025, 60-80 percent of all digital data could be stored in archives. Most importantly, the method is cost-effective compared to primary storage.
Cold-storage archives are typically found on magnetic tape or hard disk drives (HDDs). These generally cost less, have a higher storage volume, and allow for a longer storage time than flash storage devices (such as SSDs). The most important consideration in determining whether companies should rely on cold storage is how often and how quickly does data need to be accessed?
Today, next-generation HDDs play a crucial role in cold storage solutions. They improve access to archives and, ideally, reduce total cost of ownership. Higher areal densities, mechanical innovations and material innovations help ensure this.
But the architecture of data storage is also constantly being optimised.
DNA storage: reality instead of science fiction
For archiving over a long period of time – 100 years or more – new solutions and innovations are required.
DNA storage, the storage of digital data based on the molecular structure of DNA, is proving particularly promising. What sounds like science fiction is already feasible today and could have a major impact on the future of data storage. The idea is already several decades old. But it was only the enormous progress made in genetic engineering and sequencing in recent years that has made DNA storage possible.
DNA molecules consist of a chain of nucleotides, each of which contains one of the four bases: adenine, thymine, guanine, and cytosine (abbreviated as ATGC). For data storage, only the binary code containing the information needs to be encoded into a sequence of ATGC. Artificial DNA is then created from this template, which can later be sequenced and retranslated. So instead of (electro)magnetic or optical methods, DNA Storage uses a chemical method to store data.
The advantages of this are great. If we talk about a shelf life of decades for digital media, DNA-based archives enable storage for thousands of years. DNA does not change – readability will not be affected by outdated formats or missing readers even in the distant future.
The storage density of DNA Storage is also unparalleled. A tiny sphere – the synthetic DNA is stored in capsules made of silica to protect it from moisture, for example – can store billions of gigabytes of data. This exceeds the capacity of even the most modern tape cassette by a factor of 100,000.
(Uwe Kemmer, Director of EMEA Field Engineering, Western Digital).
Be the first to comment