facebookpixel

What is Cold Data? And Why You Need a Cold Storage Strategy.

ActiveScale

Join Quantum  on October 5 for a major announcement on the Future of Cold Storage: Register Here  

If you have followed International Data Corp (IDC), the well-known technology analyst firm over the last few years, you may be familiar with their work on the Global Datasphere, a measure and forecast of the amount of new data created and stored across the globe annually. Their latest analysis reports that over 64 Zettabytes of data were created in 2020. That’s about 32 trillion (2) hour movies, that’s three stacks of DVDs (without jewel cases) to the sun.  

IDC expects continued growth compounding at 19% a year into the foreseeable future. Data created over the next three years will amount to more than all the data created over the past 30 years; three times more data will be created over the next five years than was created in the past five years.  

Remarkably, only 2% of that data is being stored for future use and analysis. Expectations are that stored data too will continue to grow, particularly because every forward-looking organization is recognizing the value of data as the vehicle of digital transformation. Data is driving the opportunity to create value, to invent new revenue streams, and effectively make and confirm strategic directions. 

Which brings us to the topic of cold data…  

What is Cold Data? 

Production workloads naturally access lots of data. If you think of data as having a lifecycle, ‘hot’ data is data that is actively being used, requiring high performance access while ‘warm’ data is still frequently accessed over a given timeframe. Cold data is inactive data that is never or infrequently accessed. Industry analysts project that 60% of all stored data is cold data.1  

Screen Shot 2021 09 28 at 2.25.06 PM

Increasingly, cold data is being preserved not because an organization is required to save the data, but because the data is being recognized as having inherent and potential value. 

Classically, cold data was limited to data that was preserved to meet regulatory or in-house compliance policies that require retention for some number of years. The data often was simply written to tape media, taken offline, and moved to a storage facility for the rare event that the data would ever need to be accessed again – not so anymore.  

Why You Need a Cold Storage Strategy 

So, our whole orientation toward cold data is changing, especially as its value gets recognized – on the one hand and on the other hand, its enormity and its growth becomes overwhelming. With the digitization of everything, the incessant data collection of sensors, the volume of video and imagery sources, plus the data-intensive requirements and periodic recalibration of data analysis, artificial intelligence and deep learning workloads, the amount of cold data that must be stored is going to grow, and its application and use cannot and will not remain dormant. 

Key Considerations for a More Focused Cold Storage IT Strategy  

As cold data grows, it requires a more focused IT strategy and approach to meet the infrastructure requirements. Key considerations include: 

Budget and technology limitations. Whereas data storage requirements continue to rise, IT budgets cannot keep pace. Moreover, while solid state disk drives (SSDs) will approach the cost and capacity characteristics of today’s hard disk drives over the next several years, HDD density growth is flattening, leaving no option for a lower-cost tier of random-access storage (which has been the most effective strategy to maintain accessibility while reducing cost). DNA-based storage, while shows promise, is many years from widespread adoption and commercialization, and initially will only serve as the coldest of cold archives (i.e., relatively simple to encode, but hard to read). The largest cloud providers, for example, have already discovered that they need to rely on slower, lower-cost media to meet these constraints. 

Forever archiving. Virtually all our knowledge, in whatever form and whatever topic, has been digitized. More and more, data needs to be preserved for its historic relevance (we’ll want to look at this again) and its future value (we’ll want to analyze this again). Research data, medical records, media content, genomic data, and AI/ML modeling data are all obvious candidates that must be preserved for years and decades. Data is information and information is data. With continuous advancement of deep learning algorithms, we will continue to expand our vehicles to mine value and innovate new uses. Cold storage needs to be secure, durable, and self-healing for decades. 

Online, unfettered access for enrichment, value creation, and innovation. Extracting value from growing data stores becomes even more problematic. To extract value, you need to know what data you have and have meaningful ways to organize and find relevant subsets. We need more data (the metadata) about the data to maintain and grow its relevance. This requires analysis and enrichment of the data itself both upfront and over time for continued enhancement. To do so, the data must remain easily accessible for the long term. Currently, organizations are challenged to meet these goals with inhouse platforms due to cost constraints. Often reliant on public cloud storage, data sovereignty and control become major issues; plus, accessibility is hampered by access and storage charges that spike when cold data is accessed. 

As we look forward to the future, the demands of cold data growth will ultimately give rise to new storage and service solutions.  

The Future of Cold Storage Roundtable – October 5, 2021 

Register here  for our October 5th (8:30AM PT) roundtable on the Future of Cold Storage, and receive the IDC InfoBrief, “Data Deluge: Why Every Enterprise Needs a Cold Storage Strategy.”  

______________ 

1 Note that, whereas there is a bit of a preoccupation around the amount of stored data that is cold, in fact, 99% of all data is cold. As 98% of data doesn’t get saved, this guarantees that this data will never get accessed again, so, by our definition, this data too is cold. That’s a lot of lost opportunity. 

The post What is Cold Data? And Why You Need a Cold Storage Strategy. appeared first on Quantum Blog .

To view our Partner blog, click here

Announcing Availability of LTO-9 Tape Drives in Enterprise and Midrange Tape Systems

Archive

Today, Quantum announces the availability of LTO-9 tape drives in its flagship enterprise and midrange tape systems, the Scalar i6000 and Scalar i6. LTO-9 tape drives are now also available for purchase in the Scalar i500 tape systems . LTO-9 tape drives support both LTO-8 media, as well as LTO-9 media, with each tape holding 18 TB of native capacity, and up to 45 TB of compressed capacity. The drives have improved performance with native performance of up to 400 MB / sec. 

Tape in the 2021 Environment

So, why is tape needed in your environment in 2021? The simple answer is ransomware and the growth of unstructured data. Ransomware has evolved from an unintelligent attack aimed at disrupting your business, to a relentless, targeted, sophisticated AI-driven machine that seeks to encrypt and delete your backups to obtain hundreds of thousands in ransom. To meet the heavy demands of our current cyber climate and that of video and other forms of unstructured data, tape provides the assurance that a copy of your data will be available when needed. Tape has an inherent nature to airgap data and is the lowest cost solution for your long-term ‘cold’ storage. An offline copy is critical as an insurance policy should your online copies be compromised. Today, there isn’t technology available that can stop ransomware from attacking your organization, but we can help you prevent you from gaining a foothold. We know your reputation, customers, and their trust are important to you.

Data Growth and Unrelentless Cyber Threats

We understand data is invaluable and we see its exponential climb to stratospheric proportions, but how are you going to store the massive amounts of data being created, because it must be stored and protected (sometimes forever) from cyber threats and other forms of threats without breaking the bank. Quantum LTO-9 Tape is designed to meet the demands of the modern data center. From storing edge and AI-driven analytical data to media and entertainment content and long-term cold storage.  LTO protects your data from the onslaught of ransomware and future proofs your growing deep archives.  LTO tape has a strong history of protecting data and continues to show a solid roadmap to reach greater capacities and performance and be a strong partner in your DR strategy.

The future of LTO looks a lot denser as the innovation continues to provide a cost-efficient solution over alternative solutions that is also cyber resilient. For example, tape has major cost advantages over HDD in terms of $/GB. The capacity of tape can be increased mainly by areal density, the same as with disk, but tape can increase its length, unlike disk, which has reached its maximum platter count. Add to this, its inherent ability to form a natural airgap for securing your data, and the capacities it can store, LTO tape can address the need for an economical off-line solution.

Implement LTO-9 into Your Long-Term Storage Strategy

To learn how you can implement Quantum LTO-9 tape technology for your long-term storage or if you’re on a journey to implement cyber resilient solutions, please check out how we help here .

The post Announcing Availability of LTO-9 Tape Drives in Enterprise and Midrange Tape Systems appeared first on Quantum Blog .

To view our Partner blog, click here

Quantum Leads End-user Survey for Secondary Storage

ActiveScale

coldago logo

In the most recent end-user survey from Coldago Research , Quantum leads all vendors as the top brand for secondary storage.  Coldago’s April 2021 report surveyed over 2,000 end-users across SMB and enterprises about their brand preferences on a wide range of storage types and applications.  In secondary storage, Quantum is the leading brand. 

secondary storage

With the growth in secondary storage including object storage, Quantum offers a wide range of products to serve our customer’s needs.  These market segments are poised for growth.  According to IDC “Object storage accounts for the largest part of the public cloud storage market and continued to experience significant growth (31% in 2019)[1].  With that level of growth, customers need a wide portfolio of secondary storage products. 

End User Survey, Top US/EMEA brands for Secondary Storage

“Coldago Research unveils its end-users survey 2021 addressing perceptions, opinions, needs and priorities. We also asked users’ to rank vendors in various categories and for secondary storage, Quantum arrives 1st both in USA and Europe confirming the corporate and product strategy initiated a few years ago with new products, technology developments, acquisitions and talent recruitments that started to pay off”, said Philippe Nicolas, analyst at Coldago Research. “As a pioneer of secondary storage, Quantum continues to offer large, diverse and broad offerings targeting from small configurations to very large-scale environments, whatever is the technology, storage media, architecture and access methods making their solutions a real universal, unified and ubiquitous approach”, concludes Nicolas.

At Quantum, we’re pleased with the recognition that IT professionals have bestowed in this survey.  Over the last few years, we’ve made numerous improvements to our entire portfolio.  Here’s just a quick summary:

  • ActiveScale object storage – Multiple new releases that provide object locking, small object aggregation, and advanced troubleshooting.  New hardware included the P100E3 3-node platform for smaller customers and the new X200 platform for more performance and scale. 
  • Scalar tape – Introduction of multi-factor authentication builds upon the value of Active Vault, a feature available only in Quantum’s Scalar tape libraries to secure critical off-line data against ransomware attacks. The new feature makes unauthorized access extremely difficult, and underscores Quantum’s commitment to innovation for preserving and protecting valuable customer data.
  • DXi backup appliances New features include Secure Snapshots and others in collaboration with Veeam, including multi-tiered backup and replication to Quantum DXi appliances, across sites, to the cloud and to Scalar tape libraries.

“The Coldago report demonstrates strong customer satisfaction across the full line of Quantum’s secondary storage products,” said Bruno Hald, Quantum’s GM for Secondary storage. “We are a customer-driven organization and have delivered a series of innovation for a full range of customers, from SMB, larger enterprises, and to our hyperscale customers.  We look forward to providing them with additional products and services to improve their bottom line”. 


[1] Source:  IDC report, Cold Storage Services in the Public Cloud IaaS Market, #WC2020087, Sept 1, 2020

Learn more at https://www.coldago.com/ and read the source report: https://www.coldago.net/eusurvey

To view our Partner blog, click here

Meet the New ActiveScale: Dynamic Capacity for Dynamic Growth – Take Your Data Storage from Terabytes to Exabytes

ActiveScale

Our world is being digitized at a dizzying rate. Unstructured data, such as entertainment videos, surveillance, weather, and geospatial data – is growing at up to 60% per year, and it is projected to comprise 80% of all data on the planet by 2025. Today, the amount stored is already approaching 150 zettabytes (IDC, 2020) while growing by the minute. This staggering growth is coming from a variety of sources, including video production, satellite imagery, telemetry, web-scale applications, gene sequencing, and IoT sensors, just to name a few.

Unstructured data sets are huge and often reach petabytes to even exabytes in size, making the classic NAS and SAN storage architectures unable to store and manage them. And, today’s data not only needs to be stored and protected; it also needs to be accessible by the people who need it, when they need it. Scientists, creative professionals, and business analysts each need immediate and ongoing access to their data from which to gain new insights and to monetize its underlying value.

UNLIMITED SCALABILITY

On June 24, Quantum made a significant announcement to address the demanding requirements to support unstructured data as we unveiled the latest generation of ActiveScale object storage software and systems – namely, ActiveScale 6.0 software and the new ActiveScale X200 platform. Together, they provide unlimited scalability in terms of capacity, object count, and performance, while ActiveScale’s unequalled data durability, system availability, and data security are not sacrificed.

NO REBALANCING REQUIRED

The ActiveScale 6.0 scale-out software architecture is key to achieving robust scalability. With ActiveScale, you can seamlessly expand your compute, networking, and storage resources to easily address increasing workloads. ActiveScale Dynamic Data Placement (DDP) incorporates advanced erasure encoding, data encryption, and real-time heuristics to optimally place data securely across the entire ActiveScale cluster. DDP not only guards against data loss, component, and site failures, but also optimizes total system throughput across the cluster. DDP essentially prevents the intrusive data rebalancing that plagues many other object storage systems and inhibits their scalability in real-life scenarios.

HIGH DENSITY INFRASTRUCTURE

The ActiveScale X200 object storage platform is a new converged system. It combines the latest generation of high-density, high-capacity servers with the latest ActiveScale software advancements. When compared with our previous generation X100, a 3-module X200 cluster fits in an equivalent data center rack, providing:

  • 78% greater storage density (14.6 PB raw)
  • Up to 7X greater sustained throughput (51 GB/sec)
  • 6X more objects (30B)

Consolidating this much capacity and computing power in the same floor space significantly reduces operational costs, while also delivering greater access and performance. Plus, by seamlessly adding additional X200 modules, capacity, performance, and object count can be expanded as data grows. (Note also that current customers benefit with the ability to expand their existing ActiveScale clusters with the X200 new platform)!

THE ACTIVESCALE PORTFOLIO

To put this in perspective with our expanded portfolio, ActiveScale can now be deployed for as small as a few 100 terabytes and as large as 100 exabytes and beyond, all within a single, easy-to-use system.

X200 blog

In other exciting news, in addition to ActiveScale X200 being available through Quantum and our stellar partner network, ActiveScale 6.0 software is now available through Supermicro and their resellers when bundled with qualified Supermicro servers. To learn more, click here .

MEET THE NEW ACTIVESCALE

Learn more about ActiveScale 6.0 and its rich feature set, plus the new X200 platform.

To view our Partner blog, click here

Meet the New StorNext: Virtualized. Containerized. Cloud-Ready.

cloud

On April 27, we made a major announcement – a new software-defined architecture for StorNext and a new line of appliances in the Quantum H4000 Series. Although much of the focus might be on the new line of appliances, the real magic is the software architectural work that we’ve done under the covers. 

In this new software-defined and virtualized architecture, StorNext 7 file services, data services, and block services run virtually on a single platform. StorNext 7 has been virtualized, and major architectural components, such as management layer and APIs have been containerized in a Docker environment. In short, it is a totally redefined, modern, and cloud-ready software architecture

h4000 blog arch

This represents a major architectural step toward enabling public- and hybrid-cloud deployments of the StorNext 7 file system and will make it possible to use StorNext on additional cloud and hardware platforms in the future.  

Easy to Deploy and Use in More Places than Ever

What does this mean for our customers? It means that StorNext is now virtualized, and easier to deploy and use in more places than ever before, whether it’s at the edge, on set, in the field, or anywhere you want to stand up a complete, collaborative environment or extend your production workflow. This new software delivers greater efficiency, more flexibility, and a dramatically simplified user experience. Explore the new Quantum H4000 here .

Join Us for a Live Quantum H4000 Tour

Join us on May 18 I 10AM PDT for a live tour of the new StorNext 7 software-defined architecture and the simplified easy-to-use interface. We’ll show you how to deploy the H4000 with StorNext in minutes, so you can easily harness the power of data to accelerate your workflows and production pipelines.

Register Now

To view our Partner blog, click here

Announcing the H4000 with StorNext 7 – Fast, Collaborative Production Just Got Easier

Block Storage

If the new pace of production and sheer sprawl of content and unstructured data that you need to manage, organize, and protect seems overwhelming – you’ll be glad to know that big help–in a small package–is on the way. 

The Full Capabilities of StorNext Platform – Now Fits in Only 2U 

Today, we’re pleased to announce the newest, fastest StorNext ever on a converged appliance from Quantum. Together, StorNext 7 on the new H4000 appliance  deliver a complete StorNext environment that fits in only 2 rack units. And when you can place the full capabilities of StorNext, the world’s fastest collaborative, shared file platform in more places than ever before, it will change how and where you think about your critical workflows. 

Take Your Solutions to the Edge – Move More of the Studio to the Set 

By converging all of the capabilities of StorNext , the world’s fastest collaborative, shared file platform into a single appliance, you can get racked and running in minutes whether you’re adopting StorNext for the first time or adding more power and capability quickly to your existing environment: 

  • Bringing Your Entire Team Together Just Got Easier: Make the leap to a unified, end-to-end workflow for a new level of efficiency and faster time to delivery. 
  • Adding New Teams and Projects Anywhere Just Got Easier: Extend your existing StorNext environment quickly to a new office or for a new project without complex infrastructure buildout. 
  • Analysis and Insight at the Edge Just Got Easier: Move high-performance analysis and critical solutions closer to the point of capture for faster insight and action.
  • Bringing Post to Set Just Got Easier: Bring post-production to the set – from ingest, to pre-visualization, to rough cuts.  

The new H4000 truly lets StorNext 7 – and your workflow fly: 

  • Ultrafast end-to-end architecture on PCIe 4 and the latest multi-core processors. 
  • New unified interface to manage the entire StorNext environment and the H4000.
  • Converged architecture to deliver all of the capabilities and services of StorNext. 
  • Flexible storage options to deliver 12 or 24 bays of storage and up to 160 TB or 307 TB. 
  • Flexible connection options to deliver your optimum mix of network and connection technologies including 100 Gb Ethernet and 32 Gb Fibre Channel.  

Join Our Live Talk on May 18! 

We’ll be taking you through all the features and all the ways you can get the most from StorNext 7 on the H4000 – join us here .

Until then, you can read more H4000 Series details here  or contact  your Quantum representative or Value-Added Reseller to review your workflow. 

To view our Partner blog, click here