facebookpixel

What is Cold Data? And Why You Need a Cold Storage Strategy.

ActiveScale

Join Quantum  on October 5 for a major announcement on the Future of Cold Storage: Register Here  

If you have followed International Data Corp (IDC), the well-known technology analyst firm over the last few years, you may be familiar with their work on the Global Datasphere, a measure and forecast of the amount of new data created and stored across the globe annually. Their latest analysis reports that over 64 Zettabytes of data were created in 2020. That’s about 32 trillion (2) hour movies, that’s three stacks of DVDs (without jewel cases) to the sun.  

IDC expects continued growth compounding at 19% a year into the foreseeable future. Data created over the next three years will amount to more than all the data created over the past 30 years; three times more data will be created over the next five years than was created in the past five years.  

Remarkably, only 2% of that data is being stored for future use and analysis. Expectations are that stored data too will continue to grow, particularly because every forward-looking organization is recognizing the value of data as the vehicle of digital transformation. Data is driving the opportunity to create value, to invent new revenue streams, and effectively make and confirm strategic directions. 

Which brings us to the topic of cold data…  

What is Cold Data? 

Production workloads naturally access lots of data. If you think of data as having a lifecycle, ‘hot’ data is data that is actively being used, requiring high performance access while ‘warm’ data is still frequently accessed over a given timeframe. Cold data is inactive data that is never or infrequently accessed. Industry analysts project that 60% of all stored data is cold data.1  

Screen Shot 2021 09 28 at 2.25.06 PM

Increasingly, cold data is being preserved not because an organization is required to save the data, but because the data is being recognized as having inherent and potential value. 

Classically, cold data was limited to data that was preserved to meet regulatory or in-house compliance policies that require retention for some number of years. The data often was simply written to tape media, taken offline, and moved to a storage facility for the rare event that the data would ever need to be accessed again – not so anymore.  

Why You Need a Cold Storage Strategy 

So, our whole orientation toward cold data is changing, especially as its value gets recognized – on the one hand and on the other hand, its enormity and its growth becomes overwhelming. With the digitization of everything, the incessant data collection of sensors, the volume of video and imagery sources, plus the data-intensive requirements and periodic recalibration of data analysis, artificial intelligence and deep learning workloads, the amount of cold data that must be stored is going to grow, and its application and use cannot and will not remain dormant. 

Key Considerations for a More Focused Cold Storage IT Strategy  

As cold data grows, it requires a more focused IT strategy and approach to meet the infrastructure requirements. Key considerations include: 

Budget and technology limitations. Whereas data storage requirements continue to rise, IT budgets cannot keep pace. Moreover, while solid state disk drives (SSDs) will approach the cost and capacity characteristics of today’s hard disk drives over the next several years, HDD density growth is flattening, leaving no option for a lower-cost tier of random-access storage (which has been the most effective strategy to maintain accessibility while reducing cost). DNA-based storage, while shows promise, is many years from widespread adoption and commercialization, and initially will only serve as the coldest of cold archives (i.e., relatively simple to encode, but hard to read). The largest cloud providers, for example, have already discovered that they need to rely on slower, lower-cost media to meet these constraints. 

Forever archiving. Virtually all our knowledge, in whatever form and whatever topic, has been digitized. More and more, data needs to be preserved for its historic relevance (we’ll want to look at this again) and its future value (we’ll want to analyze this again). Research data, medical records, media content, genomic data, and AI/ML modeling data are all obvious candidates that must be preserved for years and decades. Data is information and information is data. With continuous advancement of deep learning algorithms, we will continue to expand our vehicles to mine value and innovate new uses. Cold storage needs to be secure, durable, and self-healing for decades. 

Online, unfettered access for enrichment, value creation, and innovation. Extracting value from growing data stores becomes even more problematic. To extract value, you need to know what data you have and have meaningful ways to organize and find relevant subsets. We need more data (the metadata) about the data to maintain and grow its relevance. This requires analysis and enrichment of the data itself both upfront and over time for continued enhancement. To do so, the data must remain easily accessible for the long term. Currently, organizations are challenged to meet these goals with inhouse platforms due to cost constraints. Often reliant on public cloud storage, data sovereignty and control become major issues; plus, accessibility is hampered by access and storage charges that spike when cold data is accessed. 

As we look forward to the future, the demands of cold data growth will ultimately give rise to new storage and service solutions.  

The Future of Cold Storage Roundtable – October 5, 2021 

Register here  for our October 5th (8:30AM PT) roundtable on the Future of Cold Storage, and receive the IDC InfoBrief, “Data Deluge: Why Every Enterprise Needs a Cold Storage Strategy.”  

______________ 

1 Note that, whereas there is a bit of a preoccupation around the amount of stored data that is cold, in fact, 99% of all data is cold. As 98% of data doesn’t get saved, this guarantees that this data will never get accessed again, so, by our definition, this data too is cold. That’s a lot of lost opportunity. 

The post What is Cold Data? And Why You Need a Cold Storage Strategy. appeared first on Quantum Blog .

To view our Partner blog, click here

What is Object Storage and What to Look for in an Object Storage System

ActiveScale

Unstructured data is growing at up to 60% per year and will make up 80% of all data on the planet by 2025. For this reason, object storage solutions are entering a new era of mainstream adoption and use. Object storage allows you to easily store large amounts of unstructured data in a highly scalable and cost-effective way, making it a logical choice for backup and archiving, data analytics, and large, active unstructured data sets.

What Are Objects?

In its simplest form, object storage is a storage system where data elements are stored in repositories called objects. Objects can be of variable size, and are not only defined by their content, but also by their metadata and a unique identifier. Metadata can include simple things like creation date and time, but also unique characteristics about the data and searchable summary information. In contrast to accessing a file name and location within a file system, an object is accessed by a unique identifier created when the object is created and then used to retrieve the object later.

Built for Scale

A key element of a well-designed object storage system is its ability to scale gracefully, to grow from small terabyte-scale configurations up to massive scale, supporting billions of objects and 100s of petabytes in capacity. To prevent bottlenecks, object storage requires a ‘scale-out’ software architecture. This provides an expandable flat name space to address billions of objects and allows computing, networking, and storage resources to expand as the system’s capacity grows to maintain performance.

Always Available Access

Objects stores are accessed over a standard Ethernet network in one of two ways, through API calls or HTTP/HTTPS requests, most commonly through software in the form of web-based applications, packaged and open-source software solutions, and custom-developed in-house applications and scripts. Today, most object stores are accessed via AWS S3, a de facto standard command set. Most notably, the S3 protocol runs securely over Ethernet, making it easy to integrate object storage into your computing environment and access your data from varying distances, across private networks, WANs, and the Internet.

Durability, Security, and Availability

Another key attribute of object storage is the emphasis in their design of data durability, security, and availability. With so much data being stored, it’s important that the system and data are always accessible, and data is securely stored and confidently retained over a long period of time, for years and sometimes, decades.

Object storage systems use erasure coding, a data protection technique superior to classic RAID and storage replication technologies. Erasure coding breaks an object up into many piece parts, called shards, then calculates and maintains multiple copies of parity information for these shards; the data shards and parity shards are then distributed across the underlying hardware infrastructure. This ensures continuity of data and data access in the case of a data loss, component failure, or a site disaster.

Not All Object Storage Systems Are the Same

While object storage systems have a lot of commonalities, it is also important that not all object storage systems are the same. For many systems, the concept of scale only extends to the terabyte range and reliability and availability capabilities don’t meet the needs of massive enterprise deployments.

Performance at scale. Whereas scalability is a key attribute promoted by many vendors, implementations can suffer as the system scales, causing painful, performance-crushing data rebalancing across system resources. This forces customers to manage a fleet of object storage systems despite their original goal of a simple, scalable solution.

Recommendation: Seek real-world references of customers that are running single name space instances that scale to at least a petabyte in scale, and even higher if your capacity requirements extend into 10’s or 100’s of petabytes.

Efficiency. A well-designed system should be manageable at any scale, from terabytes to exabytes, by a single administrator on a part-time basis.

Recommendation: Go beyond the demo to truly understand the system’s design and management toolset available. Bypass the downloadable UI-centric demoware to consider a ‘proof-of-concept’ deployment to validate the system design and applicability to your use case.

Proactive data protection. With the amount of the data you are storing, its security, availability, and protection against failure, is paramount.

Recommendation: Look for advanced proactive data integrity verification to guarantee data consistency over the long term and always-on, cloud-based monitoring to support your efforts.

An everlasting archive. Most object storage systems naturally grow over time and evolve to support additional use cases. The object storage solution you deploy is a key strategic asset for both mining and preserving your organization’s data assets for the long term.

Recommendation: Consider your object storage vendor as a strategic partner working with you to protect and extract value from your digital assets. Value vendor longevity and understand how well systems preserve your data and maintain data accessibility when transitioning to new generations of software and hardware.

About Quantum ActiveScale

Quantum ActiveScale is an enterprise-class object storage solution designed from the ground up for data durability, security, and availability of massive unstructured data sets and use cases ranging from terabytes to exabytes in scale. With S3 compatibility and an optimized two-layer storage architecture, ActiveScale simply and cost-effectively scales performance and capacity without bound in support of analytical workloads, active archiving, long term retention, and cold data storage. Learn more at: www.quantum.com/object-storage

To view our Partner blog, click here

Quantum Leads End-user Survey for Secondary Storage

ActiveScale

coldago logo

In the most recent end-user survey from Coldago Research , Quantum leads all vendors as the top brand for secondary storage.  Coldago’s April 2021 report surveyed over 2,000 end-users across SMB and enterprises about their brand preferences on a wide range of storage types and applications.  In secondary storage, Quantum is the leading brand. 

secondary storage

With the growth in secondary storage including object storage, Quantum offers a wide range of products to serve our customer’s needs.  These market segments are poised for growth.  According to IDC “Object storage accounts for the largest part of the public cloud storage market and continued to experience significant growth (31% in 2019)[1].  With that level of growth, customers need a wide portfolio of secondary storage products. 

End User Survey, Top US/EMEA brands for Secondary Storage

“Coldago Research unveils its end-users survey 2021 addressing perceptions, opinions, needs and priorities. We also asked users’ to rank vendors in various categories and for secondary storage, Quantum arrives 1st both in USA and Europe confirming the corporate and product strategy initiated a few years ago with new products, technology developments, acquisitions and talent recruitments that started to pay off”, said Philippe Nicolas, analyst at Coldago Research. “As a pioneer of secondary storage, Quantum continues to offer large, diverse and broad offerings targeting from small configurations to very large-scale environments, whatever is the technology, storage media, architecture and access methods making their solutions a real universal, unified and ubiquitous approach”, concludes Nicolas.

At Quantum, we’re pleased with the recognition that IT professionals have bestowed in this survey.  Over the last few years, we’ve made numerous improvements to our entire portfolio.  Here’s just a quick summary:

  • ActiveScale object storage – Multiple new releases that provide object locking, small object aggregation, and advanced troubleshooting.  New hardware included the P100E3 3-node platform for smaller customers and the new X200 platform for more performance and scale. 
  • Scalar tape – Introduction of multi-factor authentication builds upon the value of Active Vault, a feature available only in Quantum’s Scalar tape libraries to secure critical off-line data against ransomware attacks. The new feature makes unauthorized access extremely difficult, and underscores Quantum’s commitment to innovation for preserving and protecting valuable customer data.
  • DXi backup appliances New features include Secure Snapshots and others in collaboration with Veeam, including multi-tiered backup and replication to Quantum DXi appliances, across sites, to the cloud and to Scalar tape libraries.

“The Coldago report demonstrates strong customer satisfaction across the full line of Quantum’s secondary storage products,” said Bruno Hald, Quantum’s GM for Secondary storage. “We are a customer-driven organization and have delivered a series of innovation for a full range of customers, from SMB, larger enterprises, and to our hyperscale customers.  We look forward to providing them with additional products and services to improve their bottom line”. 


[1] Source:  IDC report, Cold Storage Services in the Public Cloud IaaS Market, #WC2020087, Sept 1, 2020

Learn more at https://www.coldago.com/ and read the source report: https://www.coldago.net/eusurvey

To view our Partner blog, click here

A Trifecta of Veeam-Ready Solutions

ActiveScale

They say the most interesting things in life come in sets of threes. Certainly, some sets are more interesting and simpler than others, such as live, love, laugh, or rock, paper, scissors, or my favorite, the good, the bad, and the ugly. While not super philosophical, they are easy concepts to remember and, in some cases, simplify our daily lives. When we think of things that we need to simplify in a data centric environment, we think of data management, data security, and data protection. Quantum has made a commitment to provide leading solutions that simplify; this is clearly apparent as we continue to work with Veeam offering solutions around data management, security, and protection. Quantum has been a long-standing partner of Veeam and has been certified as Veeam Ready, a program that ensures compliance with functionality and performance standards put forth by Veeam.

As the evolution of Big Data continues, the three core topics of concern to date are protect, secure, and manage data. The industry has done a decent job of providing data protection. Data Management is an emerging market segment focused on unstructured data. Recent hacking and ransomware attacks have brought concerns around data security to the forefront of people’s minds. Quantum has been a Veeam partner for a long time, since Veeam 9.5 Quantum DXi deduplication appliances and award-winning Scalar tape libraries have been certified with Veeam. Quantum’s most recent addition to the portfolio, Active Scale Object Storage , has also been Veeam Ready (v10) certified. Here is the full portfolio that is Veeam 10 Ready certified:

Veeam Integrated DXi Deduplication Appliance

DXi backup appliances offer cost-effective, fast backup and restores for mission-critical product data. Our customers love our DXi for many reasons, but feedback we often hear is that the DXi replication starts just after the first file is backed up without waiting for the end of the Veeam task, so the period of risk—the gap between when a backup is done and a copy is safely off site—is incredibly short, unlike some competitor’s solutions that start the replication only when the Veeam task is done. In v10, Fast Clone feature improves synthetic full performance by referencing existing data blocks on volumes instead of copying data blocks between files and copying those blocks only when files are modified. We have seen up to 15x improvement in our test labs and customer deployments in the speed and in some cases reduced the creation time from hours to minutes.

Veeam Ready Scalar Tape Library

Quantum Scalar tape solution for compliance, long-term retention requirement and ransomware protection. Quantum Scalar library is the only tape system in the market that uses Active Vault, a hands-off approach to managing media between an active and vaulted partition that is completely disconnected from the network. This enables our customers to fully airgap their data and thwart the effects of ransomware or other malware.

Veeam Ready Object Immutability – Active Scale Object Storage

ActiveScale is Quantum’s object storage system that provides a data-forever storage infrastructure. The ActiveScale system architecture accommodates storing primary copy of multi-petabyte-scale datasets that are highly durable and facilitates long-term data retention with minimal administrative overhead. With a Veeam Ready Object Immutability certification, Quantum solutions offer clients the ability to leverage our appliances efficiently through the Veeam GUI.

For more information on testing criteria or detailed information about how Quantum’s portfolio can help you achieve your data protection requirements, visit us our Quantum & Veeam web page or talk to one of our solution representatives.

To view our Partner blog, click here

VirtualQ I Transform Recap: Helping you Unlock the Value of Unstructured Data

ActiveScale

As we head into the holiday season, it’s a good time to recap some of the major things we’ve introduced this year, and in many ways our recent VirtualQ I Transform event was the culmination of a years’ worth of effort in bringing new products and solutions to market. 

In November, our VirtualQ I Transform event focused on how unstructured data growth, and unlocking the business value in this data, is transforming how our customers think about data. This event was the culmination of our launch efforts of the expanded portfolio we announced on November 10 – focused on managing, storing, and protecting unstructured data across its lifecycle. You can access all of the sessions and watch at your convenience here .

Most Popular Topics During VirtualQ I Transform

Based on the hundreds of attendees across many different sessions hosted over two days (one European event, and one event for the Americas), here are some of the topics that resonated most and generated the most interest:

Unstructured Data Sprawl: Many of our customers are outgrowing their capacity and looking at new ways to manage ‘unstructured data sprawl.’  We are helping these customers gain better visibility into what data they have, where it lives, and automate ways to move data between edge, core, and cloud environments. The new automated data movement we’ve added to StorNext 7 and ATFS can help in this regard, and were two of the key things we introduced. 

ATFS Storage Platform: There were many questions about our new ATFS storage platform, an easy to use, cost effective NAS platform with integrated data classification. This data classification is a major advancement in giving customers visibility into the file data they are storing.  With simple queries, customers can gain new insights into the make-up of their file data, where it lives, and then quickly use those insights to drive actions that can be enforced with data placement policies, data protection policies, and storage quotas. 

ATFS Trial Download: As we mentioned during our event, for anyone interested in learning more about the ATFS platform, and to start to classify the file data you have today, download a full version of ATFS here . It only takes a few minutes, and you can start to see the impact this platform can make to your NAS infrastructure.

Ransomware Protection: There was also a lot of interest in new ways and best practices to protect against ransomware, and security of data in general. This continues to be top of mind for many of our customers, and we now have options to protect against ransomware regardless of your technology choice, and regardless of the application you are using. We’ve introduced Object Lock to our ActiveScale object store software , we are adding a similar snapshot retention lock feature to our DXi backup appliances , and tape remains the most secure storage with innate air-gapped protection, Active Vault software to automatically move tapes to an in-library vault, and security capabilities like AES-256-bit encryption and multi-factor authentication.

“Forever” Data Archives: Lastly, there was continued interest in the emerging realization that much of this valuable data being created and analyzed will need to be kept forever in a private cloud. Our solutions for this “forever” data archive continue to drive interest, from object storage software with erasure encoding, to RAIL tape, to ways to index and search the content in these archives. Look for some exciting announcements in this area in the coming weeks and months. You can check out the “Forever Archive” white paper here .

Looking Ahead

We are continuing to innovate and transform our own business with a focus on solving our customers’ most pressing business challenges in the areas of video, digital images, and other forms of unstructured data, and we can’t wait for 2021! 

Watch VirtualQ I Transform Sessions On-Demand

If you missed the event, watch replays at your own pace to learn more about managing, storing, and protecting unstructured data across its lifecycle.

Watch Sessions On-Demand Now!

To view our Partner blog, click here

Join us at Tech Field Day!

ActiveScale

Quantum is pleased to present at the upcoming Tech Field Day , held virtually on Dec 9th

Tech Field Day is an event unlike any other in the industry.  At each Tech Field Day, a group of technical influencers and experts, called delegates, participate with leading IT vendors to explore products and their technical underpinnings.  There are no sales pitches or marketing presentations at Tech Field Day events.  Instead, vendors provide a deep dive into their product architectures and give the delegates the opportunity to probe deeper via live questions which represent the perspectives of technical end users.  These events are open to anyone to watch live, and all the presentations are made public shortly after the event. 

At the next event, Quantum will share another significant step in our vision to lead in video and unstructured data solutions and represent a key shift in focus from ‘storing’ data to ‘managing’ data. This event is a great way to learn about Quantum’s underlying technology that differentiates ActiveScale from other products.

 Follow along live via twitter @TechFieldDay or online during the time slots below. 

Time (PT) Section Speaker
8:00AM Quantum Company Overview Ed Fiore, Primary Storage, General Manager
8:20AM Object Storage and ActiveScale Introduction Thomas Demoor, ActiveScale  Lead Architect
9:00AM ActiveScale Layered Architecture  Thomas Demoor, ActiveScale, Lead Architect
9:20 AM ActiveScale Product Summary Thomas Demoor, ActiveScale, Lead Architect
9:30AM ActiveScale – Product Demonstration Sherman Schorzman, ActiveScale Technical Marketing Engineer
tech field day logo

To view our Partner blog, click here