facebookpixel

Beefing Up the Mid-Range – Introducing the ActiveScale P200 Platform

ActiveScale

With the addition of the new mid-range ActiveScale P200, we have completed the platform refresh across the entire ActiveScale portfolio, providing significantly enhanced performance and density across all price points. To meet the increasing need for object storage, ActiveScale cost-effectively scales from terabytes to exabytes and is the only object storage platform architected for both active and cold data.

p200
The ActiveScale P200 Object Storage Platform

The P200 – Start Small, Grow Big

The P200 is our new mid-range solution, based on a simple, converged architecture that consolidates computing, networking, and storage into a simple 3U building block for easy deployment and expansion. The ActiveScale P200 easily grows by adding multiple P200s together for unlimited scalability of capacity, performance, and object count to start small and grow big. Even as a 3U 3-node configuration, the P200 packs a lot of power in a dense package. Compared to the ActiveScale P100, on a per rack basis, the P200 provides:

  • 1.9X more capacity
  • 2.4X greater PUT performance
  • 2.7X greater GET performance
  • 6.25X more objects

In a 36U rack, a 12-node P200 scale-out cluster packs a lot of punch:

  • 7.8 petabytes of Active Class storage
  • 15 billion objects
  • Measured performance up to 37.8 GB/sec GETs and 31.9 GB/sec PUTs

The P200 replaces the P100 in the ActiveScale product line. But rest assured, no man (or node) gets left behind. Existing P100 customers preserve their pre-existing investment in P100s with the ability to expand their P100 clusters with P200 platforms, not only maximizing their investments in P100, but also capitalizing on new levels of capability and density of the P200.

The Foundation for the Next Evolution in Object Storage

In addition to being a powerful platform for any object storage need, the P200 is an ideal choice upon which to build your unstructured data management strategy. Undoubtedly, data is well recognized today as a strategic asset. Entire industries are digitally transforming with the opportunity to extract significant value from a wide variety of digital data sources. Whether you are producing petabytes of data through gene sequencing, satellites, user activities, or market data, the need to store, analyze, protect, and archive, more and more data is growing unabated.

As the only object storage platform that supports Active and Cold Storage Classes, the P200 allows you to keep more data at lower cost and access that data whenever you need to without expensive fees or complex recovery procedures.

ActiveScale P200 3GEO with Active and Cold Data

3geo
Self-protecting and disaster-tolerant expansion at up to 80% less cost using ActiveScale Cold Storage

The requirements for the next generation of massive data stores are not only that they be self-protecting and disaster-tolerant, but also that they scale cost-effectively. P200 meets this need as a powerful disaster-tolerant solution through its ability to deploy across multiple data centers with full redundancy and non-stop access even in the case of a site failure. Using three P200s, a petabyte scale cluster can be geospread across three data centers.

As data needs grow, requirements may be met by adding P200’s for active data. But moreover, as content ages, many organizations are best served by adding cold data resources at lower cost, by deploying ActiveScale cold storage , based on Two Dimensional Erasure Coding (2D EC) and the Quantum RAIL architecture. By adding just one tape library at each site, long term cold storage costs can be reduced by up to 80%.

A Powerful Platform Portfolio

The addition of the ActiveScale P200 platform completes an aggressive plan to upgrade the entire ActiveScale portfolio based on a common converged architecture. With scalability from terabytes to exabytes, ActiveScale meets the need for simple, secure, protected data platforms for backup, recovery, and content management in small commercial enterprises to the unstructured data needs of the most demanding and complex IT environments. ActiveScale provides scalable, cost effective, no compromise solutions to meet the growing demand to extract more value from these growing data stores.

ActiveScale Object Storage Platform Portfolio

ASCS Portfolio

For more information, you can download the ActiveScale datasheet here .  

The post Beefing Up the Mid-Range – Introducing the ActiveScale P200 Platform appeared first on Quantum Blog .

To view our Partner blog, click here

Meet Me in St. Louis – For an HPC, AI, and Analytics Extravaganza

ActiveScale

Here at Quantum, we are excited to finally be on the road again. Next stop – St. Louis! Next week, SC21 , the International Conference for High Performance Computing, Networking, Storage and Analysis, is being held at America’s Center in downtown St Louis.

sc21 1

A HISTORY OF INNOVATION

This event has a rich history of showcasing technical innovation in high performance computing (HPC) infrastructure and applications from across a broad spectrum of universities, supercomputing centers, national labs, and industrial research centers.

For Quantum, a couple of themes particularly resonate:

  • Managing data growth. The ongoing challenge of ingesting, analyzing, and managing massive data sets across their data lifecycle.
  • Cold data that’s not so cold. The desire to save more data from more sources, plus the need for easy and affordable access to that data now, and long into the future, for continued and periodic computational analysis and recalibration of deep learning models.

At the show, we will be highlighting our recent updates to the ActiveScale object storage platform , industry-leading security features of Scalar tape archive solutions , and StorNext NVMe, RDMA and tiered storage technologies. Most of all, we are excited to promote the success of our customers building state-of-the-art capabilities with Quantum products and solutions (for example, Genomics England ).

PLUS, SOME FUN AND GAMES

sc21b 1

We are also here to enjoy some fun and games. In addition to sponsoring the Petaflop , we are delighted to be co-hosting a social get together with our tech partner, Weka , to connect with our customers and the HPC community at large, and celebrate our high performance, massive scale joint storage solutions for HPC, Genomics, and AI . You don’t need to be Irish to get in ( ;o) ), but you do need to stop by the Quantum booth #918 or Weka booth #927 to claim your ticket!

We hope to see you there!

The post Meet Me in St. Louis – For an HPC, AI, and Analytics Extravaganza appeared first on Quantum Blog .

To view our Partner blog, click here

What is Cold Data? And Why You Need a Cold Storage Strategy.

ActiveScale

Join Quantum  on October 5 for a major announcement on the Future of Cold Storage: Register Here  

If you have followed International Data Corp (IDC), the well-known technology analyst firm over the last few years, you may be familiar with their work on the Global Datasphere, a measure and forecast of the amount of new data created and stored across the globe annually. Their latest analysis reports that over 64 Zettabytes of data were created in 2020. That’s about 32 trillion (2) hour movies, that’s three stacks of DVDs (without jewel cases) to the sun.  

IDC expects continued growth compounding at 19% a year into the foreseeable future. Data created over the next three years will amount to more than all the data created over the past 30 years; three times more data will be created over the next five years than was created in the past five years.  

Remarkably, only 2% of that data is being stored for future use and analysis. Expectations are that stored data too will continue to grow, particularly because every forward-looking organization is recognizing the value of data as the vehicle of digital transformation. Data is driving the opportunity to create value, to invent new revenue streams, and effectively make and confirm strategic directions. 

Which brings us to the topic of cold data…  

What is Cold Data? 

Production workloads naturally access lots of data. If you think of data as having a lifecycle, ‘hot’ data is data that is actively being used, requiring high performance access while ‘warm’ data is still frequently accessed over a given timeframe. Cold data is inactive data that is never or infrequently accessed. Industry analysts project that 60% of all stored data is cold data.1  

Screen Shot 2021 09 28 at 2.25.06 PM

Increasingly, cold data is being preserved not because an organization is required to save the data, but because the data is being recognized as having inherent and potential value. 

Classically, cold data was limited to data that was preserved to meet regulatory or in-house compliance policies that require retention for some number of years. The data often was simply written to tape media, taken offline, and moved to a storage facility for the rare event that the data would ever need to be accessed again – not so anymore.  

Why You Need a Cold Storage Strategy 

So, our whole orientation toward cold data is changing, especially as its value gets recognized – on the one hand and on the other hand, its enormity and its growth becomes overwhelming. With the digitization of everything, the incessant data collection of sensors, the volume of video and imagery sources, plus the data-intensive requirements and periodic recalibration of data analysis, artificial intelligence and deep learning workloads, the amount of cold data that must be stored is going to grow, and its application and use cannot and will not remain dormant. 

Key Considerations for a More Focused Cold Storage IT Strategy  

As cold data grows, it requires a more focused IT strategy and approach to meet the infrastructure requirements. Key considerations include: 

Budget and technology limitations. Whereas data storage requirements continue to rise, IT budgets cannot keep pace. Moreover, while solid state disk drives (SSDs) will approach the cost and capacity characteristics of today’s hard disk drives over the next several years, HDD density growth is flattening, leaving no option for a lower-cost tier of random-access storage (which has been the most effective strategy to maintain accessibility while reducing cost). DNA-based storage, while shows promise, is many years from widespread adoption and commercialization, and initially will only serve as the coldest of cold archives (i.e., relatively simple to encode, but hard to read). The largest cloud providers, for example, have already discovered that they need to rely on slower, lower-cost media to meet these constraints. 

Forever archiving. Virtually all our knowledge, in whatever form and whatever topic, has been digitized. More and more, data needs to be preserved for its historic relevance (we’ll want to look at this again) and its future value (we’ll want to analyze this again). Research data, medical records, media content, genomic data, and AI/ML modeling data are all obvious candidates that must be preserved for years and decades. Data is information and information is data. With continuous advancement of deep learning algorithms, we will continue to expand our vehicles to mine value and innovate new uses. Cold storage needs to be secure, durable, and self-healing for decades. 

Online, unfettered access for enrichment, value creation, and innovation. Extracting value from growing data stores becomes even more problematic. To extract value, you need to know what data you have and have meaningful ways to organize and find relevant subsets. We need more data (the metadata) about the data to maintain and grow its relevance. This requires analysis and enrichment of the data itself both upfront and over time for continued enhancement. To do so, the data must remain easily accessible for the long term. Currently, organizations are challenged to meet these goals with inhouse platforms due to cost constraints. Often reliant on public cloud storage, data sovereignty and control become major issues; plus, accessibility is hampered by access and storage charges that spike when cold data is accessed. 

As we look forward to the future, the demands of cold data growth will ultimately give rise to new storage and service solutions.  

The Future of Cold Storage Roundtable – October 5, 2021 

Register here  for our October 5th (8:30AM PT) roundtable on the Future of Cold Storage, and receive the IDC InfoBrief, “Data Deluge: Why Every Enterprise Needs a Cold Storage Strategy.”  

______________ 

1 Note that, whereas there is a bit of a preoccupation around the amount of stored data that is cold, in fact, 99% of all data is cold. As 98% of data doesn’t get saved, this guarantees that this data will never get accessed again, so, by our definition, this data too is cold. That’s a lot of lost opportunity. 

The post What is Cold Data? And Why You Need a Cold Storage Strategy. appeared first on Quantum Blog .

To view our Partner blog, click here