facebookpixel

AWS Outposts Meets ActiveScale Object Storage

ActiveScale

Today is another great day for ActiveScale Object Storage . We are pleased to announce that ActiveScale is now AWS Outposts Ready .  This designation, part of the AWS Service Ready program, validates ActiveScale object storage systems in AWS Outposts deployments to provide Amazon Simple Storage Service (S3) compliant services for AWS services running within an AWS Outposts environment. 

WHAT AND WHY AWS OUTPOSTS? 

AWS Outposts is a service offering first introduced in 2019 that offers AWS infrastructure, AWS services, APIs, and tools within your own datacenter. In effect, the AWS cloud comes to you,- through a fully configured, rack-based system environment deployed in your own data center, co-location facility, or hosted IT environment. Consistent with AWS services in the AWS cloud, and as trending in the industry today across multiple platform vendors, AWS Outposts is sold and managed by AWS as a fully managed service.  

AWS Outposts provides a hybrid cloud solution that is fully compatible and integrates with your IT resources deployed in the AWS cloud. As an on-premises solution, AWS Outposts is particularly well suited for workloads that require: 

  • low latency access to on-premises systems,  
  • local data processing due to dependencies on in-house resources or constraints, or 
  • data residency in a specific geography or security domain. 

WHY ACTIVESCALE AND QUANTUM OBJECT STORAGE SERVICES 

ActiveScale is an ideal solution for AWS Outposts as it provides industry-leading scalability, data durability, and total cost of ownership (TCO), enabled by its unique ability to cost-effectively manage both active and cold data. ActiveScale Dynamic Data Placement (DDP) incorporates advanced erasure coding and real-time algorithms to optimally place data securely across the entire ActiveScale system. DDP not only guards against data loss, component, and site failures, but also optimizes system performance. 

THE EMERGENCE OF THE AGILE DATA CENTER 

The increasing adoption of AWS Outposts, and recent designation of ActiveScale as an AWS Outposts Ready solution, reflect growing interest by enterprises and cloud services providers alike to adopt hybrid cloud and as-a-Service data center strategies as they evolve their infrastructures into the future. Large organizations are choosing as-a-Service solutions for their financial flexibility, agility, and simple consumption models. As another data point, note that analyst firm, IDC, predicts that Data-Center-Infrastructure-as-a-Service (DCIaaS) revenues will grow from $138M in 2020 to a whopping $14B in 2025 at a growth rate of 151.8% annually.1  Service offerings from major IT system suppliers, such as HPE Greenlake and Dell Technologies Cloud Infrastructure-as-a-Service, also reflect this trend.  

The growing demand for on-premises data center services is consistent with Quantum’s own aggressive development of Quantum-as-a-Service solutions, for example, with our recent introduction of Quantum Object Storage Services  (QOSS). Based on ActiveScale platforms, Quantum Object Storage Services provide affordable, scalable on-prem private cloud storage services for both active and archived data sets. In this deployment scenario, Quantum owns, installs, manages, maintains, and monitors your private ActiveScale infrastructure, delivered as-a-Service, backed by Quantum expertise, service delivery tools, and AIOps-driven analytics and monitoring. Scalable from petabytes to exabytes, QOSS meets the growing demands of large enterprises, government agencies, cloud service providers and research organizations for more dynamic and flexible IT consumption models. 

The post AWS Outposts Meets ActiveScale Object Storage appeared first on Quantum Blog .

To view our Partner blog, click here

What Matters in Storage

CMMA Blog

The past few decades have brought a series of innovations to the storage industry, such as tiering, snapshots, deduplication, replication, iSCSI, SSD, NVMe, and a few others, but the storage service itself hasn’t changed. This might sound shocking; let me explain by first, establishing a definition of a storage service.

Storage service is the combined delivery of three interdependent services: Capacity, performance (defined by throughput, IOPS and latency), and availability or up-time. The storage service hasn’t changed; the applications’ demand for storage services has changed only in how much, but what constitutes a storage service has stayed the same. Though this definition of storage services omits most of the innovations we have touted over the past decades, it doesn’t imply that the innovations don’t matter when evaluating solutions. So, what does drive the adoption of innovation? There is only one answer: ‘cost’.

image

Cost of the storage service required is what has been driving storage innovation. Here are some examples:

  • Deduplication reduces the amount of actual data being stored, thus reducing capacity required to support application’s needs. If storage capacity costs $1 and deduplication reduces the need by 50%, then even if the cost of deduplication is $0.20, the overall savings of $0.30 would justify adoption. If there were no cost advantages to deduplication, would it matter whether the system had it or not?
  • Tiering is the ability to move data across media performance tiers allowing for the data to consume storage that is aligned with its performance needs. Tiering data to a lower performant-less expensive tier, where storage costs 3X less, would result in significant cost efficiencies.
  • iSCSI as an access protocol has replaced FC in many data centers. Though initially the performance of iSCSI was viewed as lower than FC, the cost of deploying an iSCSI network was significantly less expensive. Many applications were fine with iSCSI performance and the adoption of this protocol skyrocketed with extensive cost savings.
  • MLC, TLC, and QLC NAND deliver higher density with each iteration. The adoption of higher density NAND offers a lower per GB cost in capacity. Since NAND delivers better performance per GB than HDD, the cost savings are amplified with higher density.

There are many other examples where innovations in storage were targeting improvements in performance, capacity density and utilization, up-time, and access to drive down the overall cost of storage services. Evaluating these features as they affect the core storage services per unit of cost will help address the need to align total cost of ownership with business value.

To view our Partner blog, click here

This Week at the Q

Archive

1. This week we published a new customer case study, Max Planck Society. Scientists from across the globe work and research at the Max Planck Institute in various scientific fields across natural sciences, life sciences, social sciences, and the humanities. Max Planck Institutes are involved in more than 4,500 cooperation projects in over 100 countries.

The Joint Network Center (GNZ) at the Fritz Haber Institute (FHI) provides technology services for Max Planck Society. To better protect large volumes of scientific data, the GNZ needed to accelerate their storage backup process and improve their archiving. Read this new case study to learn how they were able to shrink their backup window, reduce costs, help ensure the long-term integrity of their data, and gain scalability to support fast-growing scientific data volumes.

blog mp

2. We also announced the release of new DXi software with advanced features and capabilities for DXi backup appliances to significantly improve backup and restore system performance, as well as monitor systems remotely utilizing cloud-based analytics.

IDC’s Phil Goodwin, Research Director, Infrastructure Systems, Platforms and Technologies Group: “Traditional backup and recovery software remains strong, even as organizations add cloud-based backup to their data protection strategies. The combination of a Quantum DXi appliance and Veeam V10 shows great potential for enterprises seeking to protect their vital hybrid-cloud environments. Faster synthetic full backups enabled by this release means that organizations can backup more often, thereby reducing the risk of data loss and delivering a better RPO.”

Read more about this announcement in this Enterprise Talk press coverage .

blog dxi appliances

3. Check out this new blog by Eric Bassier, Quantum’s Sr. Director of Product Marketing, “Are You Experiencing Unstructured Data Sprawl?” He outlines how “unstructured data sprawl” is the emerging problem that many CIOs and large enterprises are dealing with, and the need to classify the data, how to work with it when you need it, and how to store it safely when you are not actively working with it. 

pb math

4. Speaking of unstructured data sprawl, the need for an archive strategy has never been greater. Check out this in-depth video with industry expert, Fred Moore of Horison Information Strategies, with Dave Vellante on theCUBE, covering the topic of “Reinventing Archives.”

blog fred moore

5. This new SearchStorage article, “On-premises STaaS Shifts Storage Buying to Opex Model,” is a great overview of the growing field of STaaS, and discusses the benefits of a pay-as-you-go subscription model, the flexibility STaaS offers, and the simplified management for customers.

Leave a comment if there are topics you’d like to see added to our weekly top 5 happenings!

Natasha

To view our Partner blog, click here