facebookpixel

Ransomware Proof Your Data: New Ransom Block and Logical Tape Blocking Features

Archive

.vce-row-container .vcv-lozad {display: none}

 

In recent alarming news, we’ve learned that U.S. Federal agencies issued a joint cybersecurity advisory over the ‘BlackMatter’ ransomware group targeting critical infrastructure entities, including two U.S. food and agriculture sector organizations. What’s even more alarming is that many organizations are still not prepared to defend their most valuable asset – their data. For those who don’t know, ‘BlackMatter’ seems to have evolved from ‘DarkSide,’ a ransomware gang that shuttered operations after the Colonial Pipeline ransomware attack.  

A Multi-Layer Strategy 

The U.S. Government, in the wake of the Colonial and Solar Winds cyberattack, issued an executive order to bolster up U.S cybersecurity defenses to withstand such threats. There is a clear sense of urgency that a multi-layered strategy must be implemented and not just in the government, but this will go deep into the private sector as well. A multi-layered strategy in data protection is a topic I’ve been pushing for quite some time now – definite technologies (cloud, disk, tape) are needed to lay out multi-defenses to combat and defend data against ransomware.  

Introducing Scalar Ransom Block 

For the reasons mentioned above, Quantum has hardened its Scalar Tape Library  and introduced Scalar Ransom Block , an industry-first solution that creates a physical block between data tapes and the robotic tape system. It is a feature that employs a simple and unique concept to create a physical barrier between the data stored on tapes and the network robotic tape system. This feature ensures data stored on tape is completely offline and secure and provides an ultimate level of data security for cyber-resilient archives, after all the endgame is to be cyber resilient.  

image

Logical Tape Blocking 

In addition, we also introduced Logical Tape Blocking (middle image), a software-based solution that enables administrators to use software commands to prevent tapes from being loaded or unloaded. This allows the risk window to be reduced from when the magazine waits to be filled to when it can be ejected. Both features are part of Quantum’s comprehensive Scalar Security Framework making Scalar Tape systems one of the most secure tape storage systems available in the market. The combination of employing Scalar Ransom Block and Logical Tape blocking as a deterrence to ransomware along with front-end defenses in your network, where software meets hardware and big data meets intelligence, make for a good strategy in creating a cyber-resilient infrastructure. 

Solutions like these bring a high-value proposition and bonus: Data is kept offline and managed automatically, yet available with a ‘push and a click’ versus long-term data that is kept 100% on always connected spinning disks, costing hundreds of thousands more or the hassle of handling tapes introducing human negligence. Quantum has a secure and reliable solution to help you keep your data secure and available when you need it – Quantum Ransom Block is only available in Scalar Tape Libraries.  

Make sure you check out the press release  to learn more about our recent announcement and stop by our new Scalar Tape Libraries  page where you will find in-depth information related to Scalar Ransom Block, Logical Tape Blocking, and our Scalar Security Framework. 

The post Ransomware Proof Your Data: New Ransom Block and Logical Tape Blocking Features appeared first on Quantum Blog .

To view our Partner blog, click here

What is Cold Data? And Why You Need a Cold Storage Strategy.

ActiveScale

Join Quantum  on October 5 for a major announcement on the Future of Cold Storage: Register Here  

If you have followed International Data Corp (IDC), the well-known technology analyst firm over the last few years, you may be familiar with their work on the Global Datasphere, a measure and forecast of the amount of new data created and stored across the globe annually. Their latest analysis reports that over 64 Zettabytes of data were created in 2020. That’s about 32 trillion (2) hour movies, that’s three stacks of DVDs (without jewel cases) to the sun.  

IDC expects continued growth compounding at 19% a year into the foreseeable future. Data created over the next three years will amount to more than all the data created over the past 30 years; three times more data will be created over the next five years than was created in the past five years.  

Remarkably, only 2% of that data is being stored for future use and analysis. Expectations are that stored data too will continue to grow, particularly because every forward-looking organization is recognizing the value of data as the vehicle of digital transformation. Data is driving the opportunity to create value, to invent new revenue streams, and effectively make and confirm strategic directions. 

Which brings us to the topic of cold data…  

What is Cold Data? 

Production workloads naturally access lots of data. If you think of data as having a lifecycle, ‘hot’ data is data that is actively being used, requiring high performance access while ‘warm’ data is still frequently accessed over a given timeframe. Cold data is inactive data that is never or infrequently accessed. Industry analysts project that 60% of all stored data is cold data.1  

Screen Shot 2021 09 28 at 2.25.06 PM

Increasingly, cold data is being preserved not because an organization is required to save the data, but because the data is being recognized as having inherent and potential value. 

Classically, cold data was limited to data that was preserved to meet regulatory or in-house compliance policies that require retention for some number of years. The data often was simply written to tape media, taken offline, and moved to a storage facility for the rare event that the data would ever need to be accessed again – not so anymore.  

Why You Need a Cold Storage Strategy 

So, our whole orientation toward cold data is changing, especially as its value gets recognized – on the one hand and on the other hand, its enormity and its growth becomes overwhelming. With the digitization of everything, the incessant data collection of sensors, the volume of video and imagery sources, plus the data-intensive requirements and periodic recalibration of data analysis, artificial intelligence and deep learning workloads, the amount of cold data that must be stored is going to grow, and its application and use cannot and will not remain dormant. 

Key Considerations for a More Focused Cold Storage IT Strategy  

As cold data grows, it requires a more focused IT strategy and approach to meet the infrastructure requirements. Key considerations include: 

Budget and technology limitations. Whereas data storage requirements continue to rise, IT budgets cannot keep pace. Moreover, while solid state disk drives (SSDs) will approach the cost and capacity characteristics of today’s hard disk drives over the next several years, HDD density growth is flattening, leaving no option for a lower-cost tier of random-access storage (which has been the most effective strategy to maintain accessibility while reducing cost). DNA-based storage, while shows promise, is many years from widespread adoption and commercialization, and initially will only serve as the coldest of cold archives (i.e., relatively simple to encode, but hard to read). The largest cloud providers, for example, have already discovered that they need to rely on slower, lower-cost media to meet these constraints. 

Forever archiving. Virtually all our knowledge, in whatever form and whatever topic, has been digitized. More and more, data needs to be preserved for its historic relevance (we’ll want to look at this again) and its future value (we’ll want to analyze this again). Research data, medical records, media content, genomic data, and AI/ML modeling data are all obvious candidates that must be preserved for years and decades. Data is information and information is data. With continuous advancement of deep learning algorithms, we will continue to expand our vehicles to mine value and innovate new uses. Cold storage needs to be secure, durable, and self-healing for decades. 

Online, unfettered access for enrichment, value creation, and innovation. Extracting value from growing data stores becomes even more problematic. To extract value, you need to know what data you have and have meaningful ways to organize and find relevant subsets. We need more data (the metadata) about the data to maintain and grow its relevance. This requires analysis and enrichment of the data itself both upfront and over time for continued enhancement. To do so, the data must remain easily accessible for the long term. Currently, organizations are challenged to meet these goals with inhouse platforms due to cost constraints. Often reliant on public cloud storage, data sovereignty and control become major issues; plus, accessibility is hampered by access and storage charges that spike when cold data is accessed. 

As we look forward to the future, the demands of cold data growth will ultimately give rise to new storage and service solutions.  

The Future of Cold Storage Roundtable – October 5, 2021 

Register here  for our October 5th (8:30AM PT) roundtable on the Future of Cold Storage, and receive the IDC InfoBrief, “Data Deluge: Why Every Enterprise Needs a Cold Storage Strategy.”  

______________ 

1 Note that, whereas there is a bit of a preoccupation around the amount of stored data that is cold, in fact, 99% of all data is cold. As 98% of data doesn’t get saved, this guarantees that this data will never get accessed again, so, by our definition, this data too is cold. That’s a lot of lost opportunity. 

The post What is Cold Data? And Why You Need a Cold Storage Strategy. appeared first on Quantum Blog .

To view our Partner blog, click here

Announcing Availability of LTO-9 Tape Drives in Enterprise and Midrange Tape Systems

Archive

Today, Quantum announces the availability of LTO-9 tape drives in its flagship enterprise and midrange tape systems, the Scalar i6000 and Scalar i6. LTO-9 tape drives are now also available for purchase in the Scalar i500 tape systems . LTO-9 tape drives support both LTO-8 media, as well as LTO-9 media, with each tape holding 18 TB of native capacity, and up to 45 TB of compressed capacity. The drives have improved performance with native performance of up to 400 MB / sec. 

Tape in the 2021 Environment

So, why is tape needed in your environment in 2021? The simple answer is ransomware and the growth of unstructured data. Ransomware has evolved from an unintelligent attack aimed at disrupting your business, to a relentless, targeted, sophisticated AI-driven machine that seeks to encrypt and delete your backups to obtain hundreds of thousands in ransom. To meet the heavy demands of our current cyber climate and that of video and other forms of unstructured data, tape provides the assurance that a copy of your data will be available when needed. Tape has an inherent nature to airgap data and is the lowest cost solution for your long-term ‘cold’ storage. An offline copy is critical as an insurance policy should your online copies be compromised. Today, there isn’t technology available that can stop ransomware from attacking your organization, but we can help you prevent you from gaining a foothold. We know your reputation, customers, and their trust are important to you.

Data Growth and Unrelentless Cyber Threats

We understand data is invaluable and we see its exponential climb to stratospheric proportions, but how are you going to store the massive amounts of data being created, because it must be stored and protected (sometimes forever) from cyber threats and other forms of threats without breaking the bank. Quantum LTO-9 Tape is designed to meet the demands of the modern data center. From storing edge and AI-driven analytical data to media and entertainment content and long-term cold storage.  LTO protects your data from the onslaught of ransomware and future proofs your growing deep archives.  LTO tape has a strong history of protecting data and continues to show a solid roadmap to reach greater capacities and performance and be a strong partner in your DR strategy.

The future of LTO looks a lot denser as the innovation continues to provide a cost-efficient solution over alternative solutions that is also cyber resilient. For example, tape has major cost advantages over HDD in terms of $/GB. The capacity of tape can be increased mainly by areal density, the same as with disk, but tape can increase its length, unlike disk, which has reached its maximum platter count. Add to this, its inherent ability to form a natural airgap for securing your data, and the capacities it can store, LTO tape can address the need for an economical off-line solution.

Implement LTO-9 into Your Long-Term Storage Strategy

To learn how you can implement Quantum LTO-9 tape technology for your long-term storage or if you’re on a journey to implement cyber resilient solutions, please check out how we help here .

The post Announcing Availability of LTO-9 Tape Drives in Enterprise and Midrange Tape Systems appeared first on Quantum Blog .

To view our Partner blog, click here

What is Object Storage and What to Look for in an Object Storage System

ActiveScale

Unstructured data is growing at up to 60% per year and will make up 80% of all data on the planet by 2025. For this reason, object storage solutions are entering a new era of mainstream adoption and use. Object storage allows you to easily store large amounts of unstructured data in a highly scalable and cost-effective way, making it a logical choice for backup and archiving, data analytics, and large, active unstructured data sets.

What Are Objects?

In its simplest form, object storage is a storage system where data elements are stored in repositories called objects. Objects can be of variable size, and are not only defined by their content, but also by their metadata and a unique identifier. Metadata can include simple things like creation date and time, but also unique characteristics about the data and searchable summary information. In contrast to accessing a file name and location within a file system, an object is accessed by a unique identifier created when the object is created and then used to retrieve the object later.

Built for Scale

A key element of a well-designed object storage system is its ability to scale gracefully, to grow from small terabyte-scale configurations up to massive scale, supporting billions of objects and 100s of petabytes in capacity. To prevent bottlenecks, object storage requires a ‘scale-out’ software architecture. This provides an expandable flat name space to address billions of objects and allows computing, networking, and storage resources to expand as the system’s capacity grows to maintain performance.

Always Available Access

Objects stores are accessed over a standard Ethernet network in one of two ways, through API calls or HTTP/HTTPS requests, most commonly through software in the form of web-based applications, packaged and open-source software solutions, and custom-developed in-house applications and scripts. Today, most object stores are accessed via AWS S3, a de facto standard command set. Most notably, the S3 protocol runs securely over Ethernet, making it easy to integrate object storage into your computing environment and access your data from varying distances, across private networks, WANs, and the Internet.

Durability, Security, and Availability

Another key attribute of object storage is the emphasis in their design of data durability, security, and availability. With so much data being stored, it’s important that the system and data are always accessible, and data is securely stored and confidently retained over a long period of time, for years and sometimes, decades.

Object storage systems use erasure coding, a data protection technique superior to classic RAID and storage replication technologies. Erasure coding breaks an object up into many piece parts, called shards, then calculates and maintains multiple copies of parity information for these shards; the data shards and parity shards are then distributed across the underlying hardware infrastructure. This ensures continuity of data and data access in the case of a data loss, component failure, or a site disaster.

Not All Object Storage Systems Are the Same

While object storage systems have a lot of commonalities, it is also important that not all object storage systems are the same. For many systems, the concept of scale only extends to the terabyte range and reliability and availability capabilities don’t meet the needs of massive enterprise deployments.

Performance at scale. Whereas scalability is a key attribute promoted by many vendors, implementations can suffer as the system scales, causing painful, performance-crushing data rebalancing across system resources. This forces customers to manage a fleet of object storage systems despite their original goal of a simple, scalable solution.

Recommendation: Seek real-world references of customers that are running single name space instances that scale to at least a petabyte in scale, and even higher if your capacity requirements extend into 10’s or 100’s of petabytes.

Efficiency. A well-designed system should be manageable at any scale, from terabytes to exabytes, by a single administrator on a part-time basis.

Recommendation: Go beyond the demo to truly understand the system’s design and management toolset available. Bypass the downloadable UI-centric demoware to consider a ‘proof-of-concept’ deployment to validate the system design and applicability to your use case.

Proactive data protection. With the amount of the data you are storing, its security, availability, and protection against failure, is paramount.

Recommendation: Look for advanced proactive data integrity verification to guarantee data consistency over the long term and always-on, cloud-based monitoring to support your efforts.

An everlasting archive. Most object storage systems naturally grow over time and evolve to support additional use cases. The object storage solution you deploy is a key strategic asset for both mining and preserving your organization’s data assets for the long term.

Recommendation: Consider your object storage vendor as a strategic partner working with you to protect and extract value from your digital assets. Value vendor longevity and understand how well systems preserve your data and maintain data accessibility when transitioning to new generations of software and hardware.

About Quantum ActiveScale

Quantum ActiveScale is an enterprise-class object storage solution designed from the ground up for data durability, security, and availability of massive unstructured data sets and use cases ranging from terabytes to exabytes in scale. With S3 compatibility and an optimized two-layer storage architecture, ActiveScale simply and cost-effectively scales performance and capacity without bound in support of analytical workloads, active archiving, long term retention, and cold data storage. Learn more at: www.quantum.com/object-storage

To view our Partner blog, click here

Quantum Leads End-user Survey for Secondary Storage

ActiveScale

coldago logo

In the most recent end-user survey from Coldago Research , Quantum leads all vendors as the top brand for secondary storage.  Coldago’s April 2021 report surveyed over 2,000 end-users across SMB and enterprises about their brand preferences on a wide range of storage types and applications.  In secondary storage, Quantum is the leading brand. 

secondary storage

With the growth in secondary storage including object storage, Quantum offers a wide range of products to serve our customer’s needs.  These market segments are poised for growth.  According to IDC “Object storage accounts for the largest part of the public cloud storage market and continued to experience significant growth (31% in 2019)[1].  With that level of growth, customers need a wide portfolio of secondary storage products. 

End User Survey, Top US/EMEA brands for Secondary Storage

“Coldago Research unveils its end-users survey 2021 addressing perceptions, opinions, needs and priorities. We also asked users’ to rank vendors in various categories and for secondary storage, Quantum arrives 1st both in USA and Europe confirming the corporate and product strategy initiated a few years ago with new products, technology developments, acquisitions and talent recruitments that started to pay off”, said Philippe Nicolas, analyst at Coldago Research. “As a pioneer of secondary storage, Quantum continues to offer large, diverse and broad offerings targeting from small configurations to very large-scale environments, whatever is the technology, storage media, architecture and access methods making their solutions a real universal, unified and ubiquitous approach”, concludes Nicolas.

At Quantum, we’re pleased with the recognition that IT professionals have bestowed in this survey.  Over the last few years, we’ve made numerous improvements to our entire portfolio.  Here’s just a quick summary:

  • ActiveScale object storage – Multiple new releases that provide object locking, small object aggregation, and advanced troubleshooting.  New hardware included the P100E3 3-node platform for smaller customers and the new X200 platform for more performance and scale. 
  • Scalar tape – Introduction of multi-factor authentication builds upon the value of Active Vault, a feature available only in Quantum’s Scalar tape libraries to secure critical off-line data against ransomware attacks. The new feature makes unauthorized access extremely difficult, and underscores Quantum’s commitment to innovation for preserving and protecting valuable customer data.
  • DXi backup appliances New features include Secure Snapshots and others in collaboration with Veeam, including multi-tiered backup and replication to Quantum DXi appliances, across sites, to the cloud and to Scalar tape libraries.

“The Coldago report demonstrates strong customer satisfaction across the full line of Quantum’s secondary storage products,” said Bruno Hald, Quantum’s GM for Secondary storage. “We are a customer-driven organization and have delivered a series of innovation for a full range of customers, from SMB, larger enterprises, and to our hyperscale customers.  We look forward to providing them with additional products and services to improve their bottom line”. 


[1] Source:  IDC report, Cold Storage Services in the Public Cloud IaaS Market, #WC2020087, Sept 1, 2020

Learn more at https://www.coldago.com/ and read the source report: https://www.coldago.net/eusurvey

To view our Partner blog, click here

VirtualQ I Transform Recap: Helping you Unlock the Value of Unstructured Data

ActiveScale

As we head into the holiday season, it’s a good time to recap some of the major things we’ve introduced this year, and in many ways our recent VirtualQ I Transform event was the culmination of a years’ worth of effort in bringing new products and solutions to market. 

In November, our VirtualQ I Transform event focused on how unstructured data growth, and unlocking the business value in this data, is transforming how our customers think about data. This event was the culmination of our launch efforts of the expanded portfolio we announced on November 10 – focused on managing, storing, and protecting unstructured data across its lifecycle. You can access all of the sessions and watch at your convenience here .

Most Popular Topics During VirtualQ I Transform

Based on the hundreds of attendees across many different sessions hosted over two days (one European event, and one event for the Americas), here are some of the topics that resonated most and generated the most interest:

Unstructured Data Sprawl: Many of our customers are outgrowing their capacity and looking at new ways to manage ‘unstructured data sprawl.’  We are helping these customers gain better visibility into what data they have, where it lives, and automate ways to move data between edge, core, and cloud environments. The new automated data movement we’ve added to StorNext 7 and ATFS can help in this regard, and were two of the key things we introduced. 

ATFS Storage Platform: There were many questions about our new ATFS storage platform, an easy to use, cost effective NAS platform with integrated data classification. This data classification is a major advancement in giving customers visibility into the file data they are storing.  With simple queries, customers can gain new insights into the make-up of their file data, where it lives, and then quickly use those insights to drive actions that can be enforced with data placement policies, data protection policies, and storage quotas. 

ATFS Trial Download: As we mentioned during our event, for anyone interested in learning more about the ATFS platform, and to start to classify the file data you have today, download a full version of ATFS here . It only takes a few minutes, and you can start to see the impact this platform can make to your NAS infrastructure.

Ransomware Protection: There was also a lot of interest in new ways and best practices to protect against ransomware, and security of data in general. This continues to be top of mind for many of our customers, and we now have options to protect against ransomware regardless of your technology choice, and regardless of the application you are using. We’ve introduced Object Lock to our ActiveScale object store software , we are adding a similar snapshot retention lock feature to our DXi backup appliances , and tape remains the most secure storage with innate air-gapped protection, Active Vault software to automatically move tapes to an in-library vault, and security capabilities like AES-256-bit encryption and multi-factor authentication.

“Forever” Data Archives: Lastly, there was continued interest in the emerging realization that much of this valuable data being created and analyzed will need to be kept forever in a private cloud. Our solutions for this “forever” data archive continue to drive interest, from object storage software with erasure encoding, to RAIL tape, to ways to index and search the content in these archives. Look for some exciting announcements in this area in the coming weeks and months. You can check out the “Forever Archive” white paper here .

Looking Ahead

We are continuing to innovate and transform our own business with a focus on solving our customers’ most pressing business challenges in the areas of video, digital images, and other forms of unstructured data, and we can’t wait for 2021! 

Watch VirtualQ I Transform Sessions On-Demand

If you missed the event, watch replays at your own pace to learn more about managing, storing, and protecting unstructured data across its lifecycle.

Watch Sessions On-Demand Now!

To view our Partner blog, click here