facebookpixel

Volume, Speed, Location: A Reference Architecture to Resolve Autonomous Vehicle Data-Management Challenges

ADAS

After many decades of research, finally, during the last couple of years, Artificial Intelligence (AI) and Machine Learning (ML) academic research led to the development of technology feasible for enterprise and even industrial production deployments capable of addressing many previously seemed impossible to solve problems. The availability of vast quantities of data and the exponential growth of computing resources enabled this fundamental transition.  

We don’t program computers anymore… we train them! 

The development of ML models, which predict future parameters or recognize objects, rely on the availability and accessibility of well-prepared structured and unstructured data. The size of the training and verification datasets directly influences the quality of the produced ML.  In reality, the availability of a large quantity of high-quality data is the key factor in making most ML technology work. The data must be collected, prepared, and accessible for the compute infrastructure, and has to be archived for future use. Each of these stages presents significant but different demands toward the storage infrastructure. In many cases, each data processing stage is performed at multiple locations. Primarily, the data is generated and collected first at the edge of the network and consequently pushed to distributed datacenters or to the cloud for processing and long-term storage. The exact infrastructure architecture and workflow are defined based on the specific use case, which brings the vital need for flexibility and modularity for any solution. 

One of the most extensive and highly visible AI/ML use cases is the Autonomous Vehicle (AV) development. In this case, the initial data is collected by multiple cameras, LiDARs, and Radars installed on test vehicles that are deployed in regions where the AVs will be able to operate in the future. Each vehicle collects terabytes of data per hour, which needs to be stored reliably and offloaded to a datacenter at a later time. The data collected at the datacenter quickly grows to tens and even hundreds of petabytes. Data is collected at multiple locations and processed locally or aggregated either in a primary datacenter or on the public cloud. This process is extremely time-consuming and expensive, making efficient storage and data management solutions critical for the business.  

A new reference architecture to cover all phases of the complex AV development process 

To address all of these challenges for AV developers and relying on decades of unstructured data management experience and industry-leading technology, Quantum developed a new ADAS/AV data management reference architecture that delivers unmatched flexibility and performance. It describes the only single-vendor solution on the market that covers all phases of the complex AV development process, including in-vehicle data collection, data preparation, ML model training, system simulation, HiL testing, long-term storage, and archiving. The solution includes the new Quantum R6000  in-vehicle, high performance, high capacity, ruggedized storage appliance, which is deployed in the trunk of the test car and collects multistream data at over 10GBytes/sec. Its compact size, removable storage canister with up to 120 TBytes of reliable storage makes it ideal for any data load in any environment.  

R6000 allows the fast offload of the collected data to the datacenter infrastructure, where it is placed under the management of Quantum StorNext File System , the world’s fastest file system for video workloads. StorNext manages the data stored on industry-leading NVMe, HDD, object storage , and tape appliances . It controls multiple storage tiers and places the data automatically at the tier and the location that provides the necessary performance and capacity at the best price. The capacity and the performance of each of the storage tiers scale infinitely.   

Some of the core principles followed during the design of the Quantum reference architecture are modularity and open interfaces. Each one of the components supporting the storage tiers is optional and interchangeable, and all major industry standards for interoperability are supported. These allow easy integration of existing storage datacenter infrastructure or cloud instances from AWS, GCP, or Azure.  

A blueprint for AV and AI/ML development organizations 

The new Quantum reference architecture provides a blueprint for AV and other industrial AI/ML development organizations at any size and at any stage of development to start and grow on a solid technology base and with the support of one of the leading data management companies. Tapping into the decades of knowledge Quantum provides, together with the highest performing, most reliable technology on the market, provides a massive advantage in the race of developing the next level of autonomous vehicles and industrial robots. 

To learn more about Quantum’s ADAS and mobility solutions, visit our autonomous vehicles page

To learn more about the Quantum R-Series Edge Storage range, visit the R-Series product page .

The post Volume, Speed, Location: A Reference Architecture to Resolve Autonomous Vehicle Data-Management Challenges appeared first on Quantum Blog .

To view our Partner blog, click here

What is Object Storage and What to Look for in an Object Storage System

ActiveScale

Unstructured data is growing at up to 60% per year and will make up 80% of all data on the planet by 2025. For this reason, object storage solutions are entering a new era of mainstream adoption and use. Object storage allows you to easily store large amounts of unstructured data in a highly scalable and cost-effective way, making it a logical choice for backup and archiving, data analytics, and large, active unstructured data sets.

What Are Objects?

In its simplest form, object storage is a storage system where data elements are stored in repositories called objects. Objects can be of variable size, and are not only defined by their content, but also by their metadata and a unique identifier. Metadata can include simple things like creation date and time, but also unique characteristics about the data and searchable summary information. In contrast to accessing a file name and location within a file system, an object is accessed by a unique identifier created when the object is created and then used to retrieve the object later.

Built for Scale

A key element of a well-designed object storage system is its ability to scale gracefully, to grow from small terabyte-scale configurations up to massive scale, supporting billions of objects and 100s of petabytes in capacity. To prevent bottlenecks, object storage requires a ‘scale-out’ software architecture. This provides an expandable flat name space to address billions of objects and allows computing, networking, and storage resources to expand as the system’s capacity grows to maintain performance.

Always Available Access

Objects stores are accessed over a standard Ethernet network in one of two ways, through API calls or HTTP/HTTPS requests, most commonly through software in the form of web-based applications, packaged and open-source software solutions, and custom-developed in-house applications and scripts. Today, most object stores are accessed via AWS S3, a de facto standard command set. Most notably, the S3 protocol runs securely over Ethernet, making it easy to integrate object storage into your computing environment and access your data from varying distances, across private networks, WANs, and the Internet.

Durability, Security, and Availability

Another key attribute of object storage is the emphasis in their design of data durability, security, and availability. With so much data being stored, it’s important that the system and data are always accessible, and data is securely stored and confidently retained over a long period of time, for years and sometimes, decades.

Object storage systems use erasure coding, a data protection technique superior to classic RAID and storage replication technologies. Erasure coding breaks an object up into many piece parts, called shards, then calculates and maintains multiple copies of parity information for these shards; the data shards and parity shards are then distributed across the underlying hardware infrastructure. This ensures continuity of data and data access in the case of a data loss, component failure, or a site disaster.

Not All Object Storage Systems Are the Same

While object storage systems have a lot of commonalities, it is also important that not all object storage systems are the same. For many systems, the concept of scale only extends to the terabyte range and reliability and availability capabilities don’t meet the needs of massive enterprise deployments.

Performance at scale. Whereas scalability is a key attribute promoted by many vendors, implementations can suffer as the system scales, causing painful, performance-crushing data rebalancing across system resources. This forces customers to manage a fleet of object storage systems despite their original goal of a simple, scalable solution.

Recommendation: Seek real-world references of customers that are running single name space instances that scale to at least a petabyte in scale, and even higher if your capacity requirements extend into 10’s or 100’s of petabytes.

Efficiency. A well-designed system should be manageable at any scale, from terabytes to exabytes, by a single administrator on a part-time basis.

Recommendation: Go beyond the demo to truly understand the system’s design and management toolset available. Bypass the downloadable UI-centric demoware to consider a ‘proof-of-concept’ deployment to validate the system design and applicability to your use case.

Proactive data protection. With the amount of the data you are storing, its security, availability, and protection against failure, is paramount.

Recommendation: Look for advanced proactive data integrity verification to guarantee data consistency over the long term and always-on, cloud-based monitoring to support your efforts.

An everlasting archive. Most object storage systems naturally grow over time and evolve to support additional use cases. The object storage solution you deploy is a key strategic asset for both mining and preserving your organization’s data assets for the long term.

Recommendation: Consider your object storage vendor as a strategic partner working with you to protect and extract value from your digital assets. Value vendor longevity and understand how well systems preserve your data and maintain data accessibility when transitioning to new generations of software and hardware.

About Quantum ActiveScale

Quantum ActiveScale is an enterprise-class object storage solution designed from the ground up for data durability, security, and availability of massive unstructured data sets and use cases ranging from terabytes to exabytes in scale. With S3 compatibility and an optimized two-layer storage architecture, ActiveScale simply and cost-effectively scales performance and capacity without bound in support of analytical workloads, active archiving, long term retention, and cold data storage. Learn more at: www.quantum.com/object-storage

To view our Partner blog, click here

Quantum Leads End-user Survey for Secondary Storage

ActiveScale

coldago logo

In the most recent end-user survey from Coldago Research , Quantum leads all vendors as the top brand for secondary storage.  Coldago’s April 2021 report surveyed over 2,000 end-users across SMB and enterprises about their brand preferences on a wide range of storage types and applications.  In secondary storage, Quantum is the leading brand. 

secondary storage

With the growth in secondary storage including object storage, Quantum offers a wide range of products to serve our customer’s needs.  These market segments are poised for growth.  According to IDC “Object storage accounts for the largest part of the public cloud storage market and continued to experience significant growth (31% in 2019)[1].  With that level of growth, customers need a wide portfolio of secondary storage products. 

End User Survey, Top US/EMEA brands for Secondary Storage

“Coldago Research unveils its end-users survey 2021 addressing perceptions, opinions, needs and priorities. We also asked users’ to rank vendors in various categories and for secondary storage, Quantum arrives 1st both in USA and Europe confirming the corporate and product strategy initiated a few years ago with new products, technology developments, acquisitions and talent recruitments that started to pay off”, said Philippe Nicolas, analyst at Coldago Research. “As a pioneer of secondary storage, Quantum continues to offer large, diverse and broad offerings targeting from small configurations to very large-scale environments, whatever is the technology, storage media, architecture and access methods making their solutions a real universal, unified and ubiquitous approach”, concludes Nicolas.

At Quantum, we’re pleased with the recognition that IT professionals have bestowed in this survey.  Over the last few years, we’ve made numerous improvements to our entire portfolio.  Here’s just a quick summary:

  • ActiveScale object storage – Multiple new releases that provide object locking, small object aggregation, and advanced troubleshooting.  New hardware included the P100E3 3-node platform for smaller customers and the new X200 platform for more performance and scale. 
  • Scalar tape – Introduction of multi-factor authentication builds upon the value of Active Vault, a feature available only in Quantum’s Scalar tape libraries to secure critical off-line data against ransomware attacks. The new feature makes unauthorized access extremely difficult, and underscores Quantum’s commitment to innovation for preserving and protecting valuable customer data.
  • DXi backup appliances New features include Secure Snapshots and others in collaboration with Veeam, including multi-tiered backup and replication to Quantum DXi appliances, across sites, to the cloud and to Scalar tape libraries.

“The Coldago report demonstrates strong customer satisfaction across the full line of Quantum’s secondary storage products,” said Bruno Hald, Quantum’s GM for Secondary storage. “We are a customer-driven organization and have delivered a series of innovation for a full range of customers, from SMB, larger enterprises, and to our hyperscale customers.  We look forward to providing them with additional products and services to improve their bottom line”. 


[1] Source:  IDC report, Cold Storage Services in the Public Cloud IaaS Market, #WC2020087, Sept 1, 2020

Learn more at https://www.coldago.com/ and read the source report: https://www.coldago.net/eusurvey

To view our Partner blog, click here

Announcing the H4000 with StorNext 7 – Fast, Collaborative Production Just Got Easier

Block Storage

If the new pace of production and sheer sprawl of content and unstructured data that you need to manage, organize, and protect seems overwhelming – you’ll be glad to know that big help–in a small package–is on the way. 

The Full Capabilities of StorNext Platform – Now Fits in Only 2U 

Today, we’re pleased to announce the newest, fastest StorNext ever on a converged appliance from Quantum. Together, StorNext 7 on the new H4000 appliance  deliver a complete StorNext environment that fits in only 2 rack units. And when you can place the full capabilities of StorNext, the world’s fastest collaborative, shared file platform in more places than ever before, it will change how and where you think about your critical workflows. 

Take Your Solutions to the Edge – Move More of the Studio to the Set 

By converging all of the capabilities of StorNext , the world’s fastest collaborative, shared file platform into a single appliance, you can get racked and running in minutes whether you’re adopting StorNext for the first time or adding more power and capability quickly to your existing environment: 

  • Bringing Your Entire Team Together Just Got Easier: Make the leap to a unified, end-to-end workflow for a new level of efficiency and faster time to delivery. 
  • Adding New Teams and Projects Anywhere Just Got Easier: Extend your existing StorNext environment quickly to a new office or for a new project without complex infrastructure buildout. 
  • Analysis and Insight at the Edge Just Got Easier: Move high-performance analysis and critical solutions closer to the point of capture for faster insight and action.
  • Bringing Post to Set Just Got Easier: Bring post-production to the set – from ingest, to pre-visualization, to rough cuts.  

The new H4000 truly lets StorNext 7 – and your workflow fly: 

  • Ultrafast end-to-end architecture on PCIe 4 and the latest multi-core processors. 
  • New unified interface to manage the entire StorNext environment and the H4000.
  • Converged architecture to deliver all of the capabilities and services of StorNext. 
  • Flexible storage options to deliver 12 or 24 bays of storage and up to 160 TB or 307 TB. 
  • Flexible connection options to deliver your optimum mix of network and connection technologies including 100 Gb Ethernet and 32 Gb Fibre Channel.  

Join Our Live Talk on May 18! 

We’ll be taking you through all the features and all the ways you can get the most from StorNext 7 on the H4000 – join us here .

Until then, you can read more H4000 Series details here  or contact  your Quantum representative or Value-Added Reseller to review your workflow. 

To view our Partner blog, click here

A Trifecta of Veeam-Ready Solutions

ActiveScale

They say the most interesting things in life come in sets of threes. Certainly, some sets are more interesting and simpler than others, such as live, love, laugh, or rock, paper, scissors, or my favorite, the good, the bad, and the ugly. While not super philosophical, they are easy concepts to remember and, in some cases, simplify our daily lives. When we think of things that we need to simplify in a data centric environment, we think of data management, data security, and data protection. Quantum has made a commitment to provide leading solutions that simplify; this is clearly apparent as we continue to work with Veeam offering solutions around data management, security, and protection. Quantum has been a long-standing partner of Veeam and has been certified as Veeam Ready, a program that ensures compliance with functionality and performance standards put forth by Veeam.

As the evolution of Big Data continues, the three core topics of concern to date are protect, secure, and manage data. The industry has done a decent job of providing data protection. Data Management is an emerging market segment focused on unstructured data. Recent hacking and ransomware attacks have brought concerns around data security to the forefront of people’s minds. Quantum has been a Veeam partner for a long time, since Veeam 9.5 Quantum DXi deduplication appliances and award-winning Scalar tape libraries have been certified with Veeam. Quantum’s most recent addition to the portfolio, Active Scale Object Storage , has also been Veeam Ready (v10) certified. Here is the full portfolio that is Veeam 10 Ready certified:

Veeam Integrated DXi Deduplication Appliance

DXi backup appliances offer cost-effective, fast backup and restores for mission-critical product data. Our customers love our DXi for many reasons, but feedback we often hear is that the DXi replication starts just after the first file is backed up without waiting for the end of the Veeam task, so the period of risk—the gap between when a backup is done and a copy is safely off site—is incredibly short, unlike some competitor’s solutions that start the replication only when the Veeam task is done. In v10, Fast Clone feature improves synthetic full performance by referencing existing data blocks on volumes instead of copying data blocks between files and copying those blocks only when files are modified. We have seen up to 15x improvement in our test labs and customer deployments in the speed and in some cases reduced the creation time from hours to minutes.

Veeam Ready Scalar Tape Library

Quantum Scalar tape solution for compliance, long-term retention requirement and ransomware protection. Quantum Scalar library is the only tape system in the market that uses Active Vault, a hands-off approach to managing media between an active and vaulted partition that is completely disconnected from the network. This enables our customers to fully airgap their data and thwart the effects of ransomware or other malware.

Veeam Ready Object Immutability – Active Scale Object Storage

ActiveScale is Quantum’s object storage system that provides a data-forever storage infrastructure. The ActiveScale system architecture accommodates storing primary copy of multi-petabyte-scale datasets that are highly durable and facilitates long-term data retention with minimal administrative overhead. With a Veeam Ready Object Immutability certification, Quantum solutions offer clients the ability to leverage our appliances efficiently through the Veeam GUI.

For more information on testing criteria or detailed information about how Quantum’s portfolio can help you achieve your data protection requirements, visit us our Quantum & Veeam web page or talk to one of our solution representatives.

To view our Partner blog, click here

VirtualQ I Transform Recap: Helping you Unlock the Value of Unstructured Data

ActiveScale

As we head into the holiday season, it’s a good time to recap some of the major things we’ve introduced this year, and in many ways our recent VirtualQ I Transform event was the culmination of a years’ worth of effort in bringing new products and solutions to market. 

In November, our VirtualQ I Transform event focused on how unstructured data growth, and unlocking the business value in this data, is transforming how our customers think about data. This event was the culmination of our launch efforts of the expanded portfolio we announced on November 10 – focused on managing, storing, and protecting unstructured data across its lifecycle. You can access all of the sessions and watch at your convenience here .

Most Popular Topics During VirtualQ I Transform

Based on the hundreds of attendees across many different sessions hosted over two days (one European event, and one event for the Americas), here are some of the topics that resonated most and generated the most interest:

Unstructured Data Sprawl: Many of our customers are outgrowing their capacity and looking at new ways to manage ‘unstructured data sprawl.’  We are helping these customers gain better visibility into what data they have, where it lives, and automate ways to move data between edge, core, and cloud environments. The new automated data movement we’ve added to StorNext 7 and ATFS can help in this regard, and were two of the key things we introduced. 

ATFS Storage Platform: There were many questions about our new ATFS storage platform, an easy to use, cost effective NAS platform with integrated data classification. This data classification is a major advancement in giving customers visibility into the file data they are storing.  With simple queries, customers can gain new insights into the make-up of their file data, where it lives, and then quickly use those insights to drive actions that can be enforced with data placement policies, data protection policies, and storage quotas. 

ATFS Trial Download: As we mentioned during our event, for anyone interested in learning more about the ATFS platform, and to start to classify the file data you have today, download a full version of ATFS here . It only takes a few minutes, and you can start to see the impact this platform can make to your NAS infrastructure.

Ransomware Protection: There was also a lot of interest in new ways and best practices to protect against ransomware, and security of data in general. This continues to be top of mind for many of our customers, and we now have options to protect against ransomware regardless of your technology choice, and regardless of the application you are using. We’ve introduced Object Lock to our ActiveScale object store software , we are adding a similar snapshot retention lock feature to our DXi backup appliances , and tape remains the most secure storage with innate air-gapped protection, Active Vault software to automatically move tapes to an in-library vault, and security capabilities like AES-256-bit encryption and multi-factor authentication.

“Forever” Data Archives: Lastly, there was continued interest in the emerging realization that much of this valuable data being created and analyzed will need to be kept forever in a private cloud. Our solutions for this “forever” data archive continue to drive interest, from object storage software with erasure encoding, to RAIL tape, to ways to index and search the content in these archives. Look for some exciting announcements in this area in the coming weeks and months. You can check out the “Forever Archive” white paper here .

Looking Ahead

We are continuing to innovate and transform our own business with a focus on solving our customers’ most pressing business challenges in the areas of video, digital images, and other forms of unstructured data, and we can’t wait for 2021! 

Watch VirtualQ I Transform Sessions On-Demand

If you missed the event, watch replays at your own pace to learn more about managing, storing, and protecting unstructured data across its lifecycle.

Watch Sessions On-Demand Now!

To view our Partner blog, click here