close
Travel magazine

Maximizing the value of AI deployments requires rethinking storage strategy

For AI and machine learning technologies to do their important work, they need data – lots of data. The same goes for big data analytics. The amount of data generated and replicated was expected to reach over 70 zettabytes (for context, that’s 21 zeros after the number) last year – and that number continues to skyrocket.

The information and insights contained in this data can be extremely valuable, sometimes representing unique and often irreplaceable events. This means that this data must be stored securely, which is driving demand for storage capacity.

But while data generation is expected to increase at a global compound annual rate of 23% through 2025, according to IDC Analysts, it exceeds the 19.2% growth in global storage capacity. How are companies tackling this conundrum? This requires a new approach to storage, one that is secure, reliable and infinitely scalable. Enter quick object storage.

Let’s look at three data-intensive use cases that could benefit from such an approach.

Digging into digital pathology

Digital pathologists manipulate massive amounts of data. A single full slide image can be a gigabyte or larger in size. And that single slide, when analyzed, can generate a massive amount of data. Historically, the domain has struggled to make better use of these huge volumes of data. This will allow them to do things like automatically detect pathologies in tissue samples, perform remote diagnostics, and more. But current storage paradigms limit usage. Images with useful resolution are too large to store economically.

However, fast object storage will enable new capabilities – like image banks that can be used as a key training resource and the use of space-filling curves to name/store and retrieve multi-resolution images in a object store. It also allows for extensible and flexible metadata markup, making this information easier to find and understand.

The time-sensitive travel industry

After two years of closures and movement restrictions, the travel industry is working hard to return to the pre-pandemic era. Hence the need for a better way to apply and use data.

Consider the possibilities of where most of the world’s air travelers are going to travel next or where they are going tomorrow. That would be huge for a travel agency. But sorting through the volumes of data is a massive undertaking – around a petabyte of data is generated every day, and some of the data is duplicated by sites like Kayak. This data is time sensitive and travel agencies need to quickly discover what data is meaningful. They need a way to better manage this level of scale more efficiently.

The Safety Driven Automotive Industry

One use case that gets some of the biggest headlines is autonomous vehicles. Today’s cars are computers on wheels, and the industry has worked hard with assistive tools like lane guards, collision avoidance and the like. These tools rely on sensors, which all provide large amounts of data. And that’s not to mention the development, testing and verification of self-driving algorithms.

To get the most out of this stored data, the automotive industry needs a more efficient way to analyze the data to better understand incidents where something went wrong, organize sensor outputs into as test cases, testing algorithms against sensor data, etc. Old storage approaches can’t keep up with scale –

A new framework for AI workloads

These three use cases alone underscore the importance of being capable of aggregating and orchestrating large amounts of data related to AI/ML workloads. Today, datasets are often scaled to several petabytes, with performance demands that can saturate the entire infrastructure. To get the most out of this data, businesses must overcome storage bottlenecks and capacity limitations.

AI/ML and deep learning (DL) workloads require a new storage framework, which can data flowing through the pipeline, with both excellent raw I/O performance and capacity scaling capability. Storage infrastructure must keep pace with increasingly demanding requirements at all stages of the AI/ML/DL pipeline. This requires a fast object storage solution specifically designed for speed and unlimited scalability.

Realize business value

Use cases continue to arise from the inherent possibilities of AI and ML – use cases that can change not just the way business is done, but everyday life as well. Some of them have already been tested and found to be beneficial. However, enterprise AI initiatives typically deal with large sets of data and storage solutions that cannot handle them. Without a solution to the storage problem, the automotive, healthcare and other industries cannot implement the innovations they are working on. Fast object storage helps businesses solve the burden of big data retention so they can get the insights they need from their data treasures and realize real business value.


Written by Brad King, CTO of Scality.
# Why Flexibility is Key to Hybrid Workplaces by Jason Palmer.
# Improving women’s economic participation with new models of crowdsourcing by Tracy Garley.
# Transforming an industry through innovation by Kevin Leahy.
# The future of work: what are the rising trends in the workplace? by Janet Candido.

Follow the latest news live on CEOWORLD magazine and get news updates from the United States and around the world. The opinions expressed are those of the author and not necessarily those of CEOWORLD magazine.


Follow the headlines of CEOWORLD magazine on Google News, Twitter, and Facebook. For media inquiries, please contact: [email protected]

Cory E. Barnes

The author Cory E. Barnes