Running your business News
Hot Topic: Leonovus Exec On The Challenges Of ‘Cold Data’
Joseph F. Kovar
While the vast majority of data is considered cold data that is seldom if ever accessed after 30 days, it still needs to be managed according to security, compliance, and corporate mandate requirements, all of which inject complexity into the process.
The vast majority of business data is seldom if ever accessed 30 days or more once it is created, setting up a challenge for businesses to better manage that data by moving it where costs are lower while still being able to access it if needed.
That's the word from Tim Bramble, vice president of product management at Leonovus, the Ottawa, Ontario-based developer of technology for discovering and managing data across multiple media types.
Bramble, speaking before an audience of MSPs at this week's NexGen 2019 conference in Anaheim, Calif., cited IDC studies that showed that, while total data stored in 2018 amounted to about 33 zettabytes, by 2025 that number will blossom to 175 zettabytes.
NexGen is produced by The Channel Company which is also the parent company of CRN.
About 80 percent of that data is unstructured, and that portion is growing by a cumulative annual growth rate of 30 percent to 60 percent, Bramble said.
From 75 percent to 90 percent of data is what Bramble called cold data, or data that is rarely or never accessed after being stored a certain amount of time, he said.
Data cools quickly after 30 days, Bramble said. "By the time the month ends, that data is warm," he said. "Within 90 days, it gets cold. I've seen reports that only 5 percent of cold data is ever accessed again."
There are several reasons why cold data is kept, Bramble said. Some companies are data hoarders who just automatically keep everything, some do so for regulatory compliance reasons, and some do so because of corporate mandates. He cited as an example of the latter seismic data which may not be active but which might be reused for analysis in the future.
There are several reasons why cold data does not get managed properly by classifying it as cold and then moving it to lower-cost storage media, Bramble said.
The biggest reason is the complexity of managing cold data, followed by the three-year to five-year technology refresh cycles and the difficulty in predicting what kind of storage capacity is needed in the future.
"If we resign ourselves to the fact that we can't classify it, we need a better way to store it," he said.
Bramble introduced his company's Leonovus Smart Filer, a software-based tool for managing cold data. Leonovus Smart Filer analyzes file storage to determine which data is cold data, and automates the offloading of cold data to the cloud or another low-cost media based on the date last accessed on the file type. It supports multiple clouds, enables continued access of the data, and does not require any changes to end-user or application behavior, he said.
Smart Filer works by crawling the data, reporting on what data is hot or not hot, and allows clients to set policies such as migrating data after 90 days of not being accessed, Bramble said.
As data is moved to cold data storage, the software leaves a link on the original storage media so that the user can access it as if it were stored locally but with slightly more latency, Bramble said. "The cloud providers are getting really good at performance," he said.
Alan Kasmaki, CEO and founder of BizConnectors, a Newport Beach, Calif.-based solution provider, told CRN that it is important for clients to understand the concept of cold data, especially when only 10 percent or 15 percent of stored data will likely ever be accessed after 30 days of being stored.
"Of course our customers have cold data," Kasmaki told CRN. "Who doesn't?"
For BizConnectors' clients, cold data is typically retained for security reasons, Kasmaki said. "While the cloud has advantages as a place to store cold data, customers still like to keep it in-house," he said. "We have to convince them to see the advantages of better managing it."