There’s no lack of research on data and cloud these days, but what to make of all these surveys? We gathered some of the latest data points and found some conclusions to help IT and data storage leaders tackle difficult storage and unstructured data management decisions in the coming year.
Welcome to the data economy
Data runs our world, delivered by apps, devices, websites and snappy digital processes. But is it really working? Consider for a moment how much unstructured data we are generating and then compare that to how much is actually being used for analytics and decision-making — which by some accounts is less than 5%.
By now, we’re all familiar with the oft-cited statistic from IDC Research that data volumes will grow from more than 70ZB currently to 175ZB by 2025. Keep in mind: this is just three years from today. Data generated at the edge is a notable contributing factor. In 2022, the market for the Internet of Things is expected to grow 18% to 14.4 billion active connections and by 2025, IoT Analytics predicts there will be 27 billion connected IoT devices.
You can’t see data piling up like you might see garbage in a landfill.
Data growth is a strangely invisible phenomenon — yet when it comes to the impact on IT budgets and staff, the pain is real every day. Also, the lack of data visibility and complexity in managing and analyzing unstructured data is a lost opportunity.
Let’s take a look at some of the salient data points on data and data storage for some context:
According to research by Dell, 43% of IT decision makers fear their IT infrastructure won’t be able to handle future data demands. Naturally, this concern is driving data migrations to the cloud. Cybersecurity Ventures predicts data stored in the cloud will reach 100ZB by 2025, or 50 percent of the world’s data at that time, up from approximately 25 percent stored in the cloud in 2015. With so many business applications now running in the cloud, thanks to SaaS maturity, moving unstructured data out of corporate data centers and to the cloud is the next logical step in enterprise cloud computing.
Of course, a cloud data migration is not a foolproof cost-saving strategy, given the many variables at play — from selecting the right cloud storage to continually moving data to the right place, avoiding excess egress fees and getting rid of cloud resources when they are no longer in use.
Respondents to the Flexera State of the Cloud 2021 Report estimate that their organizations waste 30 percent of cloud spend. Unsurprisingly, the Flexera study reports that optimizing the existing use of cloud (cost savings) is the top initiative for the fifth year in a row, followed by migrating more workloads to cloud and better financial reporting on cloud costs.
Equally troubling is the lack of visibility as data growth explodes. Less than half (45%) have a strong understanding of data generated outside their team, according to IDC’s “Data Literacy: A Foundation for Succeeding in a Data-Driven World.”
This lack of data visibility is not trivial, and it can hamper many initiatives — from security and compliance to decision-making, customer experience and innovation. A recent Accenture study revealed 68% of companies are not able to realize tangible and valuable benefits from data. Considering how much time and money is spent managing and storing all of this data, this is a poor ROI.
Yet those organizations that can harness data, manage it appropriately across silos and understand it at a granular level can best leverage it for success. Organizations which take a data-driven approach to decision-making grow more than 30% annually, according to Forrester.
Ten years ago, the cost and complexity of running big data analytics was a nonstarter for many organizations. But that is changing quickly, now that cloud-based AI and ML technologies have matured as well as data lakes and data lakehouses. The availability of more automated, scalable, affordable analytics platforms means that there is a lower barrier to entry and higher overall chance of success for many enterprises taking the leap into big data. No-code AI is no joke.
Here’s our analysis of what this research means for IT leaders striving (or being pressured from above) to monetize data and support departments in data-driven initiatives:
- IT organizations don’t have the infrastructure to keep up with the data deluge — they’ve got data swamps and “dark data” silos plus out-of-sight storage technology bills.
- Cloud data migrations are a smart idea to cope with strained data center storage capacity — yet there is far too much waste and not enough intelligence on best practices.
- Leading organizations want to leverage data to disrupt their markets and drive customer loyalty — Yet despite the plethora of advanced AI and ML services and technologies for rapid analysis and processing, data is stagnant and merely clogging the drain.
- If you’re managing data like you always have, it’s time to rethink that approach — With continual analytics on your data, you can understand it better, rather than treating it all the same. You can also find and move the right data to modern big data tools and analytics services — the tools which managers and researchers crave.
Enter Komprise. Komprise was founded to help organizations take control of their unstructured data to both save money and make money.
Komprise Intelligent Data Management delivers:
- Continual analysis of NAS and object data across on premises, edge and cloud storage, so you can move data to the right place at the right time for maximum data storage cost savings.
- Cost modeling so that IT can compare the data storage cost savings of moving data to different tiers of storage.
- Smarter cloud data management: Intelligent tiering of cold data to low-cost secondary storage by policy, retaining active or “hot” data on primary, high performing storage in the data center or the cloud.
- Smart Data Workflows and the Global File Index create automated workflows for all the steps required to find the right data across your storage assets, tag and enrich the data, and send it to external tools for analysis.
- Zero user disruption, full file fidelity and direct, native access to moved data.