At the midway point of 2021, it’s a time of reckoning for human progress. The Covid-19 virus has abated dramatically in some countries and regions, while dangerous variants persist in Asia, Latin America, Africa and parts of Europe. There is still troubling uncertainty for business and personal life, yet human inventions march on.
Elon Musk’s SpaceX venture announced plans earlier this year to send the first-ever all civilian crew into the ether later this year. Other technology advancements across personal gadgets and enterprise computing have progressed without delay in the past 18 months; we’d all have had a far worse time without connectivity and our personal devices during the pandemic. In a recent Future article, Marc Andreesen talked about the impact of technology on how organizations fared during 2020: “Much of the economy kept operating, and in fact many parts of the economy started operating even better under lockdown than before.”
This leads to the first of the headlines for our industry update today:
Selling the code that changed it all.
Can you even remember the time before the invention of the Internet? It’s fuzzy or non-existent, depending upon your age. This is why Sotheby’s late June auction of the early version of the World Wide Web source code for $5.4 million with fees, seems paltry if you consider the unquantifiable impact of the internet on our world today. To compare, a 500-year-old painting of Christ believed to have been painted by Leonardo da Vinci was sold at Christie’s in New York for a record $450 million in 2017.
Now let’s take a look at the latest happenings in the world of data management and storage:
It’s the data, stupid.
This article in Datanami explores the emerging big data conundrum: it is not so much the technology, as machine learning and AI applications and the infrastructure to run all of it has advanced greatly.
Nowadays, what CIOs and data science officers need to focus on is how exactly to sort, manage and leverage the massively large and complex volumes of data: “The average number of data sources used by organizations is 27, with a high of 90, according to a recent study by Precisely.
About 75% of the chief data officers (CDOs) surveyed said that dealing with multiple data sources and complex data formats is very or quite challenging, the author reports. Another survey cited in Datanami found that 96% of data professionals are at or over capacity. Automated tools for security, privacy, backups, governance, cleansing and analytics will be part of the answer to help data management and data science teams get ahead.
Back it up to the cloud.
This TechTarget article discusses the increasingly common use case of NAS backups to the cloud, with some advice on different ways to go about it. The author also goes into a few of the drawbacks of file backups, including data residency. When a customer can’t be certain that a cloud service provider won’t store [their data] within restricted areas, they will revert to control the storage themselves, which often means going back to tape or setting up private cloud, according to Fred Chagnon, principal research director at Info-Tech Research Group, who was interviewed for the article.
Chagnon’s second warning is about data egress costs, saying: “It’s like a toll highway for your data; it’s free on the way in, but you pay on the way out.”
Read how Komprise mitigates those cloud egress fees by enabling direct access to data stored in the cloud.
Speaking of backups, here’s a new book.
Blocks and Files penned a review of “Modern Data Protection — Ensuring Recoverability of all Modern Workloads,” by W. Curtis Preston, aka Mr. Backup. The book aims to explain what has become an increasingly twisted area of data management, as summarized by the reporter: “The complicating rot started with the public cloud, extended to SaaS application providers like Salesforce, and then went bananas with containerised applications and backup-as-a-service. All of a sudden the backup sources multiplied and the backup targets did likewise.” Never a dull moment in storageland!
Need a quote for your cloud move?
SearchCIO drew upon the advice of John Burke from Nemertes Research to help IT leaders understand the costs of cloud migrations, which include preparing for the migration, the costs of cloud migration itself and the post-migration operating costs of the migrated workloads. There is quite a bit to consider here, from staffing, training and new tools to disaster recovery and the need to maintain dual environments for an indefinite period of time. Chew on this: “On average, enterprises spend about 12% more to run a workload in IaaS than they do running it in their own data center. That average encompasses both workloads on which they manage to achieve great savings and those on which costs double or triple when compared to the cost of providing the service themselves. The more work that is lifted and shifted without modification, the more likely it is that the CIO will have to budget for cost increases.”
Moving to the cloud is still the right tactic in many cases and for many reasons, despite the cost uncertainties. IT leaders will need to manage the balance of organizational gain and potential cost increase in some areas to achieve the long-term goals possible by moving away from capital-intensive, non-agile ways of managing IT in the on-prem world. It’s wise to deeply analyze which cloud migration options can incur more costs than others. Read more here.
The latest from Komprise
In June, Komprise released the latest version of its platform, which focused on global management with multisite controls. Read all about Komprise Intelligent Management 4.0 here in the blog. As part of our coverage for the release, The Next Platform interviewed our COO and president Krishna Subramanian, who delivered a high-level overview of the data management challenges we are working to meet on behalf of our customers. Krishna was also honored by the Silicon Valley Business Journal in June, as a Top Woman of Influence in Silicon Valley. See how else the media is covering us on our News Page.
On July 22, we’ll be meeting with data storage experts from AWS and Pfizer to learn how Komprise helped Pfizer stop 20 years of increasing storage costs and leverage the data tiered to AWS for research, all without changing how users and applications access their files. Register here.
Also this month, get ready for another fast and fun demo with Komprise product experts, whom will deliver a short demo of selecting and running data migrations and show how you can achieve 7-25x performance with your migrations. Register here.
You can peruse other technical sessions on our Events page, including “Cloud Data Migration and Cloud Data Tiering: Know Your Choices,” and “Cloud-to-Cloud Data Replication.”