Get Enterprise Data to the Cloud Without Losing Your Mind

faster-migrations-img

It was just a few years ago, post-Covid, that cloud repatriations were the talk of the town. But the cloud is hot again and we can thank widespread cost savings mandates and AI acceleration for that.

  • Recent research from Flexera found that only 21% of cloud workloads were repatriated in the last year and organizations will spend 28% more on the cloud over the next year.
  • Gartner backed up cloud popularity with its prediction that public cloud services will reach $723.4 billion this year, up from $595.7 billion in 2024.

Yet according to Flexera, the reputable chronicler of cloud trends for more than a decade, organizations are exceeding their cloud budgets by 17%. Overspending isn’t surprising, given the ongoing complexity in executing large-scale cloud data migrations. Komprise Field CTO Benjamin Henry wrote about this topic recently in CloudTweaks.

How important are cloud data migrations for enterprises today?

Benjamin Henry: These days, with so much data being generated in enterprise by countless apps and systems, IT leaders need to adopt new storage infrastructure technologies frequently for cost, security, governance, performance and AI needs. Hybrid cloud is a common enterprise architecture to meet these diverse needs and retain flexibility, but that is also driving up complexity. Cloud migrations are at the center of many IT strategies today, but they can also result in performance bottlenecks, data loss and delays, compliance concerns and poor ROI.

What makes migrating unstructured data such a pain in the neck?

BH: Unlike structured data that fits neatly into databases, unstructured data lacks a consistent schema is spread across many locations and different environments and is used by various departments for diverse purposes. This makes large-scale migrations more than just a “lift and shift” operation.

Massive file counts and large volumes of small files can overwhelm traditional scanning and indexing processes, causing unexpected delays.
Network interruptions, file locks, or system errors can derail transfers and result in data loss.
Legacy tools or free utilities often fail to scale beyond a few hundred terabytes.
Limited insight into data usage means that IT may misplace cold data in expensive cloud storage or, conversely, store frequently accessed data in slower tiers.

What’s the first step IT leaders should take?

BH: In large, distributed organizations, data is often spread across silos, legacy storage, cloud storage, and even servers under desks or in closets, which makes full visibility elusive. Run a full discovery to understand what data you have, where it lives, who owns it, how often it’s accessed, and what types of files you’re dealing with. That helps determine what needs to move, what should be archived, and what can be deleted. It’s also the foundation for setting priorities and building a phased migration plan.

What are some risks that companies overlook?

BH: The loss or corruption of metadata, file permissions, timestamps and access control lists is common. Those are often stripped or mishandled in basic tools. Another is underestimating the load on bandwidth, conflicts with network and security configurations and the time required to move billions of small files. Even something as simple as cut-over timing can cause issues. If users are still modifying files during the switch, you’ll have versioning conflicts or data loss.

What features should your migration software include?

BH: You want more than a copy and move tool. You need the ability to scan and index all unstructured data and then categorize it based on access frequency (hot, warm, or cold), file types, data growth patterns, departmental ownership, and sensitivity (e.g., PII or regulated content).

Look for:

• The ability to detect and confine sensitive data prior to a migration.
• File-level tiering to right-size your migration and save on storage costs.
• Full preservation of permissions and metadata.
• The ability for users and applications to access data during a migration.
• Included tools to proactively identify potential bottlenecks and other issues that derail migrations.

A successful cloud migration avoids costly delays and data loss, while ensuring that your data is placed in the right storage tier the first time around.

Are there tools built specifically for this kind of migration?

BH: Komprise Elastic Data Migration is an enterprise-grade migration solution that includes deep analytics so you can plan migrations intelligently. The solution runs highly parallel transfers and uses dedicated WAN channels to avoid issues and delays with transferring large volumes of small files. It automatically preserves permissions and metadata tags, maps shares and retains data integrity. That kind of specialized tooling reduces manual work and is up to 25 times faster than legacy tools like Robocopy or Rsync. Read the latest update on Elastic Data Migration.

springdemo_websitefeaturedimage_1200px600

What other best practices can you share to maximize the ROI of cloud data migrations?

BH: Data migration isn’t a one-time event. You want tools that will deliver ongoing data lifecycle management, tiering aging data from high-cost hot storage to cooler, more affordable tiers as it becomes less relevant. By continually optimizing data based on its current use you have more predictable ROI and you can better leverage AI tools in the cloud with cloud-native data sets. You can set it and forget it with policy-based automated tiering.

Getting Started with Komprise: