TechKrunch: Using Komprise Data Analytics and Data Modeling

In this series of posts I’ll provide a summary of our TechKrunch sessions, which are designed to be informal, interactive sessions with our technical gurus on the topics that matter most to our customers, partners, and prospects.

In this session Randy Hopkins, VP of Systems Engineering at Komprise, and Glenn Speer, Senior Systems Engineer, discuss and demonstrate how we use take an analytics-first approach to quickly provide cold data visibility so you quickly understand what you have and make better decisions on how to offload expensive network attached storage (NAS) data.

Glenn jumps right into the Komprise demonstration, which starts with an out-of-the-box visualization of your hot and cold data.

At a glance you can see that 50% of your data is cold – in other words it hasn’t been accessed in over a year. You can easily change policies to move data, leave links behind, allow users to transparently access data that hasn’t been accessed in over a year or two years, etc. The focus of this view is on overall cost savings and setting global policies. But for most enterprise organizations, making global data movement changes requires special permissions and is no easy task. With Komprise, you can get much more granular quickly to get a fast return. For example, you have arrays that are getting full and you need to move data out quickly. With Komprise you can find that data and move it without setting global policies.

In the Usage tab, you see many more attributes of the data beyond age – location, top shares, top directories, ownership, file types, etc.

So, what do you do next? Deep Analytics. Quickly create a custom query to dig into the issue you’ve identified with log data.

From there you can dig into where the data resides and take action.

In the next demonstration, Glenn starts with a visualization of space consumed by your top shares. You can delete it. You can archive it. There are lots of options with Intelligent Data Management from Komprise.

He then looks at space consumed by top owners and talks about the common challenge of Zombie Data – data owned by users no longer with your organization. Quickly archive and free up space right away.

Another common example is virtual machines that are no longer being used. The screenshot below shows a hot VM data example, but that’s not usually the case. Someone has used an OVA to stand up a lot of systems. They left the OVA out on the NAS device and forgot about it. Sound familiar?

And finally, Glenn was asked about just being able to see how much of your data is hot. I’m thinking of moving to an all flash storage platform. When you’re buying a brand new, flash array, you don’t want to put expensive cold data on it. Many of our customers use Komprise to migrate just the hot data with links to anything cold to their brand new all-flash array. In a future TechKrunch session the gurus promise to cover tiering.

You can check out the video below on our YouTube channel (be sure to like and subscribe) or on BrightTalk. Next week’s session is How to Access Archived Data in the Cloud.

Next Steps: