By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

TDWI Articles

Executive Q&A: The Critical Role of Reporting in Trimming Storage Costs

In good times and tough times, IT managers are always looking for ways to trim costs. With ever-growing data volumes, managing data costs is critical. We spoke with Komprise COO and President Krishna Subramanian about how to get better control of these costs.

Data managers are always looking for ways to reduce costs, especially for data stored in the cloud. What metrics can help data managers manage storage costs most effectively? We asked Krishna Subramanian, COO and president of Komprise, for her suggestions.

For Further Reading:

Data Storage Trends: A Look Ahead

Navigating the Lines Between Container-Native and Container-Ready Storage

Executive Q&A: Software-Defined Storage

Upside: How have storage reporting and analysis needs changed over the last few years with cloud, AI, etc.?

Krishna Subramanian: Storage metrics used to be very standard and limited, primarily related to hardware performance. These measures included latency, IOPS, and network throughput; uptime and downtime per year; RTO (recovery time objective) and RPO (recovery point objective); and average time to perform a backup.

Today, metrics have changed to become more data-centric because data outlives any particular storage technologies with cloud and hybrid infrastructure. Most enterprise organizations have several storage and backup technologies in place across their environment and petabytes of unstructured data. It’s important to mobilize and manage that data to optimize data protection, cost savings, and potential value.

On top of that, storage managers need to know more about the data itself (such as file types and sizes, owners, and any other identifying metadata). This knowledge helps data stakeholders across the business find the data they need, at the time they need it, to fulfill analytics initiatives -- especially AI and ML, which require massive amounts of the right kind of data.

What are the top reports or metrics that data storage people need today to help keep up with these trends?

I’d say there are seven important metrics you need to know to manage storage effectively.

  • Storage costs for chargeback and showback: This information helps IT align with departmental power users and data owners to manage data more cost-effectively.

  • Data growth rates and top data owners: This gives insights into which groups and projects are growing data the fastest so IT can plan appropriately and introduce new policies when needed to automate data movement to archival storage as it ages.

  • Orphaned data and duplicate data: These data sets should be deleted altogether rather than consuming expensive storage space.

  • Access patterns: Understanding when data was last accessed and how many people are accessing it can help inform the appropriate data storage strategy.

  • PII and other sensitive data: The ability to search for data tagged as PII or with financial system file extensions can ensure that those data sets are secured and stored properly for data protection and compliance.

  • Sustainability: Measuring how much duplicate data has been reduced and how much data is stored on inefficient legacy storage solutions can help lower data center energy costs and meet sustainability mandates.

  • Cost savings: How much can we save with an effective data management strategy? How can we track ongoing savings once we implement a data management strategy? This ongoing visibility is critical as companies find they are spending 30 to 50% of IT budgets on data storage and data management.

How can storage people get this data? What metrics are available from their storage solutions and from reports they already get with their cloud subscription? What additional data is provided by commercial solutions?

Storage solutions often report on the efficiency and performance of the storage system, and they provide some visibility into the data that sits within the vendor’s storage. However, that visibility is often limited to what is on that storage vendor’s environment. Most organizations are heterogeneous and have a mix of storage vendor solutions both in their data centers and in the cloud.

Cloud providers give visibility into costs within the cloud and opportunities for savings. Data management solutions are technology-agnostic and show metrics on data usage, data growth, departmental showback, data costs, orphaned data, and data governance across the entire IT infrastructure. Therefore, storage people should use storage- and cloud-specific reports to optimize performance of these environments and commercial data management solutions to get overall visibility and metrics on their data.

Why is it important to be able to share these reports with people outside of storage? What do business managers or executives really need and want to know?

Departmental IT owners/managers need to understand their data profiles, usage patterns, and costs so they are informed when working with central IT to determine what additional capacity they need and which data sets can be archived or deleted to save money. IT and business executives need to understand top-line metrics about cost and data growth to help guide IT budget and infrastructure planning strategies along with data analytics initiatives.

Data storage is no longer a back-office activity but central to the organization regarding productivity, risk and security management, competition, and customer acquisition and retention. We are seeing the role of storage admins evolving beyond hardware to delivering data services and cost optimization strategies to the business. This is possible with a mature, unstructured data management strategy.

What are the greatest issues for IT and storage managers when there is a lack of visibility or metrics on their data? Are you talking about “lack of visibility” to mean unable to share or the inability to get these metrics at all?

The answer is both. First you need the right tools to get the metrics. Second, you need to be able to easily share reports with key stakeholders in IT and the business to help align viewpoints. This will help you save more and ensure that data lives in the right place at the right time for different use cases and requirements. Insights about data in storage are key to reducing the costs of managing petabytes of unstructured data, increasing compliance and security, and uncovering new value from hidden or underused data assets.

How does metadata play into all this? Why is it important?

Metadata brings structure to unstructured data and delivers more information and context about data to guide cost savings and compliance decisions. Metadata can also help data owners and stakeholders find key data sets faster and move them to the right location for projects such as AI and ML in the cloud. This capability grows if you can easily enrich metadata with custom tags to further identify it, such as demographics, project name, and so on. Global search, data workflows, enabling new applications, feeding AI training models, and harnessing data value are some of the benefits of leveraging metadata.

[Editor's note: Krishna Subramanian is COO, president, and co-founder of Komprise. In her career, Subramanian has built three successful venture-backed IT businesses and was named a “2021 Top 100 Women of Influence” by Silicon Valley Business Journal. You can reach the author at via email, Twitter, or LinkedIn.]

TDWI Membership

Accelerate Your Projects,
and Your Career

TDWI Members have access to exclusive research reports, publications, communities and training.

Individual, Student, and Team memberships available.