Building and Protecting the Value of Data Assets in 2024
Data has never been more important. Here are three ways to increase the value of those assets in the New Year.
- By David Stodder
- December 20, 2023
The data landscape is changing fast, opening enormous opportunities for insight, innovation -- and risk. Data’s importance to decision-making and data product development makes it imperative for organizations to treat data as a strategic, business-critical asset. Here are three trending practices and technology developments critical to increasing the value of data assets and activating analytics and data sharing in 2024.
Trend #1: Organizations will focus on new technology and practices for modernizing data integration
Data integration processes, including data pipelines, are a broad category of practices and technologies that bring raw data from sources and refine it into valuable, integrated data assets. The expanding data landscape is putting pressure on organizations to solve data integration problems that hinder decision-makers from gaining actionable insights and making faster decisions.
In 2024, we will see an acceleration in the use of AI-infused automation to overcome common data integration problems such as poor data quality, slow performance, and users’ inability to gain single views of relevant data. Organizations will also pursue alternatives to heavy data movement, replication, and copying, which are often the cause of data latency, higher costs, and complexity. Addressing these problems is a driver behind the trend toward consolidation into unified data platforms such as data lakehouses. These can reduce the need for data movement, such as between data transformation staging areas and target platforms.
In other cases, organizations are setting up data virtualization layers and developing data fabric architectures. Growth in hybrid multicloud data environments is accelerating interest in data virtualization and data fabrics as organizations evaluate how to establish a virtual, location-independent layer above multiple data platforms. These solutions rely on semantic richness based on metadata and additional data intelligence, often managed by a data catalog.
Finally, APIs are becoming more popular for connecting data-driven applications and enabling user data sharing and interaction. However, if not developed according to standards, APIs can repeat the difficulties of legacy “spaghetti code,” enmeshing users in technical issues and limiting functionality. Standardization of API connectivity will be an important trend in 2024.
Trend #2: Users will gain more integrated access to historical and real-time data
Traditionally, legacy-managed reporting has limited users to sections of historical data. To understand what has occurred and perform comparisons across different time periods, users typically must wait until the data goes through a transformation and then lands in the data warehouse. Organizations today want to use the power of scalable cloud data platforms to modernize transactional BI reporting and analytics by reducing data latency and enabling analytics on the combination of historical and real-time data.
Modern cloud data management platforms are enabling users to take advantage of scalable processing and faster networks to gain near- or true real-time data refreshes of historical data and broader access to diverse contextual data. Newer technologies will enable analysts to query transactional data to gain real-time visibility into current orders while simultaneously making comparisons through analysis of large volumes of historical data.
By providing combined access capabilities, organizations could avoid having to set up a separate operational data store, which many organizations traditionally use to offer access to snapshots of the most recently updated data. This usually involves configuring a separate platform just like an ETL staging area as well as maintaining a pipeline to sync transactional and analytical data. Evolving cloud data platforms that manage historical and real-time data streams in an integrated fashion will allow organizations to simplify data management and open up the potential for more timely analytics.
Trend #3: Technologies will enable data governance to be more automated and agile
Data governance is essential as direct relationships with customers require organizations to store, share, and analyze customer and consumer data, which brings along the need to adhere to regulations. Data governance policies, rules, and processes must protect financial data, intellectual property, and other types of sensitive data as well. Good practices for data governance can fall by the wayside as organizations focus on business expansion and competing in the marketplace, but this eventually creates trouble, including higher costs.
AI-driven automation is modernizing how organizations practice data governance. With automated data intelligence, organizations can establish effective data governance without requiring users to learn the technical complexities of applying data governance rules in data pipelines, data sharing, and application development. Modern technologies have internal, AI-driven augmentation that provides users with automatic recommendations and constraints. In 2024, we will see growth in the use of AI systems that interpret user patterns to proactively suggest trusted data sets and provide data intelligence relevant to their requirements.
Data stewardship plays an essential role in data governance, but traditional data stewardship is heavily manual. Data stewardship can be inconsistent across functions; locating and hiring stewards adds administrative overhead. More modern, automated, and user-friendly data intelligence solutions reduce dependence on manual data stewardship and can help lower administrative overhead.
Evolving technologies in 2024 will enable organizations to govern data effectively largely with automated data intelligence tools and systems. For data stewards, modern data intelligence tools relieve them of manual chores so they can concentrate more on helping users with “human” issues, such as data literacy and using data productively in the context of their roles and responsibilities.
In Closing: Don’t Neglect Data Literacy
With the expansion in data democratization and self-service functionality, traditional data cultures are changing dramatically. Capabilities that once were only available to technically proficient business analysts, data analysts, and IT developers are now in the hands of data-hungry business users. Generative AI adoption will only intensify this hunger.
However, tools alone do not produce insight; human factors matter. Organizations should not overlook the importance of data literacy, which is about enhancing training and mentorship to increase users’ proficiency in understanding what data means and their ability to communicate and share analytics insights. A second goal of data literacy is to increase users’ accountability for how they collect, integrate, prepare, and protect data. Thus, in closing, along with technological progress 2024 needs to be a year in which organizations make progress in human dimensions such as improving data literacy.
David Stodder is director of TDWI Research for business intelligence. He focuses on providing research-based insight and best practices for organizations implementing BI, analytics, performance management, data discovery, data visualization, and related technologies and methods. He is the author of TDWI Best Practices Reports on mobile BI and customer analytics in the age of social media, as well as TDWI Checklist Reports on data discovery and information management. He has chaired TDWI conferences on BI agility and big data analytics. Stodder has provided thought leadership on BI, information management, and IT management for over two decades. He has served as vice president and research director with Ventana Research, and he was the founding chief editor of Intelligent Enterprise, where he served as editorial director for nine years.