By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

TDWI Articles

Speeding Time to Value: How a Common Data Model Allows You to Do More with Less

Extracting insight from the masses of data inundating organizations is easier with a common data model. Here's an overview of its benefits.

Regardless of industry or department, the objective for most business users today is strikingly the same. Nearly everyone wants to exploit data to optimize and improve business outcomes, whether that involves increasing revenues, decreasing costs, or achieving other goals.

For Further Reading:

Anatomy of a Data Strategy: From Operational Intelligence to Artificial Intelligence

Modernizing the Logical Data Warehouse

Modernization Projects Will Dominate Data Management Through 2020

With the large quantities of data currently available -- the majority of which involve semistructured or unstructured data sets -- enterprise users frequently encounter three distinct challenges when mining data for business value. These are determining:

  • Is there any meaningful insight?
  • If so, where is it so the business can regularly leverage it?
  • Is the right data getting captured for these insights?

Most organizations answer these questions with approaches that vary according to department or, even worse, individual user. This means finding insights is extremely ad hoc and oftentimes not repeatable; consequently, it's not always sustainable to consistently realize defined mission-critical outcomes.

Even traditional tools such as CRM and ERP don't necessarily solve these problems. Although they have set mechanisms for the business to easily deploy them at affordable prices, the art of insights requires highly specific processes for relevant results. Typically, every company uses these tools differently, essentially reinventing the proverbial wheel.

A better approach to overcoming the challenge of extracting insight from the masses of data inundating organizations is with a common data model. Equipped with a set of reusable concepts, entities, and processes, this method streamlines analytics pipelines, clearly elucidating what data is necessary -- and where insights lie -- to enhance business performance for tangible results.

Predefined, Uniform Modeling

A predefined common data model maximizes the data deployed for (and intelligence gained from) analytics in a number of ways. Its chief benefit is the ability to standardize data across variation, most notably differences in structure. Solutions employing this approach work on unstructured, semistructured, and structured data. This standardization benefit extends to any other distinctions between data assets at the point of origination. Implicit in this harmonization of data assets across the enterprise is a unification of the language or meaning of data. These semantics are largely defined in relation to industries such as pharmaceuticals or finance.

Moreover, they produce a core set of reusable entities and concepts that are necessary to deliver meaningful analytics. In this way, the common data model approach is similar, yet ultimately superior, to that of ERP and CRM because it's applicable to any type of data. As a result, business users can employ data across the enterprise for their specific objectives, such as analyzing clinical trial data and other appropriate sources to reduce the time spent bringing new drugs to market. Furthermore, they can accomplish this or any other task in a predictable, easily reusable way.

Codeless Data Engineering

There's also a self-service aspect to leveraging industry-relevant, prebuilt common data models that's an integral part of accelerating business processes for determining insights. End users can avail themselves of data across all their systems with the best-of-breed approach that's currently popular with most organizations and engineer the data for analytics without writing any code. The codeless engineering capabilities afforded by a common data model substantially reduce the cost and time of hiring and maintaining expensive data scientists and/or engineers typically responsible for integrating the disparate data that produces the most meaningful predictive analytics.

AI techniques involving knowledge graphs, machine learning, and natural language processing can extract relevant concepts from data and map them to the target model. These technologies improve over time to learn which concepts are most helpful for specific use cases. They're also responsible for the heavy lifting of actually moving this information to the predefined target. By automating the majority of the efforts needed to assemble different data into a uniform model, AI greatly reduces time to action to allow the business to focus on capitalizing on analytics results instead of laboring to produce them.

An Analytics Foundation

The business value derived from creating a common data model includes several different dimensions. First, it enables users to get a single view of different data resources assembled for a particular use case. Next, it reduces the time dedicated to data preparation by leveraging one data model instead of building different models for each application or use case. Most importantly, this method delivers a foundation upon which to translate data into profitable action for end users.

AI's repeated mappings reduce the time spent getting the relevant concepts from diverse data into a single model that provides the basis for subsequent analytics. As is the case with any ERP or CRM system, you can also customize these procedures for individual use cases.

Even so, you're still using the same model rather than rebuilding multiple models every time business requirements or source data changes. With this approach, organizations get a fundamental set of well-understood concepts that provide the insights they need. For example, when analyzing the effectiveness of medication on patients, the common data model method can quickly extract attributes such as patient name, provider name, when patients collected their prescriptions, and how they responded to the medications. These basic concepts are swiftly identified and extracted and are readily reusable for any analytics pertaining to these healthcare needs.

Perfecting Business Insights

The customizable characteristics of a common data model are some of its most exciting and useful to the business. By quickly identifying entities from different types of data, the shared model approach enables end users to understand critical patterns for how their units are actually exploiting insights from their data. This information tells them what intelligence is gleaned from their sources and provides a blueprint for other data types that can be captured to enhance them. This knowledge is pivotal for planning and refining analytics throughout the enterprise -- as well as understanding just where your organization stands today.

About the Author

Digvijay "DV" Lamba is the founder and CEO of Lore IO. Prior to founding the company, DV was a distinguished architect at Walmart Labs and held senior leadership positions at Kosmix, acquired by Walmart, and Andale. You can reach the author via Linkedin.


TDWI Membership

Accelerate Your Projects,
and Your Career

TDWI Members have access to exclusive research reports, publications, communities and training.

Individual, Student, and Team memberships available.