Accelerating M&A with Modern Data Management
We discuss rapid pre-merger analytics and post-merger integration in the cloud.
By Ramon Chen (VP Marketing, Reltio) and Neil Cowburn (CEO, iMiDiA)
According to Ernst & Young (EY), the last few years have seen a large increase in the number of mergers and acquisitions, a trend the firm predicts will continue. Because companies must maintain leadership products in so many new areas, acquisitions are increasingly viewed as a necessary activity rather than an occasional event. A recent survey by KPMG found that 75 percent of U.S. businesses expected to undertake two or more acquisitions in the current year. For such a strategy of regular M&A to be effective, however, acquirers must become agile at pre-merger analysis and efficient post-merger integration.
One of the most challenging obstacles to both activities is the inability to get clean, reliable, relevant data in a timely fashion from the IT systems of both parties -- much less analyze it within the legal and time constraints of the pending transaction.
A new groundbreaking way to bring together critical data from both parties in a secure and controlled environment is to use a cloud-based modern data management platform built upon a big data foundation. Key to this model is the use of graph data technologies similar to those employed by LinkedIn, Google, and Facebook that enable efficient analysis regardless of format or source origination. A hybrid of columnar and graph technology provides unlimited flexibility when compared with traditional, relational row-and-column databases. This flexibility makes it possible to quickly reveal business relationships and correlations across disparate datasets that are crucial to projecting benefits and costs of an M&A transaction.
In this design, granular security and visibility controls allow each company to have its own cloud workspace; information is combined into a "clean room cloud" for auditors to assess synergies and overlaps. To facilitate the convergence of data, seamless master data management (MDM) built into the cloud platform is used to clean, enhance, deduplicate, and uncover relationships across hundreds to thousands of data sets and attributes.
In a post-merger scenario, the consolidated data forms the basis for the deployment of new data-driven enterprise applications, and is pushed back down to data warehouses and legacy systems in the operational divisions of the new merged company.
A modern data management platform also provides compliance and governance features through deep auditability: the history of every data change to every attribute in the combined repository can be inspected at any point in time to see how it has grown and evolved over time.
This article examines the components of a modern data management platform in greater depth with special emphasis on how they accelerate pre-merger analysis and post-merger integration.
The Data Challenge of M&A
As acquisitions have become a standard practice for enhancing competitive advantage in the marketplace, organizations have been forced to become familiar with the core data challenges of evaluating and later executing an M&A transaction. Few companies have become proficient enough at acquisitions that they can do them efficiently and consistently to realize the benefits and savings of the original vision. For most companies, each M&A cycle from pre-merger through post-merger is replete with one-off activities based on the unique ways in which organizations store their data.
Pre-Merger
Prior to the merger, the parties enter a complex due-diligence phase in which data integration is planned out across multiple internal organizations to enable both parties to analyze the effects of the proposed transaction. Despite attentive planning, the parties often find that they must deal with highly malformed databases that reflect legacy events, such as previous regulatory constraints, earlier acquisitions, and so forth. There are inevitably duplicate databases with conflicting records.
Unfortunately, even companies that use enterprise-grade applications and databases are prone to maintaining parallel data stores that reflect usage by different departments -- hence, they are incomplete, out of sync, and frequently out-of-date.
Finally, data kept in spreadsheets and other non-standard database formats must be included in the due diligence.
These familiar data quality issues present a two-fold challenge: consolidating databases and cleaning the data records -- two practices at the heart of MDM -- so that customer lists, dealers/vendors, and products can be compared and analysis made of the value and the extent of overlap between the businesses. Even though companies have tried to use MDM solutions to address these two challenges, the heavy, resource-intensive, and cost of legacy, on-premise MDM offerings make them less than ideal.
Because of the special demands of M&A, there is an additional challenge: the resulting clean data must be made available in a tightly controlled environment, with ironclad security, errorless data access enforcement, and full auditability.
Until now, there has been no data management platform or application that met the needs of complex M&A analysis, so parties often manually abstracted data into spreadsheets -- a laborious, error-prone path that provides no form of validation, security, or auditability.
Post-Merger
Presuming the merger is green-lighted, many companies end up discarding the data they've prepared for the due-diligence stage, and thereby throw away a significant investment in time and resources. This lack of reuse occurs because the information is mainly kept in spreadsheets that reflect a snapshot of the data at a point in time and are not kept up to date. Therefore, they cannot be readily used for any purpose other than the pre-merger activities.
Because speed of integration is key to realizing the benefits of any successful acquisition, the loss of the cleaned, integrated data is a costly missed opportunity that resets the company's data integration work to zero.
Should the merger not be consummated, both parties have a significant sunk cost in any hardware infrastructure procured.
Lack of Deep Analytics
The work of M&A is typically performed under hectic conditions and stringent deadlines. The data analysis, therefore, is often limited to what is needed to make a go/no-go decision and to estimate costs and challenges that might arise post-merger. Questions typically fall into narrow, predictable ranges -- searches for synergy and overlap. Rarely is there time for pure exploratory analysis that searches for correlations and unexpected relationships in the company's data.
For example, two companies might have complementary products and discover during due diligence that they have 60 percent overlap with two product lines. Further analysis might reveal that along the Gulf Coast, the overlap is only 20 percent, but it would take further analysis to see that almost all of that 20 percent have not upgraded their products from company A while the majority of those have regularly upgraded company B's product. This might lead to analysis of distributors in the region with comparison of sales activity of other distributors during the same time period. Such analysis can greatly fill out the picture of an acquisition, provide deeper insights into the issues and opportunities, and form the basis of a clearer and more effective post-merger road map.
For this to be possible, the auditors need to have the capabilities for deep analysis, as well as access to clean data. For many auditors using spreadsheets only, it's difficult to bring all this information together in the M&A clean room and guarantee its reliability. As a result, the M&A auditors who don't have access to modern data management tools often must get by with only basic analysis and estimates in their forecasts and recommendations.
Enter Modern Data Management
As the challenges of M&A frustrate companies seeking to move forward quickly, new solutions are emerging from modern data management, which combines the rigor of MDM in the cloud with big data infrastructure to correlate and uncover hidden relationships. Separate private workspaces known as tenants are made available for each company that meet the proper security requirements, and include data management processes and analysis create the minimum disruption for IT. The added benefit is that there is no upfront hardware and software investment and the pre-merger activities can begin immediately. Should the merger not succeed for any reason, outlay by either party will be minimized, and all work done for each company can be retained for its own use moving forward.
In this new approach, the data from company A is loaded into a secure tenant in the cloud; likewise, datasets from company B are loaded into their own isolated, secure tenant. Each company then cleans its own data: enriching it from public and commercial third-party databases, deduplicating it, and normalizing it. The resulting clean, reliable data is then automatically loaded into a third clean-room tenant. This tenant, secured with tight access controls and complete auditing capability, is where the auditors do their principal work of comparison, analysis, and assessment.
This approach provides a single point of data governance with a 360-degree view of the data. It fosters reliable analysis by analysts who can explore the data sets and see relationships and correlations that might not be evident through traditional, manual, labor-intensive processing.
Post-merger, cleaned reliable data can be pushed to the designated operational applications of the merged company and used to create new data-driven applications. It also provides the reliable, single view from which to execute retirement of legacy systems and to drive operational efficiency across combined functions. Interestingly, the same capabilities can also be used "in reverse" to support analysis for efficient and compliant divestitures or breaking up a large company into independent entities.
Although legacy on-premises master data management has been dismissed as being too slow, high cost, and complex to be used in M&A, a modern data management platform brings together big data scalability, infinite data variety, analytics, full security and auditability, and new ways to correlate relationships that enable the data to be redeployed in new contexts and in enterprise data-driven applications -- all in the cloud.
Elements of a Modern Data Management Platform
The age of big data technologies built on nearly unlimited storage and processing power has substantially changed the way companies analyze their data. Despite this, obtaining reliable data through MDM has lagged the leading edge. This leaves frontline business users hoping to be more data-driven saddled with legacy process-driven applications. Forward-looking businesses seeking to close this gap with a modern data management platform consider the following key elements important:
- Cloud-based: A multi-tenant, horizontally scalable, and always-available solution.
- Hybrid columnar and graph-based repository: Providing the ability to store and uncover relationships between fields, perform semantic searches, as well as provide powerful contextual analysis.
- Contextual analysis: enabling the construction of powerful visual hierarchical and network views of relationships, with simple-to-use tools.
- MDM: Using built-in rules-based process to cleanse, match, merge, and unmerge data using internal and external sources, ideally, supporting full data stewarding, but also collaborative curation, tagging, and cataloging of data sets.
- Data-as-a-service: Pre-mapped integration with public, commercial, third-party, public, and social data sets, with on-demand search and access.
- Auditability: Fine-grained auditability tracking with historical views to every change to any attribute.
- Agility and flexibility: Ability to view the data from any perspective, add and augment data from new sources containing both structured and unstructured data, at unlimited scale.
Taken together, these attributes of modern data management platform provide an integrated solution that delivers agility and economy in M&A while positioning the merged company to be immediately effective.
Ramon Chen is the VP of marketing at Reltio. Ramon has been involved with the development and launch of big data databases, enterprise applications, development tools, and master data management tools for more than 20 years. He's currently using Reltio's own data-driven applications in house for more effective sales, marketing, and a myopic focus on customer experience and satisfaction. You can contact the author at [email protected].
Neil Cowburn is the CEO of iMiDiA. Neil is a modern data management and M&A technology expert. He has been the driving force on countless data-driven projects in the food distribution, retail, and consumer packaged goods industries for companies such as Sysco, Kraft Foods, Mondelez, Mondeleez, Supervalu, and Ecolab. You can contact the author at [email protected].