By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

TDWI Blog

Successful Application and Data Migrations and Consolidations

Minimizing Risk with the Best Practices for Data Management
By Philip Russom, TDWI Research Director for Data Management

I recently broadcast a really interesting Webinar with Rob Myers – a technical delivery manager at Informatica – talking about the many critical success factors in projects that migrate or consolidation applications and data. Long story short, we concluded that the many risks and problems associated with migrations and consolidations can be minimized or avoided by following best practices in data management and other IT disciplines. Please allow me to share some of the points Rob and I discussed:

There are many business and technology reasons for migrating and consolidating applications and data.
  • Mergers and Acquisitions (M&As) – Two firms involved in a M&A don’t just merge companies; they also merge applications and data, since these are required for operating the modern business in a unified manner. For example, cross-selling between the customer bases of the two firms is a common business goal in a merger, and this is best done with merged and consolidated customer data.
  • Reorganizations (reorgs) – Some reorgs restructure departments and business units, which in turn can require the restructuring of applications and data. 
  • Redundant Applications – For example, many firms have multiple applications for customer relationship management (CRM) and sales force automation (SFA), as the result of M&As or departmental IT budgets. These are common targets for migration and consolidation, because they work against valuable business goals, such as the single view of the customer and multi-channel customer marketing. In these cases, it’s best to migrate required data, archive the rest of the data, and retire legacy or redundant applications.
  • Technology Modernization – These range from upgrades of packaged applications and database management systems to replacing old platforms with new ones.
  • All the above, repeatedly – In other words, data or app migrations and consolidations are not one-off projects. New projects pop up regularly, so users are better off in the long run, if they staff, tool, and develop these projects with the future in mind.
Migration and consolidation projects affect more than applications and data:
  • Business Processes – The purpose of enterprise software is to automate business processes, to give the organization greater efficiency, speed, accuracy, customer service, and so on. Hence, migrating software is tantamount to migrating business processes, and a successful project executes without disrupting business processes.
  • Users of applications and data – These vary from individual people to whole departments and sometimes beyond the enterprise to customers and partners. A successful project defines steps for switching over users without disrupting their work.
Application or data migrations and consolidations are inherently risky. This is due to their large size and complexity, numerous processes and people affected, cost of the technology, and (even greater) the cost of failing to serve the business on time and on budget. If you succeed, you’re a hero or heroine. If you fail, the ramifications are dire for you personally and the organization you work for.

Succeed with app/data migrations and consolidations. Success comes from combining the best practices of data management, solution development, and project management. Here are some of the critical success factors Rob and I discussed in the Webinar:
  • Go into the project with your eyes wide open – Realize there’s no simple “forklift” of data, logic, and users from one system to the next, because application logic and data structures often need substantial improvements to be fit for a new purpose on a new platform. Communicate the inherent complexities and risks, in a factual and positive manner, without sounding like a “naysayer.”
  • Create a multi-phased plan for the project – Avoid a risky “big bang” approach by breaking the project into manageable steps. Pre-plan by exploring and profiling data extensively. Follow a develop-test-deploy methodology. Coordinate with multi-phased plans from outside your data team, including those for applications, process, and people migration. Expect that old and new platforms must run concurrently for awhile, as data, processes, and users are migrated in orderly groups.
  • Use vendor tools – Programming (or hand coding) is inherently non-productive as the primary development method for either applications or data management solutions. Furthermore, vendor tools enable functions that are key to migrations, such as data profiling, develop-test-deploy methods, full-featured interfaces to all sources and targets, collaboration for multi-functional teams, repeatability across multiple projects, and so on.
  • Template-ize your project and staff for repeatability – In many organizations, migrations and consolidations recur regularly. Put extra work into projects, so their components are easily reused, thereby assuring consistent data standards, better governance, and productivity boosts over time.
  • Staff each migration or consolidation project with diverse people – Be sure that multiple IT disciplines are represented, especially those for apps, data, and hardware. You also need line-of-business staff to coordinate processes and people. Consider staff augmentation via consultants and system integrators.
  • Build a data management competency center or similar team structure – From one center, you can staff data migrations and consolidations, as well as related work for data warehousing, integration, quality, database administration, and so on.
If you’d like to hear more of my discussion with Informatica’s Rob Myers, please replay the Webinar from the Informatica archive.

Posted by Philip Russom, Ph.D. on March 11, 20150 comments


Achieving Analytics Maturity: 3 Tips from the Experts

By Fern Halper, TDWI Research Director for Advanced Analytics

What does it take to achieve analytics maturity? Earlier this week, Dave Stodder and I hosted a webcast with a panel of vendor experts from Cloudera, Microstrategy, and Tableau. These three companies are all sponsors of the Analytics Maturity Model, an analytics assessment tool that measures where your organization stands relative to its peers in terms of analytics maturity.

There were many good points made during the discussion. A few particularly caught my attention, because they focused on the organizational aspects of analytics maturity, which are often the most daunting.

Crawl, walk, run: TJ Laher, from Cloudera, pointed out that their customers often crawl, walk, and then run to analytics. I’ve said before that there is no silver bullet for analytics. TJ stressed the need for organizations to have a clear vision of strategic objectives and to start off with some early projects that might take place over a six-month time frame. He spoke about going deep with the use cases that you have and then becoming more advanced, such as in bringing in new data types. Cloudera has observed that success in these early projects often helps to facilitate the walking and then ultimately the running (i.e., becoming more sophisticated) with analytics.

Short-term victories have long-term implications: Vijay Anand from MicroStrategy also touched upon the idea of early wins and pointed out that these can have long-term implications. He pointed out that it is important to think about these early victories in terms of what is down the road. For instance, say the business implements a quick BI solution. That’s great. However, business and IT need to work together to build a certified environment to avoid conflicting and non-standardized information. It is important to think it through.

IT builds the car and business drives it. Ian Coe, from Tableau, also talked about IT and the business working together. He said that organizations achieve success and become mature when teams work together collaboratively on a number of prototypes using an agile approach. Tableau believes that the ideal model for empowering users involves a self-service BI approach. Business people are responsible for doing analysis. IT is responsible for managing and securing data. This elevates IT from the role of dashboard factory to architect and steward of the company’s assets. IT can work in quick cycles to give business what they need and check in with business regularly.

Of course, each expert came to the discussion table with their own point of view. And, these are just some of the insights that the panel provided. The Webinar is available on demand at tdwi.org. I encourage you to listen to it and, of course, take the assessment!

Posted by Fern Halper, Ph.D. on February 6, 20150 comments


Great Data for Great Analytics

Evolving Best Practices for Data Management

By Philip Russom, TDWI Research Director for Data Management

I recently broadcast a really interesting Webinar with David Lyle, a vice president of product strategy at Informatica Corporation. David and I had a “fireside chat” where we discussed one of the most pressing questions in data management today, namely: How can we prepare great data for great analytics, while still leveraging older best practices in data management? Please allow me to summarize our discussion.

Both old and new requirements are driving organizations toward analytics. David and I started the Webinar by talking about prominent trends:

  • Wringing value from big data: The consensus today says that advanced analytics is the primary path to business value from big data and other types of new data, such as data from sensors, devices, machinery, logs, and social media.
  • Getting more value from traditional enterprise data: Analytics continues to reveal customer segments, sales opportunities, and threats for risk, fraud, and security.
  • Competing on analytics: The modern business is run by the numbers, not just gut feel, to study markets, refine differentiation, and identify competitive advantages.

The rise of analytics is a bit confusing for some data people. As experienced data professionals do more work with advanced forms of analytics (enabled by data mining, clustering, text mining, statistical analysis, etc.) they can’t help but notice that the requirements for preparing analytic data are similar-but-different as compared to their other projects, such as ETL for a data warehouse that feeds standard reports.

Analytics and reporting are two different practices. In the Webinar, David and I talked about how the two involve pretty much the same data management practices, but in different orders and priorities:

  • Reporting is mostly about entities and facts you know well, represented by highly polished data that you know well. Squeaky clean report data demands elaborate data processing (for ETL, quality, metadata, master data, and so on). This is especially true of reports that demand numeric precision (about financials or inventory) or will be published outside the organization (regulatory or partner reports).
  • Advanced analytics, in general, enables the discovery of facts you didn’t know, based on the exploration and analysis of data that’s probably new to you. Preparing raw source data for analytics is simple, though at high levels of scale. With big data and other new data, preparation may be as simple as collocating large data sets on Hadoop or another platform suited to data exploration. When using modern tools, users can further prepare the data as they explore it, by profiling, modeling, aggregating, and standardizing data on the fly.

Operationalizing analytics brings reporting and analysis together in a unified process. For example, once an epiphany is discovered through analytics (e.g., the root cause of a new form of customer churn), that discovery should become a repeatable BI deliverable (e.g., metrics and KPIs that enable managers to track the new form of churn in dashboards). In these situations, the best practices of data management apply to a lesser degree (perhaps on the fly) during the early analytic steps of the process, but then are applied fully during the operationalization steps.

Architectural ramifications ensue from the growing diversity of data and workloads for analytics, reporting, multi-structured data, real time, and so on. For example, modern data warehouse environments (DWEs) include multiple tools and data platforms, from traditional relational databases to appliances and columnar databases to Hadoop and other NoSQL platforms. Some are on premises and others are on clouds. On the downside, this results in high complexity, with data strewn across multiple platforms. On the upside, users get great data for great analytics by moving data to a platform within the DWE that’s optimized for a particular data type, analytic workload, price point, or data management best practice.

For example, a number of data architecture uses cases have emerged successfully in recent years, largely to assure great data for great analytics:

  • Leveraging new data warehouse platform types gives analytics the high performance it needs. Toward this end, TDWI has seen many users successfully adopt new platforms based on appliances, columnar data stores, and a variety of in-memory functions.
  • Offloading data and its processing to Hadoop frees up capacity on EDWs. And it also gives unstructured and multi-structured data types a platform that is better suited to their management and processing, all at a favorable cost point.
  • Virtualizing data assets yields greater agility and simpler data management. Multi-platform data architectures too often entail a lot of data movement among the platforms. But this can be mitigated by federated and virtual data management practices, as well as by emerging practices for data lakes and enterprise data hubs.

If you’d like to hear more of my discussion with Informatica’s David Lyle, please replay the Webinar from the Informatica archive.

Posted by Philip Russom, Ph.D. on February 2, 20150 comments


Next-Generation Analytics: Four Findings from TDWI’s Latest Best Practices Report

I recently completed TDWI’s latest Best Practices Report: Next Generation Analytics and Platforms for Business Success. Although the phrase "next-generation analytics and platforms" can evoke images of machine learning, big data, Hadoop, and the Internet of things (IoT), most organizations are somewhere in between the technology vision and today’s reality of BI and dashboards. For some organizations, next generation can simply mean pushing past reports and dashboards to more advanced forms, such as predictive analytics. Next-generation analytics might move your organization from visualization to big data visualization; from slicing and dicing data to predictive analytics; or to using more than just structured data for analysis. The market is on the cusp of moving forward.

What are some of the newer next-generation steps that companies are taking to move ahead?

  • Moving to predictive analytics. Predictive analytics is a statistical or data mining technique that can be used on both structured and unstructured data to determine outcomes such as whether a customer will "leave or stay" or "buy or not buy." Predictive analytics models provide probabilities of certain outcomes. Popular use cases include churn analysis, fraud analysis, and predictive maintenance. Predictive analytics is gaining momentum and the market is primed for growth, if users stick to their plans and if they can be successful with the technology. In this case, 39% of respondents stated they are using predictive analytics today, and an additional 46% are planning to use it in the next few years . Often organizations move in fits and starts when it comes to more advanced analytics, but predictive analytics along with other techniques such as geospatial analytics, text analytics, social media analytics, and stream mining are gaining interest in the market.
  • Adding disparate data to the mix. Currently, 94% of respondents stated they are using structured data for analytics, and 68% are enriching this structured data with demographic data for analysis. However, companies are also getting interested in other kinds of data. Sources such as internal text data (today 27%), external Web data (today 29%), and external social media data (today 19%) are set to double or even triple in use for analysis over the next three years. Likewise, while IoT data is used by fewer than 20% of respondents today, another 34% are expecting to use it in the next three years. Real-time streaming data, which goes hand in hand with IoT data, is also set to grow in use (today 18%).
  • Operationalizing and embedding analytics. Operationalizing refers to making analytics part of a business process; i.e., deploying analytics into production. In this way, the output of analytics can be acted upon. Operationalizing occurs in different ways. It may be as simple as manually routing all claims that seem to have a high probability of fraud to a special investigation unit, or it might be as complex as embedding analytics in a system that automatically takes action based on the analytics. The market is still relatively new to this concept. Twenty-five percent have not operationalized their analytics, and another 15% stated they operationalize using manual approaches. Less than 10% embed analytics in system processes to operationalize it.
  • Investing in skills. Respondents cited the lack of skilled personnel as a top challenge for next-generation analytics. To overcome this challenge, some respondents talked about hiring fewer but more skilled personnel such as data analysts and data scientists. Others talked about training from within because current employees understand the business. Our survey revealed that many organizations are doing both. Additionally, some organizations are building competency centers where they can train from within. Where funding is limited, organizations are engaging in self-study.

These are only a few of the findings in this Best Practices Report.

Download the Report

Posted by Fern Halper, Ph.D. on December 18, 20140 comments


3 Reasons You Should Take the New TDWI Analytics Maturity Assessment

Analytics is hot—many organizations realize that it can provide an important competitive advantage. If your company wants to build an “analytics culture” where data analysis plays an essential role, your first step is to determine the maturity of your organization's analytics. To help your organizations measure their progress in their analytics efforts, we recently developed the TDWI Analytics Maturity Model and Assessment, which provides a quick way for you to compare your progress to other companies.

Take the assessment and you’ll get the big picture of your current analytics program, where it needs to go, and where to concentrate your efforts to achieve your goals and gain value from your analytics. Download the guide to help analyze your scores.

So, why a maturity model for analytics?

  1. It provides a framework.

    Numerous studies indicate acting on analytics has a top- and bottom-line impact. Some companies don’t know where to start. Others don’t know what to do next.

    Our analytics model provides a framework across five dimensions that are critical for analytics deployments: organization, infrastructure, data management, analytics, and governance.

    The model helps create structure around an analytics program and determine where to start. It also helps identify and define the organization’s goals around the program and creates a process to communicate that vision across the entire organization.

  2. It provides guidance.

    The model can provide guidance for companies at any stage of their analytics journey by helping them understand best practices used by companies more mature in their deployments.

    A maturity model provides a methodology to measure and monitor the state of the program and the effort needed to complete the current stage, as well as steps to move to the next stage of maturity. It serves as a kind of odometer to measure and manage the speed of your progress and adoption within the company for an analytics program. 

  3. It provides a benchmark.

    Organizations want to understand how their analytics deployments compare to those of their peers so they can provide best-in-class insight and support. 

    We have created an online assessment that consists of 35 questions across the dimensions mentioned above. At the end of the assessment, you will get a score that lets you know how mature your current analytics deployment is. You will also receive your score in each of the dimensions, along with the average scores of others in your industry or company size. This is a great way to benchmark your analytics progress!

We invite you to read the benchmark guide and take the assessment!

Posted by Fern Halper, Ph.D. on November 6, 20140 comments


3 Interesting Results from the Big Data Maturity Assessment

Almost a year has passed since the launch of the TDWI Big Data Maturity Model and assessment tool, which I co-authored with Krish Krishnan. To date, more than 600 respondents have participated in the assessment.

We asked questions in five categories relevant to big data:

  1. Organization: To what extent does your organizational strategy, culture, leadership, and funding support a successful big data program? What value does your company place in analytics?
  2. Infrastructure: How advanced and coherent is your architecture in support of a big data initiative? To what extent does your infrastructure support all parts of the company and potential users? How effective is your big data development approach? What technologies are in place to support a big data initiative, and how are they integrated into your existing environment?
  3. Data Management: How extensive is the variety, volume, and velocity of data used for big data analytics, and how does your company manage its big data in support of analytics? (This includes data quality and processing as well as data integration and storage issues.)
  4. Analytics: How advanced is your company in its use of big data analytics? (This includes the kinds of analytics utilized, how the analytics are delivered in the organization, and the skills to make analytics happen.)
  5. Governance: How coherent is your company’s data governance strategy in support of its big data analytics program?

Respondents answered 75 questions across these categories and were given a score for each category. Scores correlated with stages of maturity, including nascent, pre-adoption, early adoption, corporate adoption, and mature/visionary. (Get more information about the Big Data Maturity Model by downloading the guide.)

So what are we seeing? Where are organizations in terms of big data maturity?

In a word: early—at least for organizations that took this assessment. The majority self-reported that they had nothing in place for big data or were just in the experimentation phase.

Only a small percentage of respondents are organized to execute on big data initiatives.

When we averaged scores across all five dimensions, the mean scores put respondents between the pre-adoption and early adoption stages—when organizations are thinking about big data and may have some proof of concepts going.

However, here are three results worth noting:

  1. Respondents are not organized to execute. Only a small percentage of respondents are organized to execute on big data initiatives. About 25% of respondents had a road map or a strategy in place for big data. In addition, about one-quarter had some sort of funding process in place for dealing with big data projects. Although organizations scored higher on "soft" questions such as whether they thought they had an analytics culture in place, scores still put many in the pre-adoption phase.
  2. The data warehouse is cited most often as the infrastructure for big data. We asked respondents what kind of infrastructure they had in place for big data. A sign of maturity for big data is to take a hybrid ecosystem approach. In other words, organizations often have a data warehouse (or marts) in place, but supplement it with other tools for other jobs. Hadoop or an analytics platform might work in conjunction with the warehouse, for instance. Some organizations might use some infrastructure in the cloud. About one-third of respondents stated that their data warehouse was their core technology for big data. Another one-third stated they didn’t have a big data infrastructure. The rest had some combination of technologies in place, but they were often siloed.
  3. More advanced analytics are happening on big data, but in pockets. On the analytics front, organizations were often collecting data they weren’t analyzing. About half of the respondents stated that they were performing advanced analytics (i.e., predictive analytics or other advanced techniques), but it was happening in pockets. It was also a bit unclear whether they were analyzing big data as part of their advanced analytics efforts. Many respondents were still trying to put their big data teams together. Few had a center of excellence (COE), where ideas are shared, governance exists, and training takes place.

I’ll continue to share interesting and notable results from the Big Data Maturity Model Assessment. In the meantime, if you haven’t taken the assessment, I encourage you check it out here.

What’s next? TDWI set to launch new analytics maturity model in late 2014

TDWI is excited to announce our plans to launch a new assessment, which we're calling the Analytics Maturity Model. Like the Big Data Maturity Model, this model has an assessment associated with it. Unlike the Big Data Maturity Model, this model focuses on analytics maturity across the spectrum of BI and analytics techniques and infrastructures. I’ll be writing more about this maturity model in the coming weeks. Stay tuned!

Posted by Fern Halper, Ph.D. on October 16, 20140 comments