By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

TDWI Blog

Comprehensive and Agile End-to-End Data Management

The trend toward integrated platforms of multiple tools and functions enables broader designs and practices that satisfy new requirements.

By Philip Russom, Senior Research Director for Data Management, TDWI

Earlier this week, I spoke in a webinar run by Informatica Corporation and moderated by Informatica’s Roger Nolan. I talked about trends in user practices and vendor tools that are leading us toward what I call end-to-end (E2E) data management (DM). My talk was based on three assumptions:

  1. Data is diversifying into many structures from new and diverse sources.
  2. Business wants to diversify analytics and other data-driven practices.
  3. End-to-end data management can cope with the diversification of data, analytics, and business requirements in a comprehensive and agile manner.

In our webinar, we answered a number of questions pertinent to comprehensive and agile end-to-end data management. Allow me to summarize some of the answers for you:

What is end-to-end (E2E) data management (DM)?

End-to-end data management is one way to adopt to data’s new requirements. In this context, “end-to-end” has multiple meanings:

End-to-end DM functions. Today’s diverse data needs diverse functions for data integration, quality, profiling, event processing, replication, data sync, MDM, and more.

End-to-end tool platform. Diverse DM functions (and their user best practices) must be enabled by a portfolio of many tools, which are unified in a single integrated platform.

End-to-end agility. With a rich set of DM functions in one integrated toolset, developers can very quickly on-board data, profile it, and iteratively prototype, in the spirit of today’s agile methods.

End-to-end DM solutions. With multiple tools integrated in one platform, users can design single solutions that bring to bear multiple DM disciplines.

End-to-end range of use cases. With a feature-rich tool platform and equally diverse user skills, organizations can build solutions for diverse use cases, including data warehousing, analytics, data migrations, and data sync across applications.

End-to-end data governance. When all or most DM functions flow through one platform, governance, stewardship, compliance, and data standards are greatly simplified.

End-to-end enterprise scope. End-to-end DM draws a big picture that enables the design and maintenance of enterprise-scope data architecture and DM infrastructure.

What is the point of E2E DM?

End-to-end (E2E) data management (DM) is all about being comprehensive and agile:

  • Comprehensive -- All data management functions are integrated for development and deployment, with extras for diverse data structures and business-to-DM collaboration.
  • Agile -- Developers can very quickly on-board diverse data, profile it, and both biz/tech people can iteratively prototype and collaborate, in today’s agile spirit.

What’s an integrated tool platform? What’s it for?

An integrated platform supports many DM tool types, but with tight integration across them. The end-to-end functionality seen in an integrated DM platform typically has a data integration and/or data quality tool at its core, with additional tools for master data management, metadata management, stewardship, changed data capture, replication, event processing, data exchange, data profiling, and so on.

An integrated platform supports modern DM architectures. For example, the old way of architecting a DM solution is to create a plague of small jobs, then integrate and deploy them via scheduling. The new way (which requires an integrated toolset) architects fewer but more complex solutions, where a single data flow calls many different tools and DM functions in a controlled and feature-rich fashion.

An integrated tool platform supports many, diverse use cases. Furthermore, the multiple integrated tools of the end-to-end platform support the agile reuse of people, skills, and development artifacts across use cases. Important use cases include: data warehousing, analytics, application modernization, data migration, complete customer views, right-time data, and real-time data warehousing.

How does an integrated toolset empower agile methods?

Multiple data disciplines supported in one integrated toolset means that developers can design one data flow (instead of dozens of jobs) that includes operations for integration, quality, master data, federation, and more.

The reuse of development artifacts is far more likely with one integrated toolset than working with tools from multiple vendors.

Daily collaboration between a business subject-matter expert and a technical developer is the hallmark of agile development; an integrated DM platform supports this.

Feature-rich metadata management propels the collaboration of a business person (acting as a data steward) and a data management professional, plus self-service for data.

Self-service data access and data prep presented in a visual environment (as seen in mature integrated toolsets) can likewise propel the early prototyping and iterative development assumed of agile methods.

Automated testing and data validation can accelerate development. Manual testing distracts from the true mission, which is to build custom DM solutions that support the business.

Develop once, deploy at any latency. Reuse development artifacts, but deploy them at the speed required by specific business processes, whether batch, trickle feed, or real time.

Reinventing the wheel bogs down development. Mature integrated toolsets include rich libraries of pre-built interfaces, mappings, and templates that plug and play to boost developer productivity and agility.

What’s the role of self service in agile development methods?

Self-service data access for business users. For example, think of a business person who also serves as a data steward and therefore needs to browse data. Or consider a business analyst who is capable of ad hoc queries, when given the right tools.

Data prep for business users, analytics, and agility. Users want to work fast and independently – at the speed of thought – without need for time-consuming data management development. To enable this new best practice, the tools and platforms that support self-service data access now also support data prep, which is a form of data integration, but trimmed down for reasons of agility, usability, and performance.

Self-service and data prep for technical users. For example, self-service data exploration can be a prelude to the detailed data profiling of new data. As another example, the modern, agile approach to requirements gathering involves a business person (perhaps a steward) and a data professional, working side-by-side to explore data and decide how best to get business value from the data.

What’s the role of metadata in self-service and agile functionality?

We need complete, trusted metadata to accomplish anything in DM. And DM’s not agile, when development time is burned up creating metadata. Hence, a comprehensive E2E DM platform must support multiple forms of metadata:

  • Technical metadata – documents properties of data for integrity purposes. Required for computerized processes and their interfaces.
  • Business metadata – describes data in ways biz people understand. Absolutely required for self service data access, team collaboration, and development agility.
  • Operational metadata – records access by users and apps. Provides an audit trail for assuring compliance, privacy, security, and governance relative to data.

If you’d like to hear more, please click here to replay the Informatica Webinar.

Posted on June 30, 20160 comments


Data Warehouse Modernization: An Overview in 30 Tweets

By Philip Russom, Senior Research Director for Data Management, TDWI

To help you better understand what data warehouse (DW) modernization is, what variations it takes, who’s doing it, and why, I’d like to share with you the series of 30 tweets I recently issued on the topic. I think you’ll find the tweets interesting, because they provide an overview of data warehouse modernization in a form that’s compact, yet amazingly comprehensive.

Each tweet below is a short sound bite or stat bite drawn from the recent TDWI report “Data Warehouse Modernization in the Age of Big Data and Analytics,” which I researched and wrote. Many of the tweets focus on a statistic cited in the report, while other tweets are definitions stated in the report.

I left in the arcane acronyms, abbreviations, and incomplete sentences typical of tweets, because I think that all of you already know them or can figure them out. Even so, I deleted a few tiny URLs, hashtags, and repetitive phrases. I issued the tweets in groups, on related topics; so I’ve added some headings to this blog to show that organization. Otherwise, these are raw tweets. Enjoy!

Introduction to Data Warehouse Modernization
1. #DataWarehouse #Modernization ranges widely: upgrades; new subject areas; more platforms etc.
2. #DataWarehouse #Modernization is real. 76% of DWs are evolving dramatically or moderately.
3. 89% of #TDWI survey respondents say #DataWarehouse #Modernization is opp for innovation.

State of Data Warehouse Modernization
4. 91% of users surveyed find #DataWarehouse #Modernization extremely or moderately important.
5. Half of users surveyed say #DataWarehouse is up-to-date. Other half is behind. Both need modernizing.
6. 88% of users surveyed say #DataWarehouse still relevant to how mgt runs biz.

Drivers of Data Warehouse Modernization
7. #DWE #Modernization drivers = aligning DW w/biz; scaling to #BigData; new analytic apps; new tools & data types.
8. #DataWarehouse #Modernization fixes problems w/ DW focus, design, architecture, platform.
9. Modernize DW to leverage new types of data (unstruc,sensors,GPS) & tools (#Hadoop,CEP, cloud,SaaS).

Types of Data Warehouse Modernization
10. Continuous #Modernization is about regular recurring updates & extensions of a #DataWarehouse.
11. Disruptive #DataWarehouse #Modernization is about rip-&-replace of major datasets, platforms, tools.
12. Optimization #Modernization is about remodeling data, interfaces, processing for DW performance.

Benefits and Barriers for Data Warehouse Modernization
13. Leading beneficiaries of DW #Modernization = analytics; biz mgt; #RealTime operations.
14. Leading barriers to DW mod = problems w/ governance, staffing, funding, designs & platforms.
15. #Modernization also needed for systems DW integrates with = reporting, #analytics, #DataIntegration.

Trends in Data Warehouse Modernization
16. No.1 #Modernization trend is toward #DataWarehouse Environments (#DWEs) with multiple standalone data platforms.
17. Improving DW system arch (adding/replacing data platforms) is most common DW #modernization.
18. Platforms added to #DWE are based on column, appliance, event proc, adv’d analytics, # Hadoop.

User Plans for Data Warehouse Modernization
19. Half of org’s surveyed plan to leave current DW platform in place & add complementary platforms.
20. Half of org’s surveyed plan to rip out current DW platform & replace it within 3 to 4 years.
21. Very few users surveyed lack a plan or strategy for #DataWarehouse #Modernization.

Data Warehouse Modernization’s Effect on Architecture
22. #Modernization has reduced the number of single-DBMS-instance #DataWarehouses. Down to 19%.
23. Multi-platform #DataWarehouse Environment (#DWE) is norm for DW sys arch; 34% today.
24. Extreme #DataWarehouse Environment (#DWE) with LOTS of platforms will become sys arch norm in 3 yrs.

Hadoop’s Role in Data Warehouse Modernization, Part 1
25. #Hadoop is often deployed to modernize a DW or #DWE. Orgs w/#Hadoop in #DWE will double in 3yrs.
26. For early adaptors, #Hadoop is DW/#DWE complement, not replacement.
27. #Modernization via #Hadoop helps address “exotic” data: non-relational, unstruc, social, sensors.

Hadoop’s Role in Data Warehouse Modernization, Part 2
28. Modern DW of future will still have relational DBMS at core. But probably integrate w/#Hadoop too.
29. #Hadoop’s relational functions will improve greatly; more likely as DW replacement in 3 to 5 yrs.
30. A few users surveyed think #Hadoop will grow larger than DW but not replace it. 2% now; 14% in 3yrs.

Want to learn more about Data Warehouse Modernization?
For a more detailed discussion – in a traditional publication! – get the TDWI Best Practices Report, titled “Data Warehouse Modernization in the Age of Big Data and Analytics,” which is available in a PDF file via a free download.

You can also register for and replay the TDWI Webinar, where I discussed the findings of the TDWI report.

Posted on June 8, 20160 comments


Highlights from Informatica World 2016

Bigger than ever, with more user speakers and an impressive executive vision for product R&D

By Philip Russom, Senior Research Director for Data Management, TDWI

I just spent three days attending and speaking at Informatica World 2016 in San Francisco’s Moscone Center. Compared to previous years, this year’s event was bigger than ever, with over three thousand people in attendance and five or more simultaneous break-out tracks.

The change this year that I like most is the increased number of user case study speakers – almost double last year! To be honest, that’s my favorite part of any event, although I also like hearing executives explain their product vision and direction. With that in mind, allow me to share some highlights in those two areas, based on sessions I was able to attend at Informatica World 2016.

User Case Studies

I had the honor of sharing the stage with data integration veteran Tom Kato of Republic Services. Based on my research at TDWI, I talked about users’ trends toward integrated platforms that include tools for many data disciplines from a single vendor, as opposed to silo’d tools from multiple vendors. Tom talked about how an integrated tool strategy has played out successfully for his team at Republic Services. By adopting a comprehensive end-to-end toolset from Informatica, it was easier for them to design a comprehensive data architecture, with information lifecycle management that extends from data creation to purge.

I heard great tips by a speaker from Siemens about how their data lake is successful due to policies governing who can put data in the lake, what kind of data is allowed, and how the data is tagged and cataloged. “We saved six to twelve months by using simple flat schema in the data lake,” he said. “Eventually, we’ll add virtual dimensional models to some parts of the data lake to make it more like a data warehouse.”

A speaker from Harvard Business Publishing described a three-year migration and consolidation project, where they moved dozens of applications and datasets to clouds, both on premises and off-site (including AWS). They feel that Informatica Cloud and PowerCenter helped them move to clouds very quickly, which reduced the time that old and new systems ran concurrently with synchronization, which in turn reduced the costs and risks of migration.

Red Hat’s data warehouse architect explained his strategy for data warehouse modernization, based on modern data platforms, hybrid mixtures of clouds, complete views of customers, virtual technologies, and agile methods. Among those, clouds are the secret sauce – including Informatica Cloud, AWS, Redshift, and EC2 – because they provide the elasticity and performance Red Hat needs for the variety of analytic, reporting, and virtual workloads they run.

A dynamic duo from Verizon’s data warehouse team laid out their methods for success with clickstream analytics. They follow Gartner’s Bimodal IT approach, where old and new systems coexist and integrate. New tools capture and process clickstreams, and these are correlated with historic data in the older data warehouse. This is enabled by a hybrid architecture that integrates a mature Teradata implementation and a new Hadoop cluster, via data integration infrastructure by Informatica.

Another dynamic duo explained why and how they use Informatica Data Integration Hub (or simply DI Hub). “As a best practice, a data integration hub should connect four key entities,” said one of the Humana reps. “Those are source applications, publications of data, people who subscribe to the data, and a catalog of topics represented in the data.” Humana chose Informatica DI Hub because it suits their intended best practice, plus it supports additional requirements for a data fabric, virtual views, canonic model, data audit, and self service.

Executive Vision for Product R&D

The general sessions mostly featured keynote addresses by executives from Informatica and leading partner firms. For example, Informatica’s CEO Anil Chakravarthy discussed how Informatica technology is supporting Data 3.0, an emerging shift in data’s sources, types, technical management, and business use.

All the executive speakers were good, but I got the most out of the talk by Amit Walia, Informatica’s Chief Product Officer. It was like drinking from the proverbial fire hose. Walia announced one new product, release, or capability after the next, including new releases of Informatica Cloud, Big Data Management, Data Integration Hub, and Master Data Management (with a cloud edition). Platform realignments are seen in Informatica Intelligent Data Platform (with Hadoop as a compute engine, controlled by a new Smart Executor) and Informatica Intelligent Streaming (based on Hadoop, Spark, Kafka, and Blaze); these reveal a deep commitment to modern open source software (OSS) in Informatica’s tool development strategy. One of Walia’s biggest announcements was the new Live Data Map, which will provide a large-scale framework for complex, multi-platform data integration, as is increasingly the case with modern data ecosystems.

That’s just a sample of what Amit Walia rolled out, and yet it’s a tsunami of new products and releases. So, what’s up with that? Well, to me it means that the acquisition of Informatica last year (which made it a private company) gave Informatica back the mojo that made it famous, namely a zeal and deep financial commitment to product research and development (R&D). Informatica already has a broad and comprehensive integrated platform, which addresses just about anything you’d do in traditional data management. But, with the old mojo for R&D back, I think we’ll soon see that portfolio broaden and deepen to address new requirements around big data, machine data, analytics, IoT, cloud, mobile, social media, hubs, open source, and security.

Informatica customers have always been the sort to keep growing into more data disciplines, more data types and sources, and the business value supported by those. In the near future, those users will have even more options and possibilities to grow into.

Further Learning

To get a feel for Informatica World 2016, start with a one-minute overview video

However, I strongly recommend that you “drink from the fire hose” by hearing Amit Walia’s 40-minute keynote, which includes his amazing catalog of new products and releases.

You might also go to www.YouTube.com and search for “Informatica World 2016,” where you’ll find many useful speeches and sessions that you can replay. For something uplifting, search for Jessica Jackley’s keynote about micro loans in the third world.

Posted on May 31, 20160 comments


Modernizing Business-to-Business Data Exchange

Keep pace with evolving data and data management technologies, plus the evolving ecosystem of firms with whom you do business.

By Philip Russom, TDWI Research Director for Data Management

Earlier this week, I spoke in a webinar run by Informatica Corporation, along with Informatica’s Daniel Rezac and Alan Lundberg. Dan, Alan, and I talked about trends and directions in a very interesting data management discipline, namely business-to-business (B2B) data exchange (DE). Like all data management disciplines, B2B DE is modernizing to keep pace with evolving data types, data platforms, and data management practices, as well as evolving ways that businesses leverage exchanged data to onboard new partners and clients, build up accounts, improve operational efficiency, and analyze supply quality, partner profitability, procurement costs, and so on.

In our webinar, we answered a number of questions pertinent to the modernization of B2B DE. Allow me to summarize those for you:

What is business-to-business (B2B) data exchange (DE)?

It is the exchange of data among operational processes and their applications, whether in one enterprise or across multiple ones. A common example would be a manufacturing firm and the ecosystem of supplier and distributor companies around it. In such examples, many enterprises are involved. However, large firms with multiple, independent business units often practice B2B DE as part of their inter-unit communications within a single enterprise. Hence, B2B DE scales up to global partner ecosystems, but it also scales down to multiple business units of the same enterprise.

B2B DE integrates data across two or more businesses, whether internal or external. But it also integrates an ecosystem of organizations as it integrates data. Therefore, B2B DE is a kind of multi-organizational collaboration. And the collaboration is enabled by the transfer of datasets, documents, and files that are high quality, trusted, and standardized. Hence, there’s more than data flowing through B2B data exchange infrastructure. Your business flows through it, as well.

What are common industries and use cases for B2B DW?

The business ecosystems enabled by B2B DE are often industry specific, as with a manufacturer and its suppliers. The manufacturing ecosystem becomes quite complex, when we consider that it can include several manufacturers (who may work together on complex products, like automobiles) and that many suppliers are also manufacturers. Then there are financiers, insurers, contractors, consultants, distributors, shippers, and so on. The data and documents shared via B2B DE are key to establishing these diverse business relationships, then growing and competing within the business ecosystem.

The retail ecosystem is equally complex. A retailer does daily business with wholesalers and distributors, plus may buy goods directly from manufacturers. All these partners may also work with other retailers. A solid hub for B2B DE can provide communications and integration infrastructure for all.

Other examples of modern business practices relying on B2B DE include subrogation in insurance, trade exchanges in various industries, and the electronic medical record, HL7 standards, and payer activities in healthcare.

Why is B2B DE important?

In the industries and use cases referenced above, much of the business is flowing through B2B DE; therefore users should lavish upon it ample resources and modernization. Furthermore, B2B DE involves numerous technical interfaces, but it also is a metaphorical interface to the companies with whom you need to do business.

What’s the state of B2B DE?

There are two main problems with the current state:

B2B DE is still low-tech or no-tech in many firms. It involves paper, faxes, FedEx packages, poorly structured flat files, and ancient interfaces like electronic document interchange (EDI) and file transfer protocol (FTP). These are all useful, but they should not be the primary media. Instead, a modern B2B DE solution is online and synchronous, ideally operating in real time or close to it, while handling a wide range of data and document formats. Without these modern abilities, B2B relationships are slow to onboard and inflexible over time.

B2B DE is still too silo’d. Whether packaged or home-grown, applications for supply chain and procurement are usually designed to be silos, with little or no interaction with other apps. One way to modernize these apps is to deploy a fully functional data integration (DI) infrastructure that integrates data from supply chain, procurement, and related apps with other enterprise applications, whether for operations or analytics. With a DI foundation, modernized B2B DE can contribute information to other apps (for a more complete view of partners, supplies, etc.) and analytic data (for insights into B2B relationships and activities).

What’s driving users to modernize B2B DE?

Business ecosystems create different kinds of “peer pressure.” For example, if your partners and clients are modernizing, you must too, so you can keep doing business with them and grow their accounts. Likewise, if competitors in the ecosystem are modernizing, you must too, to prevent them from stealing your business. Similarly, data standards and technical platforms for communicating data and documents evolve over time. To continue to be a “player” in an ecosystem, you must modernize to keep pace with the evolution.

Cost is also an important driver. This why many firms are scaling down their dependence on expensive EDI-based legacy applications and the value-add networks (VANs) they often require. The consensus says that systems built around XML, JSON, and other modern standards are more feature-rich, agile, and integrate-able with the enterprise.

Note that some time-sensitive business practices aren’t possible without B2B DE operating in near time, such as like just-in-time inventory in the retail industry and outsourced material management in manufacturing. For this reason, the goal of many modernizations is to add more real-time functions to a B2B DE solution.

Self-service is a driver, too. Business people who are domain experts in supply chain, procurement, material management, manufacturing, etc. need self-service access, so they can browse orders, negotiations, shipments, build plans, and more, as represented in B2B documents and data. Those documents and datasets are infamous for data quality problems, noncompliance with standards, and other issues demanding human intervention; so domain experts need to remediate, onboard, and route them in a self-service fashion.

Why are data standards and translations key to success with B2B DE?

The way your organization models data is probably quite different from how your partners and clients do it. For this reason, B2B DE is regularly accomplished via an exchange data model and/or document type. Many of these are industry specific, as with SWIFT for financials and HL7 for healthcare. Many are “de jure” in that they are adjudicated by a standards body, such as the American National Standards Institute (ANSI) or the International Standards Organization (ISO). However, it’s equally common that partners come together and design their own ad hoc standards.

With all that in mind, your platform for B2B DE should support as many de jure standards as possible, out of the box. But it must also have a development environment where you can implement ad hoc standards. In addition, translating between multiple standards can be a critical success factor; so your platform should include several pre-built translators, as well as development tools for creating ad hoc translations.

What are some best practices and critical success factors for B2B DE?

  • Business-to-business data exchange is critical to your business. So give it ample business and technical resources, and modernize it to remain competitive in your business ecosystem.
  • Remember that B2B DE is not just about you. Balance the requirements of clients, partners, competitors, and (lastly) your organization.
  • Poll the ecosystem you operate in to keep up with its changes. As partners, clients, and competitors adopt new standards and tools, consider doing the same.
  • Mix old and new B2B technologies and practices. Older low-tech and EDI-based systems will linger. But you should still build new solutions on more modern platforms and data standards. The catch is to integrate old and new, so you support all parties, regardless of the vintage of tech they require.
  • Build a business case for B2B data exchange. To get support for modernization, identify a high-value use case (e.g., enterprise integration, real time, pressure from partners and competition), and find a business sponsor who also sees the value.

If you’d like to hear more of my discussion with Informatica’s Daniel Rezac and Alan Lundberg, you can replay the Informatica Webinar.

Posted on April 29, 20160 comments


Modernizing Data Integration and Data Warehousing with Data Hubs

As data and its management continue to evolve, users should consider a variety of modernization strategies, including data hubs.

By Philip Russom, TDWI Research Director for Data Management

This week, I spoke in a webinar run by Informatica Corporation, sharing the stage with Informatica’s Scott Hedrick. Scott and I had an interactive conversation where we discussed modernization trends and options, as faced today by data management professionals and the organizations they serve. Since data hubs are a common strategy for capturing modern data and for modernizing data integration architectures, we included a special focus on hubs in our conversation. We also drilled into how modern hubs can boost various applications in analytics and application data integration operations.

Scott and I organized the webinar around a series of questions. Please allow me to summarize the webinar by posing the questions with brief answers:

What is data management modernization?

It’s the improvement of tools, platforms, and solutions for data integration and other data management disciplines, plus the modernization of both technical and business users’ skills for working with data. Modernization is usually selective, in that it may focus on server upgrades, new datasets, new data types, or how all the aforementioned satisfy new data-driven business requirements for new analytics, complete views, and integrating data across multiple operational applications.

What trends in data management drive modernization?

Just about everything in and around data management is evolving. Data itself is evolving into more massive volumes of greater structural diversity, coming from more sources than ever and generated faster and more frequently than ever. The way we capture and manage data is likewise evolving, with new data platforms (appliances, columnar databases, Hadoop, etc.) and new techniques (data exportation, discovery, prep, lakes, etc.). Businesses are evolving, too, as they seek greater business value and organizational advantage from growing and diversifying data – often through analytics.

What is the business value of modernizing data management?

A survey run by TDWI in late 2015 asked users to identify the top benefits of modernizing data. In priority order, they noted improvements in analytics, decision making (both strategic and operational), real-time reporting and analytics, operational efficiency, agile tech and nimble business, competitive advantage, new business requirements, and complete views of customers and other important business entities.

What are common challenges to modernizing data management?

The TDWI survey mentioned above uncovered the following challenges (in priority order): poor stewardship or governance, poor quality data or metadata, inadequate staffing or skills, funding or sponsorship, and the growing complexity of data management architectures.

What are the best practices for modernizing data management?

First and foremost, everyone must assure that the modernization of data management aligns with the stated goals of the organization, which in turn assures sponsorship and a return on the investment. Replace, update, or redesign one component of data management infrastructure at a time, to avoid a risky big bang project. Don’t forget to modernize your people by training them in new skills and officially supporting new competencies on your development team. Modernization may lead you to embrace best practices that are new to you. Common ones today include: agile development, light-weight data prep, right-time data movement, multiple ingestion techniques, non-traditional data, and new data platform types.

As a special case, TDWI sees various types of data hubs playing substantial roles in data management modernization, because they can support a wide range of datasets (from landing to complete views to analytics) and do so with better and easier data governance, audit trail, and collaboration. Plus, modernizing your data management infrastructure by adding a data hub is an incremental improvement, instead of a risky, disruptive rip-and-replace project.

What’s driving users toward the use of modern data hubs?

Data integration based on a data hub replaces two of the biggest problems in data management design and development: point-to-point interfaces (which limit reuse and standards, plus are impossible to maintain or optimize) and traditional waterfall or other development methods (which take months to complete and are difficult to keep aligned with business goals).

What functions and benefits should users expect from a vendor-built data hub?

Vendor-built data hubs support advanced functions that are impossible for most user organizations to build themselves. These functions include: controlled and governable publish and subscribe methods; the orchestration of workflows and data flows across multiple systems; easy-to-use GUIs and wizards that enable self-service data access; and visibility and collaboration for both technical and business people across a range of data.

Data hubs are great for analytics. But what about data hubs for operational applications and their data?

Instead of consolidating large operational applications in the multi-month or year project, some users integrate and modernize them quickly at the data level via a shared data hub, perhaps on a cloud. For organizations with multiple customer facing applications for customer relationship management (CRM) and salesforce automation (SFA), a data hub can be a single, trusted version of customer data, which is replicated and synchronized across all these applications. A data hub adds additional functions that users of operational applications can use to extend their jobs, namely self-service data access and collaboration over operational data.

What does a truly modern data hub offer as storage options?

Almost all home-grown data hubs and most vendor-built hubs are based on one brand of relational database management system, despite the fact that data’s schema, formats, models, structures, and file types are diversifying aggressively. A modern data hub must support relational databases (because these continue to be vital for data management), but also support newer databases, file systems, and – very importantly – Hadoop.

If you’d like to hear more of my discussion with Informatica’s Scott Hedrick, please click here to replay the Informatica Webinar.

Posted on March 29, 20160 comments


Top Twelve Priorities for Data Warehouse Modernization

Top Twelve Priorities for Data Warehouse Modernization

By Philip Russom, Senior Research Director for Data Management, TDWI

No matter the vintage or sophistication of your organization’s data warehouse (DW) and the environment around it, it probably needs to be modernized in one or more ways. That’s because DWs and requirements for them continue to evolve. Many users need to get caught up by realigning the DW environment with new business requirements and technology challenges. Once caught up, they need a strategy for continuous modernization.

To help you organize your modernization efforts, here’s a list of the top twelve priorities for data warehouse modernization, including a few comments about why these are important. Think of the priorities as recommendations, requirements, or rules that can guide user organizations into successful strategies for implementing a modernization project.

1. Embrace change. Data warehouse modernization is real; a recent TDWI survey says that 76% of DWs are evolving moderately or dramatically. Given the rampant amount of change in markets and individual businesses, it’s unlikely the status quo will serve you and your organization for much longer. Besides, change is an opportunity for improvement, as long as you manage it with specific directions in mind.

2. Make realignment with business goals your top priority. This is the leading driver according to a recent TDWI survey. Learn the goals of the business and collaborate with business and technical people to determine how business goals map to technology and data. Then base your modernizations on the requirements thus defined. If alignment is achieved, the whole business will modernize, not just the warehouse. And that’s the real point.

3. Make DW capacity a high priority on the technology side. The second most pressing driver is greater capacity for growing data, users, reports. This is no surprise given the explosive growth of traditional enterprise data and new big data. 3-10TB is today’s norm for DW data volume in the average-size organization; however, the norm will soon become 10-100TB, as DW programs graduate from lesser data volumes to greater ones. These are known capacity goals for successful DWs, so keep them in mind when planning capacity modernization.

4. Make analytics a priority, too. One third of DW professionals modernize for better and newer analytics. That’s a technology challenge for the warehouse, since diverse analytic techniques have diverse data preparation requirements, and they don’t all fit the traditional warehouse. Therefore, additional data platforms and tools that complement older ones may be in order. Keep in mind that analytics is what business users want; your pristine data and elegant architecture won’t mean much, if modernization fails to deliver relevant analytics.

5. Don’t forget the related systems and disciplines that also need modernization. Top priorities are analytics, reporting, and data integration, followed by development methods and team characteristics. Align the modernization of the DW, so it can ably provision the data in a manner that these other disciplines require for their success.

6. Don’t be seduced by new, shiny objects. There are lots of new and cool technologies and tools available today, and many get evaluated for DW modernization. Before adopting one, be sure it goes beyond the bling to satisfy real-world requirements in a performant and cost-effective manner.

7. Assume that you’ll need multiple manifestations of modernization. To get the desired results, you should consider multiple modernization strategies, but try not to execute them all at once, in a big bang.

8. Be familiar with today’s tools and techniques for the modern data warehouse environment (DWE). Extending the number and type of standalone platforms within a DWE is one of the strongest trends in data warehouse modernization, because it adds value in the form of additional platforms, without ripping out or replacing established platforms.

9. Adjust the large-scale architecture of your DWE. The rise of the multi-platform DWE is forcing the modernization of system architectures. For most situations, you will keep and improve your centralized, relational DW. But you should expect to complement it with other platforms, then migrate data and balance workloads among platforms. This requires you to rework the large-scale architecture, which determines how diverse platforms integrate and interoperate, plus which data goes where and how data show flow among platforms.

10. Reevaluate your DW platform. The condition of your data is important, but it’s all for naught if the platform can’t capture, manage, and deliver data with speed, scale, and broad functionality at a reasonable cost. Replacing a DW platform is disruptive and expensive for a business. Therefore, consider leaving your existing DW platform in place, but update it and complement it with other systems. Even so, grossly deficient or outmoded platforms should be replaced.

11. Consider Hadoop for various roles in the DWE. Hadoop’s massive and cheap storage offloads older systems by taking responsibility for data staging, ELT push down, and the archiving of detailed source data (retained for advanced analytics). Hadoop also serves as a massively parallel execution engine for a wide variety of set-based and algorithmic analytic methods. Conventional wisdom says Hadoop usually complements a DW without replacing it. That’s what early adaptors do with Hadoop in DWEs today. And the number of organizations integrating Hadoop with a DW continues to increase.

12. Develop plans and recurring cycles for DW modernization. Most DW teams have settled on a quarterly schedule for updating DWs. This applies to tasks of many sizes; well-contained phases of some modernization projects may fit this scheme, as well. However, large-scale modernizations typically need their own plan. The more disruptive a modernization (such as rip-and-replace), the more critical to success is the multi-phase plan (sometimes the multi-year plan). Modernization affects business users and their processes; for minimal disruption, business managers should be involved in developing and executing modernization plans.

ANNOUNCEMENT

To learn more about modernizing data warehouses and related IT systems, attend my TDWI webinar Data Warehouse Modernization in the age of Big Data Analytics, coming up on April 14, 2016. Register online for the webinar: http://bit.ly/DWMod16

This webinar will quantify trends in data warehouse modernization and catalog technologies that are relevant. It will also document strategies and user best practices for organizing modernization projects. The goal is to help DW professionals and their business counterparts plan the next generation of their data warehouse, in alignment with business goals.

Posted on March 24, 20160 comments