By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

RESEARCH & RESOURCES

Question and Answer with the Experts

A business intelligence or data warehouse implementation can be a formidable undertaking. In these pages, leading business intelligence and data warehousing solution providers share their answers to the questions they hear often from industry professionals. Mark Hammond, an independent consultant, provides his analyst viewpoint to each Q&A.

Questions and Answers with the Experts

arcplan, Inc.

“Process-centric BI” and “operational BI” are both hot topics these days. Since operations are made up of processes, is there a difference?

Yes, and the difference lies in the level of the activity. With operational BI, vendors are extending BI from the strategic and tactical levels into the operational level, where BI has not been used—until now. Process-centric BI, on the other hand, does not fit neatly within a particular level. Instead, it follows business processes, and the underlying data, through different levels. In this way, process-centric BI can be strategic, tactical, and operational, all in the same application.

Analyst Viewpoint

Another distinction is that operational BI is usually most relevant to hands-on systems such as call centers and business performance management (BPM) or business activity monitoring (BAM), whereas process-centric BI lends itself to intelligently automating such processes as supply chain and pricing based on business rules. Distinctions aside, both operational and process-centric BI share the same objective of reducing data latency from weeks or days to hours or minutes to enable more precise business execution. The service-oriented architecture offers IT organizations a way to non-invasively embed analytics in both operational applications and processes to minimize impact on business-critical systems.

Business Objects

What components make up an EIM technical framework?

Enterprise information management (EIM) is the strategy, practices, and technologies needed to deliver a comprehensive approach to managing data. While many of the components that make up the technical framework of EIM are not new, the category itself and the ability to deliver an integrated approach is new to the market. Data integration (ETL), data quality, metadata management, and data federation are the core components companies implement within an EIM strategy. While organizations may not need to adopt all components or may stagger their implementation, it is essential that the solutions that make up the EIM framework support and integrate easily with each other.

Analyst Viewpoint

If source data is poorly integrated, inconsistent, and unreliable, a glitzy BI front end can amount to little more than whipped cream on a pile of garbage. Most organizations have experience with the technical components of an EIM framework (data integration, federation, quality, and metadata management), and have ad hoc solutions across their information infrastructure. The EIM challenge is to avoid the pile-of-garbage BI data syndrome by integrating these technologies into an EIM framework (driven by strategic objectives and defined practices) to provide a sound information infrastructure upon which BI can thrive. For some EIM adopters, a key choice will be whether to deploy best-of-breed technologies or an integrated solution from a single vendor.

DataFlux Corporation

What are the advantages of implementingservice-oriented architecture (SOA) in a data quality program?

Many businesses acknowledge the importance of having a unified view of company data across multiple enterprise applications. Data quality assures that data is consistent, accurate, and reliable, and service-oriented architecture (SOA) guarantees the data’s integrity as it arrives from multiple sources. SOA provides a rapid time-to-value, allowing you to create a business rule for data quality once—and replicate it across any application. SOA allows business users to more effectively enforce data standards, creating an enterprisewide standard for data quality. Data quality management via Web services in an SOA approach is the best method to ensure overall data integrity.

Analyst Viewpoint

Despite widespread recognition of issues with poor data quality, “dirty data” remains one of the most pernicious problems in enterprise data management. Attempts to improve data quality with isolated or departmental solutions have in many cases resulted in additional siloed solutions that fail to address the issue on a cross-enterprise scale. SOA offers organizations an opportunity to transparently integrate data quality services to cleanse, update, and propagate accurate data across loosely coupled heterogeneous systems for enhanced business value. Because SOA typically involves architectural changes at a foundational level, organizations embarking on an SOA-based data quality initiative will often be best served by an incremental approach.

DATAllegro, Inc.

How do you handle a mixed workload of short- and long-running queries, all competing for system resources?

Current RDBMS platforms allow multiple concurrent connections and queries to be submitted by independent users. The result is a mixture of short-running queries and long-running, process-intensive queries. Without a priority management system, as the number of queries increases, the performance for each individual query degrades almost linearly.

Standard RDBMS vendors address this issue by setting higher priority to specific user accounts or to certain connections, resulting in administrative overhead and dissatisfied end users. The most advanced DW appliances provide automatic workload management without a complicated configuration. Resource priority is automatically and continuously reassigned based on real-time system throughput. The resulting system is one that automatically matches performance to user expectations, even under high load conditions.

Analyst Viewpoint

Many factors must be taken into account when engineering query prioritization and optimization mechanisms, including I/O, CPU speed, memory, and bandwidth. Ultimately, the most important factor is the business priority associated with each query. Today’s configuration tools help simplify management of complex query environments with finely grained controls that may be tied to user role, time of day, and other criteria. Slick graphical displays and outlier alerts help administrators monitor the systems. Organizations will benefit from both automated workload management and periodic reviews of query prioritization and performance. In addition, diligent, trial-and-error application of performance tuning techniques with built-in or third-party tools or data accelerators can pay significant dividends.

ESRI

We’ve seen compelling demonstrations of geographic information systems (GIS), but aren’t they difficult to learn and implement?

Today’s server-based geographic information systems can be used through the interface provided in your business intelligence applications. All of the leading BI vendors, or their partners, have built in the ability to link to GIS analysis and mapping capabilities. In many cases, your BI application will guide you through the process of linking your data to commercially available GIS feature data (maps). The most commonly used GIS feature data is included with the GIS application, or it can be accessed as a Web service.

Analyst Viewpoint

We’ve seen compelling demonstrations of geographic information systems (GIS), but that, as they say, was then. This is now. Over the past several years, GIS and business intelligence vendors have made significant strides in putting GIS client and Web-based applications on the BI map with ease-of-use enhancements aimed at business users, workflow automation, and standard protocols that enable integration with RDBMS and business applications. Organizations that wish to exploit the power of GIS in such areas as marketing, supply chain, or finance can test the waters with a low-end GIS product (such as Microsoft MapPoint) before moving up to the greater breadth and functionality of GIS specialists such as ESRI.

FAST

How can you ensure that the data used to make business decisions is reliable?

In order to establish a single version of the truth on which to base corporate decisions, you must prevent errors introduced in transactional systems from propagating to the aggregated information residing in your data warehouse.

This requires operational data cleansing beyond what is attainable using standard ETL and database matching functionality. The supreme scalability and fuzzy matching algorithms available through enterprise search technology allow users for the first time to create systems that maintain data consistency and cleanliness through familiar ETL GUI environments.

Once the data is cleansed through this technology, it can then be written back for use in database transactional systems and applications, improving their accuracy and efficiency.

Analyst Viewpoint

Organizations are increasingly deploying data profiling tools to attack the data quality problem at its source, in transactional systems. Data profiling is a sound first step in any data quality initiative and can provide greater confidence in data integrity before the information is channeled to a data cleansing application. In a data profiling initiative, IT professionals can conduct a fairly granular analysis of the content, quality, and structure of data in transactional systems (or in target data warehouses or business applications). Properly executed, data profiling can identify and correct systemic problems in source data, better prepare the information for such data cleansing techniques as algorithmic fuzzy matching, and help ensure the reliability of business data.

Hyperion Solutions Corporation

Why is having a unified business performancemanagement system critical to the success of today’s enterprise?

This is being driven by the need to progress from operating tactically to taking a more strategic perspective. It is no longer sufficient to simply report results that are not tied to strategic objectives or business plans. Enterprises need to link their strategic objectives with operational goals. To implement, businesses must report and analyze financial and non-financial information in a coordinated fashion. This then allows them to monitor and compare performance to plans, adjust plans to respond to changes in the business landscape, and perform advanced planning by modeling potential scenarios. A unified BPM system enables this movement to strategic analysis by integrating a BI platform with a suite of financial applications to deliver tightly integrated capabilities to the business user.

Analyst Viewpoint

Unification is the distinguishing characteristic of a truly mature BPM implementation. To varying degrees, many organizations have implemented BPM systems in what amounts to a tactical fashion, often in support of finance or sales functions. A unified BPM deployment will cover all aspects of operations, such as manufacturing, logistics, customer service, and marketing. Only a comprehensive BPM system fully enables an organization to look under rocks that may conceal critical weaknesses in operational performance. Most organizations will evolve only gradually toward unified BPM because the transition usually involves reengineering at an architectural level, but they stand to reap the benefits of improved visibility and the ability to execute on strategic objectives.

Informatica Corporation

How do we manage the political minefields that commonly arisewhen establishing a data governance program?

Certain organizational principles are common across successful data governance programs:

  • Strong sponsorship from a senior business executive.
  • Rigor in defining roles and assigning specific responsibilities to individuals to enforce accountability.
  • Equal business and IT involvement, which means that the business “owns” the data and takes the lead in defining policies, and IT partners with the business to help implement the supporting data integration technology.
  • Establishment of an integration competency center (ICC) to promote reuse, share best practices, and establish common processes and standards for data integration.

Analyst Viewpoint

It’s seldom easy to reduce political friction, whether it exists among business units or between business and IT. In the latter case, data governance organizations should consider borrowing from a BI systems management approach that puts a premium on individuals with both business acumen and technical skills. Cross-pollinating business and IT skills, so that IT personnel have a better appreciation of the business side and business users develop basic technical skills in such areas as data definitions, can help defuse political landmines and better align business and IT. An ICC, headed by a senior executive functioning as a data governance “czar,” offers an effective platform for the sharing of business and IT skills.

MicroStrategy

What are the most effective approaches to self-service BI?

Every day, business people are challenged to uncover information to make timely, informed decisions. While the IT department can provide assistance, the most expedient option is self-service BI.

BI self-service is evolving rapidly. One model offers “what you see is what you get” (WYSIWYG) design interfaces over the Web. With WYSIWYG, business users design and refine reports using familiar skills, similar to PowerPoint and Excel.

The latest model of BI self-service does not require users to design reports. Instead, users surf the data warehouse using “drill-anywhere” capabilities. As users intuitively click on rows, columns, or cells of a report, ROLAP-based BI systems create new SQL and reports. The data can be shared or saved for future use.

Analyst Viewpoint

Business-side engagement and training are key first steps that can help self-service BI live up to its potential. Detailed input from business requirements analysts is essential to ensuring that a self-service BI solution meets the needs of business analysts. Training should not be a mere checklist item, but a prime opportunity for IT to sell business users on the value of self-service BI and gather their feedback. As experience has shown, force-feeding an unfamiliar BI tool can result in user dissatisfaction and subpar adoption. In the simplest terms, business users need to find self-service BI rewarding—even fun—for it to succeed. Business input, collaborative training, and user-friendly WYSIWYG tools are all ingredients for successful self service.

Netezza Corporation

Power and cooling costs are escalating.What data warehouse options can improve data center efficiency?

Netezza data warehouse appliances were developed with data center efficiency in mind. In one appliance rack, over 100 intelligent storage nodes provide MPP performance by attaching a CPU to each drive. Yet each node consumes less than 30 watts, so the power and cooling requirements for a full-rack system are about 4,000 watts and 12,000 BTU/hour. Compare this with alternative systems, including blade servers, which can consume as much as 600 watts per blade. A rack of 32 blades would require over 19,000 watts and 64,000 BTU/hour of cooling. The efficient design of an architecturally integrated data warehouse appliance offers an excellent solution to address escalating power and cooling costs.

Analyst Viewpoint

Energy efficiency is fast becoming a key consideration for data warehousing and IT as a whole. In fact, by a 417-4 vote in July 2006, the U.S. House of Representatives approved legislation urging IT managers to “give high priority to energy efficiency” when purchasing servers. The good news is that suppliers of data warehouse appliances and conventional servers are making significant strides in engineering energy-efficient systems. Data warehouse administrators will find a number of intriguing options in the marketplace. And with energy prices increasing and fuel supplies dwindling, a transition toward “eco-responsible computing” can reduce energy costs by thousands or even millions of dollars a year for large organizations.

Oracle

How is the role of BI evolving inan enterprise?

Enterprises today are looking at BI as a solution to enable intelligent interactions and business process optimization across their entire organization. As a result, the role of BI is evolving to enable pervasive use across the enterprise, empowering users with relevant, actionable, up-to-the-moment business insight that is embedded into users’ daily activities. Look for BI vendors who can be your strategic partners and help you gain fastest time to value with a comprehensive, best-of-breed solution encompassing data warehousing, BI platforms, and BI applications.

Analyst Viewpoint

Several overarching trends mark the evolution of enterprise BI. 1) BI is gradually transitioning from standalone ad hoc systems toward an enterprisewide discipline, including “operational BI,” that provides visibility across multiple business units and geographic locations. 2) Real-time BI is being enabled with on-demand and federated data integration systems that can feed updated data to BI systems. 3) Organizations are paying greater attention to data at a foundational level, with master data management and data quality initiatives that support BI’s ideal of a single version of the truth. 4) BI is slowly expanding beyond conventional data sources to enable analysis of unstructured data in Web pages, e-mails, word processing documents, and more.

Sybase, Inc.

Why purchase and manage separate servers, storage, databases, and tools to support a data warehouse when I could purchase an appliance for my business intelligence initiative?

Most organizations are struggling with the total cost of ownership (TCO) of their existing BI platforms. And, while the appliance approach to data warehousing sounds good and will meet the needs of a select group of organizations, most companies find the appliance is not flexible enough, does not leverage their existing systems and hardware, and will ultimately cost more to own over the program’s lifetime. For many organizations, a better choice is to integrate their existing applications while leveraging their current hardware to build out a complete BI stack for data warehousing.

Analyst Viewpoint

There’s no question that the appliance phenomenon has prompted a reevaluation of the data warehousing price/performance equation, particularly as DW organizations continue to grow data volumes and expand user populations. Appliances, now available at less than $20,000 per usable terabyte, are an intriguing option for many organizations. None of this, however, spells doom for the conventional best-of-breed approach to DW. Most adopters have deployed appliances to support “large data marts” as they size up price/performance, flexibility, and long-term ROI. Ultimately, of course, data warehousing success doesn’t depend on hardware and software alone—but on the skill and productivity of DW developers and administrators. One can, after all, crash a tricycle.

Syncsort Incorporated

My company sells products over the Web, generating high volumes of transaction data. One record per transactionlists the time of sale, dollar amount of the sale, and so on. How would I determine statistics such as the hourly minimum, maximum, and average sales?

Statistical analysis of transaction data can be simplified by using a data management solution that provides minimum, maximum, and average operators, and allows grouping on individual components of the date/time fields. In this example, the aggregate task option would be chosen and the transactions would be grouped by the hour component of the time-of-sale field. Then the minimum, maximum, and average of the sales amount field would be retained. Advanced aggregation functions allow for great flexibility and speed in crunching business analytics, yielding more precise information more quickly.

Analyst Viewpoint

A decade ago, it might have taken a couple of analysts two days of spreadsheet number-crunching to generate such multi-source aggregations. Today’s data manipulation tools support a broad range of data types and offer fairly advanced functionality for filtering, joins, summarization, and generation of user-defined statistics. High-performance parallel processing and deployability in distributed network mainframe, UNIX, and Windows environments have helped these tools earn a place in the enterprise data management hierarchy. The scope and complexity of an organization’s need to generate certain statistics usually determines whether an Excel client is adequate, or a more costly enterprise solution is warranted.

XLCubed, Ltd.

How can I avoid creating “spreadmarts”with Excel?

Spreadmarts are typically a by-product of a failed BI solution, not a failure of Excel. Analysts will create and use spreadmarts if their current solution lacks data or power. Two root causes of spreadmarts are 1) poor business models that lack data and 2) inferior analysis tools. Both will result in analysts inevitably exporting data to Excel—preferring its power—and being forced to acquire data from other sources and store it themselves. When the business models are accurate and Excel is connected directly and dynamically to the source data, spreadmarts will not be created.

Analyst Viewpoint

TDWI research has found that organizations have on average 28.5 spreadmarts that they want to consolidate. While consolidation would help transition toward a “single version of the truth,” it’s often impractical—and counterproductive—to force analysts to abandon Excel for a standard BI tool. For many organizations, the ideal solution is to enrich and integrate Excel in an enterprise information architecture. Sophisticated Excel add-ons enable organizations to hard-wire Excel into server-based data sources to 1) minimize spreadmart proliferation, 2) improve enterprise data consistency, and 3) allow users to continue using their preferred tool while providing them with the richer ad hoc analysis and reporting available in third-party Excel add-ons.


Next

Previous

Back to Table of Contents

TDWI Membership

Get immediate access to training discounts, video library, research, and more.

Find the right level of Membership for you.