RESEARCH & RESOURCES

TDWI Conference Recap: Hyper BI Isn't Hyperbole

TDWI's recent World Conference made the case that change is inherent in BI and DW. BI pros are going to have to accommodate event processing. It's all part of the ongoing transition to Hyper business intelligence.

The theme of the recent TDWI's World Conference in Washington, D.C. was performance management, but a sub-theme emerged: the promise -- and the perils -- of hyper business intelligence (BI).

From keynote presentations by industry luminaries Marc Demarest and Nancy Wilson that dealt explicitly with BI transformations to educational tracks that grappled with the nuts and bolts of the next-gen BI and PM infrastructure, radical change was in the air.

We've heard talk of radical BI change before, of course. The conference's educational content gave form and substance to all of the keynote information. Demarest, a principal with consultancy Noumenal Inc., kicked things off, serving up a keynote address that grappled with the challenges that complex event processing (CEP) presents to both BI professionals and to existing BI architectures.

Ready or not, he said, BI pros are going to have to accommodate event processing, both in its more basic forms (e.g., as application events transmitted over a messaging bus) and in its more sophisticated variants -- such as streaming time-series or event-oriented data generated by thousands of networked devices, often at a fractions-of-a-second pace that puts the "real" in real-time.

Although existing BI and DW architectures can accommodate simple event processing requirements, CEP -- and the hyper-connected streaming networks of tomorrow -- calls for a rethinking of the BI and data warehouse (DW) status quo. Demarest described it as a kind of double-edged sword for BI and DW professionals: on the one hand, the hyper-connected enterprise of tomorrow will give BI primacy of place in the very heart of the next-gen operational infrastructure; on the other hand, BI and DW pros will have to come to grips with a technology model -- figured by CEP -- to which they've thus far given relatively short shrift.

It was a theme that data warehousing expert Claudia Imhoff revisited in a course that she presented with DW veteran -- and perspicacious industry-watcher -- Colin White, dubbed The Why and How of Operational Business Intelligence.

As shops transition to operational business intelligence (OBI), the traditional DW architecture is going to have to change, said Imhoff. More precisely, BI and DW professionals will need to buttress their traditional DW architectures with next-gen complements.

"You're going to see ... at some point, traditional BI architectures stick their tongues out and go, 'I can't go any faster. I'm done," Imhoff told attendees. "At some point, we all have to recognize that traditional BI has gone as fast as it can, and if I want to go faster, I'm going to have to [change my architecture]."

CEP, Imhoff conceded, is "an entire world of analytics that we [BI professionals] didn't know anything about."

Citing the inevitability of time-sensitive complex events (for applications such as fraud detection), Imhoff said that the next-gen analytic architecture will be hybrid-itized, yoking traditional DW platforms to CEP-oriented offerings from both traditional enterprise application integration (EAI) vendors as well as upstart players.

CEP, she concluded, is a prerequisite for operational business intelligence (OBI). "If you're going down the road of operational business intelligence, at some point, you're going to bring in a CEP technology. The more we drive toward a real-time analytic environment, the more it will become event-driven and process-centric. That's when we realize that the traditional data warehouse-driven business intelligence environment might not be suitable for particular types of analytics." This doesn't mean that shops must jettison their DW infrastructures, she emphasized.

"In the [corporate information factory] or in anybody's BI architecture, there are two key components," Imhoff said. "The first is the data warehouse; the second is the operational data store. Both of them have a big role [to play] in operational BI." In the world of OBI, Imhoff said, the role of the data warehouse or data mart will remain the same: an authoritative source for historical information.

"Depending on how fast we can make our little ETL process spin, we can get snapshots in every 15 minutes, or maybe it's every hour, or maybe it's once a day," Imhoff continued. "For operational business intelligence, data integration can be sped up so that we can do intra-day analytics. ... The data warehouse may not be suitable, though, for event analytics. You can do it, but the data at that point is somewhat old. It may be too expensive, [or] the volumes are too high. ... [At any rate,] there are technological and business reasons why it doesn't make sense."

As Colin White, president of Database Associates Inc., put it, "When we get down to trying to make decisions in sub-seconds or a few seconds, we have to approach this from a different angle and a whole bunch of new analytical techniques ... will enable us to do that." Traditional data warehouse architectures tend to run out of gas once you try to push them faster than intra-day refresh periods, White argued.

"Often the technologies you use for strategic and tactical business intelligence no longer work [for operational BI]. You must look at a different architecture or a different approach to break this 24-hour-period," he said.

Imhoff stressed that the DW and data mart systems of today aren't going anywhere.

Gearing up for OBI is completely unlike doing traditional BI and DW. "Consider the age and fragility of your business intelligence systems. Can I embed BI -- or is it going to cause [the infrastructure] to fall apart?" she urged.

"Do I have newer systems where I can actually have some kind of service-oriented architecture, [or where I can] call some kind of service or embed some kind of service into the operational environment itself?"

In dozens of classes, industry experts discussed similar scenarios. In his Advanced Analytics seminar, for example, Mark Madsen, a veteran DW architect and a principal with consultancy Third Nature Inc., talked about the transition from what he called "scarce data" to "lots of data." Needless to say, Madsen noted, the advent of lots-of-data requires a rethinking -- if not a reimagining -- of the BI and DW status quo.

"We've inverted the market from scarce data to lots of data and that requires new techniques," he observed. "You have to start finding ways to determine which things are meaningful and which aren't, so you need new techniques. You need statistics, you need time series analysis." You need, Madsen concluded, a bevy of complicated -- and (from the perspective of non-statisticians) sometimes indecipherable -- analytic technologies.

Next-gen analytics will consume data from both CEP and non-traditional (i.e., unstructured or semi-structured) sources. "Analytics is now actually coming from ... two other very important environments, but they're outside the data warehouse," Madsen said. "For [someone] entrenched in traditional business intelligence, [this is] like rubbing a cat the wrong way: there are analytics outside of the traditional data warehouse that are equally valid; ... they just aren't generated from the data warehouse. Yes, traditional data warehouse[-driven] analytics are absolutely necessary, but keep your eye on [event analytics] and even unstructured or content analytics, too."

Imhoff summed up the next-gen lay-of-the-land best.

"At some point, the data warehouse has gone as far as it can go. Let it go. Let it do what it was meant to do. Don't make it do what it wasn't meant to do," she counseled attendees. "[A]t some point, we have to recognize, it's gone as far as it can go, let me change technology."

TDWI Membership

Get immediate access to training discounts, video library, research, and more.

Find the right level of Membership for you.