Reflections on the practice of business intelligence.
by Wayne Eckerson Eckerson is the author of many in-depth reports, a columnist for several business and technology magazines, and a noted speaker, blogger, and the author of the best-selling book Performance Dashboards: Measuring, Monitoring, and Managing Your Business (John Wiley & Sons, 2005) and TDWI’s BI Maturity Model.
Kalido last week announced an all-in-one analytic system—from hardware to software to services—targeted at the pharmaceutical and insurance industries. This newfangled packaged analytics application got me thinking about the long quest by BI vendors to become “the SAP of BI.” Let me explain.
SAP became a $16 billion company by offering packaged transactional applications to replace legions of outdated, non-integrated, homegrown applications. For the past 15 years, many BI vendors have believed that a similar opportunity exists to sell packaged analytical applications to replace antiquated, homegrown reporting systems or consolidate a myriad of legacy BI tools and reports. The idea of racking up billions selling cookie cutter applications had an irresistible appeal.
Posted on September 14, 20090 comments
If you think that semantics is a huge problem for data warehousing initiatives—and it is—it’s an even bigger problem for our industry at large.
Take the word analytics, for example. It’s a popular term right now: reporting is passé; everyone wants to do “analytics.” And like most popular terms, it’s been completely bastardized. This is largely because vendors want to exploit any term that is popular and bend it to match their current or future product portfolios. But it’s also because we’re too complacent, uninformed, or busy to quibble with exact meanings—that is, until we have to plunk down cold, hard cash or risk our reputation on a new product. Then we care about semantics.
Posted on September 10, 20090 comments
Once compared to David facing Goliath, database vendor Netezza yesterday traded in its wooden slingshot for steel blades and armor as it both celebrated its victories to date and geared up to fight bigger adversaries.
With nearly 300 customers and a new “commodity-based” architecture, Netezza is the clear leader in the emerging analytics database market that it evangelized beginning in 2002. It celebrated is stature and market position with a high-energy, one-day event in Boston that kicks off a worldwide roadshow.
Posted on September 3, 20090 comments
I took a briefing yesterday with SenSage, a company I didn’t know. Consequently, I assumed SenSage was another startup pitching the latest, greatest technology, but I was wrong. Although SenSage is the newest entrant in the boisterous analytic database market, it is not a new company: it’s a tried and true player that offers an MPP-based columnar database that has hundreds of customers and strong partnerships with EMC, Hewlett-Packard, SAP, and McAfee.
So, why haven’t I heard about SenSage?
Six years ago, SenSage decided to apply its technology to a narrow market rather than offer a general purpose analytical engine. It chose the security and compliance market, packaging its high-performance database as a solution for meeting emerging “log management” requirements. New regulations, such as SOX and HIPAA, require organizations to store (i.e. archive) transactional log data to support compliance auditing, forensic investigation, and root cause analysis. By wrapping its database with customized ETL and reporting/analysis applications tailored to log management requirements, SenSage offers customers better performance, faster time to market, and lower cost than rivals, which include IBM, Teradata, and Oracle, which do not have dedicated log management solutions. Rather than archiving data to offline storage, SenSage makes the archived data available online so users can query and analyze the data in real-time.
So far, so good. But what makes SenSage think it can break through the noise in the current analytic database market?
SenSage’s differentiator is that it supports high-volume “event” data, which is why it calls its product the SenSage Event Data Warehouse. By events, SenSage means high volumes of source system transactions that it captures from log files or directly from the source applications. It has built a good business capturing systems log data for auditing and compliance purposes, but it also boasts several telecommunications companies that use the product to capture and analyze call detail records. Other potential event data that SenSage wants to target include point-of-sale data, RFID data, Web traffic, and email.
I still have a lot to learn about SenSage before I can make an accurate assessment about its capabilities and prospects. But they’ve got a track record, which is more than most analytic database vendors. So they’ve got my attention!
Posted on August 26, 20090 comments
How do you deliver dashboards that end users will adopt and use? That was the gist of the insightful and humorous presentation titled “Dashboards to Die For” delivered by John Rome, Associate Vice President in the Technology Office at Arizona State University (ASU), at TDWI’s BI Executive Summit in San Diego earlier this month.
ASU’s dashboard project started with a memo from the University’s president to the CIO, Adrian Sannier, Rome’s boss, that said “Adrian, learn all you can about Dashboards and then see me.” (See figure 1.) With a data warehouse already in place, the dashboards would be instrumental in driving usage to higher levels, making BI pervasive, according to Rome.
Posted on August 24, 20090 comments
The biggest mistake that I see companies make when purchasing BI tools is straddling the middle. In their quest to standardize on a single BI tool for reporting and analysis to achieve cost-savings and supplier simplification, they fail to meet the needs of any users. Typically, the tool is too powerful for casual users and not powerful enough for power users. Consequently, everybody loses and the BI program doesn’t get traction within the user community.
Posted on August 14, 20090 comments
More than 70 business intelligence directors and sponsors gathered in San Diego this week for the TDWI’s semi-annual BI Executive Summit. The executives were enlightened by a mix of case studies (NetApp, Dell, Arizona State University, RBC Wealth Management, and ShareThis) and educational sessions on key areas of interest (BI in the Cloud, change management, cost estimating BI projects, Hadoop, visualization, BI mashups, social intelligence, pervasive BI, and the future of BI.)
The attendees also voted on technologies and programs that will be of key interest to them in the next three years. Topping the technology chart were predictive and in-database analytics, dashboards, visualization, the cloud, and operational BI. On the process side, key initiatives will be data quality and governance, business process management, and BI competency centers.
Posted on August 7, 20090 comments
Yesterday, I had doubts about the value of driving from Boston to New York (eight hours roundtrip) to attend a short IBM briefing on Smart Analytics, but thankfully IBM didn’t disappoint, at least in the end.
The Non Announcement. The briefing consisted of two announcements and one non-announcement.
The non-announcement was that IBM acquired leading analytics vendor, SPSS, for $1.2 billion. Oddly, the acquisition wasn’t the focus of the Smart Analytics briefing I was attending, as I assumed once I saw the press release. In fact, as I learned later, it was a coincidence that the SPSS announcement occurred on the same day as the Smart Analytics briefing. This was reinforced by the fact that the IBM software executives (Steve Mills and Ambuj Goyal) didn’t say much about the acquisition other than it would “embed the technology across our platform.” What I find strange about that statement is that IBM had a great data mining product called Intelligent Miner which it discontinued as a standalone product several years ago and embedded its functionality inside DB2 and other applications. So, they basically bought what they already had or still have. Odd.
Posted on July 29, 20090 comments
“The enterprise software market is breaking down,” proclaimed Mark Madsen at a meeting of TDWI’s Boston Chapter yesterday. “And this opens the door for open source software.”
Madsen said the business model for enterprise software vendors has switched from selling licenses to selling maintenance and support. He said maintenance fees now comprise 45% of revenues and a lionshare of profitability. This is largely because the software market has matured and consolidated, leaving customers hostage to a few big companies, Madsen said.
Posted on July 22, 20090 comments
I recently reviewed the course materials for a class titled “A Step by Step Guide to Enterprise Data Governance” taught by Mike Ferguson at the TDWI Munich conference in June. Mike did a tremendous job covering the full scope of the data governance topic.
Mike defines enterprise data governance as “the set of processes by which structured and unstructured data assets are formally managed and protected by people and technology to guarantee commonly understood trusted and secure data throughout the enterprise.”
Posted on July 14, 20090 comments