Trend Lines: Kognitio Recalibrates
Kognitio spent the last half-decade trying to break into the U.S. market. Now, officials claim, the market is coming to Kognitio.
- By Stephen Swoyer
- May 22, 2012
Analytic database stalwart Kognitio spent the last half-decade trying to break into the U.S. market. Now, officials claim, the market -- in the U.S., the EU, and elsewhere -- is coming to Kognitio.
It all has to do with the ascendancy of in-memory database technologies, of which Kognitio is a prime example. In-memory is nothing new, of course. Kognitio, for example, has a 25-year-old pedigree. More recently, SAP AG started marketing its own in-memory database, HANA, which marries massively parallel processing-like (MPP) scale-out performance with non-uniform memory architecture (NUMA) scale-up-ability. The prominence of SAP's bid with HANA is what Kognitio means when it says that the market is coming -- or has come -- to it.
"With the acceptance now in the market for the idea of in-memory anlaytics, [there are] two reasons for that: [first,] SAP spent hundreds of millions of dollars convincing people that it's the right way to do analytics, and the other thing is the [comparatively inexpensive] cost of memory [capacity] and CPU power," says Roger Gaskell, Kognitio CTO. "Today we're buying [systems] with a quarter of a terabyte of memory in a single server for a few thousand dollars."
Kognitio's WX2 database was designed as an in-memory data store. More precisely, WX2 was designed 25 years ago as an in-memory data store. Although Kognitio (then known as White Cross) achieved some success in the 1990s riding the first wave of white box data warehouse (DW) systems, its market traction wasn't necessarily a function of its in-memory architecture. After all, the cost of RAM in the 1990s was an order of magnitude greater than it is in the 2010s. Nowadays, RAM is comparatively inexpensive.
"Those two things have got us to the point right now where we can really just talk about our in-memory capabilities. People have come to us specifically in the last year [or] six months to talk about in-memory as opposed to just data warehousing. By far the biggest area where people have come to [us to] talk ... is [as an] in-memory [data store] on top of Hadoop," Gaskell says.
Make that three reasons why Kognitio says the market is coming -- if not rushing -- in its direction: the low cost of compute resources, the new chic of in-memory, and the tsunami of hype that characterizes all things Big Data. There's a fourth reason, too: Kognitio spent a good chunk of the last half-decade trying to differentiate itself from an increasingly noisome pack of analytic DBMS competitors. During an interview at this month's TDWI World Conference in Chicago, U.S. PR coordinator Steve Friedberg said that the aggressiveness of Kognitio's new in-memory messaging marks a "sea change shift" in Kognitio's marketing.
Kognitio, Friedberg suggested, now has an "American edge to it."
CTO Gaskell, who's been with Kognitio almost from the beginning, has no problem channeling this "edge," American or otherwise. "The market has caught up with us in that the traditional method of sucking in data into excel and building OLAP cubes is highly data-intensive. Where I think the market is going ... is being index-less, not needing aggregates, being able to accommodate multiple schemas, large sets of data, scale up, scale out," he argues. "We're the perfect analytical accelerator to Hadoop, or to an enterprise data warehouse, or to what have you. It seems to me that this is the way the market is moving in terms of the next two to five years."
Kognitio earlier this month announced a partnership with Hadoop specialist Hortonworks. (Hortonworks is a story unto itself: in the last six months, it's signed prominent accords with several DW mainstays, including Kognitio and Teradata Corp.) Gaskell says at least one customer reference is using Kognitio to run on top of Hadoop. In such an arrangement, he argues, a mature DW platform like Kognitio can augment Hadoop's shortcomings.
"They've found there's a lot of power and strengths in Hadoop to do things around data loading, doing reporting, creating transformations, doing standard reports, but where they've found difficulty is giving users ad hoc interactive [query capabilities]. It just isn't something that you can do naturally using Hadoop."