Advanced Analytics Catalyzes Spending
Vendors have taken to advanced analytics like sharks to a feeding frenzy. What's whetting their appetite is companies' need for big data and IoT services -- along with the analytics know-how to maximize the value of these technologies.
- By Steve Swoyer
- July 5, 2016
In its new Worldwide Semiannual Big Data and Analytics Spending Guide, International Data Corp. (IDC) makes a startling prediction. By 2019, it forecasts, worldwide sales of big data and business analytics technologies -- including services -- will reach $187 billion. That's an increase of 53 percent over today's tally, which IDC pegs at (a not-at-all slouchy) $122 billion.
By far the biggest engine for big data and analytics sales will be services, according to the forecast. The market watcher says services-related revenues will account for over half (55 percent) of all spending over the next four years. Of this share, demand for IT-related services will far outstrip demand for business-related services, IDC says, putting the ratio at 3:1.
This shouldn't be surprising. Vendors have taken to big data, the Internet of Things (IoT), and analytics like sharks to a feeding frenzy -- or a presidential candidate to Twitter. In just the last six months, for example, Cisco Systems, Hewlett-Packard Enterprise, IBM, Microsoft, Oracle, SAP, SAS Institute, and Teradata all announced ambitious IoT-related analytics efforts.
What's whetting their appetite is the palpable need among companies -- especially large organizations -- for big data and IoT services, along with the advanced analytics know-how to maximize the value of their investments in these technologies.
IDC forecasts that by 2019, large organizations (those with 500 or more employees) will account for three quarters ($140 billion) of all big data and analytics spending. From a vendor's perspective, that's a simply irresistible opportunity.
It would be inaccurate to describe the growing service expense as wasteful for organizations, however.
Advanced analytics is a hard problem that can't be willed or wished away by means of automation, artificial intelligence (AI), and machine learning (ML) -- or via a black-box technology offering. ML, AI, decision automation, and other analytics technologies require highly specialized expertise: business and/or domain-specific expertise, on the one hand; data engineering, analytics, and even project-specific technology expertise on the other.
Take streaming analytics, for example. There are no less than half a dozen open source streaming ingest -- or combined streaming ingest and streaming analytics -- projects, for example, with names such as Storm, Flink, Flume, Kafka, Spark Streaming, Apex, and Heron. The first six are sponsored by the Apache Software Foundation; the last is a Twitter project.
Which is right for your company? How do you make this decision? Let's look at some recent history.
Four years ago, Storm was the streaming ingest/analytics technology of choice. Then the market selected Kafka (which supports streaming microbatch ingest) as the gold standard -- for streaming ingest, at least. Kafka was often paired with Storm, too. More recently, Spark Streaming, Apache Apex, and Twitter's Heron emerged to address the streaming-analytics use case.
For some applications, a combination of these technologies (e.g., Flume, Kafka, and Spark Streaming; Kafka, Storm, and Hbase) might be ideal. For others, no existing technology will suit. Is this why Twitter developed Heron from scratch to replace its Storm-based streaming ingest solution?
What matters is that these technologies are evolving so rapidly that projects will often leapfrog one another -- or lose momentum as open source developers (and the larger market) take up and use competitive projects. Unlike Twitter, the vast majority of organizations have neither the resources nor the capacity to roll out their own streaming ingest and in-flight analytics technologies. Vanishingly few companies do.
Basically all organizations that aren't named Facebook, Google, or Twitter are going to need help identifying ideal candidate use cases for advanced analytics. They're going to need help analyzing said use cases and breaking them down into discrete (doable or addressable) problems. They're going to need help developing and enumerating the elements of these problem areas.
You'd better believe they're going to need help mapping these elements to solutions: help selecting the right mix of technologies, help integrating these technologies into existing processes, help creating entirely new processes to support them.
They're also going to need help developing, implementing, and maintaining edge-analytics solutions. Analytics isn't one-and-done; the accuracy or precision of almost all analytics models will drift over time.
The successful use of analytics will also transform the status quo. When a business turns the analytics lens on itself -- using analytics to monitor, simplify, automate, and optimize -- that business will change as a result. Analytics is likewise cumulative: not only does analytics success fuel additional analytics demand, but discrete insights (such as a pattern suggestive of impending failure in an upstream device) are most valuable when they're combined with other analytics insights.
The cost for companies to transition to advanced analytics is going to be enormously expensive, but the effects of this transition will likely be enormously valuable, although probably not quite as valuable as services vendors are promising.
About the Author
Stephen Swoyer is a technology writer with 20 years of experience. His writing has focused on business intelligence, data warehousing, and analytics for almost 15 years. Swoyer has an abiding interest in tech, but he’s particularly intrigued by the thorny people and process problems technology vendors never, ever want to talk about. You can contact him at [email protected].