Analysis: Teradata Tackles Complexity with New IoT Analytics Team
The human expertise to identify high-value opportunities for using IoT analytics is in short supply. This is why Teradata has formed its new “SpecialOps” squad for IoT analytics.
- By Steve Swoyer
- April 20, 2016
Teradata this week announced a new dedicated unit for Internet of Things (IoT), Global IoT Analytics.
The new unit, consisting of data scientists, data engineers, and software developers, is indeed a global one, with locations in the U.S., U.K., and India. It’s organized under the auspices of Teradata Labs. What are we to make of Teradata’s new “SpecialOps” squad -- its term, not ours -- for IoT?
On the one hand, there’s the buzzword factor: IoT is one of the hottest things going right now, with arguably more cachet than big data, the Heavyweight Champion of Hype.
On the other hand, there’s the complexity factor: making use of IoT is hard. Very hard. The challenge it poses isn’t primarily technological: it’s analytic. The technologies that enable IoT analytics -- for ingesting, parsing, and analyzing IoT data in real time; storing and managing data at massive scale; and establishing relationships between IoT signalers and data from other sources, including the data warehouse (DW) -- are not new. If making use of IoT were simply a matter of collecting, storing, and processing data, it would be easy.
What’s scarce is human expertise: the skill and know-how to put technologies together, to identify high-value opportunities for using them, and, especially, to research and develop advanced analytical use cases for IoT data. The promise of IoT analytics lies in just this, however. The combination of data from IoT signalers with data from other sources creates a more detailed and contextual representation of reality -- one that has more explanatory, predictive, and prescriptive power. Teradata, following industry luminary Tom Davenport, calls this the “analytics of things.”
Late last year, Teradata announced Listener, a software product that captures and persists streaming events into a target repository. (Listener is supposed to be generally available this month or sometime next month, Teradata officials promise.) In speaking about Listener at Teradata’s 2015 Partners Conference in Anaheim, officials made it clear that technology was just one piece of the overall IoT-analytic puzzle -- and a relatively minor piece, at that.
“A lot of the value that's come from the Internet of Things has been in that point-to-point communication between devices. [This] represents a huge opportunity in terms of the ability to capture that data and perform analytics on it,” Chad Meley, vice president of marketing with Teradata, told analysts at Partners Conference. “There still is a barrier to this ... going mainstream in terms of the talent pool and some of the challenges associated with just the mix and match [of different technologies] to pull together insights [from IoT data].”
Teradata put together its new IoT Analytics unit by shuffling existing employees and recruiting new talent, says technical marketing specialist Dan Graham. Although he declined to offer specifics about the size of Teradata’s new IoT special-ops teams, Graham said Teradata had assigned “dozens” of data scientists, data engineers, and software developers to each of the three global locations. He echoed Meley’s emphasis on the importance of human talent and skill.
“Generally speaking, we know how to clean the data, we know how to parse the data, we know how to analyze the data. That part is in the bag if you’ve got a data warehouse already,” he says.
Ah, yes: the data warehouse -- that relict reminder of the early data warehousing, right? Not so, says Graham. He cites the work of Harvard Business School economist Michael Porter, who wrote of the “exponentially expanding opportunities for new functionality, far greater reliability, [and] much higher product utilization” that are made possible by IoT-related analytics.
The “exponential” aspect of Porter’s claim presupposes that data from connected IoT signalers can be contextualized with data from other sources. Imagine a situation in which a streaming analytics analysis of several IoT sensors points to an upstream problem with a separate, non-IoT-enabled component or device. (This, too, is an interesting analytics problem: the anomaly is only discovered by combining and contextualizing data from multiple IoT signalers.) The obvious solution is to replace that device. But what is the window for doing so? Does the company have the necessary part? If so, where is it? What are the logistics of getting it out to the remote site and getting a skilled technician out there at the same time? What is the predicted timeframe before the device fails? What is the cost of downtime? To what extent should the company expedite this replacement?
This information is already in the company’s operational systems. For most Teradata customers, it’s already in the Teradata data warehouse, which means the necessary supply-chain and logistical analytics are already there, too. The tricky part is designing and instantiating data flows to consolidate and prepare all of this data, mixing streaming data with data from operational systems or the data warehouse, data from geographic information systems, and so on. The even trickier part is designing and training the analytics and predictive models that make sense of it. The trickiest part of all, of course, is the time-consuming background work that the data scientist or statistician must perform in exploring and experimenting with the data, identifying use cases like this one.
“If you don’t have a data warehouse, you can’t do more than half of your analytics,” Graham argues. “Which customers will be impacted by an outage? Who are the highest-value customers [among them]? If you don’t have the customer data, you can’t answer that.”
Stephen Swoyer is a technology writer with 20 years of experience. His writing has focused on business intelligence, data warehousing, and analytics for almost 15 years. Swoyer has an abiding interest in tech, but he’s particularly intrigued by the thorny people and process problems technology vendors never, ever want to talk about. You can contact him at email@example.com.