5 Minutes with an Analyst: Mike Driscoll of Metamarkets
Mike Driscoll of Metamarkets explains where data analytics and data science are headed, the tools and techniques that have great potential, and the beauty of tables in displaying analytics results.
- By James E. Powell
- December 14, 2016
Mike Driscoll is the CEO of Metamarkets, a San Francisco-based company providing interactive analytics for programmatic marketing. Driscoll founded Metamarkets in 2010 after spending more than a decade developing data analytics solutions for online retail, life sciences, digital media, insurance, and banking. He spoke to Upside about the role of an analyst.
Where is data analytics/data science headed in the next few years?
In the next few years I see three key themes: real-time streaming, verticalization, and actuation.
First, we're seeing this shift in the classes of data that data scientists and analysts work with, moving from static data sets to real-time data streams.
Second, I think data analytics and BI tools are increasingly becoming specific to verticals rather than being general purpose. Driving a car or choosing where the medical laser should be shot is a harder problem than just doing image analysis more generically and requires a deeper level of understanding of the workflow involved in a problem.
Third, we're moving from a world where data science was just about analysis to actually connecting analysis with action. Sensors are not valuable unless you have actuators that they can be attached to. Data scientists are getting more involved in connecting insights to actions.
Is there a tool or technique that isn't popular today but has a lot of potential? Why?
One of the more interesting areas of innovation and big data analysis these days is around statistical approximations. The first era of big data and MapReduce algorithms focused on getting exact sums, counts, and averages at scale. That was powerful, but we are now seeing a class of algorithms become used more frequently that allow for tradeoffs between processing cost and processing precision. That mirrors a bit how the natural systems often deal with data at scale. The eye doesn't retain 100 percent of every photon that hits its censor -- likewise, a lot of data processing and big data stacks need not retain 100 percent of the data that they come across to get to approximate insights quickly.
What's your favorite part about being an analyst/data scientist? Your least favorite part?
The most fun part of being a data analyst is the art of mapping numeric data to visual insights. There's a lot of art involved in how to map a time series into colors and axes and shapes -- that's where analysts can be artists. The least fun part, unsurprisingly, is data "janitoring" and preparation -- in most cases you spend 80 percent of your time acquiring, structuring, and preparing the data and only 20 percent of your time performing analysis. We're trying to change that here at Metamarkets and make it easier to have data available at your fingertips.
Are you working on anything interesting right now? If not, what's your dream project?
I believe the most powerful data visualization is the humble table, and I'm passionate about finding ways to take the tabular display of information into the modern era. That's something I'm exploring, and I'm also excited about all the work we're currently doing at Metamarkets with many of the leading companies across the programmatic marketing landscape to provide them with better ways to turn mountains of auction data into interactive visualizations quickly.
If you could go back in time, what's the one thing you would tell yourself as a new analyst/data scientist?
As a data analyst, you're often working through a series of transformations with our data sets. The most important thing on the path to getting insights is to properly manage that transformation process. The only way to maintain sanity on that path is to automate your work, so that you know how you got through each step. Data analysts who don't organize their transformation pipelines often end up not being able to repeat their analyses, so the advice I would give to myself is the same advice often given to traditional scientists: make your experiments repeatable!
About the Author
James E. Powell is the editorial director of TDWI, including research reports, the Business Intelligence Journal, and Upside newsletter. You can contact him
via email here.