By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

TDWI Upside - Where Data Means Business

Machine Learning: From "In Vogue" to "In Production"

What is happening now in machine learning is very much like the homebrew computer movement from a half-century ago.

Can you name a technology that almost all of us have been using for 30 years that is paradoxically now considered to be the Next Big Thing?

That would be machine learning. It has been powering things (such as credit card fraud detection) since the late 1980s, about the same time, banks started widespread use of neural networks for signature and amount verification on checks.

Machine learning has been around a long time, even in very widely used applications. That said, there has been a massive technological revolution over the last 15 years.

This upheaval is normally described in recent news and marketing copy as revolutionary because what appeared to be impossibly hard tasks (such as playing Go, recognizing a wide range of images, or translating text in video on the fly) have suddenly yielded to modern methods tantalizing us with the promise that stunning new products are just around the corner.

The Real Change Isn't What You Think

In some sense, however, the important thing that has changed is a shift, taking machine learning from something that can be used in a few niche applications supported by an army of Ph.D.-qualified mathematicians and programmers into something that can turn a few weekends of effort by an enthusiastic developer into an impressive project. If you happen to have that army of savants, that is all well and good, but the real news is not what you can do with such an army. Instead, the real news is about what you can do without such an army.

Just recently, academic and industrial researchers have started to accompany the publication of their results with working models and the code used to build them. Interestingly, it is commonly possible to start with these models and tweak them just a bit to perform some new task, often taking just a fraction of a percent as much data and compute time to do this retuning. You can take an image-classification program originally trained to recognize images in any of 1,000 categories using tens of millions of images and thousands of hours of high-performance computer time for training and rejigger it to distinguish chickens from blue jays with a few thousand sample images and a few minutes to hours of time on your laptop. Deep learning has, in a few important areas, turned into cheap learning.

Over the last year or two, this change has resulted in an explosion of hobby-level projects where deep learning was used for all kinds of fantastically fun -- but practically pretty much useless -- projects. As fanciful and even as downright silly as these projects have been, they have had a very practical and important effect of building a reservoir of machine learning expertise among developers who have a wild array of different kinds of domain knowledge.

Coming Soon

Those developers who have been building machines to kick bluejays out of a chicken coop, play video games, sort Legos, or track their cat's activities will inevitably be branching out soon to solve problems they see in their everyday life at work. It will be a short walk from building a system for the joy of messing about to building systems that solve real problems.

What is happening now in machine learning is very much like the homebrew computer movement from a half-century ago. The first efforts resulted in systems that only a hacker could love, but before long we had the Apple II and then the Macintosh. What started as a burst of creative energy changed the world.

We stand on the verge of the same level of change.

About the Author

Ted Dunning is chief applications architect at MapR Technologies and a board member for the Apache Software Foundation. He is a PMC member and committer of the Apache Mahout, Apache Zookeeper, and Apache Drill projects and a mentor for several incubator projects. He was chief architect behind the MusicMatch (now Yahoo Music) and Veoh recommendation systems and built fraud detection systems for ID Analytics (LifeLock). He has a Ph.D. in computing science from the University of Sheffield and 24 issued patents to date. He has co-authored a number of books on big data topics including several published by O’Reilly related to machine learning. Find him on Twitter as @ted_dunning.

TDWI Membership

Accelerate Your Projects,
and Your Career

TDWI Members have access to exclusive research reports, publications, communities and training.

Individual, Student, and Team memberships available.