By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

RESEARCH & RESOURCES

Ten Mistakes to Avoid in Predictive Analytics

Page 2 of 2

Ten Mistakes to Avoid In Predictive Analytics: Exclusive Bonus Content for The Modeling Agency Subscribers

We'd like to extend a special welcome to TMA's newsletter subscribers! Below, you'll find an excerpt from our Members-only Ten Mistakes to Avoid series: Ten Mistakes to Avoid In Predictive Analytics. In addition, we've provided an exclusive "Bonus Mistake" not included in the printed publication.

The Ten Mistakes to Avoid series, published quarterly, addresses the 10 most common mistakes managers and teams make and gives you inside knowledge on how to avoid these common pitfalls. This quarter's Ten Mistakes to Avoid is a guide to successfully implementing predictive analytics—while avoiding common errors.

If you are interested in reading the full version of Ten Mistakes to Avoid In Predictive Analytics, we invite you to become a TDWI Member. Membership comes with a wide range of benefits, including a comprehensive selection of industry research, news, and information; access to all of TDWI's current and archived research and publications in a password-protected area of the TDWI Web site; and discounts to TDWI conferences and seminars.

Thank you for considering Membership with TDWI! Please send us your questions and feedback.

TDWI Membership benefits
Become a Member

Ten Mistakes to Avoid In Predictive Analytics

Mistake One: Failure to Be Driven by Return on Investment

A predictive analytics project is an investment of time, energy, and resources in the development of a mathematical model used for making decisions about allocating organizational resources in a particular functional area.

Far too many organizations undertake development work as a research effort without a clear understanding of how the project will benefit the organization. There are few other areas of operations where such expenditures would be permitted.

It is not uncommon for large organizations to allocate tens of thousands to millions of dollars on a predictive analytics project. Project opportunities should be evaluated and prioritized based on their expected returns.

Strong arguments can be made for low-risk and high-return-on-investment (ROI) projects. However, many organizations commit resources at a level that makes high ROI virtually impossible, or they develop project designs that are relatively high risk.

Solid predictive analytics opportunities can be identified in many functional areas. They are exemplified by a well-defined business decision process, where a relatively small enhancement offers significant financial benefit.

Most organizations find that they are better served by a larger number of predictive analytics projects, where each project has a smaller scope, can be completed in less time, and requires a smaller investment. Prime opportunities for predictive analytics projects are in areas where the decision process is well understood and being performed by multiple individuals within an organization.

Many businesses find that there are several good ways of making decisions, and the domain experts in their organizations are already making decisions in a way that is successful.

Initial predictive analytics projects often achieve high ROI by synthesizing the multiple decisions being performed by these domain experts, and by developing a single best-practices model.

These projects are easy for the organization to evaluate because they involve well-understood decisions within the organization. The business objectives are generally well developed. Access to the required data is defined.

Beginning with such projects offers an organization the opportunity to explore predictive analytics in a business environment where the technology can be applied and evaluated based on well-defined business objectives. It is not uncommon for organizations to achieve an ROI of 1,000 percent or higher on these types of projects.

Bonus Mistake: Insufficient Validation of Models

In human behavior modeling, there is no right answer. Our models are an attempt to identify subgroups of the population that display a behavior of interest at a rate that is different from the “in-general” perspective. This allows an organization to adapt its resource allocation strategies in a way that devotes more resources to those groups that provide the most benefit, and reduce resource allocation to those groups that have a negative impact on performance.

This effort is made more difficult by the fact that behavior changes over time. Many project teams rely on highly sophisticated techniques and algorithms, using the bulk of their data for modeling with the remainder left for testing the developed model.

The single most important criterion for evaluating a model is how it performs in a live decision-making environment using the organization’s real performance metrics. Successful project teams typically reduce the data and effort allocated to the model development process specifically. They place a higher priority on repeated validation of a model’s performance prior to implementing it.

It is far better to abandon a particular model than to develop a model that performs well from an analytic perspective but cannot replicate that performance in a live environment.

About the Author

Thomas A. “Tony” Rathburn has guided commercial and government clients internationally in the implementation of predictive analytics solutions since the mid-1980s. As a senior consultant with The Modeling Agency, Tony delivers custom workshops and collaborates on a wide range of assignments. He is a regular presenter in the predictive analytics track at TDWI World Conferences, and hosts a popular Webinar: “Data Mining: Failure to Launch.” Contact him at [email protected].

Interested in reading the full publication? Become a Member today!

TDWI Membership

Get immediate access to training discounts, video library, research, and more.

Find the right level of Membership for you.