Everyday Analytics: Predicting the Oscars, Racism in Algorithms
Analytics isn't only for highly technical subjects. Read these two articles to learn how tech firms used data to try to predict the outcome of the Academy Awards, and how using predictive analytics in the courts is raising serious ethical questions.
- By Lindsay Stares
- June 16, 2016
Data and the Oscars: How Accurate Are Predictions?
When it comes to the Academy Awards, we can all say who we think should win, but some people predict who is going to win. Some of those people are film industry insiders, some are bookies taking bets on the outcome, and recently, some are tech firms.
This past year, two firms tried to use data analytics to predict the Best Picture winner. Neither company got the answer right, but neither did the human approach used by bookies. (Both groups thought The Revenant would win the top prize.) Read coverage from the BBC for details, including the types of data the firms considered and the approach they used.
The Moral Dimension: Big Data and the Courts
Enterprises are using big data to find trends, identify customer needs, and optimize business processes. Errors or unseen biases in the data or the model may affect the success of analytics.
What happens when the accuracy of a data algorithm could impact the rest of someone's life?
Court systems in many states have begun using predictive algorithms that combine numerous data points and rate the likelihood that a person will reoffend. Some courts even use this data in sentencing. An investigation by ProPublica found that something in this data or the algorithm may be biased -- it is scoring black defendants as a higher risk, even compared to white defendants with prior offenses.
Read their in-depth report for more about the algorithm, the data ProPublica used to judge the risk assessment tool, and the impact this use of analytics is having.
Lindsay Stares is a production editor at TDWI. You can contact her at firstname.lastname@example.org.