Just because we have amassed a huge amount of data doesn’t mean that we really understand what it is telling us. To move from raw data to actionable information, we frequently must use algorithmic techniques. However, the ever-growing range of available algorithms and the confusing landscape of algorithmic technologies can make it hard to select, apply, and integrate algorithmic intelligence into your data analysis pipeline.
This is a two-part course that can be taken together as a full-day course, or each part can be taken individually without the other. In this second part we will look at technical implementation details around integrating algorithms and work together through six practical applications of algorithms that were discussed in Part 1. If you didn’t attend Part 1, you can still attend this course to dive right in, using R, Python, AWS, and APIs to experiment with correlation analysis, principal component analysis, clustering, prediction, and anomaly detection.
You Will Learn
- The difference between data, information, and knowledge, and how to improve visualizations with algorithms that move from data to knowledge.
- How algorithms can make the difference between having a data-driven message fall flat or driving stakeholders to action.
- How to run correlation analyses and visualize the results in ways that both explore and explain.
- How to perform principal component analysis in R, what the results mean, and in what situations this is useful.
- How to perform a clustering analysis and how to integrate the results into business analysis.
- How to predict outcomes by using decision trees in Python, and the difference between unsupervised and supervised machine learning.
- Important differences between client-side and server-side operations, as related to algorithmic implementation architectures, and pros and cons of each.
- How to leverage the growing world of APIs by using AWS, Postman, and API algorithm providers.
- Data analysts, business analysts, business intelligence professionals, analytics professionals, data scientists, and data visualization practitioners; developers or architects responsible for integrating disparate technologies; anyone responsible for finding and communicating knowledge derived from data