TDWI Articles

Perspective: What Makes "Analytics" Analytics?

As we contemplate a future in which machines produce analytics insights, remember that no combination of technologies and methods can replace the human capacity for curiosity and wonder.

Increasingly, we use the term "analytics" to describe just about everything, from business intelligence (BI) reports, charts, and other not-quite-analytical artifacts to the most advanced data visualizations.

However, a BI report is not (necessarily) a piece of analysis. An ad hoc query tool by itself is not analytical, and neither is a self-service data visualization tool such as Tableau.

What makes these things "analytical" is their creation and use by human beings.

We could split hairs here -- especially as we contemplate a future in which "intelligent" machines are tasked with the production of new analytics insights. However, someone has to study, decompose, and frame the initial problem. Someone has to identify that problem's constitutive elements. Someone has to enumerate and interpret the relations between these elements. Someone has to instantiate this interpretation in software or firmware, e.g., in an analytics model.

Before any of this happens, someone has to speculate, inquire, and hypothesize. Someone has to be curious -- to wonder.

That someone is a human being.

This will continue to be the case until machines somehow develop (or, what is more likely, have developed for them) a capacity for curiosity and wonder, both of which are arguably necessary for self-awareness and abstract thought. Until then, even the most sophisticated ensemble models, ensemble methods, or function-specific artificial intelligences (AI) cannot reveal, let alone "discover," anything that isn't built into their understanding of the world they purport to model.

So what do we really mean by the term "analytics?"

Often, we mean something quite mundane. "Analytics" describes the practice of putting pieces of information together into new and different combinations -- data models, multidimensional models, predictive models, algorithms, etc. -- such that they approximate a richer, more revealing, more actionable world.

Analytics can take the form of a basic business fact -- sales of this product in this store in this region for this period is one example. An historical analysis could ask how sales of this product at this time compare with sales at this time last year or five years ago.

More advanced types of analytics are far from mundane, however. They use statistical techniques to generate an output -- a prediction, a prescription, a simple correlation -- that's a function of one or more input variables.

Some analytics technologies -- such as machine learning and AI -- also have a limited capacity to "learn" or adapt as real-world conditions change. They're able to do this by making predictions or inferences based on the information available to them. They cannot "think," "know," or "imagine." Instead, they work programmatically, in accordance with the logic and rules that are built into their enabling models. Their sophistication and power can be enhanced by combining them with other models or technologies.

Nothing Will Come of Nothing

No combination of analytics technologies and methods can approximate the human capacity for curiosity and wonder. Historian of ideas Lorraine Daston, among others, has written persuasively about the role curiosity and wonder play in the production of knowledge.

Analytics technologies lack these capacities. They likewise lack the related capacities of imagination and speculation, which are the products of human self-consciousness and thought.

Analytics is constrained at all times by the parameters that determine the makeup of its world. It cannot "think" outside of this world. If logic for associativity, for referentiality, or for a kind of (primitive) conceptual abstraction isn't built into an analytical model -- or enabled by means of other technologies, such as a graph or text-analytics engine -- it cannot accommodate change or discontinuity. Furthermore, if real-world conditions change drastically, even the most accurate, reliable, or sophisticated ensemble of technologies will cease to be useful.

The accuracy of analytics tends to diminish over time as a function of continuous change. As real-world conditions diverge from the parameters instantiated in an analytics model, the model has less predictive or revelatory power.

We can use different combinations of heuristics, decision rules, and machine learning, along with other methods, to control for divergence, to some extent. Think of this as managing for known-known and known-unknown change. We cannot design our analytics technologies to discover, analyze, and (most important) creatively adapt to radical -- i.e., unknown-unknown -- change, however.

Barring the emergence of something like artificial general intelligence, we aren't likely to produce analytics models that have this capacity. It's inescapably a problem for human wonder, analysis, and creative transformation.

Remember, analytics has only a priori understanding of the world -- knowledge independent of experience. If an object of knowledge isn't already "there" in the analytics itself -- i.e., instantiated in the logic that models, parses, transforms, processes, and/or combines data objects, or measured by these data objects themselves -- it can't be discovered or mined. The conditions for its possibility don't yet exist.

The point is that they aren't going to exist unless a human being identifies them.

Analytics isn't magical. It isn't mystical, miraculous, or oracular -- or god from the machine. It is the product of human imagination and human labor. To this end, speculation, inquiry, research, and analysis are critical components of the analytics development process.

Most important, however, are the naive human feelings of curiosity and wonder: the capacity to be amazed and baffled, uncertain and afraid, hopeful and optimistic. Those of us who have used data visualization technology to reveal hitherto unknown -- and undreamt -- patterns, phenomena, and, yes, anomalies have a sense for what this means.

In the latter case, the discovery of a phenomenon or anomaly isn't the end -- it's just a beginning. It's prelude to a process of speculation, inquiry, research, testing, and analysis.

The feelings and attributes I'm describing will not easily be taken up by machines. There's every reason to doubt that an AI -- self-aware or no -- could experience or know something like curiosity or wonder. As Daston and co-author Katharine Park write in their award-winning Wonders and the Order of Nature, wonder -- not just curiosity, but optimistic wonder, fearful wonder, wonder in the face of the strange, uncanny, or anomalous -- is a reliable engine for discovering and producing knowledge.

"[W]onder and wonders hovered at the edges of scientific inquiry," they write. "[T]hey defined those edges, both objectively and subjectively. Wonders as objects marked the outermost limits of the natural. Wonder as a passion registered the line between the known and the unknown."

About the Author

Stephen Swoyer is a technology writer with 20 years of experience. His writing has focused on business intelligence, data warehousing, and analytics for almost 15 years. Swoyer has an abiding interest in tech, but he’s particularly intrigued by the thorny people and process problems technology vendors never, ever want to talk about. You can contact him at [email protected].


TDWI Membership

Accelerate Your Projects,
and Your Career

TDWI Members have access to exclusive research reports, publications, communities and training.

Individual, Student, and Team memberships available.