By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

TDWI Articles

Putting People at the Forefront of Technology

Whether in business projects or public endeavors, the belief that technology is always the answer leads to poor design and failure to meet the needs of real people.

Technochauvinism. An evocative word but the antithesis of where I believe technology should be taking business and society. It is a trap that IT project owners and managers -- especially in analytics and AI projects -- would do well to sidestep.

For Further Reading:

4 Ways to Evolve Your Project Management

Putting People First

Do Your Analytics to Speak to You?

Meredith Broussard, data journalist and associate professor at New York University introduced the word in her 2018 book, “Artificial Unintelligence: How Computers Misunderstand the World." Simply put, technochauvinism is the belief that technology -- and, usually, technology alone -- is the solution to every challenge we encounter, whether in the physical world or in the fields of human or social endeavor.

The Free Dictionary defines chauvinism as a “[p]rejudiced belief in the superiority of one's own gender, group, or kind." Broussard adopts all these senses of the word, including making a strong case for addressing male chauvinism in the technology industry. My focus here is not on gender but rather on the rather widespread (let’s just call it) forgetfulness among technologists to involve “ordinary people" early and deeply in the definition and design of IT solutions that, as a result, fail to deliver expected benefits.

A Database Based on the Wrong Data

Broussard offers a range of examples of the inappropriate application of technology to real-world problems. Although the book’s title focuses on artificial intelligence, she also illustrates how technochauvinism can infect projects involving simpler technology.

Broussard points out the case of a centralized database for textbooks purchased for all schools in the Philadelphia school district. The system was implemented to ensure textbook availability for students as well as to enable volume purchasing and manage distribution -- all legitimate objectives for a database-based system.

It failed miserably in that first and most important objective. Superficially, this was because it depended on teachers and staff in schools for ongoing data entry of stock on hand, annual changes in textbook needs, book deliveries and, more challenging, the regular “borrowing" of books by students. With little training available, limited incentive, and deepening budget cuts, the database never actually reflected the books available to students, particularly in poorer schools.

More fundamentally, the “people space" of the system was never properly conceived or architected. In my Business unIntelligence architecture, the people space represents the personal, organizational, and social needs of and constraints on all the people who use and/or benefit from the system. In this example, the primary stakeholders are the students and teachers for whose benefit textbooks are supplied. Secondary stakeholders include administrative staff of the district and taxpayers who fund the program.

In my opinion, the project started from a known and well-understood technology solution -- a database for inventory management -- rather than the needs of the stakeholders. Although this solution met administrative needs, it failed the primary stakeholders. Although a textbook inventory is very likely a component of a solution for these stakeholders’ needs, proper attention to the project’s people space would have identified additional and very different solution components and priorities. A real solution would likely have focused on tracking the location and use of individual copies of books physically in a school as opposed to the number of copies of different titles in stock, some of which might still be sitting in their delivery boxes or “resting" in students’ homes.

Note the difference: the number of books in stock is simple data and good for administrative needs, while tracking locations and use of individual books is more akin to information, more challenging to capture and track, and better suited to the real-world needs of ordinary people.

The Unreasonable Effectiveness of Data

In a 2009 paper, “The Unreasonable Effectiveness of Data," Google researchers Alon Halevy, Peter Norvig, and Fernando Pereira explain how often simple mathematics -- such as f=ma or e=mc2 -- describe the physical world, whereas economists and social scientists must depend on statistical methods for problems that involve human behavior. “We should stop acting as if our goal is to author extremely elegant theories," they suggest, “and instead embrace complexity and make use of the best ally we have: the unreasonable effectiveness of data."

In those early days of big data, working on speech recognition, translation, and similar projects, researchers at Google and other companies with enormous data sets were still enthralled by the unexpected answers and possibilities hidden in such data. Unfortunately -- as has become clear since then -- the unquestioning application of statistics and data to human and social (including business) systems can also create ethical problems for a wide variety of reasons, including hidden biases in the data or project team and untested assumptions about cause and effect. These lead to discrimination against individuals and groups and faulty decisions that often benefit the privileged few at the expense of society as a whole.

Despite increasing evidence of such problems, as well-documented by Broussard and others, software vendors and IT project managers often operate from unconscious technochauvinism as they eagerly pursue inappropriate or poorly considered business and government projects, particularly in analytics or AI.

Power to the People

The solution lies in a return to the old discipline of "requirements gathering" but with a new emphasis on the wider and more diverse set of stakeholders for modern IT projects. Customer segmentation projects, for example, must also consider the impact on different demographic customer groups to eliminate socially biased outcomes. Repurposing existing Internet of Things data, as is regularly seen in AI projects, requires careful consideration of why the data was originally collected to avoid unintended consequences or false conclusions.

Technochauvinism aside, technology can assist in this modern requirements process. Used with care, social media and collaborative tools can surface unexpected needs or concerns from smaller or marginalized groups of stakeholders. Combining opposing technologies, such as SQL and NoSQL data stores, offers additional agility in project scoping. However, the best solution to allow project leaders and managers to meet real people's needs is listening carefully, hearing fully, and regular introspection for unseen assumptions and biases in the project team.

About the Author

Dr. Barry Devlin is among the foremost authorities on business insight and one of the founders of data warehousing in 1988. With over 40 years of IT experience, including 20 years with IBM as a Distinguished Engineer, he is a widely respected analyst, consultant, lecturer, and author of “Data Warehouse -- from Architecture to Implementation" and "Business unIntelligence--Insight and Innovation beyond Analytics and Big Data" as well as numerous white papers. As founder and principal of 9sight Consulting, Devlin develops new architectural models and provides international, strategic thought leadership from Cornwall. His latest book, "Cloud Data Warehousing, Volume I: Architecting Data Warehouse, Lakehouse, Mesh, and Fabric," is now available.


TDWI Membership

Accelerate Your Projects,
and Your Career

TDWI Members have access to exclusive research reports, publications, communities and training.

Individual, Student, and Team memberships available.