I love books that make you think. We The People, by Kathy Rondon, leads us down the path of rethinking what we do with data today and also, for most of us, makes us realize that we often do not consider ethics when using data today to make business decisions. To quote Kathy, “Just because we can do something, does that mean we should do it?”
This book covers the data ethics landscape for both the public and private sectors. Kathy lays out potential problems in our collection, sharing, use, and retention of data; how those problems can negatively impact more than just an individual person’s privacy; and the questions we should be asking of ourselves and our data practices for us and fellow citizens to live our lives freely and fairly. Using the goals of the Preamble to the U.S. Constitution to frame the conversation, you will understand how leveraging big data can promote democratic values, and how it can undermine them when ethical considerations take a back seat to unfettered data collection and use.
Kathy’s writing style is engaging, yet concise. Here is an excerpt, for example, from the book’s introduction (used with permission from Technics Publications):
In 2016, HBO premiered the series Westworld. Taking place in a not-too-distant future, the story follows a theme park of Artificial Intelligence (AI)-enabled robotic park “hosts.” Many of the theme park human customers take the opportunity to do things they would not do in the “real world,” and the ethical argument that takes place early on in the series is one of questioning where consciousness begins and whether depraved actions toward the “not human” are really depraved at all—or just letting off steam. By season three of Westworld, though, the real point has become apparent: that the theme park revenue was not the point at all, but rather the collection of enormous amounts of data about attendees, the aggregation of that data with other data from vast repositories, and the use of that data to control very real lives in the very real world. In a seminal scene in Westworld season three, a character, becoming aware of the scope of the data collection and its use society-wide says, “So, it tells them who I am?” The chilling response is, “It’s not about who you are, Caleb. It’s about who they’ll let you become.”
Dystopian? Maybe not. In 2020, The Washington Post reported on “surveillance scores” developed by data collection and analysis companies—who operate almost entirely without regulation—and how those scores can have dire consequences. The impact of credit rating companies like Equifax, TransUnion, and Experian are well-known but are only the tip of the iceberg.
CoreLogic, a California-based data analytics and business intelligence company, uses big data collection and analysis to provide scores to landlords to inform, not only the risk of a tenant not paying their rent on time, but also the ability of the tenant to absorb a rent increase. HireVue, a Utah-based company that provides a hiring software platform, also generates an “employability” score using large caches of proxy data on applicants.
Examples abound across the economy. These scores result in automated decisions that are not transparent to the scored subjects and that, in most cases, cannot be questioned or challenged. This is a problem because data and its use are not, ipso facto, impartial and unbiased. And in a democratic society, the authoritarianism of “data” as inexorable undermines that democracy itself. Democratic societies need an ethical framework for data collection, use, sharing, and retention—a framework that actively informs a society-wide legal and regulatory regime.
If you were to do a general internet search for the term “data ethics,” you would find several different definitions and approaches…