Data is Risky Business: I’m a Doctor, Jim …

COL03x - feature image already at 300 x 300In Star Trek, DeForest Kelley use to exclaim “I’m a Doctor Jim, not an engineer” or something similar whenever Captain Kirk asked him to do anything that was not in the scope of his medical practice. On TV, this was kind of funny. However, this past month, Science Magazine carried a story about a team of academic researchers in data science who have developed an algorithmic model to identify gang-related crime. A paper describing this research was recently presented at the Artificial Intelligence, Ethics, and Society (AIES) conference in Louisiana.

In the Q&A after the presentation, the researchers were unable address key questions about the quality of training data, the risk of algorithmic bias,  the implications of incorrect decisions mislabelling people, or the potential for the algorithms to be misused by gangs to predict when police might conduct raids.

The response of one of the researchers was: “I’m just an engineer.”

This is tellingly indicative about the Ethics of the Individual and the Ethics of the Organisation. This also highlights an issue I have spoken about in keynote presentations involving the need to mature about how we think and teach ethics in information management. The issue comes in two parts.

Part 1: Academic research in analytics is often able to by-pass rigorous ethical review because it uses information that is already available to the public or is already a matter of public record. So long as we are not actually experimenting directly on people, so long as we are not directly asking people about data relating to them, or so long as we are using data that is “public,” the ethical bar for research is often lower. The outputs of that academic research (that might not have gone through a rigorous ethics review) produces algorithmic models that are often adopted and adapted by organizations for social or commercial purposes. And, more often than not, we don’t subject the processing and the application of those algorithms to ethical scrutiny, often because there is a presumption that the academics will have done the ethical leg work when doing their research.

Part 2: When presented with potential ethical issues arising from the application of analytic capabilities, information management professionals often fail to recognize them or to appreciate the impacts of the tools and technologies that we are implementing. The ethical constructs in our organizations often do not provide any constraining situational modifiers to ensure that individuals will act in an ethical way or recognize the potential ethical impacts of the tools that they are developing.

Together, these factors can create an environment where an organization will do something phenomenally unethical as part of the development of new products or services. Another story this past month that highlights this is the disclosure that Facebook had to “nudge” children to use its Messenger for Kids tool during the research phase for the roll out of a messaging application targeted at children. From an ethical perspective this seems to suggest that steps were taken to ensure that the outcome of the R&D for this application was as desired by the company (and personally, I find the specific targeting of an advertising data gathering tool at children is unethical).  Children didn’t want to engage with the application. Additional child-focused functions had to be introduced to entice children to use the application.

Considering that Facebook’s objective is to maintain its user base, an eyebrow or two should be raised when extensive efforts are made to entice children to use their services. Particularly when independent researchers are highlighting the potential risks for children of early or over-exposure to these types of applications. Your eyebrows should hit a geostationary orbit when you consider that children up to the age of 13 are not equipped to understand complex issues such as privacy or the implications of sharing their data or their metadata (details such as who they are messaging, when, and how often).

But hey, the people involved in developing those nudges to entice children into using the application and sharing their data were “just engineers.”

The AEIS paper issue was discussed on Twitter during the week. Some commenters suggested that mere engineers have no agency here because they are a replaceable commodity and there are thousands queueing up to take their jobs so they can’t take an ethical stand and need to do what they are told.

That’s not being “an engineer.” That’s “just following orders.” History has heard that defense before.

Share this post

Daragh O Brien

Daragh O Brien

Daragh O Brien is a data management consultant and educator based in Ireland. He’s the founder and managing director of Castlebridge. He also lectures on data protection and data governance at UCD Sutherland School of Law, the Smurfit Graduate School of Business, and at the Law Society of Ireland. He is a Fellow of the Irish Computer Society, a Fellow of Information Privacy with the IAPP, and has previously served on the boards of two international professional bodies. He also is a volunteer contributor to the Leaders’ Data Group (www.dataleaders.org) and a member of the Strategic Advisory Council to the School of Business in NUI Maynooth. He is the co-author of Ethical Data & Information Management: Concepts, Tools, and Methods, published in 2018 by Kogan Page, as well as contributing to works such as the DAMA DMBOK and other books on various data management topics.

scroll to top