There have been a couple of different things converged in my media feeds that highlight a key issue that all data management professionals and leaders in information management need to be conscious of – Group Think.
This manifests itself in at least two ways in the following examples I’ll use to illustrate my points. Each of these has an implication for data governance practices and how we implement them in organizations, particularly in the context of regulated activities (i.e. where there is a legal requirement to do things a certain way) or in the context of activities that are not yet explicitly regulated but raise potential ethical challenges (e.g. potential discrimination arising from bias in processing).
Firstly, there is what I will term “Class A Group Think.” This manifests itself as a hostility to or rejection of challenges or questioning of decisions or actions being taken by an organization when raised by bodies external to the organization. The example I’ll use of this is an ongoing issue in Ireland related to the introduction by stealth, of which is effectively a national identity card system. The Irish Data Protection Commission issued investigation findings a few weeks ago that determined that there was no legal basis for the extension of an identity management system beyond the Department of Employment and Social Protection (the Social Welfare ministry), despite it having being made a requirement to access a wider range of government services such as passport applications, drivers license applications, or complaints about school bus transportation. This has exposed the Irish Government to significant potential liabilities under data protection laws, including potential civil liability.
The problem is compounded by the fact that various expert commentators and civic society groups have been raising warnings to the Government for over three years about the risks in their strategy. These commentators (your humble author included) were dismissed as cranks. The Data Protection regulator had sent draft findings to the department over a year ago that clearly flagged the issues forming the basis of their ultimate findings. This was responded to with a 470 page rebuttal that doubled down on the drivers for the project. The Department’s own Data Protection Officer, who is required under data protection law to be independent, also raised concerns about the project. He was moved to a different role this past January and replaced by someone with direct ties to the Public Services Card project (the alleged interference in the independence of the DPO is the subject of a separate regulatory investigation).
To paraphrase the Marx Brothers movie “Duck Soup,” the Irish Government didn’t engage with the concerns being raised because they had already paid a month’s rent on the battlefield. How familiar is this vignette to those of us who advise organizations on their data management strategy and obligations? What it illustrates is the critical importance in the design of our governance and oversight processes for data management of a “safety valve” that allows us (and in some cases compels us) to engage with rebuttal or challenge the data-driven goals of the organization. More often than not, by following the mantra of Stephen Covey and “seeking first to understand, then be understood” we can deliver better outcomes for organizations, customers, citizens, and society. But that requires a conscious design decision in the governance structures of the organization, including the “tone at the top” (in this case, a Government Minister who once cheerfully described the card as “mandatory but not compulsory”).
In the context of this Public Service Card, one of the constant refrains from critics of the project has been that they are not opposed to the idea of a national identity card, but that they do expect that introducing one would be subject to appropriate debate, and appropriate legislation to introduce safeguards against misuse and abuse of the system. Also, as a minimum starting point, the scheme needed to comply with existing data protection laws in its design and implementation. It doesn’t, as the Regulator has confirmed this past month.
The other facet of Group Think that has been visible in my media feed these past few weeks is the internal group think that results in governance processes that prevent or block the escalation of data management issues or concerns in the organization. Katherine O’Keefe and I discuss a previous manifestation of this in our book Ethical Data and Information Management in the context of how Facebook reacted internally to data that showed a bias in shared content and links towards right-wing content. While engineers raised the issue in internal chat groups, there was no defined mechanism for those concerns to be taken forward in the governance structures of the organization.
This past month, Apple has joined the ranks of other companies with AI voice assistants in having a data privacy scandal. A whistle blower revealed in the Guardian newspaper that staff in an Apple sub-contractor in Cork, Ireland, had been listening to and “grading” audio recordings linked to Siri activations. These recordings included recordings of medical consultations, drug deals, and people having sex. However, the grading and evaluation process did not allow for the staff to flag recordings that were of a private or intimate nature. According to the whistle blower, contractors raised concerns about the legality of their processing under EU data protection laws and the ethical issues arising from them hearing things like medical consultation recordings and other intimate moments. However, the grading process only allowed staff to report technical issues with the audio (e.g. recording quality) and not to flag any privacy issues.
Apple have responded in a timely manner by terminating the grading and review processes that were being conducted in Ireland. 300 contractors have been laid off with a week’s notice as a result (sending the clear message to any other contractors working for them that blowing a whistle will have consequences for their colleagues).
At the heart of this story is an internal “group think” that sees audio samples as data and the human grading process as a way to better train the AI systems, and as a result doesn’t consider the legal or ethical safeguards that might be needed in the context of the processing. These might include telling users that this kind of processing takes place and being clear as to the legal basis and safeguards relating to it, and ensuring that staff could properly categorize data that was of concern. As my colleague Katherine O’Keefe puts it: data is people, and that is an aspect that Apple appears to have missed.
Of course, all of this brings us to a third manifestation of Group Think that is prevalent in these two example projects – the blind adherence to a particular vision of technology and data capability.
In the context of the Public Service Card project there is an article of faith in the Irish civil service that it is necessary to create a Single View of Citizen in order to deliver integrated and joined-up services. In the rush to create the answer, it would seem that they have lost sight of the question. The question needs to look at how to deliver outcomes to individuals in a way that meets or exceeds expectations. And those expectations need to include questions of privacy and security, particularly as any democracy is only ever one election away from a shift to totalitarianism.
In the context of Apple, Siri, and Artificial Intelligence, there is a willingness to accept a “happy path” vision of technology where the machines do all the learning and make our lives better. But, just like Dorothy in Oz, we need to remember that behind the curtain there is usually a man pulling levers to make the magic happen. In the case of AI, the “man behind the curtain” is often armies of contractors reviewing data to fix data quality problems that the machines can’t handle.
Background reading for this article
Apple/Siri
Public Service Card
https://www.newstalk.com/news/public-services-card-data-protection-894624
https://www.siliconrepublic.com/enterprise/data-protection-public-services-card-statement (from 2017)