Data is Risky Business: The Customer Perspective

15-SEPCOL03OBRIEN-edThis article was going to be nice and tightly drawn, picking up on a single theme from my blog post last month on Ethics (Ethics Schmetics) concerning the “ad blocker arms race” that is developing. But a few other things have happened in the last few weeks that also bring “teachable moments” to the table that align with the issues raised in the “ad blocker arms race” discussion. So, dear reader, please bear with me. There is a coherent point to all of this. And I apologizes to my non-US readers for using the word “server” when I mean “waiter.”

It’s Ad Blockers All the Way Down…

Facebook has fired a salvo against the threats to its advertising business in the last few weeks. They have announced that they will make adverts “indistinguishable from content” to circumvent ad blocking technologies ( ). This is being done in parallel with providing users with greater control over their ad preferences. The Zuck giveth, the Zuck taketh away, it seems.

On the face of it, this might look reasonable; give users more controls while getting around technologies that block the serving of adverts by making ads indistinguishable from content. Given Facebook’s recent push on ethics, educating users, and helping them understand how their data is used, one would think they’d have done the legwork to make sure this is a fair and ethical thing to do.

But it’s not. Since the dawn of the printing press (and possibly since we first started written communication) there has been a challenge of separating objective content from subjective paid-for material. Content like this has a name – Advertorial (adverts that look like content), and there are strict rules in most jurisdictions around flagging this content as an advert.

For example, Section 5 of the FTC Act applies to “misleading content” and has been used in the US to prosecute newspapers and magazines printing advertorial without flagging it as such. TV shows have to disclose if they have had “commercial consideration” for product placements (like the type of mobile phone a character uses in a show, or the type of beverage they drink).  In light of these rules, bloggers, particularly celebrity bloggers, are increasingly under pressure from a number of jurisdictions to disclose if they are getting paid to write about the brand they are saying is cool/sexy/able to cure suspicious itching.

So, Facebook’s cure to the ad blocker problem is to basically do a thing that has been unlawful for longer than we have had on-line advertising, recreating a problem we’ve had to address in print and other media for a number of years, and have laws about already. We have a wonderful technical solution to a customer behavior and ethical practices question—but the answer may be the wrong one.

Real Data Driven Marketing

At the other end of the spectrum, Procter and Gamble have recently announced they are pulling back from granularly targeted advertising on Facebook and adopting a more “broadcast” model. Their reason: the granular adverts weren’t effective. They didn’t work. And P&G had the data to show that.

Ultimately, P&G are just listening to the Voice of the Customer in this regard – a basic quality management principle. But in doing so, they have done something blindingly obvious but particularly clever: they have used data about marketing conversions to drive decisions about how to market. They found that when they broadened their marketing profiles up to wider demographics, they got a better conversion rate from a larger population, which resulted in higher sales.

The WSJ report is clear: P&G isn’t giving up on targeted advertising entirely; they are just being more targeted in how they use it, and making sure they don’t cross a creepy line in doing so. And it isn’t necessarily an ethical decision; more a rational economic decision (but sometimes the same destination is reached by different roads).

However, none of that will matter if Facebook’s advertising platform turns out to have become unlawful through its seamless blending of adverts and content.

The Diner Experience

A related issue arose during an experience my wife and I recently had eating at a popular chain of diners in Ireland. We were in a rush to get somewhere and stopped in for what is usually reasonably fast service and reasonably reliable food. Despite the place being less than half full, that was not to be. Due to a failure of the server to hit the “enter” button on the terminal used to send orders to the kitchen, we were left waiting for nearly an hour for our meal (meaning we missed the thing we were in a rush for). To make matters worse, another staff member came and started clearing our table while we were still eating.

Now, how does this relate to Facebook, advertising, data, and ethics? It’s actually quite simple. Technology adoption versus sensible and obvious controls. In our diner experience, the server who had taken our order walked past our booth a number of times while we were waiting, taking orders from others and bringing them their food. All the while we were still waiting.

Technology had been implemented to “improve” the order-to-serve process, with orders being logged electronically. But the server didn’t hit ‘Enter’. There was no detective control in place (or even a reactive control) that prompted the server in question to ask: “Why have they not got any food on their table?” We had to alert them to the fact we still hadn’t received our food.

Needless to say, after being delayed an hour, I didn’t appreciate having my fries and drink swiped off the table while I was still eating. When we complained, we were told that the server had been asked to find out if we were finished and see if we wanted dessert or coffee. This highlights another key issue: communication is a process, and it impacts customer expectation and experience. Specifically, it doesn’t matter to the customer what your internal objectives, processes, and technologies are. The customer only cares about what they experience. If it is at odds with their expectation (like expecting to be allowed to finish their soda), they will complain.


I won’t bore you with the details of what happened next (I complained very loudly) or how the issue was resolved to our satisfaction (it wasn’t). However, it did bring home (yet again) the importance of thinking about process and the expectations of your customers in any data management endeavours. It is the expectation of the customer that is key—whether it is in serving adverts or serving burgers.

As we seek to develop more data-driven business models, and as we replace manual processes with technology-based capabilities, we become more susceptible to issues like data quality problems arising from poor processes, and we risk removing the checkpoints that alert us to errors or to poor end-customer experiences. Procter and Gamble looked at their data and found that targeted adverts had stagnant results, and have adapted their strategy accordingly. Conversely, our server didn’t look at our table and wonder “Where the heck is their food?”

Ultimately, technology is often presented as panacea for all our ills in business. Whether it is an electronic order-taking system in a diner, or a data analytics platform in a bank, or a targeted advertising platform for businesses, none of them will deliver the right results if the data in them isn’t right and if the experience the end-customer gets from interacting with the organization using them doesn’t match expectations.

Facebook has created a technical solution to an ethical and regulatory problem. However, their “solution” to the ad blocker issue will deliver an experience that legislators and regulators have already found to counter the expectations of consumers. Anything Facebook does to identify the content as an advert to meet those requirements will enable an ad blocker tool to be created to block that kind of content. This may lead to some false-positive matches; but, there is an arms race taking place.

P&G perhaps are building the basis of a responsible advertising strategy by doing broader-based targeting. It may be that this is the dawn of the ‘Tasty Burger’ advertising model that everyone is looking for: but it can only be delivered with appropriate controls and with an awareness and respect for relevant regulatory governance.

Share this post

Daragh O Brien

Daragh O Brien

Daragh O Brien is a data management consultant and educator based in Ireland. He’s the founder and managing director of Castlebridge. He also lectures on data protection and data governance at UCD Sutherland School of Law, the Smurfit Graduate School of Business, and at the Law Society of Ireland. He is a Fellow of the Irish Computer Society, a Fellow of Information Privacy with the IAPP, and has previously served on the boards of two international professional bodies. He also is a volunteer contributor to the Leaders’ Data Group ( and a member of the Strategic Advisory Council to the School of Business in NUI Maynooth. He is the co-author of Ethical Data & Information Management: Concepts, Tools, and Methods, published in 2018 by Kogan Page, as well as contributing to works such as the DAMA DMBOK and other books on various data management topics.

scroll to top