The unintended consequences of standardless privacy regulation
Welcome to 2020 where we are experiencing record innovation in adtech. Smart technologies, automation, machine learning, and artificial intelligence are all contributing to the greater good of a safer, more robust ecosystem and consumer experience. Operating on higher levels of efficiency with enhanced capabilities help mitigate issues that have plagued us, like fraud, so on the one hand, the future looks bright.
On the other hand, enter the “Great Data Debate.” In today’s always-on, data-driven world, understanding where our data is coming from and what to do with it is crucial to helping brands measure the performance of their ad spend and ultimately grow their user base. The restriction of visibility into attribution data comes at a cost to the entire ecosystem but especially to marketers and consumers.
We are not here to speculate on the primary motivator for this shift toward restricting visibility, but, as is true with all things in life, we need checks and balances to ensure no one company or person becomes too powerful. The culmination of this decision to make a portion of advertiser’s data unavailable reduces a marketer’s ability to intelligently spend ad budgets and, more importantly, it puts the lion’s share of control over consumer data in the hands of big tech companies. If nothing else, in the past few years we’ve seen countless examples of what happens when you put blind faith in a company to be accountable for your privacy and personal information. Additionally, consider an internet where a small handful of tech giants have the ability to dictate compliance with their own interpretation of data privacy –what happens to consumer choice? The answer is clear: If we say yes to an internet of this making, then privacy and consumer choice become whatever those tech giants want it to be.
Furthermore, there is a false narrative circulating that increased privacy requires restricting opportunity. Companies should have data-driven policies and procedures in place that adhere to new compliance regulations for enhanced data-subject rights and implement changes in the way they manage and interact with customers on a consent-based level.
Since the advent of GDPR in May 2018, it has been highly documented by executives, scholars and economists that while these new regulations are good news for privacy, the unintended consequences are that they further consolidate power into the hands of tech giants, putting all others at a disadvantage. This is, unfortunately, exactly the opposite of the intended result.
At the end of the day, it’s easy to fall prey to the notion that all efforts in the name of “privacy” are good. We tend to accept inappropriately reactive edicts as a mechanism for compliance, but we should not. Ultimately the consumer base needs to understand the difference between buying the product and being the product. If, as a consumer, we have an expectation of privacy and anonymity, we must acknowledge there is a cost associated with that. If that cost is unacceptable, we must elect not to patronize the service in question. Period. If consumers are willing to sacrifice their anonymity and resettable data for a free service, then great, a consumer is born! If I subscribe to a paid service for my monetary exchange, I expect the price I pay is greater than the unsold value of my consumption data on an open market. It is a choice; make it an informed one, but be open to the consequences either way.
There’s a lot of information coming at us at all times, and growth comes from our ability to understand, question, and utilize the data driven by campaigns and for future initiatives. Ask questions, be critical, don’t spend the easy dollars; there is a way for the entire ecosystem to effectively work together. As champions and advocates for the greater good, we need to ensure a fair and level playing field, maintaining transparency and facilitating the power of choice.