Top of page

Data Privacy – A Deep Dive (Part 3): A 9Rooftops 3-Part Series

Evan Fung - Director of Analytics

As Director of Analytics, Evan Fung helps 9Rooftops’ clients compile critical website user data that enables businesses to enhance their business strategies. His comprehensive approach to analytics focuses on branding and sales goals.

Evan’s “Data Privacy – A Deep Dive” is a three-part series on the complex, sometimes controversial, analytics ecosystem. First, Evan’s analytics deep-dive delves into the long, often forgotten, analytics history. Later, he discusses the more modern, familiar analytics headlines.

Contact Evan and the team at 9Rooftops for an analytics audit today.

Part 3 – Data Privacy: Where Do We Go From Here?

What’s Coming?​

Data Privacy & Its Future

In part 1 of our 9Rooftops’ “Data Privacy – A Deep Dive Series,” we allowed history to serve as our guide. In part 2, we explored a number of modern analytics controversies that changed cultural perceptions on digital data issues and prompted legislative interventions.

In our final installation, we analyze data privacy concerning everything hereafter. What’s the future of data privacy?

Let’s explore.

The Future of Data Privacy

GOOGLE’S PRIVACY SANDBOX​

Privacy Sandbox: First introduced in August 2019, Google’s Privacy Sandbox is, according to Google, “a collaborative effort to develop new web technologies that will improve people’s privacy protection and maintain existing web capabilities, including advertising, keeping the web open and accessible to everyone.”

While the project is not well-defined yet, the Sandbox essentially consists of five browser APIs that, together, will replace cookies as we know them:

Trust Token API – Google’s alternative to CAPTCHA; will request that users fill out a form to receive cryptographic tokens, with the purpose of preventing fraud

Aggregated Reporting API – will allow performance-related information (e.g., reach, views, impression) without cross-site tracking of the user

Conversion Measurement API – will signal when user conversions take place, without revealing any personal information, similar to Apple’s SKAdNetwork

Retargeting API – the TURTLEDOVE proposal (Two Uncorrelated Requests, Then Locally-Executed Decision On Victory); allows ad networks to add users to segment groups in the browser based on certain actions

Note: Google announced on January 25, 2022 that Federated Learning of Cohorts would be replaced by the Topics API. Skip to the “Topics API” section to learn more, or continue reading for historical context around FLoC.

Federated Learning of Cohorts (FLoC) – machine learning technique that allows for browsers to form a centralized model without exchanging data, in order to place users into various “cohorts” based on browsing activity

Topics API – replaces the controversial Federated Learning of Cohorts (FLoC) as a proposed solution to the deprecation of third-party cookies. Topics API determines a user’s top interests based on browsing history—without involving external servers—and shares that information with participating websites and advertisers.

PRIVACY SANDBOX – FLoC​

The most widely discussed and most contentious of Google Privacy Sandbox original set of proposals was the Federated Learning of Cohorts.

What was FLoC?
With FLoC, rather than letting websites track users’ browsing activity with third-party cookies, Chrome itself would track users’ browsing activity — locally — and then place the user in various audiences, or “cohorts,” based on those habits. Advertisers would then receive this cohort information from Chrome and target their ads to cohorts, rather than an individual user. Google claimed that FLoC was nearly as effective as cookie-based tracking — delivering 95% of the conversions per dollar.

What was the problem?

Organizations like the EFF (Electronic Frontier Foundation) expressed concern with the new potential privacy risks that FLoC would bring; EFF even created a website (amifloced.org) to let users check if they had been included in Google’s early tests of FLoC.

No other browser developer announced plans to implement FLoC, and several have explicitly said that they would block it. Developer Brave said, “The worst aspect of FLoC is that it materially harms user privacy, under the guise of being privacy-friendly.” Developer Vivaldi said, “We will not support the FLoC API and plan to disable it, no matter how it is implemented. It does not protect privacy and it certainly is not beneficial to users, to unwittingly give away their privacy for the financial gain of Google.”

PRIVACY SANDBOX – TOPICS

What is the Topics API?

Topics replaces the controversial FLoC proposal. With Topics, a user’s browser determines interest categories like “Fitness” or “Travel & Transportation” based on browsing history in a given week. The topics are stored for three weeks before being deleted. When a user visits a participating website, one topic from each of the past three weeks is shared with the site and its advertisers. This process occurs locally, with no external servers (including Google’s own) being involved.

How is the Topics API different from FLoC?

Reduced Fingerprinting Risk – One of the main differences between Topics and FLoC is that Topics does not group users into cohorts. The Electronic Frontier Foundation pointed out that fingerprinting techniques could be used to uniquely identify a user’s browser from other users within the same cohort.

Avoids Sensitive Categories – Unlike FLoC, Topics exclude potentially sensitive categories like gender and race.

Increased User Control – Users can see the topics they are part of and block specific ones, as well as opt out of the Topics API entirely.

PRIVACY SANDBOX – LEGAL HURDLES

In January 2021, the Competition and Markets Authority (CMA) in the U.K. announced that it would investigate Google’s Privacy Sandbox to see whether it could harm competition and concentrate even more ad spend into the tech giant. As of June 2021, Google has agreed to collaborate with the CMA, as it attempts to rework the Privacy Sandbox. Meanwhile, Google also faces a potential antitrust battle in the U.S., with a group of 15 attorneys general arguing, “Google’s new scheme is, in essence, to wall off the entire portion of the internet that consumers access through Google’s Chrome browser.”

It remains to be seen whether Google’s shifting from the controversial FLoC to Topics will assuage any of these concerns.

ADVANCING TECH WILL PUT FURTHER SCRUTINY ON PRIVACY​

The growing ubiquity of digital solutions in everyday life will only serve to heighten concerns about privacy, data protection and ethical use of data. Two areas of technological advancement are among the fastest growing, while also representing some of the most significant concerns related to privacy — internet of things and facial recognition.

Internet of Things (IoT)

By the end of 2020, there were more IoT connections (e.g., smart home devices, connected vehicles, smart watches, etc.) than non-IoT connections (e.g., smartphones, desktops, laptops). Of the 21.7 billion active connected devices worldwide, 54% of them were IoT devices. By 2025, there will be an estimated 30.9 billion IoT connections — almost four IoT devices per person.

According to a study by Accenture, consumers are suspicious of these connected devices that, while adding convenience, continue to collect more and more data about them. More than 76% said that they expressed discomfort with data collection through microphones or voice assistants.

Facial Recognition
The global facial recognition market size was valued at $3.9 billion in 2020 and is expected to grow to $13.9 billion by 2028.

Though facial recognition technology was put into the hands of the masses with the launch of the iPhoneX in 2017, consumers have maintained reservations about the technology and its potential uses.

While most Americans (56%) trust the use of facial recognition technology by law enforcement agencies, a notably smaller 36% of the public trusts technology companies with its usage, with an even smaller 18% trusting advertisers to use the technology responsibly.

RISING FOCUS ON DATA ETHICS​

The Open Data Institute defines the rapidly emerging area of data ethics as “a branch of ethics that evaluates data practices with the potential to adversely impact on people and society — in data collection, sharing and use … A failure to handle data ethically can harmfully impact people and lead to a loss of trust in projects, products or organizations.”

The Snowden and Cambridge Analytica sagas obviously come to mind as examples where organizations failed to maintain or purposefully disregarded data ethics. Despite those watershed moments bringing data ethics to the forefront of public consciousness, brands — and the platforms that they advertise on — are increasingly losing sight of data ethics and, in turn, losing consumer trust as a result.

The Christchurch massacre Facebook livestream, YouTube’s controversies surrounding the availability of videos showing children in distressing situations and the U.N. Global Compact on Migration fake news operation are just a few examples of incidents where inattentive or negligent marketers had their ads appear next to unsavory or questionable content, causing demonstrable damage to brands.

The pressure and onus are clearly on marketers to become drastically more vigilant and aware of where and how their money is spent. If they fail to do so, it will undoubtedly be met with considerable consumer backlash, or worse — as a study found that over 90% of global consumers expect brands to use technology ethically and governments to intervene if they do not.

PERSONAL PRIVACY OWNERSHIP​

Historically data and privacy systems evolved within organizations and the systems were largely managed by these organizations, with limited user control only recently emerging. But in the future, such systems could become much more people-centric with the advent of Personal Information Management Systems (PIMS).

PIMS (also referred to as personal data stores, personal data spaces or personal data vaults) are systems that allow people to control the gathering, storing and sharing of their personal data to third-parties.

PIMS fall into two categories: a local storage model or a cloud-based storage model. In a local storage model, information is kept in users’ devices such as a smartphone or laptop. In a cloud-based model, information is stored either in one location or among various service providers and logically linked.

PIMS also help organizations by facilitating compliance with existing privacy laws like GDPR and CCPA. They make it easier for organizations to gain effective consent of users, Also, because individuals are in direct control, PIMS facilitate compliance with the obligations of businesses to grant users rights to access their individual data and to ensure that the data is up-to-date and accurate.

The advantages of PIMS and a people-centric approach are:

  • Greater efficiency for organizations and individuals — data is collected once into a personal data store and used many times; consent is managed via a dashboard rather than relationship by relationship
  • Privacy addressed – data is controlled by the individual
  • Hacking disincentivized – requires hacking multiple databases
  • Regulatory risk reduced – policy makers and regulators feel less compelled to intervene in how the data economy works

WHAT SHOULD WE DO?​

BUILD TRUST​

At the heart of it all, what organizations and advertisers need to focus on is building (or rebuilding) the waning trust of consumers.

  • More than a third (39%) of consumers said they had lost trust in a company due to a data breach or misuse of data they have heard about.
  • 81% of consumers said they need to trust a brand to buy from them.
  • A similar 81% of refuse to patronize a company that they do not trust.
  • 89% expect to disengage from one that breaches their trust.

Just build consumer trust … easy, right? Perhaps not easy, per se, though there is a relatively clear path forward with specific steps that organizations can take:

  • Show the value of personalization via data collection
  • Be clear and transparent about data privacy
  • Emphasize data security & protection from breaches
  • Embed data ethics into your organization
  • Prepare for a cookie-less future

SHOW VALUE OF PERSONALIZATION​

Marketers face an ongoing struggle to balance respecting and protecting consumers’ data, while simultaneously using that data for marketing personalization.
This tension stems from conflicting views on the consumer side. On one hand, consumers now expect personalization. They want retailers and brands to accurately anticipate their needs based on data insights related to their buying habits and preferences.

  • 81% of consumers want brands to get to know them better
  • 61% of consumers want companies to prioritize personalization
  • 91% of consumers are more likely to shop with brands that send them relevant offers and recommendations.

On the other hand, people do not want personalized services at the expense of privacy.

  • Only 17% of consumers viewed tailored ads as ethical
  • 41% of consumers felt it was creepy to get a text from a brand or retailer while walking past a physical store
  • 35% found it creepy to see social media ads for items they have browsed on a brand website

This paradox, of sorts, has frustrated marketers. Gartner predicts that 80% of marketers will drop their personalization efforts by 2025 due to poor return on investment and data management challenges. But all is not lost. The key for marketers is to always remember the value exchange at hand — customers trade their data with companies in return for personalized and relevant benefits — and by emphasizing the following:

  • Empowerment: Let consumers control how their data is used, in terms of messaging (the type and frequency of messages) and site personalization.
  • Ease: Make it easy for consumers to complete a task or complete a purchase by remembering and utilizing purchase or browsing history and personal preferences.
  • Relevance: Only send communications that align with consumers’ specific profile, shopping history and patterns.
  • Transparency: Clearly state what data is collected and how it will be used, and ensure privacy policies are easily accessible.

BE CLEAR / TRANSPARENT ABOUT DATA PRIVACY​

Nearly 60% of Americans claim that they have very little to no understanding about what companies do with data that is collected. This lack of understanding is in large part due to the complex and/or confusing ways in which companies present their policies concerning data collection and privacy.

A study by the Advertising Research Foundation (ARF) found consumers often struggle to understand data privacy conditions brands publish online due to marketing jargon and ambiguous language — that is, if they even bother to read them. Pew Research found that only 36% of consumers ever read a privacy policy before agreeing to it, with only 9% and 13% always or often reading the policies, respectively.

The fact of the matter is that the GDPR and CCPA force companies to inform and be transparent about the use of personal data. Organizations will be required to state clearly how they plan to use personal data in concise, easy-to-understand terms, free of legal jargon. Moving forward, the consumers will undoubtedly demand policies and practices that are even easier to access, read and understand.

While it is a mandate that companies provide transparent policies, at the risk of fines and/or legal consequences, there is also a benefit for companies that comply:

  • 73% of consumers said they were willing to share more personal information if brands are transparent about how it is used
  • 80% of Americans would be more likely to share personal data if they understood the information was stored safely and securely, and 64% of Americans would be more likely to share if the data collector was trusted and reliable

EMPHASIZE DATA SECURITY / PROTECTION FROM BREACHES​

Data leaks and hacks are an unfortunate reality and one of the costs of doing business in the 21st century. There were 1,001 data breaches in the US, alone in 2020 — up from 784 in 2015. At the same time, data security is the largest single drive of trust in brands, according to a study by MRS Delphi Group.

There are many data security best practices lists available to be perused. Any one of those lists would most likely provide a good baseline for companies to follow, just so long as a few fundamentals are met:

  • Identify and classify sensitive data – Companies must know exactly what types of data they have in order to effectively protect them. Data discovery technologies can scan data repositories and then classify into proper categories, which can be updated as data is created, changed, processed or transmitted.
  • Create a data usage policy – Companies should develop a policy that defines the types of access, conditions for data access based on classification, who has access, what constitutes proper data use and the consequences of policy violations.
  • Monitor and control access to sensitive data – Limit access to information based on the concept of least privilege — meaning users receive only those privileges that are essential to perform their intended function.
  • Use endpoint security systems – Network endpoints are particularly vulnerable to attack, so it is crucial to have an endpoint security infrastructure that includes, at a minimum: antivirus software, antispyware, pop-up blockers and firewalls.
  • Assess and tests security systems regularly – Regular spot checks or full-blown mock attacks can highlight issues & weaknesses and allow for proper adjustments.
  • Update & Re-evaluate – Hacking techniques and technology are constantly evolving, so data security systems must update accordingly. Software and hardware should be regularly updated, and the company’s overall approach to security should be re-evaluated periodically.

EMBED DATA ETHICS INTO ORGANIZATIONS​

Marketers will increasingly be tasked with treating consumers’ data with due care and respect. For those that do this successfully, combining data ethics and company purpose could result in a new competitive advantage.

The Conscious Advertising Network suggests six areas companies should work on for a more ethical approach to data-driven marketing:

  • Eradicating ad-fraud
  • Increasing diversity
  • Gaining consumers’ informed consent
  • Ensuring advertisers are not inadvertently funding hate speech
  • Ensuring advertisers are not inadvertently funding fake news or any other intentionally misleading content
  • Ensuring children’s wellbeing online

The World Federation of Advertisers (WFA) has published the world’s first guide for brands on data ethics in advertising — “Data Ethics – The Rise of Morality in Technology” — which sets out four key principles marketers need to consider so that their organization always uses data ethically:

  • Respect: all usage should respect the people behind the data and use it to improve their lives
  • Fairness: usage should be inclusive, acknowledge diversity and eliminate bias
  • Accountability: open and transparent data practices backed up by robust governance
  • Transparency: open and honest data practices, particularly as AI and machine-learning approaches start to automate decisions

Consumers want brands to act with purpose and deliver positive social change — 66% say it is important for brands to take public stands on social and political issues. Going beyond mere compliance to truly pushing a purpose and social mission can create a virtuous circle of personal data sharing and improved marketing and business performance.

PREPARE FOR COOKIE-LESS FUTURE – FIRST-PARTY DATA​

With a cookie-less future fast approaching, first-party data will be the *first* line of defense and a key part of the digital transformation required of marketers.
First-party data can stem from:

  • Websites
  • Transactions
  • Owned panels
  • Subscriptions
  • CRM systems

Marketers should ensure that they have a first-party data strategy in place. At a high level, there are three steps they should take:

  1. Determine what data is currently being collected and what data still needs to be collected
    Marketers should make an inventory of the types of data they currently have on consumers to understand if there are any gaps that need to be filled. For example, if location data is important to a company’s ad tactics, a location field can be placed within email signups or other forms.
  2. Build a first-party audience list with privacy top-of-mind
    As previously mentioned, privacy policies should be readily available and easily understandable. It is also recommended that only the minimum amount of data necessary to fulfill a need should be collected — e.g., if location data is not necessary upon initial signup, consider holding off requesting it. This facilitates quicker and simpler conversions, and the opportunity to collect that data is available through future communications.
  3. Activate the data
    Using the collected data, marketers can create audiences tailored to specific tactics & channels, create lookalike audiences to increase reach, build retargeting criteria and systems and create personalized experiences for consumers.

PREPARE FOR COOKIE-LESS FUTURE – CLEAN ROOMS​

Companies able to harness the power of first-party data will have a distinct competitive advantage. The issue, however, is that a first-party data strategy is not scalable for many organizations — generally, only large and already successful companies will have the means to implement. Also, first-party data is only valuable to a point on its own. Any single company’s first-party data list will represent just a small sample of the much larger consumer pool.

Enter data clean rooms.

Clean rooms allow advertisers, publishers and other companies to pull together their anonymized data into one platform in order to combine and analyze the intersected data, all while protecting any private user information. This enables organizations to tackle cross-media measurement and attribution by matching customer data with aggregated campaign data in a privacy-safe way.

The largest and most obvious providers of clean rooms are walled gardens like Google, Facebook and Amazon. Google’s Ads Data Hub (ADH) and Facebook’s Advanced Analytics are currently in the market, while Amazon is in the process of building out its clean room.

While data clean rooms sound promising, they are not without their faults. The largest concern is the inability to take the intersected data beyond the walls of the clean rooms— true to their walled garden labels. This means that parties cannot independently validate the data and must trust that Google/Facebook/Amazon have accurately processed the data.

When it comes to walled gardens, there is no single source of data truth like there is for TV or radio. Unless an independent verification party is established in the future, advertisers will simply have to play the game by their rules.

PREPARE FOR COOKIE-LESS FUTURE – DMPs to CDPs​

With third-party cookies going by the wayside, data management platforms (DMPs), which rely heavily on data collected via these cookies, will likely lose significant value as well in the eyes of marketers. Marketers should instead turn their attention toward customer data platforms (CDPs) instead.

DMPs vs. CDPs

DMPs – The focus of DMPs is on the gathering, categorizing, and classifying of anonymous third-party data collected via cookies, device IDs, and IP addresses. DMPs enable marketers to centrally store this data and use it in conjunction with programmatic ad buying to target and optimize ads.

CDPs – are focused on first-party data, though they can work with anonymous data as well. CDPs utilize a persistent and unified database to consolidate and integrate customer data from multiple channels into a single profile around an individual customer. Structured data from a CDP can be sent to other marketing technology systems to deliver personalized messaging campaigns.

CDPs are focused on the full customer lifecycle, providing a 360-degree customer view, and thus are more flexible than DMPs in terms of the number of applications to which they can apply. CDPs also typically have more advanced unification algorithms built-in, so the unified customer profiles can persist over time.

In Conclusion

SINCE YOU’VE GONE, I’VE BEEN LOST WITHOUT A TRACE​

Data privacy has a longstanding and complicated history, and its present and future are ever-changing.

Consumers are growing increasingly more protective of their personal data, for good reason — organizations have continually failed to handle their data in a secure and ethical manner.

Organizations should feel compelled to adapt and evolve to better meet the needs of their consumers. Big players like Google, Apple, and Facebook have already started paving the way, though they can hardly agree on the right path. Most organizations can—and likely will have no other choice but to — follow the lead of these big players, and that is fine so long as the organizations are willing to play by the restrictive rules set by the walled gardens.

Organizations that wish to take a more proactive approach can use the options laid out here to craft a gameplan, in hopes of getting ahead of the movement — or at least staying on top of it.

Whatever path is taken, though, the fact of the matter is that the data privacy landscape will change, and it will continue to change. Some will take solace in the idea that everyone will be affected in the same way. Some will drag their feet and look back longingly to the days when tracking consumers was easier. Meanwhile, those who are assertive and flexible, will see this as an opportunity to get an upper hand and win consumers over with progressive thinking.

In case you missed it, here’s Part 1 and Part 2.

Sources: https://9rooftops.com/news/sources-data-privacy-sources-part-1-2-3-a-9rooftops-3-part-series/

Modern KPIs Unveil Growth Potential. Start Growing Again.

This is not an advertisement, and solely reflects the views and opinions of the author. This website and its commentaries are not designed to provide legal or other advice and you should not take, or refrain from taking, action based on its content.

loading...