Last week, the beauty retailer Sephora made the headlines in the news. Not because of a new product launch or some extraordinary TikTok campaign, but because it became the first company that the California’s Attorney General publicly fined for allegedly violating the requirements of the California Consumer Privacy Act (CCPA).
According to the case, Sephora failed to communicate transparently to its customers that it sold personal data (tracked through pixels and cookies) to third parties. The brand also did a poor job with their “do not sell my personal information” processes, failing to let consumers reverse the companies’ data sharing decisions if they wished so (including through Global Privacy Control (GPC)), neglecting to assess third party’s data sharing, and consistently update contracts as CCPA requires. These violations cost the company $1.2M in settlement fees and a range of compliance remediation obligations.
The core of this case is Sephora’s “selling” (read: “sharing”) of consumer personal data with third-party advertising networks and analytics providers. Let’s clarify one crucial detail here: what does selling data actually mean under CCPA? The regulator has so far interpreted that language to include a range of data sharing practices for “any valuable consideration” and, according to the decision in this case, it also includes when a company shares data with a vendor in exchange for “free or discounted analytics and advertising benefits.” The lack of a definition of what their third parties could do with the data appears to be a critical pitfall in Sephora’s data sharing arrangements. In fact, among the corrective measures the AG imposed on the retailer, Sephora must: 1) identify the third parties it shares/sells data to; 2) ensure contracts and processes are in place to limit what third parties can do with data, as defined by its policies; and 3) take specific actions to enforce “do not sell my personal information” across its data sharing/selling ecosystem.
These are basic actions that will help consumers understand with whom a business shares their data and take action to change the company’s data sharing decisions they don’t agree with. These are necessary and helpful measures, yet we are still just scratching the surface when it comes to trusted data sharing.
The Path Forward For Trusted Data Sharing Includes Closing “The Trust Gap”
In our new research: “Trusted Data Sharing: A Modern Framework For Empowering Individuals And Organizations” we have investigated the opportunities and the risks of data sharing and created a framework that helps organizations make their data sharing processes safer, more controlled, and more trusted.
The core of the research focuses primarily on tackling the most critical point of failure of today’s data sharing ecosystems: “the trust gap.” Explained simply, this is the delta between the controls, policies, and governance an organization sets up when collecting personal data and that organization’s ability to ensure the same level of control when sharing data. We have also looked explicitly at data sharing practices that entail the sharing of personal data, hence making the need for adequate safeguards more urgent.
This report provides specific guidance to every company that engages in data sharing practices, as it:
- Defines a model architecture to describe common data sharing arrangements and highlights specific risks and common points of failure.
- Contains a detailed analysis of the benefits that closing the trust gap would bring to consumers and businesses.
- Lists potential controls, including customer consent management, privacy preserving technology, blockchain, and data intelligence platforms and maps them against discrete actions in complex data sharing ecosystems.
- Identifies the most appropriate controls against the model architecture of data sharing in complex ecosystems.
Get in touch with us and schedule a guidance session to learn more about the research and start closing the trust gap of your data sharing practices.