As mentioned in my previous two posts (found here and here), I gave a presentation at the 40th Annual Association of National Advertisers/Brand Activation Association Marketing Law Conference titled “The Pursuit of ‘Truth’ in Advertising.” It explored how consumers view the truth in this era of fake news and alternative facts, and how this changing understanding of the truth has affected the advertising ecosystem and the practice of advertising law. Today, I will share the third installment in my series of highlights from my presentation.

Who am I?

Who are you?

In the data driven world, who I am and who you are really depends on who wants to know. Each advertiser is interested in a different story about me. I don’t get to control the story or weigh in on what’s the truth. I am the data they collect, and the data is me.

Here Are Some Things to Consider

Brands need to offer consumers real transparency and choice about the use of their data. These principles ensure consumer trust. But now, they’re more important in light of the GDPR.

The GDPR came into effect in May of this year. The GDPR provides sweeping legal reform affecting how companies may, and may not, use data. It affects not only companies in the EU, but companies that offer goods or services to people in the EU and/or monitor the behavior of people in the EU. This affects more of us than you’d think.

In particular, the GDPR’s definition of personal data is incredibly broad. The rights of EU consumers to control the story (their data – their truth) are stronger than ever. Consumers can know whether their personal data has been collected. Consumers can ask the data collectors to “forget” that data by erasing it, and force the entire chain of processors to do the same.

Here in the United States, California has been at the forefront of privacy legislation with the passage of the California Consumer Privacy Act. Effective in 2020, this will provide consumers with rights analogous to those in the GDPR. This includes: (1) the right to request access to, and then delete, your personal information collected by a business, and (2) the right to be notified of the sale of that information.

One can never predict what will happen in Washington, especially now. But there is growing consensus on both sides of the aisle that it’s time for federal privacy legislation to provide consumers with broader protection for their data and avoid a state patch-quilt of legislation.

Of course, we still have self-regulation. In September 2018, the BBB’s Accountability Program issued its ninetieth decision in a case involving Chocolate, a mobile video ad exchange and SDK platform. Chocolate was found to have been collecting user data (including device identifiers and precise location data) for interest-based advertising without complying with the DAA Principles of transparency, enhanced notice, and control requirements.

Meanwhile, the new FTC has renewed its focus on data privacy cases. The FTC settled with PayPal, operator of the Venmo mobile payment app, alleging that users were misled about their ability to control their privacy. The FTC required Venmo to be clearer about the steps consumers can take to privatize their transactions.

And don’t forget about class actions and data.

This year, we saw a class action filed against Fiat Chrysler alleging that its 3G “Infotainment” system was vulnerable to hacking, and these vulnerabilities were not adequately mitigated or disclosed to consumers. The case is ongoing in Illinois, Michigan, and Missouri.

In today’s complicated climate, companies’ ability to protect consumer data is limited at best, and all the policies in the world can’t stop a major data breach. The response to hacks is largely a game of mitigation, investigation, notification and response, so companies must have an effective pre-emption and response plan.

The Way I See It

No one can truly opt out of a data-driven consumer economy. We transact and lead our lives as data points, sharing different identities with different marketers for different reasons. Consumers are instead forced to rely on marketers, platforms and data brokers to keep their data safe and to make sure it is used responsibly.

To be perceived as proper stewards of consumers’ data, companies must regularly recommit to data security and transparency. They must constantly refine their measures to keep pace with changing laws and the threats of even more sophisticated hackers.

Too often, companies falter on this commitment. And regulators and consumers will be watching when they do.