We live in a world increasingly dominated by our personal data.
Some of that data we choose to reveal, for example, through social media, email, and the billions—yes, billions—of messages, photos, and tweets we post every day.
Still other data is required to be collected by government programs that apply to travel, banking, and employment and other services provided by the private sector. All of that is subject to extensive government data collection and reporting requirements.
Many of our activities generate data that we are not even aware exists, much less that it is recorded. In 2013, the public carried 6.8 billion cellphones. They not only generate digital communications, photos, and video recordings, but also constantly report the user’s location to telephone service providers. Smartphone apps, too, often access location data and share it through the Internet.
Added to the mix are video and audio surveillance, cookies, and other technologies that observe online behavior, and RFID chips embedded in passports, clothing, and other goods—a trove of data collected without our awareness.
Trillions of Transactions a Year
Much of this data is aggregated by third parties we’ve never heard of and with whom we have limited or no direct dealings. According to The New York Times, one of these companies, known as Acxiom, alone engages in 50 trillion data transactions a year, almost none collected directly from individuals.
Known as information intermediaries, they calculate or infer information from demographic information such as income level, education, gender and sex; census forms; and past behavior, such as what clothes and foods someone purchased. That can generate data profiles that can be very revealing and used in determining credit scores, marketing predictions, and other ways to quantify us.
As the volume, importance and, indeed, the value of personal data expands, so too does the urgency of protecting the information from harmful or inappropriate uses. But as we know, that’s not easy.
Most data protection laws in the United States and elsewhere place some or all of the responsibility for protecting privacy on individual subjects through what’s called “notice and consent.”
In 1998, for example, the U.S. Federal Trade Commission, after reviewing the “fair information practice codes” of the United States, Canada, and Europe, reported to Congress that the “most fundamental” principles to protect privacy are “notice” and “consumer choice or consent.”
U.S. statutes and regulations tend to parallel the FTC’s rules and recommendations on notice and choice. All U.S. financial institutions are required to send every customer a privacy notice every year, and doctors, hospitals, and pharmacies provide similar notices, usually on every visit.
The focus on notice and consent is not limited to the United States. The draft of the European Union’s General Data Protection cites “consent” more than 100 times and emphasizes its importance.
All Our Fault
The truth is that notice and consent laws do little to protect privacy but typically just shift the responsibility for protecting privacy from the data user to the data subject—that would be us. After all, if anything goes wrong, it is our fault because we consented—often without realizing it.
Individual consent is rarely exercised as a meaningful choice. We are all overwhelmed with many long, complex privacy policies that most of us never read.
It is no wonder. One 2008 study calculated that reading the privacy policies of just the most popular websites would take an individual 244 hours—or more than 30 full working days—each year.
A reliance on notice and choice both under-protects privacy and can interfere with and raise the cost of beneficial uses of data, such as medical research and innovative products and services. (This is especially true when personal information is used by parties with no direct relationship to the individual, generated by sensors or inferred by third parties.)
‘Fantasy World’
In a May 2014 report, the U.S. President’s Council of Advisors on Science and Technology described the “framework of notice and consent” as “unworkable as a useful foundation for policy.” The report stressed, “Only in some fantasy world do users actually read these notices and understand their implications before clicking to indicate their consent.”
There are better alternatives. One is enacting laws that place substantive limits on risky or harmful data uses, for instance. Another is to increase oversight by government and self-regulatory agencies, which could potentially forbid certain uses of personal data by third parties.
Many privacy advocates note that the United States is the only industrialized country without a dedicated privacy office in the federal government. Creating one might help ensure more attention is paid to privacy.
Other efforts are underway to restrict notice and choice to times when they are necessary and meaningful, and then to make them simpler and clearer.
Another promising approach would be to ensure that businesses take responsibility for their uses of personal data by making them legally liable for the reasonably foreseeable harm they cause, rather than allowing them to use notice and consent to continue shifting the responsibility to us.
At minimum, big users of personal data should be required to assess and document the risk those uses pose, and the steps they have taken to mitigate those risks. A more formal approach to managing privacy risks could better protect privacy, lead to greater consistency and predictability over time, and allow data users to make productive uses of data if risks can be mitigated.
The alternative is to continue to rely on notices no one reads, choices no one understands, and the other ineffective tools of the fantasy world that privacy law has become.
Fred H. Cate is a distinguished professor and C. Ben Dutton professor of law at Indiana University–Bloomington. This article previously published on TheConversation.com.