Tag Archive for: …but

Giving People Property Rights In Data Will Not Solve Privacy, But…

Online privacy can’t be solved by giving people new property rights in personal data. That idea is based on a raft of conceptual errors. But consumers are already exercising property rights, using them to negotiate the trade-offs involved in using online commercial products.

People mean a lot of different things when they say “privacy.” Let’s stipulate that the subject here is control of personal information. There are equal or more salient interests and concerns sometimes lumped in with privacy. These include the fairness and accuracy of big institutions’ algorithmic decision-making, concerns with commodification or commercialization of online life, and personal and financial security.

Consumers’ use of online services will always have privacy costs and risks. That tension is a competitive dimension of consumer Internet services that should never be “solved.” Why should it be? Some consumers are entirely rational to recognize the commercial and social benefits they get from sharing information. Many others don’t want their information out there. The costs and risks are too great in their personal calculi. Services will change over time, of course, and consumers’ interests will, too. Long live the privacy tension.

Online privacy is not an all-or-nothing proposition. People adjust their use of social media and online services based on perceived risks. They select among options, use services pseudonymously, and curtail and shade what they share. So, to the extent online media and services appear unsafe or irresponsible, they lose business and thus revenue. There is no market failure, in the sense used in economics.

Of course, there are failures of the common sort all around. People say they care about privacy, but don’t do much to protect it. Network effects and other economies of scale make for fewer options in online services and social media, so there are fewer privacy options, much less bespoke privacy policies. And companies sometimes fail to understand or abide by their privacy policies.

Those privacy policies are contracts. They divide up property rights in personal information very subtly—so subtly, indeed, that it might be worth reviewing what property is: a bundle of rights to possess, use, subdivide, trade or sell, abandon, destroy, profit, and exclude others from the things in the world.

The typical privacy policy vests the right to possess data with the service provider—a bailment, in legal terminology. The service provider gets certain rights to use the data, the right to generate and use non-personal information from the data, and so on. But the consumer maintains most rights to exclude others from data about them, which is all-important privacy protection. That’s subject to certain exceptions, such as responding to emergencies, protecting the network or service, and complying with valid legal processes.

When companies violate their privacy promises, they’re at risk from public enforcement actions—from Attorneys General and the Federal Trade Commission in the United States, for example—and lawsuits, including class actions. Payouts to consumers aren’t typically great because individualized damages aren’t great. But there are economies of scale here, too. Paying a little bit to a lot of people is expensive.

A solution? Hardly. It’s more like an ongoing conversation, administered collectively and episodically through consumption trends, news reporting, public awareness, consumer advocacy, lawsuits, legislative pressure, and more. It’s not a satisfactory conversation, but it probably beats politics and elections for discovering what consumers really want in the multi-dimensional tug-of-war among privacy, convenience, low prices, social interaction, security, and more.

There is appeal in declaring privacy a human right and determining to give people more of it, but privacy itself fits poorly into a fundamental-rights framework. People protect privacy in the shelter of other rights—common law and constitutional rights in the United States. They routinely dispense with privacy in favor of other interests. Privacy is better thought of as an economic good. Some people want a lot of it. Some people want less. There are endless varieties and flavors.

In contrast to what’s already happening, most of the discussion about property rights in personal data assumes that such rights must come from legislative action—a property-rights system designed by legal and sociological experts. But experts, advocates, and energetic lawmakers lack the capacity to discern how things are supposed to come out, especially given ongoing changes in both technology and consumers’ information wants and needs.

An interesting objection to creating new property rights in personal data is that people might continue to trade personal data, as they do now, for other goods such as low- or no-cost services. That complaint—that consumers might get what they want—reveals that most proposals to bestow new property rights from above are really information regulations in disguise. Were any such proposal implemented, it would contend strongly in the metaphysical contest to be the most intrusive yet impotent regulatory regime yet devised. Just look at the planned property-rights system in intellectual property legislation. Highly arguable net benefits come with a congeries of dangers to many values the Internet holds dear.

The better property rights system is the one we’ve got. Through it, real consumers are roughly and unsatisfactorily pursuing privacy as they will. They often—but not always—cede privacy in favor of other things they want more, learning the ideal mix of privacy and other goods through trial and error. In the end, the “privacy problem” will no more be solved than the “price problem,” the “quality problem,” or the “features problem.” Consumers will always want more and better stuff at a lower cost, whether costs are denominated in dollars, effort, time, or privacy.

Jim Harper is a visiting fellow at the American Enterprise Institute and a senior research fellow at the University of Arizona James E. Rogers College of Law.

Techdirt.

Tracking report was wrong, says Facebook …but there is a bug that needs fixing

Facebook has responded to privacy claims made by Belgian researchers but admitted that a “bug” may have caused inadvertent tracking of some users.
Naked Security – Sophos