Total Pageviews

Thursday, July 9, 2020

AMERICANA / PLAYING MONOPOLY WITH THE BIG BOYS AND GIRLS

Mr. Dipayan Ghosh, Harvard University
GUEST BLOG / By Mathew Ingram, Media Today Columnist, Columbia Journalism Review--The most benign view of Google, Facebook, and Amazon is that any social or political disruption and turmoil these behemoths have caused is a side-effect of the beneficial services they provide, and any over-sized market power they have is the result of good old-fashioned hard work or an accident of economics and technology.

But what if that's not the case? Dipayan Ghosh is a former Facebook staffer and a former policy advisor to the Obama White House who now runs the Digital Platforms and Democracy Project at Harvard, and the author of a new book called Terms of Disservice: How Silicon Valley is Destructive by Design. Ghosh argues that these companies are monopolists, and that they engage in a wide variety of disturbing conduct—much of it involving the data of their users—not accidentally but deliberately. "I believe that Facebook, Google, and Amazon should be seen as out-and-out monopolists that have harmed the American economy in various ways, and have the potential to do much greater harm should their implicit power go uncurbed," he writes.

Recently at Columbia Journalism Review, we've been discussing some of the themes in Ghosh's book—including privacy, competition, algorithmic accountability, and the idea of a new social contract—in a series of roundtables hosted on Galley, CJR's discussion platform. One roundtable started with a one-on-one conversation about privacy with Ghosh, followed by a day-long open discussion that included Ed Felten, a professor of computer science and public affairs at Princeton and a former Deputy Chief Technology Officer with the White House; Jennifer King, the director of privacy at Stanford Law School's Center for Internet and Society; Olivier Sylvain, a professor of law at Fordham University and director of the McGannon Center for Information Research; and Jules Polonetsky, who is chief executive of the Future of Privacy Forum. The question before the panel was: "Is online privacy broken, and if so what should we do about it?"

Ghosh argued that not only is online privacy broken, but the digital giants have played a key role in breaking it to their advantage, with personal data at the heart of their business model. "These firms increasingly and perpetually violate consumer privacy to serve this consistent business model by collecting personal information in an uninhibited manner," Ghosh says. "And relatively little of that activity is properly scrutinized, resulting in the radical corporate violation of individual privacy." One question that came up in the roundtable was why, after two decades of this digital platform model, there isn't a federal privacy law? Ghosh says this is a result of what he calls the "privacy paradox." Most users don't see the privacy harm when they sign up for a free service—they get immediate gratification from connecting with friends, and only much later do they see the downsides in the form of data breaches, etc.

Sylvain agreed that much of the danger in online networks is unseen by users directly and therefore regulation is needed. "Regulators and legislators are better positioned to intervene when consumers cannot easily see the deep or long-term harms and costs," he says. "Notice-and-consent just isn't enough when users cannot measure the costs or understand the full scope of the ways in which companies use/market/leverage their data." King agreed that privacy is a collective good. "I often analogize this to pollution and recycling; we are all harmed by the net effects of the individual negative actions we take, whether it is throwing away another piece of plastic, or sharing or disclosing more personal information online," she says. "Both problems require systemic solutions—trying to change individual behavior is simply not enough."

Felten said that many users may be oblivious to the potential privacy dangers of online networks, but he said that many of the behaviors we may see as irrational—handing over personal details without thinking, etc.—may in fact be rational. If someone trades their Social Security number to a complete stranger in return for a small benefit, that may seem irrational at first glance, he says, but it may just be a result of their belief that their privacy is not worth much anyway because of all the data breaches etc. that take place regularly. "In other words, people may 'sell' their data cheaply because they believe that their data is already out there, and available to anybody who wants it," says Felten. "Perhaps the problem is not irrationality, but instead it's cold-eyed rationality in response to an observed failure in privacy protection."

Here's more on the digital platforms and their dominance:

Natural monopoly: In another roundtable, CJR discussed the question of antitrust regulation with Ghosh, as well as Anant Raut, global head of competition policy at Facebook and a former counsel to the antitrust division of the Department of Justice; and Sanjukta Paul, a professor of law at Wayne State University.

Ghosh argued that Facebook, Google, and Amazon are "natural monopolies" in their respective markets, in the same way that railroads, electric utilities, highways, and some telecom networks have been deemed natural monopolies. Paul, however, argued that the law itself helps create monopolies like Facebook: "Without specifically defined legal entitlements, including legally defined corporate privileges, Facebook would never have monopoly power in the first place," she says.

Right to be forgotten: When it comes to privacy, one of the things the European Union offers is what's called the "right to be forgotten," which requires online services to remove personal information in certain cases, such as when an old criminal charge shows up in searches for a person's name. Polonetsky said that the ability to remove data from a service seems like a valuable thing, but he is less convinced about removing articles from a search index. "I am worried that asking search engines to de-index puts far too much discretion in their hands," he says. "I would prefer that this type of request goes to the publisher, and ends up with a court process where both sides can be heard. Search engines should then be obligated to follow the decisions of courts or of democratic legislative process."

Collective action: King said one potential solution to the personal data problem is to allow individuals to pool their information collectively and manage it, including any potential benefits from it. "That could break us out of the current mold of personal data exchange, where individuals are forced to negotiate with companies or platforms to access a product or service," she says. "As long as individuals have to fend for themselves, they will continue to be at a disadvantage in terms of how they can control the access and use of their personal data. My hope is that we can implement forms of data governance that allow individuals to collectively pool and manage their data, to allow both more control and more direct benefit."

Mission Statement:
COLUMBIA JOURNALISM REVIEW’S mission is to be the intellectual leader in the rapidly changing world of journalism. It is the most respected voice on press criticism, and it shapes the ideas that make media leaders and journalists smarter about their work. Through its fast-turn analysis and deep reporting, CJR is an essential venue not just for journalists, but also for the thousands of professionals in communications, technology, academia, and other fields reliant on solid media industry knowledge.

Donate to the Columbia Journalism Review. Click here.

No comments:

Post a Comment