Facebook’s Public Reckoning The social-media giant faces decisions on privacy and publishing.

https://www.wsj.com/articles/facebooks-public-reckoning-1521846210

Mark Zuckerberg famously started Facebook out of a Harvard dorm room in 2004, but his social network now boasts more than two billion users world-wide. Facebook hasn’t matured as fast as it has grown, and its recent troubles show that it will need to exercise more control over content and privacy. Or politicians may do it instead.

Facebook is facing a public reckoning after two newspapers reported that data on 50 million users was improperly shared with a firm working for the Trump campaign. The outrage is overwrought, but perhaps inevitable given that Facebook has promoted itself as a guardian of consumer privacy.

In 2014 the company supported a Senate bill to limit the National Security Agency’s access to electronic data, arguing that it was more trustworthy than the government. Last year Facebook joined other tech companies to oppose Federal Communications Commission Chairman Ajit Pai’s rescission of Obama-era privacy rules for broadband providers that didn’t apply to them.

The Internet Association argued that companies like Facebook and Google “have more limited visibility into online practices and consumer information” than broadband providers that are “in a position to develop highly detailed and comprehensive profiles of their customers—and to do so in a manner that may be completely invisible.” Facebook’s alleged data breach has exposed this conceit.

Facebook makes most of its $16 billion in annual profit from harvesting data on users. In 2007 the company decided to start selling personalized ads to fuel its growth. To boost user engagement and generate more ad revenue, Facebook encouraged third-party apps such as inane personality quizzes like “Which Disney Princess Are You?”

Users give apps access to their data, which Facebook bars from being sold or shared though it has no controls to stop it. Developers can also exploit user data for their own purposes. This was how the Obama campaign targeted and turned out voters in 2012. Facebook in 2015 limited developers’ access to friends’ data, which made this harder to do. Yet users are suddenly outraged that their data may have fallen into the Trump campaign’s hands.

The fuss is much ado about little. A University of Cambridge researcher conducted an online personality quiz on some 270,000 users in 2013. He allegedly shared their data with Cambridge Analytica, which the Ted Cruz and Trump campaigns employed. Facebook says it directed the firm to delete the data after learning about the privacy violation. Cambridge says it did, though a former employee claims otherwise.

All of this despair over the Trump campaign and Facebook has had an incidental benefit: People are finally realizing that the sprawling social network isn’t merely a place to share cat photos. Facebook is the world’s biggest media conglomerate, depository of consumer data and communications network.

Facebook controls 20% of the digital ad market with a $463 billion market value that is more than triple Disney’s. It boasts 70 million more users in the U.S. than Verizon Wireless, the largest mobile network. About 140 million Americans get news from Facebook, while Fox News’s average prime-time viewership is 2.4 million.

A few thousand Russian ads on Facebook didn’t turn the 2016 election, but the proliferation of fake news is tainting public discourse. Facebook deserves some blame since it has refused to separate chaff from wheat and pay for quality news. Facebook lets algorithms curate users’ news feeds and “trending” section. But its algorithms favor free and viral content that generate more ad revenue. Algorithms also allow Facebook to avoid responsibility—look ma, no hands!—for content on its site.

***

Mr. Zuckerberg bobbed and weaved through interviews this week, promising to audit third-party apps and improve screening of objectionable content. Facebook is losing teenage users, and Mr. Zuckerberg knows the company’s growth depends on consumer trust.

The problem is that Facebook doesn’t want to accept the responsibility of being essentially America’s largest publisher. Mr. Zuckerberg told the Recode website this week that he wants “to get our policies set in the way that reflects the values of the community so I’m not the one making those decisions” and that “I feel fundamentally uncomfortable sitting here in California at an office, making content policy decisions for people around the world.”

Less high-mindedly, he doesn’t want to pay for content Facebook now grabs for free from news organizations. But Facebook has already had to ensure advertisers they won’t be put next to pornography ads or bigoted videos. And if Facebook wants to avoid blame for spreading false news or propaganda, it may have to start promoting and paying for content that readers can trust. The risk of political bias is real but can be avoided with editorial judgment of the sort publishers exercise every day.

Section 230 of the Communications Decency Act protects “interactive computer services” like Facebook from liability for user-generated content. But politicians are getting tired of internet giants hiding behind a law that was intended to protect startups.

The Senate passed legislation this week that exempts sex-trafficking from Section 230, and several Senators have threatened more changes if tech companies don’t clean up their act. The Federal Trade Commission is investigating Facebook’s privacy protections, and Mr. Zuckerberg has said he’s open to more regulation. But it would be far better for Facebook to take more responsibility for its content than for politicians and bureaucrats to do so.

Comments are closed.