Commentary
A few years ago, I published
an essay explaining how internet moguls use misdirection to trick us into looking the wrong way at critical moments, just as magicians do.
No one does this more brazenly than Facebook CEO Mark Zuckerberg, who was recently pummeled in a
scathing report issued by the UK House of Commons, as well as in a
new book by Roger McNamee, one of Facebook’s early investors. I called for Zuckerberg’s
resignation last summer after he was less than forthright in his testimony before the U.S. Congress. According to
Slate, his testimony included misdirections galore: “diversions, technical quibbles, and filibusters designed to run out each lawmaker’s four-minute clock.”
Now, Zuckerberg has outdone himself. In a lengthy op-ed published a few days ago in the
Washington Post (and published simultaneously as a
Facebook blog post), he has called for nothing less than the regulation of the entire internet, which, of course, includes Facebook itself.
He has even offered specific strategies, generously advising authorities on how they should regulate four critically important areas: harmful content, election integrity, privacy, and data portability.
Are your B.S. detectors tingling yet?
Why is the bad boy of the internet, whose company suddenly lost
$50 billion in market value in March 2018 and has become one of the
least reputable companies in America in recent years, suddenly telling us how to regulate his own company?
Has Zuckerberg, who has been
credibly accused of hacking his way through life since 2003, suddenly turned over a new leaf?
Not at all. If you look with a critical and informed eye at his guidelines for regulation, you will see misdirection worthy of high praise from Penn and Teller.
Harmful Content. Everyone is concerned about false and disturbing content on the net, and no one has done more to spread such content than Mark Zuckerberg—most recently by showing a
17-minute live feed of a murderer systematically killing innocent people in a mosque.
His solution? Have “third-party bodies ... set standards for the distribution of harmful content.”
The trick here is to shift responsibility away from the company. Standards won’t eliminate harmful content, which, Zuckerberg reminds us, is “impossible to remove.” They will, however, give the company some protection against lawsuits and fines. As long as the company is complying with standards, it’s reasonably safe, no matter what horrible content slips through its porous curation procedures. The company might even be able to cut its curation budget, relying on standards to shield it from liability.
Election Integrity. Here, Zuckerberg focuses on just one relatively trivial issue–political ads–and calls for the creation of legislation that “should be updated to reflect the reality of the threats” we now face. He doesn’t mention Cambridge Analytica or Russia’s Internet Research Agency, both of which used Facebook ads to reach millions of the platform’s users in 2016, but that’s what he’s talking about. He also doesn’t say how, exactly, Congress or regulators should update the law.
But targeted advertising isn’t a legitimate threat to our electoral process. As I’ve explained
elsewhere, targeted ads are competitive, visible, and subject to confirmation bias, just as billboards and TV ads are, and the same can be said about fake news stories.
Facebook is a serious threat to democracy not because of the ads that companies buy on its platform, but because of its unprecedented power to determine what news and information more than 2 billion people see every day. I’ve spent more than six years discovering, studying, and quantifying disturbing
new forms of influence that the internet has made possible–almost all of which are entirely in the hands of executives at Google and Facebook.
To give you just one quick example of the power Zuckerberg has, his
own published data shows that if he chooses to broadcast “go vote” reminders just to the supporters of one candidate on Election Day in 2020–and that will almost certainly be the
Democratic candidate–Zuckerberg’s message will likely give that candidate at least 450,000 additional votes that day, with no one but Zuck and a couple of his cronies aware of his mischief. That massive manipulation is one of
at least five techniques Facebook can use to tip elections without people knowing.
To protect elections, virtually every aspect of Facebook’s operations would need to be strictly regulated, but Zuckerberg is pointing us away from the real threats.
Privacy. Zuckerberg’s solution to the privacy problem we all face these days is for the world to adopt some version of the EU’s 2018
General Data Protection Regulation (GDPR), even though Zuckerberg himself
refused to implement GDPR guidelines throughout his worldwide company after the law went into effect.
This ploy is especially troubling. Facebook is the second most intrusive surveillance operation ever created by humankind (Google is No. 1), generating almost 100 percent of its revenue by monetizing the personal information it collects continuously about us and our children. Zuckerberg’s proposal for protecting our privacy will have no impact on the company’s sleazy business model–a model that inherently violates privacy.
Has the GDPR solved the privacy problem in the EU? Not at all. As Zuckerberg knows better than anyone, it hasn’t stopped Facebook from collecting and monetizing the personal data of a single soul in the EU.
In fact, because it’s so hard for companies to implement the GDPR, its main effect has been to
discourage venture funding for startups and small companies in Europe. It has had
minimal impact on Facebook and Google.
At best, the GDPR creates the illusion that people own and can erase their data after it has been collected. But once collected, that data is incorporated into predictive models that are owned fully by Facebook and Google. The data itself is expendable.
What’s more, no data is ever really removed or deleted. It’s just no longer accessible to you. Multiple copies of your data remain on servers and backup devices for long periods—possibly indefinitely, subject to Facebook’s “
relevant legal or operational retention needs.”
Portability. Zuckerberg should have stopped while he appeared to be ahead. Instead, he added this fourth and somewhat pathetic category of regulation.
By portability, he means that people should be able to remove their data from Facebook and bring it elsewhere. But bring it where exactly? Facebook has no competitors, and if you leave, you’re splitting yourself off from hundreds of friends and family members, almost as if you’ve run off to Siberia without your cell phone. Zuckerberg was probably chuckling when he wrote his musings on regulating portability.
None of Zuckerberg’s sly guidelines even begins to address the three serious threats the company poses to democracy and human freedom: the
aggressive surveillance, the ability to control what billions of people see and don’t see (the
censorship problem), and the ability to
manipulate people’s thinking, behavior, purchases, and votes subliminally.
Will our leaders resist Zuckerberg’s sleight of hand and address the tough issues directly? Given the
wads of money Facebook gives to our politicians, I have my doubts.
Robert Epstein, a former editor-in-chief of Psychology Today, is senior research psychologist at the American Institute for Behavioral Research and Technology. The author of 15 books and more than 300 articles, he is working on a book called “How Google and Facebook Ate Your Brain, and How You Can Get a New One.” Follow him on Twitter: @DrREpstein Views expressed in this article are opinions of the author and do not necessarily reflect the views of The Epoch Times.