Facebook’s Business Model Is Monstrous: Here’s How We Fix It

Facebook’s Business Model Is Monstrous: Here’s How We Fix It
A giant digital sign is seen at Facebook's corporate headquarters campus in Menlo Park, Calif., on Oct. 23, 2019. Josh Edelson/AFP via Getty Images
John Mac Ghlionn
Updated:
Commentary

Of all the threats facing the United States, China and Facebook pose two of the biggest.

The United States, to some extent, is limited in its ability to address Chinese aggressions. Facebook, on the other hand, is an American company with an American founder—the United States can take appropriate action. The question, though, is what sort of action should be taken? In this short piece, I propose a somewhat novel solution to the “Facebook Problem.”

At first, Facebook and the Chinese Communist Party (CCP) appear to have little, if anything, in common. On closer inspection, however, they share striking similarities. The CCP has been labelled hostile and autocratic; Facebook, meanwhile, has been described as the “largest autocracy on Earth,” a “hostile foreign power” capable of destroying the United States. Both the CCP and Facebook Inc. are considered a threat to democracy; both are responsible for spreading dangerous disinformation; and both have a history of suppressing free speech. Lastly, both are run by men who appear to answer to no laws in particular. Of course, Mark Zuckerberg is not Xi Jinping. One is directly responsible for the torture occurring in Xinjiang, for example; the other is not. However, with Facebook, Zuckerberg created a Frankenstein Monster, albeit unknowingly. For this, he must be held accountable. The question, though, is how?
When we discuss Big Tech, we’re describing Apple, Amazon, Google, Microsoft, and Facebook. The infamous 5, contrary to popular belief, are not created equally. People like Elizabeth Warren and Alexandria Ocasio-Cortez are wrong to view Big Tech as some sort of amorphous blob. After all, Apple and Facebook offer completely different services.
In a recent interview with “60 minutes,” a whistleblower by the name of Frances Haugen discussed the many ways in which Facebook harms society. Haugen, a data scientist from Iowa, called Facebook “substantially worse” than any other social media platform. Before joining Facebook, Haugen worked at both Pinterest and Google. Facebook, as many readers already know, has a notorious history. Since its launch back in 2004, it has been accused of a litany of offenses, from spying on customers to giving users’ data away without their permission. In the interview, Haugen accused decision makers at Facebook of ignoring research documenting the ways in which Instagram (owned by Facebook) fuels eating disorders and depression in young women.
In truth, Haugen’s revelations simply shed more light on a well-known fact—Facebook, in its current form, is deeply problematic. Along with the Chinese regime, it poses one of the greatest threats to democracy.
Former Facebook employee and whistleblower Frances Haugen testifies during a Senate Committee on Commerce, Science, and Transportation hearing titled, "Protecting Kids Online: Testimony from a Facebook Whistleblower," on Capitol Hill in Washington on Oct. 5, 2021. (Matt McClain/Pool via Getty Images)
Former Facebook employee and whistleblower Frances Haugen testifies during a Senate Committee on Commerce, Science, and Transportation hearing titled, "Protecting Kids Online: Testimony from a Facebook Whistleblower," on Capitol Hill in Washington on Oct. 5, 2021. Matt McClain/Pool via Getty Images

What Can Be Done?

Some, like the aforementioned Warren, are obsessed with the idea of breaking up Big Tech. This, however, is a fool’s game. The Big 5 are all conglomerates—they have acquired numerous other businesses, from AI research facilities to digital health startups. Facebook Inc. now owns 78 different companies. Ostensibly, it’s a social media company. In reality, though, it’s a multi-headed hydra worth $1 trillion. More importantly, breaking up Facebook Inc. would fail to address the core problem, namely its manipulation of algorithms.

Instead, the U.S. government should hone in on the regulation of algorithms. As Haugen explained in the interview, Facebook intentionally “games” the system so users are presented with the most outrageous content. This manipulation breeds anger; with anger, comes engagement; with engagement, comes greater profits.

Authors at TechCrunch made a bold suggestion: The FDA “must assert its codified right to regulate the algorithm powering the drug of Instagram.” By viewing algorithms as “a drug impacting our nation’s mental health,” the FDA would be in a strong position to rein in Big Tech.
The authors, however, failed to discuss the ubiquity of algorithms. Moreover, the vast majority of algorithms are helpful, rather than harmful. Without algorithms, Google’s search engine would be utterly useless. Need to book a taxi? Order a pizza? Book a flight? To do so, we require algorithms—lots of them. In many ways, without these computer-implementable instructions, modern society would grind to a screeching halt. Algorithms are the computational equivalent of water or air—in other words, they’re everywhere.
Instead, I suggest treating algorithms like vehicles. Just like the National Highway Traffic Safety Administration (NHTSA) regulates the safety of motor vehicles and related equipment, the United States stands to benefit from a National Algorithm Safety Administration. Perhaps an independent agency dedicated to “test driving” algorithms in the same way independent agencies test drive cars. Or, if this seems absurd, how about a National Algorithm Safety Board, like the National Transportation Safety Board (NTSB)? According to the NTSB’s website, this independent federal agency is “charged by Congress with investigating every civil aviation accident in the United States and significant accidents in other modes of transportation—highway, marine, pipeline, and railroad.” A National Algorithm Safety Board could perform a similar function with thorough, expert-led investigations of cases where algorithms cause psychological injuries. All investigations would be carried out by an elected team of independent analysts.

For this piece, I reached out to Facebook for comment on the matter; no comments were offered.

In a recent interview with NY Magazine, Scott Galloway, professor of marketing at the New York University Stern School of Business, discussed the many ways in which Big Tech giants like Facebook and Amazon have learned to completely disregard specific regulation and laws. According to Galloway, tech companies are willing to do “whatever is required to scale,” including abusing employees and lying to Congress. As Galloway warned, until an “algebra of deterrence” is implemented, then Big Tech companies will continue to operate in immoral—even criminal—ways. With Facebook, an algebra of deterrence is possible—it starts with a National Algorithm Safety Board.
Views expressed in this article are opinions of the author and do not necessarily reflect the views of The Epoch Times.
John Mac Ghlionn
John Mac Ghlionn
Author
John Mac Ghlionn is a researcher and essayist. He covers psychology and social relations, and has a keen interest in social dysfunction and media manipulation. His work has been published by the New York Post, The Sydney Morning Herald, Newsweek, National Review, and The Spectator US, among others.
twitter
Related Topics