Silicon Valley, in my opinion, has lost its moral compass—or its soul.
The Valley is lined with social media companies that are monetizing people’s weaknesses. Those companies know us better than we know ourselves. They use artificial intelligence to test whatever they want to test on us and learn how we react to those tests. That’s how they make money.
Moral Shift: Consumers Become the Product
Today, there’s almost no silicon in Silicon Valley. It’s mostly companies that monetize information they’ve gathered about people.Facebook and Google, among other companies, have more psychologists and psychiatrists on their payrolls than some of the largest hospitals in the country. Those people are working on creating social media posts.
Social media use isn’t the same as drinking or drug use, but it’s actually a lot more dangerous in the long run, in my opinion. It plays on people’s weaknesses to make money, and it’s 100 percent legal.
These companies decide who will—and who won’t—see certain posts. By showing people specific content that increases engagement, companies are then able to monetize this information.
Major social media companies make their money by selling engagement information to advertisers, who use this information to optimize their advertisements and shove whatever product they’re selling down your throat. In a way, it’s similar to fueling addiction.
There’s that very old saying that if you don’t pay, you are the product. We’re the product for these companies.
It’s not moral, which is a strange word in business. Efficiency is what counts in business; morality, not so much—as long as everything is legal, of course.
These companies gather information through intelligence. When we look at a person or a picture, we can immediately deduct some conclusion about them, which may be either correct or incorrect. The same logic applies to computers and artificial intelligence (AI).
How long you look at something when you’re scrolling through a given social media site, as well as what types of content you look at on that site, are all interpreted through AI. Some tests are then applied to see if the AI can jump to the correct conclusions regarding your interests.
Of course, whenever we type or post something, the AI takes our previous actions into account. Based on that, the AI will direct us to whatever it predicts we’re trying to do.
Hopefully, these social media companies are only using our information for financial gain, but my impression is that this isn’t just about the money. It’s also for directing political choices and elections, pushing the agenda of some high-ranking people in these companies. They think they can direct upon the population of deplorables what we should or shouldn’t do. For me, that’s a horrible possibility. That isn’t what technology should be about.
Profit becomes the bottom line. For these company employees, their fiduciary responsibility is toward their shareholders, and their obligation is to maximize the value of their company’s shares.
How their decisions affect society, how they misinform people, and how they expose the public to things they should never be exposed to aren’t their concerns. But it’s their moral obligation to consider these things if they choose to include morals in their business model—and at some point, some companies do.
Life is like a pendulum, swinging to the left and to the right, back and forth. At some point, society will revolt or the government will come in and tell people what they can and can’t do, such as in the case of China. In that country, they’re very strong-fisted about controlling social media and using it to further their oppression and control.
At some point, the same could be said when these social media companies run afoul. We see this with Twitter founder Jack Dorsey, who decided to ban then-President Donald Trump from his platform. But at the same time, Dorsey claims that Twitter is a news organization. As a news organization, Twitter shouldn’t be allowed to discriminate, yet it still does so. Why? Because it has enough money and enough lobbyists to do so.