NY Considers New Rules for Social Media Companies Over Child, Teen Use

Lawmakers in NY have proposed similar rules to the EU’s Digital Services Act to restrict how social media platforms can interact with under 18s.
NY Considers New Rules for Social Media Companies Over Child, Teen Use
N.Y. Gov. Kathy Hochul speaks onstage at the 2023 Concordia Annual Summit at Sheraton, N.Y., on Sept. 19, 2023. Riccardo Savi/Getty Images for Concordia Summit
Stephen Katte
Updated:
0:00

New York lawmakers have announced new proposals to restrict what social media platforms can do with the personal information of under 18s and minor’s use of addictive algorithms.

In an Oct. 11 press release, New York Gov. Kathy Hochul, New York Attorney General Letitia James, State Senator Andrew Gounardes (D-N.Y.), and state Assemblywoman Nily Rozic revealed two bills aimed at reducing the negative impacts of social media on under 18s.

“Our kids are in crisis, and the adults in the room need to step up,” Gov. Hochul said.

“The statistics are extraordinarily disturbing: teen suicide rates are spiking, and diagnoses of anxiety and depression are surging. It’s critical we all stand together to address the youth mental health crisis,” she added.

Social Media Can be Addictive

One of the bills, the Stop Addictive Feeds Exploitation (SAFE) for Kids Act, aims to give families control over what content appears on their children’s feeds—families must choose to opt in for algorithms used by social media companies to track content for feeding more of similar content.

“Algorithmic feeds have been shown to be addictive because they prioritize content that keeps users on the platform longer,” the governor’s office said.

Under the proposed law, adult permission would be required for kids under 18 to turn on suggested feeds on social media apps.

Parents would also be able to opt their kids out from accessing social media platforms and their push notifications between midnight and 6 a.m. Platforms would need “verifiable parental consent” to continue offering services during these hours.

Under the new proposals, social media platforms would also have to provide parents with tools to cap the number of hours their kids can use them.

Pakistani children point at a computer screen showing a screen grab of a press conference attended by provincial minister Shaukat Yousafzai and streamed live on social media, in Islamabad on June 15, 2019. (Farooq Neem/AFP/Getty Images)
Pakistani children point at a computer screen showing a screen grab of a press conference attended by provincial minister Shaukat Yousafzai and streamed live on social media, in Islamabad on June 15, 2019. Farooq Neem/AFP/Getty Images

Companies that violate the proposed laws would face fines of $5,000 for every breach. Parents of the affected child would also be able to seek $5,000 in damages from the social media platform per incident.

However, social media companies will be allowed an opportunity to “cure any claim” brought by the parent/guardian of an affected minor.

Algorithmic feeds have been shown to be addictive because they prioritize content that the user likes, keeping them—and particularly minors—on the platform longer, sometimes indefinitely.
According to a Pew Research Center study conducted April 14 to May 4, 2022, from among a sample of 1,316 teens aged 13 to 17, the vast majority of respondents admitted to using YouTube and TikTok daily, with some reportedly using the sites almost constantly.

About 77 percent of the teens surveyed said they use YouTube every day, while a smaller majority of teens, around 58 percent, said they were on TikTok at least once a day.

Around half are on Instagram or Snapchat at least once a day, while only a relatively small number, around 19 percent, reported daily use of Facebook.

Protecting Minors’ Data

The logos of Facebook, YouTube, TikTok, and Snapchat on mobile devices in a combination of 2017–2022 photos. (AP Photo)
The logos of Facebook, YouTube, TikTok, and Snapchat on mobile devices in a combination of 2017–2022 photos. AP Photo

The second bill being proposed, the New York Child Data Protection Act, would prevent social media companies from collecting, sharing, and selling personal data about minors without informed consent. For kids under 13, that informed consent must come from a parent.

According to the Pew Research Center study, more than half the respondents felt they were not in control of the information social media companies collect, and often sell.

Around 40 percent of teens said they felt they had little control over their data, while 20 percent reported feeling like there was no way to control what information about them was gathered.

The remaining respondents, 26 percent, were unsure how much control they have over their data, while only 14 percent of teens said they think they have a lot of control over how platforms use their data.

At the same time, only 8 percent said they were highly concerned about how much personal information social media companies might have about them, while 13 percent classed themselves as very concerned.

Companies that violate the proposed bill limiting collecting, sharing, and selling of children’s personal data will attract damages or civil penalties of up to $5,000 per breach, the proposal offers.

Platforms are also prohibited from using children’s data and online activity to target them with personalized ads.

The Epoch Times contacted several social media platforms for their response to the proposed law but has yet to hear back by the time of publication.

The bills put forward are similar to rules already in place in the 27-nation European Union.

Under new digital rules, the Digital Services Act, which came into force throughout Europe this year, platforms are required to give users an alternative to the automated algorithms that recommend videos and posts based on their profiles and viewing history.

Explicit Images

In another hot-button issue around social media, Pew Research Center’s data also found that around half the parents they surveyed were worried their child could be exposed to explicit content on social media.
This concern has only intensified in recent days following Hamas’s surprise Oct. 7 attack on Israel, as videos and images of the attacks have flooded social media.
Soldiers walk in front of an Israeli police station that was damaged during battles to dislodge Hamas terrorists on Oct. 8, 2023. (Jack Guez/AFP via Getty Images)
Soldiers walk in front of an Israeli police station that was damaged during battles to dislodge Hamas terrorists on Oct. 8, 2023. Jack Guez/AFP via Getty Images
Several schools in the United States, the United Kingdom, and Israel have been advising parents to delete their children’s social media apps to prevent their children from being exposed to violent and traumatic content.

User Control

In a move to give users more control over their feeds, X, the platform formerly known as Twitter, has said that users can now control what kinds of media they want to see. In the “Content you see” settings, users are given the option to choose whether or not to see sensitive media.
X has warned users of an increase in daily active users and content coming out of the conflict area. But while it has given people the option to restrict sensitive media, the company said, “In these situations, X believes that, while difficult, it’s in the public’s interest to understand what’s happening in real time.”

However, the platform has taken action under its “Violent and Hateful Entities Policy” to remove newly created Hamas-affiliated accounts and is working to “try and prevent terrorist content from being distributed online.”

“X is committed to the safety of our platform and to giving people information as quickly as possible. In the coming days, we will continue to keep our community updated,” it said.

Stephen Katte
Stephen Katte
Author
Stephen Katte is a freelance journalist at The Epoch Times. Follow him on X @SteveKatte1
twitter
Related Topics