Connect with us

International

Meta platforms to introduce stricter content controls for teens amid global regulatory pressure

Amid mounting regulatory pressure, Meta Platforms (META.O) unveiled stricter content controls for teens on Instagram and Facebook, aiming to shield youngsters from harmful content.

Facing intense scrutiny and a lawsuit by 33 U.S. states, including California and New York, Meta addresses allegations linking its platforms to youth mental health challenges and addiction.

Published

on

In response to increasing global regulatory pressure and mounting legal challenges, Meta Platforms (META.O) announced on Tuesday (9 Jan) its decision to implement more stringent content controls for teenagers on Instagram and Facebook.

The move comes as part of the company’s efforts to protect children from harmful content on its popular social media apps.

In a blog post, Meta revealed that all teenagers will now be automatically placed into the most restrictive content control settings on both Instagram and Facebook.

The company also disclosed its plans to limit additional search terms on Instagram, aiming to make it more difficult for teens to encounter sensitive content related to issues such as suicide, self-harm, and eating disorders while using features like Search and Explore.

Illustration. (Photo: law.com)

According to Meta, these measures, expected to roll out over the coming weeks, are designed to create a more “age-appropriate” experience for young users.

The company is facing growing scrutiny in both the United States and Europe over allegations that its platforms contribute to youth mental health issues and addiction.

In the United States, the attorneys general of 33 states, including California and New York, filed a lawsuit against Meta in October 2023.

The legal action accused the company of repeatedly misleading the public about the dangers of its platforms and their addictive nature.

The lawsuit, following a two-year multistate investigation, cited various studies, including Meta’s research, linking the use of Instagram and Facebook by young people to mental health problems such as depression and anxiety.

In response to the legal challenges, Meta stated that it shares a commitment to providing teens with safe and positive online experiences.

The company expressed disappointment that instead of working collaboratively with industry players to establish clear, age-appropriate standards for teen apps, the attorneys general chose to pursue legal action.

One of the key catalysts for increased regulatory scrutiny was the testimony of former Meta employee Arturo Bejar in the U.S. Senate.

Bejar alleged that the company was aware of the harassment and other harms faced by teens on its platforms but failed to take adequate action.

On Tuesday (9 Jan), Bejar stated that Meta’s recent changes did not address his concerns, criticizing the company for relying on “‘grade your own homework’ definitions of harm” and not providing an easy way for teens to report unwanted advances.

Under Meta’s new policy, the profiles of all users under 18 will default to the most restrictive settings. Certain types of content will be hidden on both Facebook and Instagram, even if shared by someone a teen follows.

Additionally, specific search terms related to sensitive topics will be restricted, redirecting teens to “expert resources for help” such as the National Alliance on Mental Illness.

While Meta will automatically set all teen accounts to the strictest settings, the company acknowledged that these settings can be changed by users. Teen users will also be prompted to update their privacy settings themselves.

For those who adopt recommended privacy settings, Meta will limit actions such as reposting content, tagging or mentioning them, and including their content in Reels and remixes. Only a user’s followers will be able to send messages, and offensive comments will not be visible.

Despite not explicitly addressing ongoing legal actions, Meta stated in its blog post that it regularly consults with experts in adolescent development, psychology, and mental health to ensure the safety of its platforms. The company emphasized its decade-long commitment to developing policies and technology to address sensitive content.

In response to Meta’s announcement, Rachel Rodgers, a psychologist at Northeastern University, applauded the measures as “an important step in making social media platforms spaces where teens can connect and be creative in age-appropriate ways.”

She suggested that these changes provide an opportunity for parents to engage in conversations with their teens about navigating difficult topics.

The move by Meta comes amidst fierce competition with TikTok for the attention of young users.

Recent surveys, such as the Pew Research Center’s 2023 study, indicate a decline in Facebook’s popularity among teens, with 63% and 59% of U.S. teens reporting the use of TikTok and Instagram, respectively, compared to only 33% using Facebook.

Share this post via:
Continue Reading
Click to comment
Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments

Trending