Meta Platforms Inc. announced on Tuesday that it will be blocking more content accessible to teenagers on its social networks Facebook and Instagram. This decision comes amid mounting regulatory pressure from around the world to mitigate exposure to potentially harmful content, according to a Reuters report.
The tech giant stated that it would impose stricter content control restrictions specifically targeting adolescent users of these apps. Moreover, Meta is set to restrict additional search terms within its photo-sharing app Instagram. The company added that these measures would make it more challenging for teenagers to access sensitive content related to topics such as suicide, self-harm, and eating disorders, especially when using Instagram features like "Search" and "Explore."
Meta highlighted that the protective measures set to roll out over the coming weeks are aimed at delivering age-appropriate content to its user base.
In the United States and Europe, Meta is facing pressures due to allegations that its applications are addictive and have played a part in exacerbating a mental health crisis among young people.
In October, attorneys general from 33 U.S. states, including California and New York, filed lawsuits against the company. They accused Meta of repeatedly misleading users about the risks associated with its platforms.
In Europe, the European Commission has requested information about the steps Meta is taking to shield children from illegal and harmful content.
The regulator pressures follow testimony by a former Meta employee before the U.S. Senate, accusing the company of being aware of harassment and other harms faced by teenagers on its platforms but failing to take adequate measures to address these issues.