The US Federal Trade Commission (FTC) has proposed a blanket ban on Meta (formerly Facebook) from monetizing data belonging to anyone under the age of 18, saying that the social network violated its 2020 privacy order.

As part of the proposed changes, Meta, which changed its name from Facebook in October 2021, would be prohibited from profiting from data it collects, including through its virtual reality products, from users under the age of 18.

It would also be subject to other expanded limitations, including in its use of facial recognition technology, and required to provide additional protections for users.

“Facebook has repeatedly violated its privacy promises. The company’s recklessness has put young users at risk, and Facebook needs to answer for its failures,” said Samuel Levine, Director of the FTC’s Bureau of Consumer Protection late on Wednesday.

This is the third time the FTC has taken action against Facebook for allegedly failing to protect users’ privacy.

The Commission first filed a complaint against Facebook in 2011 and secured an order in 2012 barring the company from misrepresenting its privacy practices.

Buy Me A Coffee

According to a subsequent complaint filed by the Commission, Facebook violated the first FTC order within months of it being finalized — engaging in misrepresentations that helped fuel the Cambridge Analytica scandal.

In 2019, Facebook agreed to a second order, which took effect in 2020, resolving claims that it violated the FTC’s first order.

X Manually Reviewing Users’ Direct Messages, Musk Has No Clear Answer

“Today’s action alleges that Facebook has violated the 2020 order, as well as the Children’s Online Privacy Protection Act Rule (COPPA Rule),” said the FTC.

The 2020 privacy order required Facebook to pay a $5 billion civil penalty.

In addition, the FTC has asked the company to respond to allegations that, from late 2017 until mid-2019, Facebook misrepresented that parents could control whom their children communicated with through its Messenger Kids product.

Despite the company’s promises that children using Messenger Kids would only be able to communicate with contacts approved by their parents, children in certain circumstances were able to communicate with unapproved contacts in group text chats and group video calls.

Under the COPPA Rule, operators of websites or online services that are directed to children under 13 must notify parents and obtain their verifiable parental consent before collecting personal information from children.