EU Probes into Meta for Not Doing Enough to Ensure Child Safety

admin


  • Facebook and Instagram face fresh investigation from the EU over concerns that they’re not doing enough to protect minor users on their platforms.
  • Two important concerns have been raised—that the platforms are too addictive and that their age-verification tools are not effective.
  • If Meta is found in violation of the DSA regulations, it will face a massive fine of up to 6% of its annual global revenue.

EU Probes into Meta’s Facebook and Instagram for Not Doing Enough to Ensure Child Safety

The European Union has launched a fresh investigation against Meta (Facebook and Instagram) over concerns that it’s not doing enough to protect minor users on its platforms. If these concerns are found to be true, the company will be slapped with a heavy fine.

In a statement released on Thursday, the EU said that it’s worried Facebook and Instagram “may exploit the weaknesses and inexperience of minors and cause addictive behavior.”

The EU is also concerned that these platforms are not doing enough to keep underage users off them. Apparently, the age-verification methods put in place are not as effective as they should be.

Meta has responded to EU’s concerns and said that ensuring young people have a safe experience on its platforms is one of its top priorities. The company has invested decades in research and has created around 50 safety tools to protect children online.

It’s worth noting that Meta isn’t lying when it says it has created child safety tools. For instance, it released two major updates in January this year to address mounting regulatory pressure.

First, it reinforced safety measures for teens including shielding them from sensitive content on the ‘Search’ and ‘Explore’ features of Instagram. Second, Meta blocked under-18 users from receiving DMs from strangers on Instagram and Facebook.

However, the EU is right in saying that the platforms aren’t doing anything to keep young users off it—Meta is just promising safety. This is not enough because a 12-year-old’s brain can still get fried seeing useless reels all day long, even if it’s not ‘sensitive content.’

I’m talking about low attention spans, ADHD, teen mental health concerns, etc. Young users should undoubtedly be kept off these platforms with the help of stronger age verification tools.

Nevertheless, Meta acknowledged the points EU raised and said that it’s looking forward to explaining its work to the bloc and doing everything in its power to make its platforms a better place for young users.

The Digital Services Act (DSA)

The EU has always been a tad bit stricter when it came to apps that deal with minors. However, after the recently introduced Digital Services Act (a set of laws that handle everything related to digital platforms), things have gone up a whole new level—and that’s a good thing for us consumers.

Under the DSA, companies that are labeled as ‘very large online platforms‘ have to do more to protect children. For instance, they need to have more protective features in place so that underage users don’t stumble upon inappropriate content.

Failure to comply with the rules would invite a hefty fine of 6% of the company’s annual global revenue—this is the fine I mentioned earlier.

Note: Very large online platforms are apps/websites that have more than 45 million users per month in the European Union.

Interestingly, this is not the first time Meta has been under fire by the EU. In September 2023, it had to submit a risk assessment report to the commission detailing what it’s doing to protect minors on its platform.

However, the EU did not find the report satisfactory. In a statement, it said that it’s not convinced that Meta is doing enough to comply with the Digital Services Act (DSA).

On the point of probes, the EU is also investigating Meta for allegedly violating DSA regulations for election disinformation. A separate probe addressing this was launched against Meta in April 2024, which found that the company is not doing enough to curb the spread of election-related misinformation on its platform ahead of the upcoming European Parliament elections.

Furthermore, the EU isn’t the only one scrutinizing core Meta apps and operations. The company has received loads of criticism and investigations from other agencies and law enforcement officials around the world, too.

In December 2023, a New Mexico attorney general sued the company stating that it enables child exploitation, spread of child abuse material, solicitation, and trafficking. Similarly, in October 2023, Meta was sued by 33 US states for being too addictive for young users.

Meta has addressed both these incidents and said the same thing—that protecting young users is its priority and it will look into the matter and take necessary steps. If and when it will do that is the bigger question, in my humble opinion.

Perhaps the EU can impose strict deadlines on the implementation of remedial measures, which could also be verified first-hand by officials?

Read more: Ex-Meta employee accuses company of ignoring his warnings on how Instagram was harming teens

The Tech Report - Editorial ProcessOur Editorial Process

The Tech Report editorial policy is centered on providing helpful, accurate content that offers real value to our readers. We only work with experienced writers who have specific knowledge in the topics they cover, including latest developments in technology, online privacy, cryptocurrencies, software, and more. Our editorial policy ensures that each topic is researched and curated by our in-house editors. We maintain rigorous journalistic standards, and every article is 100% written by real authors.



Source link

Leave a comment