EU investigating Meta over addiction and safety concerns for minors


Meta is back in hot water again over his methods (or lack thereof) of protecting children. The European Commission has started official procedures To determine whether the owner of Facebook and Instagram is in breach of the Digital Services Act (DSA) by contributing to children’s social media addiction and failing to ensure they have a high level of security and privacy.

The Commission’s investigation will specifically examine whether Meta was properly assessed and acted against the risks posed by the interfaces of their platforms. It is concerned about how their designs can “exploit the weaknesses and inexperience of minors and lead to addictive behavior and/or reinforce the so-called ‘rabbit hole’ effect. Children’s physical and mental well-being, as well as their rights, must be respected.” “.

The process will also examine whether Meta takes appropriate steps to prevent minors from accessing inappropriate content, whether it has effective age verification tools and simple, robust privacy tools for minors as default settings.

DSA It sets standards for very large online platforms and search engines (those with 45 million or more monthly users in the EU) such as Meta. The obligations of the designated companies include being transparent about their advertising and content moderation decisions, sharing their data with the Commission and examining the risks their systems pose in areas such as gender-based violence, mental health and the protection of minors.

It responded to formal processes by pointing to features such as meta parental control settings, silent mode, and automatically restrict content for teens. “We want young people to have a safe, age-appropriate experience online and have spent a decade developing more than 50 tools and policies designed to protect them. This is a challenge facing the whole industry and we look forward to sharing the details of our experience. With the European Commission we are working,” said Meta’s spokesperson Engadget.

However, Meta has consistently failed to prioritize youth safety. Previous exciting developments include Instagram’s content recommendation algorithm child sexual abuse and claims to design its platforms to be addictive to young people psychologically harmful contentfor example, eating disorders and body dysmorphia promotion.

The meta has also served as a disinformation hub for people of all ages. The commission has already started official procedures against the company on April 30 over concerns about misleading ads, data mining for researchers and the lack of an “effective third-party real-time civil discourse and election monitoring tool” ahead of June’s European Parliament elections. Earlier this year, Meta announced it CrowdTangleFacebook and Instagram will be shut down completely in August, clearly showing how fake news and conspiracy theories circulate.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *