Pro Time Hub

Meta faces child safety risks on Instagram

Meta, the parent company of Instagram, is facing increased scrutiny from European Union regulators regarding Meta faces child safety risks on Instagram concerns on the social media platform. The regulators have issued a formal request for information (RFI) under the Digital Services Act (DSA), seeking details on Meta’s response to these concerns, particularly focusing on the risks associated with the sharing of self-generated child sexual abuse material (SG-CSAM).

The DSA, which became applicable to larger platforms like Instagram in late August, mandates Big Tech companies to address illegal content and implement measures to prevent misuse of their services. The regulation places a strong emphasis on the protection of minors, prompting several early RFIs from the European Commission related to child safety.

This latest request follows a report by the Wall Street Journal (WSJ) indicating that Instagram is grappling with a problem related to CSAM that was exposed earlier in the year. The report suggests that Meta has not effectively addressed the issues identified, despite the establishment of a child safety task force. The company’s recommendation systems reportedly continue to promote content associated with underage-sex material, even after the removal of certain hashtags.

Meta in Europe about child safety risks

The EU had previously warned Meta about potential “heavy sanctions” if prompt action wasn’t taken to address child protection issues following the WSJ’s revelations in June. Another WSJ report alleges that, five months later, Meta’s efforts to rectify the situation have been inadequate.

The DSA empowers the European Commission to impose fines of up to 6% of a company’s global annual turnover for violations. Meta, having already been fined over half a billion dollars for Instagram’s violation of data protection rules for minors just over a year ago, could face significant financial penalties if found non-compliant.

The Commission has requested additional information from Meta regarding the measures taken to comply with obligations related to the protection of minors, specifically concerning the circulation of SG-CSAM on Instagram. The inquiry also covers details about Instagram’s recommender system and the amplification of potentially harmful content.

With a deadline of December 22, Meta is required to provide the Commission with the requested child safety data. Non-compliance with RFIs can result in DSA sanctions. The EU regulators’ repeated questioning of Meta’s approach to safeguarding minors could also pose reputational risks for the company. This is the third RFI Meta has received since DSA compliance began, and the second focusing on child safety on Instagram, indicating ongoing assessments by the EU that may lead to further actions or penalties. Meta has been contacted for comment on the latest RFI.

Comments are closed.