The social media company established a child-safety task force in June after an earlier report by the Journal and researchers at Stanford University and the University of Massachusetts Amherst documented how Meta’s algorithm was struggling to stamp out a web of accounts that trade in underage-sex content.
But months later, new tests by the Canadian Centre for Child Protection, a nonprofit dedicated to the personal safety of children, show that these problems remain pervasive on Meta’s Instagram and Facebook. While Meta has long contended with pedophilic content on its platform, Facebook recommendation algorithms continue to suggest harmful accounts and groups with names such as “Little Girls,” “Beautiful Boys” and “Young Teens Only” to users. Some of these groups boast as many as 800,000 members.