The Wall Street Journal has published details over several months regarding how Meta and Instagram have presented inappropriate and sexual content related to children to users. A detailed report revealed how Instagram connected a network of accounts that purchase and sell child sexual abuse materials and direct them to one another through its recommendation algorithm. A recent investigation shows that the problem extends to Facebook groups, where there exists an ecosystem of accounts and groups that sexually exploit children, some of which include up to 800,000 members. Meta's recommendation system has allowed abusive accounts to find each other through features like “groups you should join” on Facebook or the autocomplete function for hashtags on Instagram. Meta stated that it is imposing restrictions on how suspicious adult accounts interact with one another, making it so these accounts cannot follow each other on Instagram, will not be recommended to each other, and none of their comments will be visible to other suspicious accounts. Additionally, Meta has expanded its list of terms, phrases, and emojis related to child safety and has begun using machine learning to detect connections between different search terms. These reports and the resulting changes regarding child safety come at a time when regulators in the United States and the European Union are pressing Meta on how to ensure the safety of children across its platforms.