Instagram accounts that primarily feature images of children, but are run by adult users, will no longer be recommended to “potentially suspicious adults.” The update was announced in a blog post detailing the latest expansion of Meta’s child safety features, which includes new blocking and reporting capabilities for teenagers and additional protections for adult-managed accounts that feature children.

Meta has since introduced a variety of online safety features for Facebook and Instagram users who are under 18, and some of these are now being expanded to adults who frequently post images of children — a group that Meta says frequently includes parents and talent managers. Instagram will now “avoid recommending” such accounts to suspicious adults, such as those who have been blocked by teens, and in turn avoid steering suspected creeps to adult-run accounts featuring children. The app will also hide comments from potentially suspicious adults on their posts, and make it harder for both kinds of accounts to find each other in Search.

While Meta says that these adult-managed accounts are “overwhelmingly used in benign ways,” the company has also been accused of knowingly allowing parents who sexually exploit their children for financial gain on Facebook and Instagram to remain on the platforms. Hiding potential predators from adult accounts featuring kids builds on Meta’s update last year that stopped accounts that heavily feature children from offering subscriptions or receiving gifts.

Other Teen Account features that are coming to accounts featuring kids in the coming months will automatically default them to Instagram’s strictest message settings and filter out offensive and inappropriate comments. Some additional safety features are rolling out to Instagram DMs that provide Teen accounts with a combined report and block option. Teen users will now also see the month and year that the account they’re messaging with joined Instagram to help them spot potential creeps and scammers.

Share.
Exit mobile version