Meta launches more parental supervision tools on Instagram

Instagram and Facebook parent company Meta is adding several new parental monitoring tools and privacy features to its platform as social media companies face increased scrutiny over their impact on teen mental health.
But many of the features require minors and their parents to opt-in, raising questions about how effective this measure is. Instagram, for example, will send notifications to teens after they block someone, urging parents to “supervise” their accounts. The idea is to get children’s attention when they are likely to be more open to parental guidance.
Once teens have opted in, the system allows parents to set time limits, see who their child follows and who is following them, and how much time minors spend on Instagram. will be able to track Parents cannot see the contents of the message.
Last year, Instagram launched parental monitoring tools to help families use the platform to find resources and guidance. The catch with this process is that children have to sign up if they want a parent to monitor their account. It’s unclear how many teens have opted in, and Meta declined to provide a figure.
Such monitoring allows parents to see how many friends their children have in common with accounts they follow or are being followed. So if a child is being followed by someone none of their friends are following, it could be a red flag that the teen doesn’t know that person in real life.
This “helps parents understand how much their teen knows about these accounts and helps encourage offline conversations about those connections,” Mehta said. increase.
Meta will also add parental monitoring tools already available on Instagram and virtual reality products to Messenger. Opt-in features allow parents to see how much time their child spends on messaging services and information such as contact lists and privacy settings, but not who they chat with, for example.
Such features are useful for families where parents are already involved in their children’s online lives and activities. Experts say that is not the reality for many.
Last month, U.S. Surgeon General Vivek Murthy warned there wasn’t enough evidence to show that social media was safe for children and teens, urging tech companies to “act now to protect our children. He called for immediate action.
Murthy told The Associated Press that while he is aware that social media companies are taking steps to make their platforms more secure, it is not enough. For example, although children under the age of 13 are technically prohibited from accessing social media, many young children misrepresent their age on her Instagram and her TikToks, with or without parental permission. Accessing apps such as
Murthy also said it was unfair to expect parents to control their children’s behavior with rapidly evolving technology, saying, “It’s about how children think about themselves, how they form friendships and It will fundamentally change how we experience the world, and by the way, technology is changing what it used to be.” It didn’t have to be managed for generations at all. “
“We put all of that on the shoulders of parents, and it’s totally unfair,” Murthy said.
Also, starting Tuesday, Meta will encourage, but not force, children to take a break from Facebook, much like it already does on Instagram. After 20 minutes, the teen is notified to leave the app for a while. If you want to keep scrolling, just close the notification. TikTok also recently introduced a 60-minute time limit for users under the age of 18, but teens must set it themselves or enter a passcode set by a parent if the child is under 13. can circumvent this limitation.
“Our focus is kind of a suite of tools to help parents and young people maximize their safe and relevant experiences online,” Meta said. said Diana Williams, who oversees product changes for youth and families. “We are also looking to build tools that teens can use on their own to learn how to manage and be aware of how they use their time. It’s like ‘Quiet Mode’.”