Instagram, Facebook adding more parental controls

Instagram and Facebook's parent company Meta is adding some new parental supervision tools and privacy features to its platforms as social media companies face increasing scrutiny over their effects on teen mental health.

But many of the features require minors — and their parents — to opt in, raising questions about how effective the measures are.

The new measures allow parents to view how much time their teen is spending on the Messenger app, or giving parents updates about who is on their child's Messenger contact list.

Meta has also implemented a notification on Facebook suggesting teens take a break from the platform after 20 minutes. Another nudges Instagram users to close Instagram if they are scrolling videos for too long late at night.

"I think it's a step in the right direction," said Dr. Nava Silton, a child psychologist. 

However, Dr. Silton says social media platforms need to go even further. 

"Surveying parents and getting a really great, large sample of parents and seeing what their concerns and challenges are I think would be incredibly important," Dr. Silton said. 

Jim Steyer, the CEO and founder of Common Sense Media, called the news a "smoke screen."

"None of these new features address the negative impact their business model is having on the well-being of kids, including their mental health. We need national privacy laws to protect kids," Steyer said in a statement.

Last month, U.S. Surgeon General Vivek Murthy warned that there is not enough evidence to show that social media is safe for children and teens and called on tech companies to take "immediate action to protect kids now."