Hello, tech enthusiasts! It’s always exciting to see how technology shapes our lives, especially for our younger users.
Meta is rolling out Teen Accounts for its Facebook and Messenger platforms, complete with essential built-in protections. This move is intended to enhance the safety and experience for young users.
The feature will first become available in regions including the U.S., U.K., Australia, and Canada, before expanding to other areas. It automatically enrolls teens into an app experience that restricts inappropriate content and unwanted contact.
Last September, Instagram began piloting similar Teen Accounts, driven by feedback from lawmakers urging tech companies to prioritize the safety of teens online.
Meta has made it clear that teens under 16 will need parental approval to change any settings and will face various content restrictions. For instance, they can only interact with messages from people they follow or who have messaged them previously.
Teens will also have limited visibility on their stories and will receive reminders to take breaks after an hour of usage. The app will activate a ‘Quiet mode’ during the night to encourage healthier usage patterns.
In addition to these measures, Instagram will restrict live broadcasting for teens under 16 unless permitted by their parents. They will also need parental consent to disable the feature that blurs potentially sensitive images.
These updates reflect Meta’s commitment to addressing mental health concerns related to social media usage. With over 54 million teens already transitioned to Teen Accounts on Instagram, Meta is eager to ensure safer social media experiences.
In a recent study, 94% of parents reported that these accounts assist in safeguarding their children while using the platform, while 85% believe they help facilitate positive interactions.