Meta launches new safety features for teen users of Facebook and Instagram
“We’re sharing an update on how we protect young people from harm and seek to create safe, age-appropriate experiences for teens on Facebook and Instagram,” Meta stated in a blog post.
Meta, Facebook and Instagram’s parent company, is launching new tools to protect teens and their privacy on its platforms. Teen users (aged 16 or 18, depending on the law of the country) will be defaulted into more private settings on these platforms.
“We’re sharing an update on how we protect young people from harm and seek to create safe, age-appropriate experiences for teens on Facebook and Instagram,” Meta stated in a blog post.
Privacy feature for teen users of Facebook will limit:
Who can see their friends list
Who can look at the people, Pages and lists they follow
Who can see posts they’re tagged in on their profile
Reviewing posts they’re tagged in before the post pops up on their profile
Who may comment on their public posts
ALSO READ: Meta investors call Zuckerberg's metaverse project 'terrifying': Report
Meta working to stop the spread of teens’ intimate images online
Meta also shared an update on the work they are doing to stop the spread of teens’ intimate images online. It is carrying out a project with the National Center for Missing and Exploited Children (NCMEC) to set up a global platform for teens who are concerned that intimate images they clicked might be shared on public online platforms without their consent.
“The non-consensual sharing of intimate images can be extremely traumatic and we want to do all we can to discourage teens from sharing these images on our apps in the first place,” it stated.
Meta added that they are also working with Thorn and their NoFiltr brand to create educational materials that decrease the shame and stigma surrounding intimate images, and empower teens to request help and take back control if they’ve shared them or are dealing with sextortion.