Published on 03:37 PM, May 09, 2023

A brief look inside Meta's recent youth safety initiatives

Meta, the parent company of Facebook and Instagram, has announced that in their ongoing effort and commitment to youth safety online, a series of youth safety initiatives to protect young people while using their apps are currently in place and are constantly reinforced. The company has been working closely with experts in mental health, child psychology, and digital literacy to build features and tools that allow users to connect online safely and responsibly, Meta announced in a recent press briefing.

Meta's approach to youth safety is based on three principles: responsible connections, age-appropriate safeguards, and innovation. To implement these principles, Meta engages in co-design with parents and children, research, and consultations to understand the needs of youth and their platforms. From these efforts, Meta has identified key takeaways, including the need to revisit the notion of parental control with a holistic approach, the importance of promoting responsible behavior online, and the desire of both teens and guardians for a network focused on online safety.

One of the initiatives is to require everyone to be at least 13 years old before they can create an account on Facebook and Instagram. Meta also deletes the account of any person under the minimum age limit as soon as they become aware of it. In addition, the company has rolled out features that let people manage their time, prevent unwanted interactions, and control what type of content and accounts they see. Since February 2023, advertisers can only use age and location to reach teens; a gender is no longer a targeting option.

Meta has partnered with the National Center for Missing and Exploited Children (NCMEC) to build a global platform called "Take it Down" for teens who are worried intimate images they created might be shared on public online platforms without their consent. This platform can help prevent a teen's intimate images from being posted online and can be used by other companies across the tech industry.

Instagram, which is owned by Meta, has also introduced new safety features for minors. Since 2021, everyone under 16 years old defaults to a private account when they join Instagram. New accounts belonging to minors on Facebook have automatically defaulted to share with 'friends' only, and their default audience options for posts do not include 'public.'

Instagram's Supervision Tools also allow parents around the world to set up tools with their teens to supervise their Instagram experience. The tool "Restrict" was built specifically because of feedback from young people who said they wanted more control over what was happening when they were being bullied or harassed.

Meta has also designed resources, guides, and programs with information on teen online safety, including their Facebook Safety Center and Bullying Prevention Hub, Instagram Safety Centre, and Instagram Community Portal.

The company's Community Standards also provide additional protections for minors in policy areas such as bullying and harassment, privacy violations, and image privacy, and violent and graphic content. Under their Privacy Violations policy, Meta removes images or videos of minors under 13 years old when the content is reported by the minor, parent, or legal guardian.

Meta has also established regional initiatives and partnerships to promote online safety, such as partnering with Brac in Bangladesh to reach 10 million women and teenagers and collaborating with organizations such as Zindegi Trust in Pakistan to promote online safety. In addition, Meta has policies and enforcement mechanisms in place to ensure age-appropriate content and to safeguard users, including zero tolerance for bullying, harassment, hate speech, and self-harm.

Meta's efforts to promote youth safety are not limited to policy and enforcement mechanisms but also include educational initiatives such as the Kishor Alo Instagram Parents Guide in Bangladesh, the educational curriculum in Sri Lanka, and ChaiChats in Pakistan. Through these efforts, Meta is working to create a safer online environment for youth, while also addressing the challenges of collecting data and safeguarding privacy.

With more young people going online, engaging parents on the topic of online safety is more important than ever, and Meta is committed to helping both young people and caregivers to understand how to use their products and tools to create age-appropriate experiences.