Difficulties redeeming promotion codes
Some customers may have experienced difficulties redeeming their promotion codes. Please try redeeming your code again now. If your code still does not work, it means you are not eligible for this promotion. We appreciate your patience and understanding.
Difficulties redeeming promotion codes
Some customers may have experienced difficulties redeeming their promotion codes. Please try redeeming your code again now. If your code still does not work, it means you are not eligible for this promotion. We appreciate your patience and understanding.

Timeline of tools, features, and resources to help support teens and parents

Updated:a week ago

We’ve built numerous tools, features and resources that help teens have safe, positive experiences, and give parents simple ways to set boundaries for their teens. Over time, some of those have been improved on, or consolidated with other features. The timeline below provides a snapshot of some of the many tools and features we’ve announced across our apps and technologies. You can find more information about these tools and features, and how they work, in our Instagram Parent Guide and our Family Center, and additional resources on supportive online experiences in our Resources Hub for Parents and Guardians.

  • October 2010: Instagram launches with the blocking feature.
  • September 2016: We gave people the option to swipe to delete comments that they found inappropriate on Instagram.
  • September 2016: We launched our comment keyword filter on Instagram, allowing people to filter out offensive or inappropriate comments
  • December 2016: We gave people the option to turn off comments on Instagram.
  • December 2016: We launched anonymous reporting of accounts that may be struggling with their mental health, and directing those accounts to resources on Instagram .
  • March 2017: We added the ability for people to connect with crisis support partners like Crisis Text Line and NEDA on Messenger.
  • June 2017: We launched our offensive comment filter control, allowing people to automatically hide certain offensive comments on Instagram. We later expanded the offensive comment filter to include terms related to bullying and harassment.
  • September 2017: We gave people the option to choose who can comment on their posts on Instagram.
  • September 2017: We gave people the option to file an anonymous report of potential self-injury in Live, and provided resources to those affected on Instagram.
  • August 2018: We launched an activity dashboard, which included a daily reminder and a new way to limit notifications on Instagram and Facebook.
  • July 2019: We began showing Comment Warnings on Instagram to prompt people to reconsider posting comments that may be hurtful. We later expanded this feature to include an additional, stricter warning when people repeatedly try to post potentially offensive comments, and more details about what could happen if they choose to proceed.
  • October 2019: We launched Restrict, a feature that allows people to control their Instagram experience, without notifying people who may be attempting to target them.
  • December 2019: We began showing Caption Warnings on Instagram, to prompt people to reconsider posting images and captions that may be offensive or hurtful.
  • May 2020: We launched the ability for people to delete multiple comments at once.
  • May 2020: We launched the ability for people to block or Restrict multiple accounts at once. We later launched “multi-block”, an option for people to both block specific accounts and preemptively block new accounts that someone may create to target them.
  • May 2020: We gave people the option to pin comments, to give people an easy way to amplify and encourage positive interactions.
  • May 2020: We gave people the option to manage who can tag and mention them on IG, to help protect themselves from bullies who may try to target them in this way.
  • November 2020: We added a message at the top of all search results when people searched for terms related to suicide or self-injury on Instagram, pointing to resources.
  • February 2021: We launched expert-backed resources when someone searches for eating disorders or body image-related content, and in May we launched a dedicated reporting option for eating disorder content.
  • March 2021: We restricted adults over 18 from starting private chats with teens they're not connected to on Instagram and Messenger.
  • March 2021: We began using prompts – or safety notices – to encourage teens to be cautious in conversations with adults they’re already connected to
  • April 2021: We launched our Hidden Words tool to give people the option to filter DM requests containing certain offensive words, phrases, and emojis.
  • May 2021: We gave people the ability to hide public like counts, to give them more control over their experience.
  • July 2021: We began limiting potentially suspicious adults from finding and following teens in places like Reels, Explore, or ‘Suggested for You’.
  • July 2021: We announced default private account settings for U16 users when they sign up for Instagram, as well as notifications encouraging existing teens under 16 to switch to a private account.
  • July 2021: We launched our Sensitive Content Control, which allows people to decide how much sensitive content shows up in Explore. We began defaulting all U16 users into the ‘Less’ setting in Sensitive Content Control on Instagram to make it more difficult for them to come across potentially sensitive content in Search, Explore, and Hashtag Pages, Reels, Feed Recommendations and Suggested Accounts.
  • August 2021: We launched our ‘Limits’ tool, which allows people to automatically hide comments and DM requests from people who don’t follow them, or who only recently followed you.
  • December 2021: We launched ‘Take A Break’ to empower people to make informed decisions about how they’re spending their time. To make sure that teens were aware of this feature, we showed them notifications suggesting they turn these reminders on.
  • December 2021: We began restricting people from tagging or mentioning teens that don’t follow them, or from including their content in Reels Remixes or Guides when they first join Instagram. (Note: As of September 2024, the functionality offered by Quiet Mode has become part of our new Sleep Mode.)
  • December 2022: We brought age verification tools to Facebook Dating to help verify that only adults are using the service, and to continue our efforts in preventing minors from accessing it.
  • February 2022: We launched “Your activity,” which allows people to bulk manage their content and interactions, review their history, and download their information.
  • February 2022: We introduced Personal Boundary for Horizon Worlds and Horizon Venues, preventing avatars from coming within a set distance of each other and making it easier to avoid unwanted interactions. Personal Boundary is automatically turned on for everyone in Horizon Worlds.
  • March 2022: We introduced VR Parental Supervision Tools on Quest and we launched Family Center and Parental Supervision Tools on Instagram. Initially Instagram’s supervision tools allowed parents to:
    • View how much time their teens spend on Instagram and set time limits
    • Set specific times during the day or week to limit their teen’s use of Instagram
    • Be notified when their teen chooses to report an account or post, including who was reported and the type of content
    • View what accounts their teens follow and the accounts that follow them
  • March 2022: We announced Favorites and Following, two new ways for people to choose what they see in their Instagram Feed, including giving people the option to see their feeds in chronological order.
  • May 2022: We launched the ability for parents to lock teens out of their apps on the Quest platform.
  • June 2022: We brought more Parental Supervision Tools to Quest headsets, allowing parents to oversee things like:
    • Approve their teen’s download or purchase of an app.
    • Block specific apps that may be inappropriate for their teen.
    • Receive “Purchase Notifications,” alerting them when their teen makes a purchase in VR.
    • View headset screen time from the Oculus mobile app, so they’ll know how much time their teen is spending in VR.
    • See a list of their teen’s friends.
    • Limit a teen’s ability to use their Quest with a PC or sideload apps not available on the Quest store
  • June 2022: We introduced Voice Mode in Horizon Worlds, which allows you to choose how you hear people who you don’t know. When Voice Mode is turned to the garbled voices setting, the voices of non-friend voices’ sounds like unintelligible, friendly sounds.
  • June 2022: We updated Parental Supervision Tools on Instagram to include more options for parents, including:
    • Setting specific times during the day or week when they would like to limit their teen’s use of Instagram
    • Seeing more information when their teen reports an account or post, including who was reported, and the type of report
  • June 2022: We launched new Nudges for teens on Instagram that encourage them to switch to a different topic if they’re repeatedly looking at the same type of content on Explore.
  • June 2022: We introduced new ways to verify peoples’ ageon Instagram, including privacy-preserving selfie videos.
  • July 2022: We introduced new tools that allow parents to enable and disable social features for teens that they’re supervising in Quest, including disabling the ability for their teens to send or receive chat messages.
  • October 2022: We began nudging people to be kind in DM requests, to discourage offensive or inappropriate DMs.
  • November 2022: We began prompting teens to report accounts to us after they block someone.
  • November 2022: We began defaulting teens under the age of 16 (or under 18 in certain countries) into more private settings when they join Facebook, and encouraged teens already on the app to choose these more private settings.
  • January 2023: We began giving teens more ways to manage the types of ads they see on Facebook and Instagram with Ad Topic Controls.
  • January 2023: We launched Quiet Mode, a feature to help people focus and to encourage them to set boundaries with their friends and followers. We prompt teens to turn on Quiet Mode when they spend a specific amount of time on Instagram late at night. Note: As of September 2024, the functionality offered by Quiet Mode has become part of our new Sleep Mode.
  • January 2023: We made updates to give people more control over the content they see on Instagram. First, we gave people the option to choose to hide multiple pieces of content in Explore at one time. When people select “Not interested” on a post seen in Explore, we aim to avoid showing them this kind of content in other places where we make recommendations, like Reels, Search and more. We also began allowing people to customize their recommendations with keywords. People can add a word or list of words, emojis or hashtags that they want to avoid – like “fitness” or “recipes” – and we’ll work to no longer recommend content with those words in the caption or the hashtag.
  • February 2023: Meta and NCMEC launch Take It Down, a new tool to help prevent the spread of young people’s intimate images.
  • April 2023: We brought Parental Supervision Tools to Horizon Worlds, allowing parents to:
    • See, adjust, and lock safety features like voice mode and personal boundary
    • See who their teen follows and who follows their teen.
    • See which apps their teen has used and how much time they’ve spent in Meta Quest and Worlds in the past seven days.
    • Give permission to allow or block their teen from using apps, including Worlds.
  • April 2023: We introduced a new tool, the Meta Quest Browser Website Category Filter, to help parents and guardians manage what their teen can access and view in the Meta Quest Browser.
  • June 2023: We brought Parental Supervision Tools to Messenger, allowing parents to:
    • View how much time their teen spends on Messenger
    • View and receive updates on their teen’s Messenger contacts list, as well as their teen’s privacy and safety settings
    • Get notified if their teen reports someone (if the teen chooses to share that information)
    • View who can message their teen (only their friends, friends of friends, or no one) and see if their teen changes this setting
    • View who can see their teen’s Messenger stories and get notified if these settings change
    • We later added additional features, including: giving parents the ability to set scheduled breaks and to view their teens’ blocked contacts.
  • June 2023: We began requiring people to send an invite to get their permission to connect in DMs. We limit these message request invites to text only, so people can’t send any photos, videos, or voice messages, or make calls, until the recipient has accepted the invite to chat. These changes mean people won’t receive unwanted photos, videos, or other types of media from people they don’t follow.
  • June 2023: We began showing teens a notification when they spend 20 minutes on Facebook, prompting them to take time away from the app and set daily time limits.
  • October 2023: We gave people the option to manually hide comments, to give them even greater control over comments that they may find upsetting or unwelcome, in addition to our Hidden Words tool.
  • November 2023: We announced we were founding members of Lantern, a program run by the Tech Coalition that enables technology companies to share signals about accounts and behaviors that violate their child safety policies, allowing them to take action across their respective platforms.
  • November 2023: We brought parental supervision tools to Facebook, allowing parents to oversee things like:
    • The amount of time their teen spends on Facebook.
    • To schedule breaks for their teens and access expert resources on managing their teens’ time online.
  • January 2024: We began hiding more types of age-inappropriate content for teens on Instagram and Facebook.
  • January 2024: We began hiding more results in Instagram Search related to suicide, self-harm and eating disorders. Now, when people search for terms related to suicide, self-harm and eating disorders, we’ll start hiding these related results and will direct them to expert resources for help.
  • January 2024: We began prompting teens to update their privacy settings on Instagram in a single tap with new notifications.
  • January 2024: We launched new nighttime nudges that show up when teens have spent more than 10 minutes on a particular Instagram surface (i.e., Reels or Instagram Direct Message) late at night. They’ll remind teens that it’s late, and encourage them to close the app. Note: As of September 2024, the functionality offered by Quiet Mode has become part of our new Sleep Mode.
  • January 2024: We announced stricter default message settings for teens under 16 (under 18 in certain countries), meaning only people they follow or people they’re already connected to can message them or add them to group chats.
  • January 2024: Building on Instagram’s existing parental supervision tools,parents using supervision will now be prompted to approve or deny their teen's (under 16) requests to change their default safety and privacy settings to a less strict state – rather than just being notified of the change.
  • February 2024: Meta has worked with the National Center for Missing & Exploited Children (NCMEC) to expand Take It Down to more countries and languages, allowing more teens to take back control of their intimate imagery and help protect themselves from scammers. We also partnered with Thorn to develop updated tips for teens - and parents and teachers - on what to do if they’re affected by scammers who seek to exploit their intimate imagery.
  • April 2024: We’re testing new features to help protect young people from sextortion and intimate image abuse. These include:
    • An on-device nudity protection feature, which will blur images detected as containing nudity received in Instagram DMs, encourage people to be cautious when sending sensitive images, and educate people both sending and receiving them about the risks involved. The feature will be on by default for teens under 18.
    • New technology to detect accounts that may potentially be engaging in sextortion and prevent them from interacting with and messaging teens
    • A new educational pop-up message for people who may have interacted with an account we’ve removed for sextortion.
  • April 2024: We started to share more sextortion-specific signals to Lantern, the platform we helped establish that allows participating tech companies to share signals about predatory accounts and behaviors. This industry cooperation is critical, because predators don’t limit themselves to just one platform – and the same is true of sextortion scammers.
  • June 2024: We introduced our privacy-preserving age verification tools on Facebook to even more countries globally. Now when someone attempts to edit their date of birth from under 18 to over 18 on Facebook, they will be prompted to verify their age either by submitting an ID or by uploading a video selfie.
  • September 2024: We launched Instagram Teen Accounts, a new experience for teens, guided by parents. Teen Accounts have built-in protections which limit who can contact them and the content they see, and also provide new ways for teens to explore their interests. We’ll automatically place teens into Teen Accounts, and teens under 16 will need a parent’s permission to change any of these settings to be less strict.
  • November 2024: We began testing a new feature that allows people, including teens, to reset their content recommendations in Explore, Reels and Feed, if they want to start fresh. This builds on the ways we already help people curate what they see in their recommendations, for example tapping on the three dots of a suggested post and selecting ‘interested’ or ‘not interested’.
  • December 2024: We’ve started rolling out Teen Accounts to teens signing up for Instagram and new teens just joining Instagram in the European Union.
  • March 2025: We announced and began to roll out the Instagram School Partnership Program to help address ongoing concerns about online bullying in schools by giving teachers, educators and administrators an easier way to report instances of teen safety issues directly to Meta. We’ll continue improving this program and encouraging US middle and high schools to enroll, to ensure students can have safer online experiences when returning to school this fall.
  • April 2025: We announced that Instagram Teen Accounts will now include new restrictions for Instagram Live and our nudity protection feature.
  • April 2025: We’ll begin making Teen Accounts available on Facebook and Messenger. Teen Accounts on Facebook and Messenger will offer automatic protections to address unwanted contact, sensitive content, and time spent. We’ll begin rolling Facebook and Messenger Teen Accounts out to teens in the US, UK, Australia and Canada and will bring the experience to teens in other regions soon.
  • April 2025: As we continue to roll out additional protections for Teen Accounts, we want to ensure eligible teens are enrolled. We began testing technology to proactively find accounts we suspect belong to teens in the US, even if they list an adult birthday, and place them in Teen Accounts settings.
  • June 2025: We announced that we’re building new technology to detect ads for nudify apps and sharing signals about these apps with other tech companies so they can take action too.
  • July 2025: We added new safety features to DMs in Teen Accounts to give teens more context about the accounts they’re messaging and help them spot potential scammers. Now, teens will see new options to view safety tips and block an account, as well as the month and year the account joined Instagram, all prominently displayed at the top of new chats.
  • July 2025: We strengthened our protections for accounts run by adults that primarily feature children. These include adults who regularly share photos and videos of their children, and adults – such as parents or talent managers – who run accounts that represent teens or children under 13.
  • September 2025: We expanded the Instagram School Partnership Program to help educators report safety concerns, like bullying, directly to us for quicker review and removal. We also expanded access to our online safety curriculum to teach middle schoolers how to stay safe online, including how to recognize and avoid scams and other types of exploitation.
  • September 2025: We expanded testing technology to proactively find accounts we suspect belong to teens in the UK, Australia, and Canada, even if they list an adult birthday, and place them in Teen Account settings.
  • October 2025: We announced that Instagram Teen Accounts will be guided by PG-13 movie ratings by default. This means that teens will see content on Instagram that’s similar to what they’d see in a PG-13 movie. Teens under 18 will be automatically placed into an updated 13+ setting, and they won’t be able to opt out without a parent’s permission. And because we know that all families are different, we’re also introducing a new, stricter setting for parents who prefer a more restrictive experience for their teen.
  • October 2025: We announced that we will be introducing simple, new ways for parents to turn off their teens’ access to one-on-one chats with AI characters, and giving them more insights on how teens are interacting with AI.
Share

Was this article helpful?

Yes
No