Meta

Meta
Facebook Meta.com social mediaThreads Meta.com social mediaInstagram Meta.com social mediaX Meta.com social mediaYouTube Meta.com media channel
Meta Store
Ray-Ban Meta glassesOakley Meta glassesMeta Ray-Ban DisplayCompare glassesMeta QuestAccessoriesApps and gamesMeta Quest gift cardsRefurbished Meta Quest 3Refurbished Meta Quest 3SRefurbished Ray-Ban Meta glassesMore from Ray-BanMeta Quest: Play now. Pay later.Meta Warranty PlusMeta for WorkMeta for EducationMeta Quest referralsEducation discountBlog

Store support and legal
Meta Help CentreOrder statusReturnsFind a product demoFind a storeLegalTerms of saleMeta Quest safety centre

Community
CreatorsDevelopersBusinessesCharitiesDownload SDKsMade for Meta partner programmeVR for Good

Our actions
Data and privacyResponsible business practicesElections

About us
About MetaCompany infoCareersMedia galleryBrand resourcesFor investorsNewsroom

Site terms and policies
Community StandardsPrivacy PolicyTermsCookie Policy

App support
Facebook Help CentreMessenger Help CentreInstagram Help CentreWhatsApp Help CentreWorkplace Help CentreMeta Verified

Meta Store
Ray-Ban Meta glasses
Oakley Meta glasses
Meta Ray-Ban Display
Compare glasses
Meta Quest
Accessories
Apps and games
Meta Quest gift cards
Refurbished Meta Quest 3
Refurbished Meta Quest 3S
Refurbished Ray-Ban Meta glasses
More from Ray-Ban
Meta Quest: Play now. Pay later.
Meta Warranty Plus
Meta for Work
Meta for Education
Meta Quest referrals
Education discount
Blog
Store support and legal
Meta Help Centre
Order status
Returns
Find a product demo
Find a store
Legal
Terms of sale
Meta Quest safety centre
Community
Creators
Developers
Businesses
Charities
Download SDKs
Made for Meta partner programme
VR for Good
Our actions
Data and privacy
Responsible business practices
Elections
About us
About Meta
Company info
Careers
Media gallery
Brand resources
For investors
Newsroom
Site terms and policies
Community Standards
Privacy Policy
Terms
Cookie Policy
App support
Facebook Help Centre
Messenger Help Centre
Instagram Help Centre
WhatsApp Help Centre
Workplace Help Centre
Meta Verified
Meta Store
Ray-Ban Meta glasses
Oakley Meta glasses
Meta Ray-Ban Display
Compare glasses
Meta Quest
Accessories
Apps and games
Meta Quest gift cards
Refurbished Meta Quest 3
Refurbished Meta Quest 3S
Refurbished Ray-Ban Meta glasses
More from Ray-Ban
Meta Quest: Play now. Pay later.
Meta Warranty Plus
Meta for Work
Meta for Education
Meta Quest referrals
Education discount
Blog
Store support and legal
Meta Help Centre
Order status
Returns
Find a product demo
Find a store
Legal
Terms of sale
Meta Quest safety centre
Community
Creators
Developers
Businesses
Charities
Download SDKs
Made for Meta partner programme
VR for Good
Our actions
Data and privacy
Responsible business practices
Elections
About us
About Meta
Company info
Careers
Media gallery
Brand resources
For investors
Newsroom
Site terms and policies
Community Standards
Privacy Policy
Terms
Cookie Policy
App support
Facebook Help Centre
Messenger Help Centre
Instagram Help Centre
WhatsApp Help Centre
Workplace Help Centre
Meta Verified
Meta Store
Ray-Ban Meta glasses
Oakley Meta glasses
Meta Ray-Ban Display
Compare glasses
Meta Quest
Accessories
Apps and games
Meta Quest gift cards
Refurbished Meta Quest 3
Refurbished Meta Quest 3S
Refurbished Ray-Ban Meta glasses
More from Ray-Ban
Meta Quest: Play now. Pay later.
Meta Warranty Plus
Meta for Work
Meta for Education
Meta Quest referrals
Education discount
Blog
Store support and legal
Meta Help Centre
Order status
Returns
Find a product demo
Find a store
Legal
Terms of sale
Meta Quest safety centre
Community
Creators
Developers
Businesses
Charities
Download SDKs
Made for Meta partner programme
VR for Good
Our actions
Data and privacy
Responsible business practices
Elections
About us
About Meta
Company info
Careers
Media gallery
Brand resources
For investors
Newsroom
Site terms and policies
Community Standards
Privacy Policy
Terms
Cookie Policy
App support
Facebook Help Centre
Messenger Help Centre
Instagram Help Centre
WhatsApp Help Centre
Workplace Help Centre
Meta Verified
English (UK)

Privacy progress update

Table of contents

How we do it
Governance
Creating a culture of privacy
Regulatory readiness process
Privacy risk identification and assessment
Safeguards and controls
Issues management
Privacy Red Team
Incident management
Third-party oversight
Anti-scraping
Risk review
Technical privacy investments
Privacy product outcomes
Ongoing commitment to privacy

How we do it

Since 2019, we've invested significantly in people, products and technology to continue to evolve our rigorous privacy programme. Everyone at Meta is responsible for upholding the experience and privacy expectations that we provide to our users, and we use technology, policies and processes to build privacy protections into our products. With our entire company acting as the first line of defence, following the guidance of our compliance experts, we have a mature privacy programme that ensures oversight and accountability at an unprecedented scale.
Photo of Chief Privacy Officer of Product Michel Protti on Meta’s Menlo Park campus

"We realised we needed an order of magnitude greater investment in privacy – the summer of 2019 was that moment. The degree of change for Meta has been massive – it's required foundational changes to our people, processes and technical infrastructure, and oversight and accountability. And we continue to invest in protecting people's data as systems, technology and expectations evolve."

– Michel Protti, Chief Privacy and Compliance Officer, Product

Back to top

Governance

Our work on privacy is underpinned by our internal governance structures that embed privacy and data-use standards across the company’s operations.

Privacy product groups

We embed privacy-focused teams within product groups to champion privacy considerations by providing expertise within each product group. These teams enable front-line ownership of privacy responsibilities across our products. We have a robust and collaborative process for infusing privacy requirements into product roadmaps across Meta. Twice a year, product teams devote an average of 6-8 weeks to privacy roadmapping, dedicating a portion of their roadmaps to building new and improved privacy protections.

Product risk & compliance organisation

Led by Chief Privacy and Compliance Officer, Product, Michel Protti, the product risk & compliance organisation is made up of dozens of teams, both technical and non-technical, focused on guiding the company on privacy strategies.

The product risk & compliance organisation is at the centre of our company’s commitment to its comprehensive privacy programme. Its mission – to instil responsible practices and enable innovation across Meta – guides this work by ensuring that people understand and trust how Meta products and services use their data responsibly.

The product risk & compliance organisation is just one organisation among many across the company that is responsible for privacy. There are thousands of people in different organisations and roles across Meta, including public policy, legal and product teams, who are working to embed privacy, safety, security and other areas of risk management, into all facets of our company operations. Getting privacy right is a deeply cross-functional effort, and we believe everyone at Meta is responsible for that effort.

Privacy and Data Policy

Led by Erin Egan, Vice President and Chief Privacy Officer, Policy, the Privacy and Data Policy team leads our engagement in the global public discussion around privacy, including new regulatory frameworks, and ensures that feedback from governments and experts around the world is considered in our product design and data use practices.

To do so, the Privacy and Data Policy team consults with these groups through a variety of consultation mechanisms. This includes:

  • Ongoing consultations with experts as a part of our product development process, and to inform our long-term approach to privacy
  • Workshops (including data dialogues) and other convenings on complex privacy questions
  • Funding experts to do "deep dive" consultations with us on how we can improve aspects of our privacy programme
  • Supporting cross-industry initiatives such as Trust, Transparency and Control Labs (TTC Labs) and Open Loop that develop innovative solutions that help us and others build technology responsibly
  • Hosting a regular conversation series with leading privacy experts from around the world to discuss a range of pressing privacy policy topics

External privacy events

We support and participate in industry conferences and other events that promote privacy and share advances in privacy practices. In 2025, we shared our advances in automating purpose limitation in our infrastructure at the IAPP’s Privacy, Security and Risk conference. Erin Egan also discussed risk management and AI with Bojana Belamy, President of the Centre for Information Policy Leadership (CIPL), and Susan Cooper, Meta’s Global Data Protection Officer, in our Privacy Conversations series, which is an online discussion where Erin invites leading privacy experts from around the world to discuss a range of pressing privacy policy topics. These are opportunities for us to both share with and learn from experts, to ensure that we’re taking their perspectives into account as we build products and policies.

Advisory groups

We also partner with various experts via advisory groups. These groups inform Meta’s decision-making on the most novel and complex questions that we face as we build innovative technologies. We regularly consult with groups focused on:
  • The privacy needs around the world
  • Messaging and encryption
  • Wearables, virtual reality and the metaverse
  • Artificial intelligence
  • Monetisation and ads

Privacy Legal

The Privacy Legal team is embedded in the design and ongoing execution of our programme and advises on legal requirements during the course of our privacy review process.

Audit & Privacy Committee

The Audit & Privacy Committee is an independent committee of our Board of Directors that meets at least quarterly and has responsibility for privacy and product compliance oversight, among other things. The Committee is composed of independent directors with a wealth of experience serving in similar oversight roles. At least once per quarter, they receive briefings on the global policy landscape, the state of our privacy programme, the status of the independent third-party assessment of our privacy programme and more.

Internal audit

Internal audit brings independent assurance on the overall health of our privacy programme and the supporting control framework.
Back to top
A group of people talking in a work meeting, surrounded by paper, tablets and sticky notes.

Creating a culture of privacy

Part of ensuring that everyone understands their role in protecting privacy at Meta is driving continuous privacy learning and education that spans training and internal privacy awareness campaigns.

Training

A core component of our privacy education approach is delivered through our privacy training. Our privacy training covers the foundational elements of privacy and is designed to help everyone here at Meta feel empowered to identify privacy risks and make responsible decisions that help mitigate them, so we can all take pride in not only what we build, but how we build it. One key theme is ensuring that Meta personnel are aware that they are the first line of defence to guard against privacy risks and to mitigate such risks.

Both our annual privacy training and our privacy training courses for new hires and new contingent workers provide scenario-based examples of privacy considerations aligned with our business operations and include an assessment to test the understanding of the relevant privacy concepts. These trainings are updated and deployed annually to ensure that relevant information is included in addition to core concepts.

In addition to our foundational privacy training, Meta regularly deploys specialist privacy education to targeted audiences, tailored to evolving topics and risks.

Privacy awareness

Another way we drive company-wide awareness around privacy is through regular communication to employees. In addition to our privacy training courses, we deliver ongoing privacy content through internal communication channels, updates from Meta’s leadership and internal Q&A sessions.

Back to top
Two people in a corridor having a discussion while looking at a tablet computer.

Regulatory readiness process

We have a dedicated team whose job is to help ensure that the company complies with global privacy and data regulations. To achieve this goal, we've developed a comprehensive, end-to-end process that helps us identify and respond to external regulatory obligations. This process provides the company with a coordinated view into all incoming regulatory obligations and a predictable process to support our responses.

Privacy risk identification and assessment

We’ve created our privacy risk management programme to identify and assess privacy risks related to how we collect, use, share and store user data. We leverage this process to identify risk themes, enhance our privacy programme and prepare for future compliance initiatives.

Safeguards and controls

We’ve designed safeguards, including processes and technical controls, to address privacy risks. As part of this effort, we conduct internal evaluations on both the design and effectiveness of the safeguards for mitigating privacy risk.

Issues management

We’ve established a centralised issue management function to facilitate self-identification and remediation of privacy issues. This process spans the privacy issue management lifecycle from intake and triage, remediation planning and closure with evidence.

Privacy Red Team

We’ve established a Privacy Red Team whose role is to proactively test our processes and technology to identify potential privacy risks. The Privacy Red Team assumes the role of external or internal parties attempting to circumvent our privacy controls and safeguards, which helps us proactively identify areas where we can improve our control environment.

Incident management

In addition to our robust mitigations and safeguards, we also maintain a process to (1) identify when an event potentially undermines the confidentiality, integrity or availability of data for which Meta is responsible, (2) investigate those situations, and (3) take any needed steps to address gaps we identify.

Our incident management programme operates globally to oversee the processes by which we identify, assess, investigate and remediate privacy incidents. Although the privacy and data practices team leads the incident management process, privacy incidents are everyone's responsibility at Meta. Teams from across the company, including legal and product teams, play vital roles. We continue to invest time, resources and energy in building a multi-layered programme that is constantly evolving and improving, and we highlight three components of our approach below.


Proactive identification

We take a layered approach to protecting people and their information – which includes implementing safeguards designed to catch bugs proactively, before they can become a problem. Given the scale at which we operate, we have invested heavily in building and deploying a wide range of automated tools that are intended to help us identify and remediate potential privacy incidents as early and quickly as possible. These automated systems are designed to detect incidents in real time to facilitate rapid response. The oversight and diligence of our employees always play a critical role in helping to proactively identify and remediate incidents. Our engineering teams regularly review our systems to identify and remediate incidents in a timely manner.

Bug bounty

Since 2011, we have operated a bug bounty programme in which external researchers help improve the security and privacy of our products and systems by reporting potential security vulnerabilities to us. The researchers' findings may lead to further investigation and, if necessary, remediation by Meta. The programme helps us scale detection efforts and fix issues faster to better protect our community, and the bounties we pay to qualifying participants encourage more high-quality security research.
Over the past 15 years, we’ve awarded over USD 25 million to more than 1,400 researchers from 88 countries for helping us detect and fix issues faster, including testing products and features before they are rolled out to our users.

Transparency

While we’ve adopted a number of protections to guard against privacy incidents (e.g. unauthorised access to personal data), if an incident does occur, we believe that transparency is an important way to rebuild trust in our products, services and processes. Accordingly, beyond fixing and learning from our mistakes, our incident management programme includes steps to notify people where appropriate, such as a post in our Newsroom or our Privacy matters blog about issues affecting our community, or working with law enforcement or other officials to address incidents we find.
Back to top
A photo of a person in conversation with someone slightly ot of focus on the right.

Third-party oversight

Third parties are external partners who do business with Meta, but aren’t owned or operated by Meta. These third parties typically fall into two major categories: those who provide a service for Meta (such as vendors who provide website design support or technology solutions to enable our business) and those who build their businesses around our platform (such as app or API developers). To mitigate privacy risks posed by data and personal information exchanged with third parties, we developed a dedicated third-party oversight and management programme, which is responsible for overseeing third-party risks and implementing appropriate privacy safeguards.


Third-party service providers

We’ve also created a third-party assessment process for service providers to assess and mitigate security and privacy risk. Our process requires that these service providers are also bound by contracts containing privacy protections. Their risk profile determines how they are monitored, reassessed and, where appropriate, which enforcement actions to take as a result of violations, including termination of the engagement.

Third-party developers

As part of Meta’s ongoing commitment to fostering a secure and privacy-centric developer ecosystem, we’ve launched new resources and initiatives designed to support responsible data use and reinforce our platform integrity.
Responsible Platform Initiatives Hub: Our Responsible Platform Initiatives Hub centralises Meta’s approach to privacy and security with our third-party developers, reinforcing our commitment to safe and responsible data practices across our platform. This hub provides resources and comprehensive guidance on our compliance-related programmes, responsible development practices and the tools that help developers align with Meta's privacy standards. Through the hub, developers can access necessary resources related to our policies and responsible platform practices, supporting a safer and more transparent ecosystem.
Data access and renewal: We have a consolidated process to request, manage and renew developer access to data, reinforcing our commitment to simplifying the platform and building new developer tools. Under data access renewal, developers undergo just one regular assessment once a year to confirm continued compliance with Meta's privacy and data protection standards and Platform Terms. This process requires developers to explicitly reaffirm their adherence to Meta’s data protection standards and Terms, and outline how and why each type of data is used.

Anti-scraping

Our anti-scraping team is dedicated to detecting, investigating and blocking patterns of behaviour associated with unauthorised scraping. Scraping is the automated collection of data from a website or app and can be either authorised or unauthorised. Using automation to access or collect data from Meta’s platforms without our permission is a violation of our Terms of Service.


Prevention and mitigation

We continue to invest in infrastructure and tools to make it harder for scrapers to collect data from our services and more difficult to capitalise off it if they do. Examples of these investments include rate limits and data limits. Rate limits cap the number of times anyone can interact with our products in a given amount of time, while data limits keep people from getting more data than they should need to use our products normally.

Investigation and enforcement

We’ve blocked billions of suspected unauthorised scraping actions per day across Facebook, Instagram and WhatsApp, and we’ve taken a variety of actions against unauthorised scrapers, including disabling accounts and requesting that companies hosting scraped data delete it.
Back to top
Several people leaning over a table in a meeting room setting.

Risk review

The risk review process – which includes privacy review – is a central part of developing new and updated products, services and practices at Meta. Through this process, we assess how data will be used and protected as a part of new or updated products, services and practices. We review an average of 1,800 products, features and data practices per month across the company before they are sent to assess and mitigate risks.

Our strategy is focused on embedding hundreds of high-quality, reusable internal requirements into our risk review system designed to meet external expectations globally. An example of a requirement is mandating that when requested, user data be deleted within a certain period of time. This way, product teams can consider privacy risks at the start of the product development process, and easily apply requirements to maintain high standards of compliance.

To help ensure that those requirements are working in practice, we’ve developed consolidated and uniform technical solutions to satisfy a given requirement, with continuous compliance checks to confirm the solution is effective.

For newly developed products which may not be covered by internal requirements, such as certain AI products and features, we have a robust process in place to triage critical decisions with senior-level subject matter experts. Decisions are then codified into requirements.

Back to top
A wide shot of five people in conversation standing in front of a wall of windows.

Technical privacy investments

By embedding privacy into Meta’s tools and processes, teams can think about privacy earlier in the product development lifecycle and more easily and consistently deliver privacy benefits for our users.

We’re investing in technological innovations that scale and improve our approach to privacy. By combining the efficiency and scalability of AI with the nuance and expertise of human judgment, we’re better able to ensure consistent decisions that deliver innovative products.


AI for compliance efforts

We believe that the natural evolution of risk management – historically a manual and time consuming process – leverages advanced AI and automation technology to help deliver more consistent privacy benefits for our users. By automating the rote aspects of risk management, we free up our privacy experts to focus on novel, complex or compounding regulatory challenges throughout product development – not just before production. Our work is predicated on the philosophy that AI and humans each have unique qualifications, and by combining them effectively, we can optimise for the appropriate level of AI-human collaboration each compliance task necessitates.
Tasked with integrating AI and automation specifically for compliance, our teams have delivered breakthrough capabilities to bring greater consistency and effectiveness to our risk management. For example, automating the identification of known risks and mitigations to ensure that uniform standards helps empower our experts to make precise risk identification and mitigation decisions. Plus, AI-powered detection mechanisms proactively catch code that may fall out of compliance, safeguarding our operations.

Privacy-aware infrastructure

We continue to advance and build our privacy-aware infrastructure – innovative and efficient constructs embedded in Meta infrastructure solutions that enable engineers to more easily address complex privacy requirements as they build products. Privacy-aware infrastructure embeds privacy rules directly into code, helping ensure that requirements are automatically respected. For example, we built policy zones that apply across our infrastructure to address restrictions on data, such as using it only for allowed purposes, providing strong guarantees for limiting the purposes of its processing. PAI continues to be crucial in enforcing complex purpose limitation scenarios while ensuring scalability, reliability and a streamlined developer experience.
A waist-up photo of a person holding a coffee cup and looking at their phone while smiling.

Privacy product outcomes

We’ve become a privacy-first product development company ensuring that from inception to delivery, any product or feature considers the privacy implications from the start. This is part of our commitment to delivering privacy in both new product innovations and updates to existing products. To reinforce this commitment, Privacy is a core component of performance evaluation for our engineering teams.

You can see this in many examples of products we’ve delivered:


End-to-end encryption in messaging

Since 2016, personal conversations on WhatsApp have been protected by end-to-end encryption, which means no one outside a user’s chats, not even WhatsApp or Meta, can read, listen to or share them. Starting in 2023, we have also been rolling out default end-to-end encryption for all personal one-to-one chats and calls on Messenger, making them even more private and secure.
This has taken years to deliver because we’ve taken the time to get it right. Our engineers, cryptographers, designers, policy experts and product managers have worked tirelessly to rebuild Messenger features from the ground up. Enabling end-to-end encryption on Messenger meant fundamentally rebuilding many aspects of the application protocols to improve privacy, security and safety, while simultaneously maintaining the features that have made Messenger so popular. Our approach was to leverage prior learnings from both WhatsApp and Messenger’s Secret Conversations (Messenger’s early optional end-to-end encryption offering), and then iterate on our most challenging problems such as multi-device capability, feature support, message history and web support. Along the way, we introduced new privacy, safety and control features such as app lock and delivery controls that let people choose who can message them, while improving existing safety features such as report, block and message requests. We’ve also made it easier to opt in to end-to-end encrypted backups by introducing passkey-encrypted backups, allowing users to encrypt their backups with their face, fingerprint or device pin code – the same way they unlock their device.
At its core, end-to-end encryption is about protecting people’s communications, so they can feel safe expressing themselves with their friends and loved ones. We worked closely with outside experts, academics, advocates and governments to identify risks and build mitigations to ensure that privacy and safety go hand in hand. We commissioned an independent human rights impact assessment and designed our products to prevent harm by design, offered robust user controls and invested in proactively identifying abuse.

Wearables

Meta’s AI glasses have built-in Meta AI, and let you snap a photo or record a video from your unique point of view, listen to music or take a call – all without having to pull out your phone, so you don’t have to choose between capturing the moment and experiencing it.
Meta’s AI glasses were built with privacy at their core, showing our commitment to responsible innovation and privacy by design. We’ve incorporated stakeholder feedback – which we obtained early on from the moment we launched the first version of these glasses, the Ray-Ban Stories – in meaningful and tangible ways.
  • The capture LED is a visible light that lets others know when AI glasses are being used to take photos and record videos, for the user’s gallery or streaming.
  • The glasses are configured so that users cannot capture photos or videos while the capture LED is fully covered. If the capture LED is fully obscured, the user will not be able to utilise the camera and will be notified to remove the obfuscation before proceeding.
  • We published an updated white paper that explains our approach to bystander signalling for active capture and AI features.
  • Wrist sensor data (including EMG data) from the Meta Neural Band is processed locally on user devices to enable control of AI glasses. Users can choose to share their wrist sensor data with Meta, if they are experiencing an issue with their Meta Neural Band. The Meta AI app continues to provide easy access to privacy settings to manage information and additional data sharing with Meta.

Teen Accounts

Teens are automatically placed into Teen Accounts, and teens under 16 need a parent’s permission to change any of the default built-in protections to be less strict. These Teen Accounts features are currently being rolled out globally across Instagram, Facebook and Messenger.

Meta AI on WhatsApp

We also developed Private processing for Meta AI on WhatsApp, which is built on top of a Trusted Execution Environment (TEE). Private Processing allows users to direct Meta AI to process their requests in a secure and private cloud environment where no one, not even Meta or WhatsApp, can access their personal messages. This technology now powers certain optional Meta AI features on WhatsApp, such as summarising messages and providing writing help. Once Private Processing has finished responding to a request, the messages won't be stored.

Why am I seeing this ad?

Our “Why am I seeing this ad?” tool continues to help people understand why they’re seeing the ads they do on Facebook and Instagram feeds. One key update was summarising information into topics about how activity both on and off our technologies – such as liking a post on a friend’s Facebook Page or visiting a sports website – may inform the machine learning models we use to shape and deliver ads. We have also continued to introduce new examples and illustrations explaining how our machine learning models connect various topics to show relevant ads.

Meta Content Library and AI Tools

Meta Content Library and Content Library API are research tools that provide qualified individuals with access to publicly accessible content from Facebook, Instagram and Threads. Individuals can search, explore and filter the data on both a graphical user interface or through a programmatic API.
Accessible data includes posts from Facebook Pages, groups, events, fundraisers, marketplace listings and profiles, posts and stories from Instagram accounts, fundraisers and messages from channels, and posts from Threads profiles. Details about the content, such as the number of reactions, shares, comments and post view counts, are available as well.
Meta launched a partnership with the Secure Data Access Center (CASD, Le Centre d’Accès Sécurisé aux Données), and as part of our collaboration, CASD will independently review research proposals to access Meta Content Library. Their global leadership ensures that the highest standards of integrity and security are maintained throughout the review process and ensures that applicants to Meta Content Library will continue to receive an independent and expert-led evaluation. This marks a transition from the University of Michigan’s Social Media Archive (SOMAR) as our independent vetting partner. We are grateful for the expertise and dedication they brought to the application review process.
To coincide with our partnership with CASD, researchers will now submit applications to access Meta Content Library through our newly launched Meta-hosted application portal, Research Tools Manager. This application infrastructure was built to enhance the onboarding and support experience for both new applicants and existing researchers.

Educating people about privacy

Our work to communicate transparently includes providing external education to improve people’s understanding and awareness of our practices and ensuring that information is accessible and easy to find.
  • Privacy Policy, which details how we collect, use, share, retain and transfer information, as well as what rights and controls people have over their privacy.
  • Privacy Centre, where people can go to better understand our practices so they can make informed decisions about their privacy in a way that is right for them. Through education and access to privacy and security controls, we address some of the most common privacy concerns from the billions of people who spend their time with us every day. Privacy Centre has several modules, including sharing, collection, use, security, youth, generative AI and ads, to directly connect an issue or concern with the relevant privacy and security controls we’ve built across our apps and services over the years.
  • Data and Privacy Section of Newsroom, where we provide more information about how we’ve approached privacy in the context of particular features or issues.
  • Additional educational materials. We are committed to helping users understand our data practices and regularly introduce new educational resources. We launched a new animated video that clearly explains how organic personalisation works and showcases the privacy controls available to users. The video also introduces the role of AI in enhancing personalisation. By adding this video to our existing personalisation resources, we aim to increase awareness of user controls and reinforce Meta’s dedication to transparency.

Giving people greater access and control over their information

To provide greater transparency and control to people, we’ve developed a number of privacy tools for people to understand what they share and how their information is used, including:
  • Privacy Checkup: Guides people through important privacy and security settings on Facebook to help strengthen account security and manage who can see what they share and how their information is used. Privacy Checkup has five distinct topics to help people control who can see what they share, how their information is used and how to strengthen their account security.
    • Who can see what you share helps people review who can see their profile information, such as their phone number and email address, as well as their posts.
    • How to keep your account secure helps people strengthen their account security by setting a stronger password and turning on two-factor authentication.
    • How people can find you on Facebook lets people review ways in which people can look you up on Facebook and who can send you friend requests.
    • Your data settings on Facebook lets people review the information they share with apps they've logged in to with Facebook. They can also remove the apps they no longer use.
    • Your ad preferences on Facebook provides information about how ads work on our Products, lets people decide what profile info advertisers can use to show them ads and lets them control who can see their social interactions, such as likes, alongside ads.
  • Manage activity: Allows people to manage posts in bulk with filters to help you sort and find the content they are looking for, such as posts with specific people or from a specific date range. It also includes a deletion control that provides a more permanent option, so people can move old posts in bulk to the bin. After 30 days, posts sent to the bin will be deleted, unless they choose to manually delete or restore them before then. You can access this feature by going to activity log in Facebook and Your activity in Instagram.
  • Accounts Centre: Accounts Centre is a place to help people manage connected experiences and change account settings across their Facebook, Instagram, WhatsApp and Meta accounts. People can choose to add Facebook, Instagram, WhatsApp and Meta accounts to the same Accounts Centre, enabling them to:
    • Manage connected experiences across the accounts in the same Accounts Centre, such as sharing posts or stories to multiple profiles at once, and logging in across accounts. Learn more.
    • Manage individual settings for each account in Accounts Centre, such as personal details, password and security, and your information and permissions. Learn more.
      • Download your information and Access your information: Within Accounts Centre settings, Meta provides tools to make people’s information on our products useful and easy to find, such as activity log, access your information and download your information.
    • Manage settings across all accounts at once if they are in the same Accounts Centre, such as ad preferences and activity off-Meta technologies (which provides a summary of activity that businesses and organisations share with us about people's interactions, such as visiting their apps or websites, and gives people the option to disconnect their past activity from their account). Learn more.
    • For WhatsApp users, adding their WhatsApp account to an Accounts Centre is optional, and can unlock connected experiences across apps, such as seamlessly sharing your WhatsApp Status to Instagram or Facebook Stories. If you choose this option, your personal messages and calls are still protected with end-to-end encryption, meaning no one outside the chat, not even WhatsApp or Meta, can read, listen to or share them.
  • Meta Quest and Meta Horizon:
    • Social privacy settings: You can choose how social you want to be and how your information is shared with others through your privacy settings. You can make your Meta Horizon profile public or private, your Active Status allows you to control whether to share when you're online or were recently online, and Hide app activity lets you hide your activity for specific apps. These privacy settings are simple and easily accessible from your Quest headset or the Horizon app.
    • Privacy indicator and permissions history: The privacy indicator that lets you know when certain privacy-related features are currently in use – and which of the apps in your library are currently using them. That includes your microphone, location, spatial data, eye tracking (for Quest Pro only) and natural facial expressions (for Quest Pro only). The permissions history feature also shows which of your installed apps accessed your microphone, location, spatial data, eye tracking (for Quest Pro only), natural facial expressions (for Quest Pro only) and storage data, over the last seven days.
    • Parent-managed accounts: Parents can set up parent-managed Meta accounts for 10-12 year olds (ages may vary depending on location), allowing pre-teens to access a vast array of engaging and educational content in VR – with age-appropriate protections built specifically for them.

Ongoing commitment to privacy

We’re invested in privacy and are committed to continuous improvement.

"Getting privacy right is a continual, collective investment across our company, and is the responsibility of everyone at Meta to advance our mission."

– Michel Protti, Chief Privacy and Compliance Officer, Product

Protecting users’ data and privacy is essential to our business and our vision for the future. To do so, we’re continually refining and improving our privacy programme and our products, as we respond to evolving expectations and technological developments – working with policy makers and data protection experts to find solutions to unprecedented challenges – and sharing our progress as we do.

Back to top
Meta
AI glasses
Meta Quest
Apps and games
Explore Meta
Support
Meta
AI glasses
Meta Quest
Apps and games
Explore Meta
Support