Skip to content

Privacy Progress Update

Since 2019, we’ve invested more than $8 billion in building and continuing to evolve our rigorous privacy program.

01. HOW WE DO IT

Today, over 3,000 people across the company focus primarily on privacy. At the heart of this investment are the teams, technology and processes that build privacy protections into our products and create the oversight and accountability for a mature privacy program at an unprecedented scale.

Photo of Chief Privacy Officer of Product Michel Protti on Meta’s Menlo Park campus

“We realized we needed an order of magnitude greater investment in privacy—the summer of 2019 was that moment. The degree of change for Meta has been massive—it's required foundational changes to our people, processes, and technical infrastructure, and oversight and accountability. And we continue to invest in protecting people's data as systems, technology, and expectations evolve.”


—Michel Protti, Chief Privacy and Compliance Officer, Product

Governance

Our work on privacy is underpinned by internal governance structures that embed privacy and data-use standards across the company’s operations.

As we continue to integrate privacy across the company, we’ve embedded privacy teams within product groups that deepen the understanding of privacy considerations by providing expertise within each product group. These teams enable front-line ownership of privacy responsibilities across our products. We have a robust and collaborative process for infusing privacy requirements into product roadmaps across Meta. Twice a year, product teams devote an average of 6-8 weeks to privacy roadmapping, dedicating a portion of their roadmaps to building new and improved privacy protections.

Led by Chief Privacy and Compliance Officer, Product, Michel Protti, the Product Compliance and Privacy team is made up of dozens of teams, both technical and non-technical, focused on guiding the company on privacy strategies.

The Product Compliance and Privacy Team is at the center of our company’s efforts to maintain a comprehensive privacy program. Its mission—to instill responsible practices and enable innovation across Meta—guides this work by ensuring people understand and trust how Meta products and services use their data responsibly.

The Product Compliance and Privacy Team is just one organization among many across the company that is responsible for privacy. There are thousands of people in different organizations and roles across Meta, including public policy, legal and product teams, who are working to embed privacy into all facets of our company operations. Getting privacy right is a deeply cross-functional effort, and we believe everyone at Meta is responsible for that effort.

Led by Erin Egan, Vice President and Chief Privacy Officer, Policy, the Privacy and Data Policy team leads our engagement in the global public discussion around privacy, including new regulatory frameworks, and ensures that feedback from governments and experts around the world is considered in our product design and data use practices.

To do so, the Privacy and Data Policy team consults with these groups through a variety of consultation mechanisms. In 2024, this included:

  • Ongoing consultations with experts as a part of our product development process, and to inform our long-term approach to privacy
  • Workshops (including Data Dialogues) and other convenings on complex privacy questions
  • Funding experts to do "deep dive" consultations with us on how we can improve aspects of our privacy program
  • Supporting cross-industry initiatives like Trust, Transparency and Control Labs (TTC Labs) and Open Loop that develop innovative solutions that help us and others build technology responsibly
  • Hosting a regular conversation series and an annual Global Innovation and Policy Flyout with leading privacy experts from around the world to discuss a range of pressing privacy policy topics

We support and participate in industry conferences and other events that promote privacy and share advances in privacy practices. In 2024, we shared our advances in privacy enhancing technologies in USENIX’s Conference on Privacy Engineering Practice and Respect, and governance and accountability best practices in IAPP’s conferences with privacy and governance professionals in each part of the world. These are opportunities for us to both share with and learn from experts, to ensure we’re taking their perspectives into account as we build products and policies.

We also partner with various experts via advisory groups, which we expanded in 2024. These groups are created to inform Meta’s decision making on the most novel and complex questions we face as we build innovative technologies. In 2024, we consulted with groups focused on:

The Privacy Legal team is embedded in the design and ongoing execution of our program and advises on legal requirements during the course of our privacy review process.

The Privacy and Product Compliance Committee is an independent committee of our Board of Directors that meets at least quarterly and has responsibility for privacy and product compliance oversight. The Committee is composed of independent directors with a wealth of experience serving in similar oversight roles. At least once per quarter, they receive briefings on, among other things, the global policy landscape, the state of our privacy program, and the status of the independent third-party assessment of our privacy program.

Internal Audit brings independent assurance on the overall health of our privacy program and the supporting control framework.

Creating a Culture of Privacy

Part of ensuring that everyone understands their role in protecting privacy at Meta is driving continuous privacy learning and education that spans training and internal privacy awareness campaigns.

A core component of our privacy education approach is delivered through our privacy training. Our privacy training covers the foundational elements of privacy and is designed to help everyone here at Meta feel empowered to identify privacy risks and make responsible decisions that help mitigate them, so we can all take pride in not only what we build, but how we build it. One key theme is ensuring that Meta personnel are aware that they are the first line of defense to guard against privacy risks and to mitigate such risks.

Through its eLearning format, both our annual privacy training and our privacy training courses for new hires and new contingent workers provide scenario-based examples of privacy considerations aligned with our business operations and include an assessment to test the understanding of the relevant privacy concepts. These trainings are updated and deployed annually to ensure relevant information is included in addition to core concepts.

Alongside our foundational required privacy training, we also maintain a catalog of all known privacy training deployed across Meta that spans topics and risks.

Another way we drive company-wide awareness around privacy is through regular communication to employees. In addition to our privacy training courses, we deliver ongoing privacy content through internal communication channels, updates from Meta’s leadership, internal Q&A sessions, and a dedicated Privacy Day.

We leverage tentpole moments like Data Privacy Day to drive cross-company focus on privacy, featuring talks from Meta leaders, and highlighting key privacy concepts and priorities through engaging content and events.

Regulatory Readiness Process

We have a dedicated team whose job is to help ensure the company complies with global privacy and data regulations. To achieve this goal, we've developed a comprehensive, end-to-end process that helps us identify and respond to external regulatory obligations. This process provides the company with a coordinated view into all incoming regulatory obligations and a predictable process to support our responses.

Privacy Risk Identification and Assessment

We’ve created our Privacy Risk Management program to identify and assess privacy risks related to how we collect, use, share, and store user data. We leverage this process to identify risk themes, enhance our privacy program, and prepare for future compliance initiatives.

Safeguards and Controls

We’ve designed safeguards, including processes and technical controls, to address privacy risks. As a part of this effort, we conduct internal evaluations on both the design and effectiveness of the safeguards for mitigating privacy risk.

Issues Management

We’ve established a centralized Issue Management function to facilitate self-identification and remediation of privacy issues. This process spans the privacy issue management lifecycle from intake and triage, remediation planning, and closure with evidence.

Privacy Red Team

We’ve established a privacy red team whose role is to proactively test our processes and technology to identify potential privacy risks. The Privacy Red Team assumes the role of external or internal parties attempting to circumvent our privacy controls and safeguards, which helps us proactively identify areas where we can improve our control environment.

Incident Management

No matter how robust our mitigations and safeguards, we also need a process to (1) identify when an event potentially undermines the confidentiality, integrity, or availability of data for which Meta is responsible, (2) investigate those situations, and (3) take any needed steps to address gaps we identify.

Our Incident Management program operates globally to oversee the processes by which we identify, assess, mitigate, and remediate privacy incidents. Although the Privacy and Data Practices team leads the incident management process, privacy incidents are everyone’s responsibility at Meta. Teams from across the company, including legal and product teams, play vital roles. We continue to invest time, resources, and energy in building a multi-layered program that is constantly evolving and improving and we highlight three components of our approach below.

We take a layered approach to protecting people and their information—which includes implementing safeguards designed to catch bugs proactively, before they can become a problem. Given the scale at which we operate, we have invested heavily in building and deploying a wide range of automated tools that are intended to help us identify and remediate potential privacy incidents as early and quickly as possible. These automated systems are designed to detect incidents in real time to facilitate rapid response.

Of course, no matter how capable our automated systems become, the oversight and diligence of our employees always plays a critical role in helping to proactively identify and remediate incidents. Our engineering teams regularly review our systems to identify and fix incidents before they can impact people.

Since 2011, we have operated a bug bounty program in which external researchers help improve the security and privacy of our products and systems by reporting potential security vulnerabilities to us. The researchers’ findings may lead to further investigation and, if necessary, remediation by Meta. The program helps us scale detection efforts and fix issues faster to better protect our community, and the bounties we pay to qualifying participants encourage more high-quality security research.

Over the past 10 years, more than 50,000 researchers joined this program and around 1,500 researchers globally have been awarded bounties.

While we’ve adopted a number of protections to guard against privacy incidents like unauthorized access to data, if an incident does occur, we believe that transparency is an important way to rebuild trust in our products, services, and processes. Accordingly, beyond fixing and learning from our mistakes, our Incident Management program includes steps to notify people where appropriate, such as a post in our Newsroom or our Privacy Matters blog about issues impacting our community, or working with law enforcement or other officials to address incidents we find.

Third-Party Oversight

Third parties are external partners who do business with Meta but aren’t owned or operated by Meta. These third parties typically fall into two major categories: those who provide a service for Meta (like vendors who provide website design support) and those who build their businesses around our platform (like app or API developers). To mitigate privacy risks posed by data and personal information exchanged with third parties, we developed a dedicated third party oversight and management program, which is responsible for overseeing third party risks and implementing appropriate privacy safeguards.

We’ve also created a third party privacy assessment process for service providers to assess and mitigate privacy risk. Our process requires that these service providers are also bound by contracts containing privacy protections. Their risk profile determines how they are monitored, reassessed, and, where appropriate, which enforcement actions to take as a result of violations, including termination of the engagement.

As part of Meta’s ongoing commitment to fostering a secure and privacy-centric developer ecosystem, we’ve launched new resources and initiatives designed to support responsible data use and reinforce our platform integrity.

Responsible Platform Initiatives Hub: We’ve launched a new Responsible Platform Initiatives Hub to centralize Meta’s approach to privacy and security with our third party developers, reinforcing our commitment to safe and responsible data practices across our platform. This hub provides resources and comprehensive guidance on our compliance-related programs, responsible development practices, and the tools that help developers align with Meta’s privacy standards. Through the hub, developers can access necessary resources related to our policies and responsible platform practices, supporting a safer and more transparent ecosystem.

Data Access and Renewal: We introduced a new consolidated process to request, manage and renew developer access to data, reinforcing our commitment to simplifying the platform and building new developer tools. Under data access renewal, developers undergo just one regular assessment once a year to confirm continued compliance with Meta’s privacy and data protection standards and Platform Terms. This process requires developers to explicitly reaffirm their adherence to Meta’s data protection standards and Terms, and outline how and why each type of data is used. By consolidating the developer experience, we hope to make it streamlined for developers to meet our requirements and deliver reliable security and privacy to their end users.

Anti-Scraping

Our anti-scraping team is dedicated to detecting, investigating and blocking patterns of behavior associated with unauthorized scraping. Scraping is the automated collection of data from a website or app and can be either authorized or unauthorized. Using automation to access or collect data from Meta’s platforms without our permission is a violation of our terms of service.

We continue to invest in infrastructure and tools to make it harder for scrapers to collect data from our services and more difficult to capitalize off of it if they do. Examples of these investments include rate limits and data limits. Rate limits cap the number of times anyone can interact with our products in a given amount of time, while data limits keep people from getting more data than they should need to use our products normally.

We leverage internally generated user and content identifiers after we observed that unauthorized scraping often involves guessing or purchasing such identifiers. We also use new pseudonymized identifiers that help deter unauthorized data scraping by making it harder for scrapers to guess, connect, and repeatedly access data.

We’ve blocked billions of suspected unauthorized scraping actions per day across Facebook and Instagram, and we’ve taken a variety of actions against unauthorized scrapers including disabling accounts and requesting that companies hosting scraped data delete it.

Privacy Review

The Privacy Review process is a central part of developing new and updated products, services, and practices at Meta. Through this process, we assess how data will be used and protected as a part of new or updated products, services and practices. We review an average of 1,400 products, features and data practices per month across the company before they ship to assess and mitigate privacy risks.

Komal Lahiri, VP Privacy Review

“Billions of people trust us with their privacy everyday. Privacy Review is key to honoring that trust and helps ensure that we innovate responsibly. Our primary goal is to show our users and regulators that we’re meeting our privacy obligations and getting this right.”


—Komal Lahiri, VP, Governance, Risk, and Compliance, Product

As a part of the process, a cross-functional team of privacy experts evaluates potential privacy risks associated with a project and determines if there are any changes that need to happen before project launch to mitigate those risks. If there is a disagreement on the assessment of applicable risks or the proposed product mitigations, the process requires teams to escalate to product and policy leadership and ultimately the CEO for further evaluation and decision.

The development of our new or modified products, services or practices through the Privacy Review process is guided by our internal privacy expectations, which include:

  1. Purpose Limitation: Process data only for a limited, clearly stated purpose that provides value to people.
  2. Data Minimization: Collect and create the minimum amount of data required to support clearly stated purposes.
  3. Data Retention: Keep data for only as long as it is actually required to support clearly stated purposes.
  4. External Data Misuse: Protect data from abuse, accidental loss, and access by unauthorized third parties.
  5. Transparency and Control: Communicate product behavior and data practices proactively, clearly and honestly. Whenever possible and appropriate, give people control over our practices.
  6. Data Access and Management: Provide people the ability to access and manage the data that we have collected or created about them.
  7. Fairness: Build products that identify and mitigate risk for vulnerable populations, and ensure value is created for people.
  8. Accountability: Maintain internal process and technical controls across our decisions, products and practices.

We’ve also invested in verification reviews and a centralized platform to support operating the Privacy Review process at scale:

  • Centralized Platform: Our platform is a one-stop-shop for managing all aspects of the Privacy Review process. This central repository enables teams to efficiently manage their reviews, requirements, and privacy decisions.
  • Verification reviews: Privacy Review includes a technical implementation review phase to analyze, verify, and document the technical implementation of privacy mitigations. This process is integrated with our software development tools, helping ensure that what was agreed upon in Privacy Review is actually implemented. By automating this verification step, we can maintain our commitment to user privacy, consistently.
02. Technical Privacy Investments

We’re investing in technological innovations that scale and improve our approach to privacy. By embedding privacy into Meta’s tools and processes, teams can think about Privacy earlier in the product development lifecycle and deliver privacy benefits for our users.

“Investing in infrastructure helps ensure that privacy is inherent in everything we build. It enables us to continue building innovative, valuable products for people in a privacy-safe way.”

Michel Protti, Chief Privacy and Compliance Officer, Product

Photo of two people collaborating

We've made significant strides in enhancing our Privacy Review process through technology investment. Specifically, we’ve introduced a product-level decision framework that has improved consistency and standardization across our reviews. This framework has enabled us to automate and streamline the review process by leveraging past reviews on similar projects, and embed privacy controls directly into our engineering tools. These changes have enabled us to review over 1,400 launches monthly while maintaining consistent privacy standards.

We continue to advance and build our privacy-aware infrastructure— innovative and efficient constructs embedded in Meta infrastructure solutions that enable engineers to more easily address complex privacy requirements as they build products. Privacy-aware infrastructure embeds privacy rules directly into code, helping ensure requirements are automatically respected. For example, we built Policy Zones that apply across our infrastructure to address restrictions on data, such as using it only for allowed purposes, providing strong guarantees for limiting the purposes of its processing. PAI continues to be crucial in enforcing complex purpose limitation scenarios while ensuring scalability, reliability, and a streamlined developer experience.

We’re proactively reducing the amount of user data that we collect and use by deploying innovative tools and technology across Meta. We continue to invest in privacy-enhancing technologies (PETs)—technologies based on advanced cryptographic and statistical techniques that help to minimize the data we collect, process and use and have been working to open source this work in cases where useful for the broader ecosystem, including on PETs for AI through PyTorch. Additionally, our investments in PETs helped enable a new cryptographic security feature on WhatsApp that helps to verify that your connection and conversation is secure based on key transparency. This feature reduces the possibility of a third party impersonating the person or business a user wants to connect and share encrypted messages with, by checking the validity of public keys in the conversation against a server-side directory that stores public keys with user information and then providing a publicly available, privacy-preserving audit record for anyone to verify that data has not been deleted or modified in the directory.

Similarly, we developed a framework for code and asset removal which guides engineers through deprecating a product safely and efficiently. Deprecating products is a complex feat involving internal and external dependencies, including dependencies on other Meta products that may not themselves be in scope for removal. To address this, our Systematic Code and Asset Removal Framework (SCARF) includes a workflow management tool that saves engineers time by identifying dependencies as well as the correct order of tasks for cleaning up a product. In addition, SCARF includes subsystems for safely removing dead code as well as unused data types.

SCARF powers thousands of human-led deprecation projects alongside the millions of code and data assets it has cleaned automatically. It is additionally useful for our privacy teams, who use the tool to monitor progress of ongoing product deprecations and ensure that they are completed in a timely manner.

03. Privacy Product Outcomes

We’ve become a privacy-first product development company ensuring that from inception to shipping, any product or feature considers the privacy implications from the start. This is part of our commitment to delivering privacy in both new product innovations and updates to existing products. To reinforce this commitment, Privacy is a core component of performance evaluation for our engineering teams.



You can see this in many examples of products we’ve shipped:

Since 2016, personal conversations on WhatsApp have been protected by end-to-end encryption, which means no one outside of a user’s chats, not even WhatsApp or Meta, can read or listen to them. In 2023, we began to roll out default end-to-end encryption for all personal one-to-one chats and calls on Messenger, making them even more private and secure. This ensures that no one sees your messages except you and who you’re chatting with.

This has taken years to deliver because we’ve taken the time to get it right. Our engineers, cryptographers, designers, policy experts and product managers have worked tirelessly to rebuild Messenger features from the ground up. Enabling end-to-end encryption on Messenger meant fundamentally rebuilding many aspects of the application protocols to improve privacy, security, and safety while simultaneously maintaining the features that have made Messenger so popular. Our approach was to leverage prior learnings from both WhatsApp and Messenger’s Secret Conversations (Messenger’s early optional end-to-end encryption offering), and then iterate on our most challenging problems like muti-device capability, feature support, message history, and web support. Along the way, we introduced new privacy, safety and control features like app lock and delivery controls that let people choose who can message them, while improving existing safety features like report, block and message requests. On WhatsApp, where all personal chats and calls are end-to-end encrypted by default, we launched a novel encrypted storage system, built using WhatsApp’s Auditable Key Directory, that gives users the ability to securely save and restore their WhatsApp contacts.

At its core, end-to-end encryption is about protecting people’s communications, so they can feel safe expressing themselves with their friends and loved ones. We worked closely with outside experts, academics, advocates and governments to identify risks and build mitigations to ensure that privacy and safety go hand-in-hand. We commissioned an independent human rights impact assessment and designed our products to prevent harm by design, offered robust user controls and invested in proactively identifying abuse.

Ray-Ban Meta glasses let you snap a photo or record a video from your unique point of view, listen to music or take a call—all without having to pull out your phone. Ray-Ban Meta glasses have been redesigned with a higher quality camera, improved audio and microphone systems, and new features, such as livestreaming and built-in Meta AI, so you don’t have to choose between capturing the moment and experiencing it.

Ray-Ban Meta glasses were built with privacy at their core, and serve as a clear proofpoint of our commitment to responsible innovation and privacy by design. We’ve incorporated stakeholder feedback—which we obtained early on from the moment we launched Ray-Ban Stories—in meaningful and tangible ways.

  • The capture LED is now more noticeable and visible with a differentiated signaling pattern—solid to blinking—for longer-duration capture (video recording, live streaming).
  • We also introduced a tamper detection feature to prevent users from recording while the capture LED is fully covered. If the capture LED is fully obscured, the user will not be able to utilize the camera and will be notified to remove the obfuscation before proceeding.
  • The Meta View companion app continues to provide easy access to privacy settings to manage information and additional data sharing with Meta.

We launched Instagram Teen Accounts, a new experience for teens, guided by parents. Teen Accounts will also be coming to other Meta platforms. Teen Accounts have built-in protections which limit who can contact teens and the content they see. Teens will automatically be placed into Teen Accounts, and teens under 16 will need a parent’s permission to change any of the default built-in protections to be less strict.

  • Private accounts: Default private accounts for teens under 16, requiring them to accept new followers. With default private accounts, teens need to accept new followers and people who don’t follow them can’t see their content or interact with them.
  • Messaging restrictions: Teens are placed into the strictest messaging settings and can only be messaged by people they follow or are already connected to.
  • Sensitive content restrictions: Teens are automatically placed in the most restrictive setting for sensitive content control, which limits the type of sensitive content teens see in places like Explore and Reels.
  • Limited interactions: Teens can only be tagged or mentioned by people they follow.

Supervision Features for Parents: While parents can’t read their teen’s messages, they can see who their teen has messaged in the past seven days.

More details available here.

We launched new generative AI features, including AI stickers, image editing with AI, our AI assistant known as Meta AI spanning across our apps, and 28 new AI characters. We've updated the Privacy Center guide on Generative AI and other transparency resources so people have information on how we build our AI models, how our features work, and what options and privacy data rights they have in their region.

Our “Why am I seeing this?” tool continues to help people understand why they’re seeing the ads they do on Facebook and Instagram feeds. One key update was summarizing information into topics about how activity both on and off our technologies—such as liking a post on a friend’s Facebook page or visiting a sports website—may inform the machine learning models we use to shape and deliver the ads seen. We also introduced new examples and illustrations explaining how our machine learning models connect various topics to show relevant ads. Additionally, we introduced more ways for users to find our ads controls, providing the availability to access Ads Preferences from additional pages in the “Why am I seeing this ad?” tool.

Meta Content Library and Content Library API are research tools that provide qualified individuals with access to publicly-available content from Facebook, Instagram and Threads. Individuals can search, explore and filter the data on both a graphical user interface or through a programmatic API.

Accessible data includes posts from Facebook Pages, groups, events, and profiles, posts from Instagram accounts and posts from Threads profiles. Details about the content, such as the number of reactions, shares, comments and post view counts are available as well.

Meta partnered with the Inter-university Consortium for Political and Social Research (ICPSR) at the University of Michigan to share public data from Meta technologies in a responsible, privacy-preserving way. This partnership is enabled through ICPSR’s industry-leading Social Media Archive (SOMAR) initiative. SOMAR independently processes and reviews applications for access to Meta Content Library and Content Library API.

Our work to communicate transparently includes providing external education to improve people’s understanding and awareness of our practices and ensuring information is accessible and easy to find.

  • Privacy Policy, which details how we collect, use, share, retain, and transfer information, as well as what rights and controls people have over their privacy.
  • Privacy Center, where people can go to better understand our practices so they can make informed decisions about their privacy in a way that is right for them. Through education and access to privacy and security controls, we address some of the most common privacy concerns from the billions of people who spend their time with us everyday. Privacy Center has several modules, including sharing, collection, use, security, youth, generative AI, and ads, to directly connect an issue or concern with the relevant privacy and security controls we’ve built across our apps and services over the years.
  • Data and Privacy Section of Newsroom, where we provide more information about how we’ve approached privacy in the context of particular features or issues.

To provide greater transparency and control to people, we’ve developed a number of privacy tools for people to understand what they share and how their information is used including:

  • Privacy Checkup: Guides people through important privacy and security settings on Facebook to help strengthen account security and manage who can see what they share and how their information is used. Privacy Checkup has five distinct topics to help people control who can see what they share, how their information is used and how to strengthen their account security.
    • Who Can See What You Share helps people review who can see their profile information, like their phone number and email address, as well as their posts.
    • How to Keep Your Account Secure helps people strengthen their account security by setting a stronger password and turning on two-factor authentication.
    • How People Can Find You on Facebook lets people review ways in which people can look you up on Facebook and who can send you friend requests.
    • Your Data Settings on Facebook lets people review the information they share with apps they’ve logged into with Facebook. They can also remove the apps they no longer use.
    • Your Ad Preferences on Facebook provides information about how ads work on our Products, lets people decide what profile info advertisers can use to show them ads, and lets them control who can see their social interactions, like likes, alongside ads.
  • Manage Activity: Allows people to manage posts in bulk with filters to help you sort and find the content they are looking for, like posts with specific people or from a specific date range. It also includes a deletion control that provides a more permanent option, so people can move old posts in bulk to the trash. After 30 days, posts sent to the trash will be deleted, unless they choose to manually delete or restore them before then.
  • Accounts Center: Accounts Center is a place to help people manage connected experiences and change account settings across their Facebook, Instagram and Meta accounts. People can choose to add Facebook, Instagram and Meta accounts to the same Accounts Center, enabling them to:
    • Manage connected experiences across the accounts in the same Accounts Center, such as sharing posts or stories to multiple profiles at once, and logging in across accounts. Learn more.
    • Manage individual settings for each account in Accounts Center, such as personal details, password & security, and your information & permissions. Learn more.
      • Download Your Information and Access Your Information: Within Accounts Center settings, Meta provides tools to make people’s information on our products useful and easy to find, such as Activity Log, Access Your Information, and Download Your Information). In an effort to continually enhance these tools, in February 2024, we began including Data Logs in Download Your Information, which provide people more data about how they use our products. Data Logs include things like information about content they've viewed on Facebook, interactions off of Meta technologies and things we’ve recommended based on their activity.
    • Manage settings across all accounts at once if they are in the same Accounts Center, such as ad preferences and activity off-Meta technologies (which provides a summary of activity that businesses and organizations share with us about people’s interactions, such as visiting their apps or websites, and gives people the option to disconnect their past activity from their account). Learn more.
  • Meta Quest and Meta Horizon:
    • Privacy settings: You can choose how social you want to be and how your information is shared with others through your privacy settings. The privacy settings that you choose on Meta Quest or the Meta Horizon app will be the same ones that you have on Worlds. You can manage your settings in the Meta Horizon app and on Meta Quest. You can make your Meta Horizon profile public or private, your Active Status allows you to control whether to share when you're online or were recently online, and Hide app activity lets you hide your activity for specific apps. Social privacy settings on Meta Horizon OS are simple and easily accessible from Quest’s settings controls or Worlds’ settings controls.
    • Sensor Lock For Quest: The new sensor lock feature cuts power to your headset’s external cameras and microphones in certain situations, like when the headset goes to sleep. And those sensors will stay powered off until you physically press the headset’s power button—so you stay in control. Sensor lock started as an optional feature on Quest 3 and will be included in all future models. Prior Quest devices turn off external cameras and microphones in software instead of being powered off directly.
    • Parent-managed Accounts: Parents can set up parent-managed Meta accounts for 10-12 year olds (ages may vary), allowing preteens to access a vast array of engaging and educational content in VR – with age-appropriate protections built specifically for them.
04. ONGOING COMMITMENT TO PRIVACY

We’re invested in privacy and are committed to continuous improvement.

“Getting privacy right is a continual, collective investment across our company, and is the responsibility of everyone at Meta to advance our mission.”
Michel Protti, Chief Privacy and Compliance Officer, Product

Protecting users’ data and privacy is essential to our business and our vision for the future. To do so, we’re continually refining and improving our privacy program and our products, as we respond to evolving expectations and technological developments—working with policy makers and data protection experts to find solutions to unprecedented challenges—and sharing our progress as we do.