WhatsApp Launches Parent-Linked Accounts for Pre-Teens
TL;DR
WhatsApp announced on March 11, 2026, that it will allow children under 13 to use the platform through parent-managed accounts with restricted features, no ads, and end-to-end encryption. The move arrives amid a global wave of child online safety legislation, Australia's first-of-its-kind social media ban for minors, and research showing that Meta's existing teen safety tools on Instagram have been largely ineffective — raising the question of whether parental controls on WhatsApp will deliver where previous efforts have fallen short.
On March 11, 2026, WhatsApp — the world's most popular messaging platform with over 3.3 billion monthly active users — announced it would begin allowing children under the age of 13 to use the service for the first time . The new "parent-managed accounts" represent a significant shift for a platform that has long maintained a minimum age requirement of 16 in most jurisdictions and 13 in the United States, in line with the Children's Online Privacy Protection Act (COPPA) .
The decision arrives at a moment of extraordinary global tension over children's digital lives. Governments from Canberra to Capitol Hill are drafting, debating, and in some cases enacting sweeping legislation to restrict minors' access to social media. Meta's answer is not to build higher walls, but to open a supervised door — and the stakes of getting it right could not be higher.
How Parent-Managed Accounts Work
The setup process requires both a parent's and a child's device to be physically present. A parent registers and verifies the child's phone number, confirms the child's age, and scans a QR code on the child's device to link the two accounts . The parent then sets a 6-digit PIN that locks all privacy settings, message request folders, and activity alert configurations — ensuring that only the parent, not the child, can modify these controls .
By default, managed accounts can only exchange messages with contacts already saved in the child's phone. Chat requests from unknown numbers are routed into a separate folder, locked behind the parent's PIN . When a message request does arrive, the child sees a context card displaying whether the unknown sender shares any groups with other pre-teen users and which country the sender is located in .
Group access is similarly restricted. Only the parent account can add the child to groups or approve group invitations. Parents receive automatic alerts when their pre-teen adds, blocks, or reports a contact, or when new members join a group the child belongs to .
What's Restricted — and What's Preserved
WhatsApp's pre-teen accounts are stripped down to a messaging-and-calling core. The accounts cannot access Meta AI, Channels, Status updates, or location sharing . Disappearing messages cannot be enabled for one-on-one chats . No ads of any kind will be shown to managed accounts .
Critically, all conversations remain protected by end-to-end encryption. Parents cannot read their child's messages or listen to their calls — a design choice WhatsApp has emphasized as fundamental to the product's identity . "All personal conversations remain private and protected with end-to-end encryption, meaning no one — not even WhatsApp — can see or hear them," the company stated .
The accounts stay linked until the child turns 13, at which point they receive a notification that the account can convert to a standard WhatsApp profile. Parents will have the option to delay this transition by an additional 12 months .
The Regulatory Pressure Cooker
WhatsApp's announcement does not exist in a vacuum. It lands in the middle of a rapidly accelerating global movement to regulate children's online experiences.
In December 2025, Australia became the first country in the world to enforce a ban on social media for children under 16. Under the Online Safety Amendment (Social Media Minimum Age) Act 2024, platforms including Facebook, Instagram, TikTok, Snapchat, X, YouTube, and Reddit face penalties of up to $49.5 million AUD if they fail to take "reasonable steps" to prevent minors from creating or maintaining accounts . Notably, the law imposes no penalties on the children or their parents.
In the United States, a wave of federal legislation is advancing through Congress. In December 2025, the House Subcommittee on Commerce, Manufacturing, and Trade advanced all 18 bills related to child online safety and privacy, including the Kids Online Safety Act (KOSA) and COPPA 2.0 . COPPA 2.0 would extend privacy protections to all minors under 17, prohibit targeted advertising to children, and require an "eraser button" allowing parents and teens to delete personal data . Meanwhile, the FTC issued a Final Rule in January 2025 updating the original COPPA framework with stricter requirements on data retention, third-party disclosure, and mandatory written information security programs .
Meta's move to proactively create a regulated space for under-13 users on WhatsApp can be read as an attempt to get ahead of this legislative tide — offering a framework of parental control before governments impose one by force.
A Track Record Under Scrutiny
The critical question is whether Meta can deliver on its promises. The company's track record with teen safety on its flagship platform, Instagram, has drawn withering criticism from researchers and child safety organizations.
In September 2025, a joint investigation by Fairplay and the Molly Rose Foundation found that of 47 Instagram teen safety features tested, only eight — less than one in five — were fully functional . Two-thirds (64%) of safety tools were either "substantially ineffective" or no longer existed. The report, titled "Teen Accounts, Broken Promises," documented how teens continued to be exposed to content promoting suicide, self-harm, and eating disorders despite Meta's safety assurances .
A separate study by the Breakthrough Campaigns for the HEAT Initiative found that 58% of young teen Instagram users aged 12-15 reported encountering unsafe content and unwanted messages within six months of being migrated to Teen Accounts . Instagram's own design features were found to undermine safety — auto-complete suggested search terms related to eating disorders and self-harm, and teens were prompted with adult stranger follow suggestions .
Meta took more aggressive action in July 2025, removing over 635,000 accounts linked to predatory behavior, including nearly 135,000 Instagram accounts that had sexualized content involving children . But the sheer scale of the problem underscores how difficult enforcement is on a platform of Meta's size.
The question for WhatsApp is whether its fundamentally different architecture — private messaging rather than public feeds, no algorithmic content recommendation, no discovery features — makes it structurally better suited to protect children, or whether the shift to under-13 users introduces risks that even end-to-end encryption cannot mitigate.
The Case For — and Against
Proponents argue that WhatsApp's approach is pragmatic. Data from Pew Research Center shows that nearly 30% of parents with children aged 8-10 say their child already owns a smartphone, and the average age of first phone ownership has dropped to 11.6 years . A majority of parents of 11- and 12-year-olds reported their child has a smartphone. Among teens aged 13-17, smartphone access has reached 95% .
The reality, advocates say, is that children are already messaging — on platforms with fewer controls or none at all. WhatsApp's parent-managed accounts at least provide a sanctioned, supervised environment. The company has stated that the feature was developed in direct response to parents who had bought phones for their pre-teens and wanted a safe messaging option .
Critics, however, raise several concerns. First, the very act of creating an official under-13 product normalizes social platform use at younger ages, potentially accelerating the trend rather than containing it. Second, while end-to-end encryption protects privacy from external surveillance, it also means that if a child does encounter harmful content or a predatory contact, the encrypted messages cannot be reviewed — not by parents, not by WhatsApp, and not by law enforcement.
Third, the effectiveness of parental controls depends entirely on parental engagement. Research consistently shows that digital literacy and monitoring capacity vary enormously across households, with lower-income and less tech-savvy families often having fewer resources to actively manage their children's online experiences .
The Smartphone Generation: By the Numbers
The demographic reality driving WhatsApp's decision is stark. According to Pew Research Center's October 2025 survey, 61% of parents of children aged 12 and younger say their child uses a smartphone . Even among children aged 5-7, 12% already own one .
Mental health data adds urgency to the debate. The Mayo Clinic notes that for 12- to 15-year-olds, spending three or more hours per day on social media is linked to higher risk of mental health concerns . Yet the average American child spends significant time online daily, and messaging apps represent a growing share of that time.
WhatsApp's 3.3 billion user base makes it the dominant messaging platform globally, particularly in markets across Europe, Latin America, South Asia, and Africa . In many of these regions, WhatsApp is not merely a social app — it is the primary communication infrastructure for families, schools, and communities. The pressure to include younger users is, in many markets, a practical necessity rather than a commercial ambition.
The Global Patchwork
WhatsApp has stated it will roll out parent-managed accounts "in select geographies" initially, with gradual expansion over the coming months . The company has not specified which countries will receive the feature first or whether the rollout will differ based on local regulatory frameworks.
This geographic selectivity is significant. In the European Union, the Digital Services Act and the General Data Protection Regulation (GDPR) impose strict requirements on platforms processing children's data, with some member states setting the age of digital consent as high as 16. In Australia, WhatsApp's messaging-only service was notably exempted from the social media ban, as the legislation targeted platforms with social networking features rather than pure messaging apps . This exemption may have actually created an opening for WhatsApp to position itself as the "safe" alternative.
The patchwork of global regulations means WhatsApp will need to navigate a complex compliance landscape, potentially offering different feature sets and age thresholds in different markets.
What Comes Next
WhatsApp's parent-managed accounts represent a calculated gamble. By bringing under-13 users into a controlled environment rather than pretending they don't exist, Meta is betting that supervised access is safer than unsupervised reality. The company is also positioning itself favorably against potential legislation — demonstrating that industry self-regulation can work before governments decide it cannot.
But the history of Meta's teen safety efforts casts a long shadow. The gap between announced features and functional protections on Instagram — where less than 20% of safety tools worked as advertised — suggests that launch-day promises and real-world outcomes can diverge dramatically.
The coming months will provide the first real test. Child safety organizations, regulators, and researchers will be watching closely to determine whether WhatsApp's stripped-down, encryption-first approach to child accounts represents a genuine advance in online safety — or another chapter in the familiar cycle of tech industry promises followed by inadequate delivery.
For the millions of families around the world already handing smartphones to children well before their 13th birthday, the answer matters enormously.
Related Stories
Meta Ends End-to-End Encryption in Instagram Direct Messages
Meta Removes End-to-End Encryption from Instagram Direct Messages
Australia's Regulator Orders Stricter Enforcement of Social Media Ban for Under-16s
Meta Suffers Court Losses Compared to 'Big Tobacco Moment'
Meta to Acquire Moltbook, Viral Social Network for AI Agents
Sources (20)
- [1]WhatsApp is launching parent-linked accounts for pre-teenstechcrunch.com
WhatsApp launched parent-supervised accounts for users under 13, restricted to messaging and calls with PIN-protected parental controls and activity alerts.
- [2]Meta will let kids under 13 use WhatsApp with parent-managed accountsengadget.com
Meta introduced parent-managed accounts on WhatsApp with restricted features including no Channels, location sharing, or Meta AI integration.
- [3]Children's Online Privacy Protection Rule (COPPA)ftc.gov
COPPA imposes requirements on operators of websites directed to children under 13, requiring verifiable parental consent before collecting personal information.
- [4]WhatsApp launches preteen accounts, here's how they work9to5mac.com
Parent-managed accounts allow preteens to use WhatsApp with parental oversight, PIN-protected settings, and restricted contact from unknown users.
- [5]WhatsApp introduces parent-managed accounts for pre-teensbleepingcomputer.com
Pre-teen accounts lack access to Meta AI, Channels, or Status, and cannot enable disappearing messages. Parents receive alerts on contact activity.
- [6]WhatsApp's under-13 accounts with parental control: How to set it up, features and limitsgulfnews.com
Chat requests from unknown contacts are locked behind a parent PIN, and group invite links are similarly restricted on managed accounts.
- [7]WhatsApp to launch parent-managed accounts for under 13srte.ie
WhatsApp revealed parent-controlled accounts amid rising global concerns about the impact of social media and chat apps on children.
- [8]WhatsApp Introduces Parent-Managed Accounts to Give Families More Control Over Pre-Teen Messagingrepublicworld.com
All personal conversations remain private and protected with end-to-end encryption, meaning no one — not even WhatsApp — can see or hear them.
- [9]Social media ban for children under 16 starts in Australianpr.org
Australia became the first country to formally bar users under 16 from accessing major social media platforms, with enforcement beginning December 2025.
- [10]Social media age restrictions - eSafety Commissioneresafety.gov.au
Age-restricted platforms face penalties of up to $49.5 million AUD for failing to prevent under-16s from maintaining accounts.
- [11]COPPA 2.0, KOSA among 18 children's online safety bills advanced by US House subcommitteeiapp.org
The House Subcommittee advanced 18 child online safety bills including COPPA 2.0 and KOSA in December 2025.
- [12]Wave of Federal Online Safety Legislation Hits Congressdwt.com
COPPA 2.0 would broaden coverage to minors under 17, prohibit targeted advertising to minors, and require an eraser button for personal data.
- [13]Instagram Teen Accounts fail to protect children, safety tools testing revealsfairplayforkids.org
Of 47 Instagram safety features tested, only eight were fully functional. Two-thirds were either substantially ineffective or no longer existed.
- [14]Instagram Teen Accounts fail to protect children, first-of-its-kind testing revealsmollyrosefoundation.org
Joint investigation by Fairplay and the Molly Rose Foundation found less than 1 in 5 Instagram teen safety features were fully functional.
- [15]Research Among Young Teen Instagram Users - HEAT Initiativeheatinitiative.org
58% of young teen Instagram users aged 12-15 reported encountering unsafe content and unwanted messages within six months of Teen Account migration.
- [16]Meta launches new teen safety features, removes 635,000 accounts that sexualize childrencnn.com
Meta removed over 635,000 accounts linked to predatory behavior, including 135,000 Instagram accounts with sexualized content involving children.
- [17]How parents describe their kids' tech use - Pew Research Centerpewresearch.org
Nearly 30% of parents with children aged 8-10 say their child owns a smartphone. A majority of parents of 11-12 year-olds report their child has one.
- [18]Teens, Social Media and Technology 2024 - Pew Research Centerpewresearch.org
95% of teens ages 13-17 have access to a smartphone in 2024, up 22% from a decade ago. Average age of first phone is 11.6 years.
- [19]Teens and social media use: What's the impact? - Mayo Clinicmayoclinic.org
For 12-15 year-olds, spending three or more hours per day on social media is linked to higher risk of mental health concerns including anxiety and depression.
- [20]Latest WhatsApp Statistics 2026 - Active Users Datademandsage.com
WhatsApp has over 3.3 billion monthly active users as of January 2026, with approximately 2.3 billion daily active users.
Sign in to dig deeper into this story
Sign In