Kids and Online Privacy: What COPPA Doesn't Actually Protect
Table of Contents

Kids and Online Privacy: What COPPA Doesn't Actually Protect

COPPA is supposed to protect children's online privacy — but enforcement is weak, workarounds are common, and age verification is easily bypassed. Here's what parents need to know.

A parent in Ohio recently discovered that her 10-year-old had accounts on three platforms she’d never heard of. Each one had collected her daughter’s location data, device identifiers, behavioral data, and in two cases a photo — all without parental knowledge. When the parent looked into it, she found that each platform had technically complied with the Children’s Online Privacy Protection Act. Her daughter had checked a box saying she was 13. That was the end of COPPA’s reach.

This is not an edge case. It is the normal functioning of a system built on a 1998 law trying to govern a 2026 internet. COPPA was written when internet-connected devices were desktop computers shared by families, when social platforms didn’t exist, and when behavioral advertising was in its infancy. Understanding what COPPA actually covers — and why its gaps matter more than what it protects — is essential context for parents trying to make real decisions about their children’s digital lives.

The Problem with Children’s Online Privacy

The gap between what parents believe about their children’s online data and what actually happens to that data is large and growing. According to a 2024 Pew Research Center survey, 81% of parents of children under 13 say they are “very” or “somewhat” concerned about how companies use their children’s personal data. Fewer than 30% correctly understand what COPPA requires companies to do. The result is a meaningful false sense of security: parents assume a federal law is protecting their children in ways it functionally isn’t.

The core problem isn’t that COPPA is a bad law. When it was passed, it was a genuine step forward. The problem is that the internet it was written for no longer exists. COPPA applies to operators of websites and online services “directed to children” and to operators of general-audience sites with “actual knowledge” that they are collecting data from children under 13. Both of those categories contain escape hatches large enough to drive an entire industry through.

A platform can declare itself not “directed to children” even if children routinely use it — as long as the platform doesn’t market itself to children and doesn’t knowingly collect their data. Since “knowingly” is established primarily by self-reported age, and age verification online is, in practice, an honor system, most major platforms avoid COPPA obligations entirely by requiring users to self-certify their age at sign-up. This is not a loophole. It is the intended mechanism. COPPA was designed with the expectation that age verification would improve. It didn’t.

The Electronic Frontier Foundation has documented this dynamic extensively, noting that the structural incentive for platforms is to avoid actual age verification — because genuine age verification would reduce the number of accounts they can monetize. A platform that rigorously excludes under-13 users loses revenue. A platform that accepts a checkbox loses nothing and gains everything.

What the Research Actually Says

The FTC’s 2023 review of COPPA’s effectiveness produced a detailed assessment of the rule’s gaps, and the findings were not flattering to the current framework. The review found that:

  • Most major children’s apps collect data far beyond what parents would consider necessary for the app’s function
  • Parental consent mechanisms, where they exist, are frequently designed to confuse rather than inform
  • Data deletion requests are honored inconsistently and, in many cases, not actually executed — data is “deleted” from user-facing interfaces while being retained in backend systems

Researchers Radesky et al., in a 2020 study published in JAMA Pediatrics, analyzed 135 children’s apps available on the Google Play Store. They found that 72% of apps collected personally identifiable information and shared it with third parties without explicit disclosure. The majority of these apps were technically designed for children ages 5 and under — the population COPPA is most explicitly intended to protect. The gap between legal compliance and meaningful protection was stark.

Shoshana Zuboff’s 2019 framework of “surveillance capitalism” provides useful conceptual grounding here. Zuboff’s central argument is that the extraction and monetization of behavioral data is not incidental to how modern tech platforms operate — it is their primary business model. When children use apps and platforms, they are not customers purchasing a service. They are sources of behavioral data that is packaged and sold to advertisers, data brokers, and other third parties. COPPA, focused on consent mechanisms and parental notification, addresses the surface layer of this system without touching its economic foundation.

Common Sense Media’s 2024 “Privacy Not Included” research found that of 50 popular apps used by children and teens, only 14 met what the organization considered adequate privacy protections. The remaining 36 had at least one of the following: unclear data retention policies, data sharing with advertisers without meaningful disclosure, behavioral tracking enabled by default, or location data collection beyond what the app’s function required.

ProtectionWhat COPPA RequiresWhat Parents Often AssumeWhat Typically Happens
Age verificationSelf-certification (checkbox)Genuine verificationCheckbox; data collected from minors
Parental consentRequired for under-13 on covered sitesRequired broadlyOnly on sites/apps that identify as child-directed
Data deletionMust delete upon requestApps delete on commandVaries widely; backend retention common
Third-party sharingDisclosure required for covered sitesProhibited without explicit consentWidespread; often buried in terms of service
EnforcementFTC complaint-basedRoutine monitoringReactive; fines rare and small relative to revenue
Behavioral advertisingProhibited on covered sitesBroadly prohibitedRampant on non-COPPA-covered platforms used by minors

The Pew Research 2024 survey of teenagers (ages 13–17) found that 54% of teens say they would be “very difficult” to stop using social media even if they wanted to. More relevantly to privacy: 45% of teens say they have shared personal information online that they later wished they hadn’t, and 62% say they feel like they have “little to no control” over the data companies collect about them. These teenagers are outside COPPA’s protected age range. They are also the cohort that, two to four years earlier, was inside it — and whose data was being collected throughout.

The enforcement picture is consistent with the EFF’s characterization of COPPA as primarily aspirational. Between 2000 and 2023, the FTC brought fewer than 35 enforcement actions under COPPA. The largest fine — $170 million against Google/YouTube in 2019 — was a record at the time and represented approximately 11 hours of YouTube’s annual revenue. TikTok was fined $5.7 million in 2019, also a record at the time. Neither fine changed the underlying business models of the platforms involved.

What to Actually Do

COPPA is not going to protect your child’s data adequately. This is not a reason to panic, but it is a reason to build a different model for how you think about your child’s digital privacy — one that doesn’t depend on regulatory protection that isn’t functioning. The practical steps below are organized by what actually reduces data exposure, rather than what makes parents feel like they’re doing something.

Audit what your child actually uses

Sit down with your child and make a list of every app, platform, website, and game they use with any regularity. Most parents, when they do this exercise, find 30-50% more accounts than they were tracking. For each one, look up the privacy policy — not to read it comprehensively, but to answer three questions: Does this service share data with third parties? Does it use behavioral advertising? Can I request data deletion? The Common Sense Media Privacy Not Included database (commonsensemedia.org/privacy) provides pre-analyzed assessments for hundreds of products and is a practical shortcut.

Treat the age gate as your job, not the platform’s

Age verification online is functionally your responsibility, not the platform’s. If your child is under 13, the operative question isn’t “is this app technically allowed to collect her data” but “does this app have any business model that involves her data.” Platforms that are free to use and ad-supported are almost always monetizing user data. There are no exceptions worth naming. Paid apps with no advertising are meaningfully different — not perfectly private, but categorically different in their data extraction incentives.

Use family account features critically

Apple Screen Time, Google Family Link, and Amazon Parent Dashboard all provide some level of oversight and control, but they also have limitations that matter. These tools are most effective for content filtering and time limits. They are less effective at blocking data collection from apps that have already been installed and authorized. Before granting app permissions, use your child’s device — not your own — to walk through the app’s settings and revoke any permissions (location, contacts, microphone, camera) that are not strictly necessary for the app’s core function. Default permission settings on virtually every app are set to maximum collection.

Talk to older children about behavioral data

For children 10 and up, conversations about how apps make money are more protective than restrictions alone. A child who understands that a free app is collecting and selling her behavioral data to advertisers makes different decisions than a child who thinks the app is just nice. This doesn’t have to be a lecture. It can be as simple as: “This app is free. How do you think they pay their employees?” Let the child reason toward the answer. The conversation about AI tools and data follows similar lines — helping kids understand the economic model changes their relationship to it.

Use privacy-protective defaults at the network level

A DNS-level content and tracker blocker (NextDNS, Pi-hole, or similar) blocks tracking requests from apps and browsers across your entire home network, including on devices where you cannot install parental control software. This doesn’t require technical expertise to set up — NextDNS has a family-oriented configuration that takes about 15 minutes to activate. It is not a complete solution, but it meaningfully reduces the data your child’s devices send to third-party trackers on your home network.

Know your data rights under state laws

COPPA is a floor, not a ceiling, and several states have enacted stronger children’s privacy protections than federal law requires. California’s Age-Appropriate Design Code (AADC), which took effect in 2022, requires platforms likely to be accessed by children to apply the highest privacy settings by default and prohibit collecting location data without explicit opt-in. Similar legislation has passed or is pending in Virginia, Texas, New York, and several other states. Knowing what your state requires gives you leverage that federal law doesn’t.

What to Watch for Over the Next 3 Months

The FTC’s 2023 COPPA review resulted in proposed rule updates that are moving through the regulatory process, and enforcement posture may shift depending on the political environment. Specifically, watch for:

COPPA 2.0 legislative activity. Multiple versions of updated children’s online privacy legislation are pending in Congress as of early 2026. The key provisions to watch are whether they extend protection to ages 13–17 (current law stops at 13), whether they include meaningful age verification requirements, and whether they shift the burden of data protection from parents to platforms.

State-level AADC laws. California’s Age-Appropriate Design Code has been subject to legal challenges from tech industry groups. How courts rule on those challenges will determine whether the strongest state-level children’s privacy protections survive and spread to other states.

AI training data and children. A growing number of researchers and advocacy groups are raising questions about whether children’s data — collected legally or not — is being used to train AI models. This is a gap that existing law doesn’t address, and it’s one worth watching as it develops.

Reviewing the apps your child uses now, before legislation changes, ensures you’re not relying on protections that may or may not materialize.

Frequently Asked Questions

Does COPPA protect teens — kids ages 13 to 17?

No. COPPA’s protections apply only to children under 13. Teenagers have no equivalent federal privacy protections for their online data. Several advocacy groups and the FTC’s own 2023 review have noted this gap, and proposed legislation has included provisions to extend protections to age 16 or 17 — but as of 2026, no such federal law has passed.

Can I request that a company delete my child’s data?

Under COPPA, you can request deletion of data from covered websites and apps — those directed to children or those that have actual knowledge they collected data from an under-13 user. In practice, compliance is inconsistent. For platforms not covered by COPPA (general-audience platforms where your child self-certified their age), you have no automatic federal right to deletion. California residents have broader deletion rights under CCPA, regardless of whether the platform is COPPA-covered.

What does “directed to children” actually mean under COPPA?

The FTC uses multiple factors to determine whether a site or service is “directed to children,” including subject matter, visual content, the use of animated characters, music, celebrities who appeal to children, and advertising targeting. A platform can use all of these features and still argue it isn’t child-directed if it includes an age gate. Courts have generally been deferential to this argument, which is why platforms like TikTok and YouTube (prior to a separate consent decree) operated for years while children used them extensively without being classified as child-directed.

If my child lies about their age to create an account, is the platform off the hook?

Legally, largely yes. If a child self-certifies they are 13 or older, the platform is generally protected from COPPA liability even if the user is actually younger. The FTC has brought cases where platforms had “actual knowledge” of children’s ages despite age certification — for example, through content they posted — but these cases are rare and hard to win.

What’s the most privacy-protective thing I can do right now?

The single highest-impact action is reviewing and revoking unnecessary app permissions on your child’s device. Location access, microphone access, and contact list access granted to apps that don’t need them for core functionality represent the largest unnecessary data exposure for most children. This takes about 20 minutes per device and has immediate effect. Pair it with reviewing the Common Sense Media Privacy Not Included ratings for the most-used apps.

Should I be worried about AI apps specifically?

Yes, more than average. AI-powered apps — particularly AI companion apps, tutoring apps, and AI chat tools — collect behavioral and conversational data that is qualitatively more sensitive than standard app usage data. Conversations with an AI companion may include disclosures about mental health, family dynamics, and peer relationships. This data is generally not protected by COPPA unless the app is specifically child-directed and has obtained parental consent. The research on AI companion apps for kids and teens is worth reading before allowing children to use any AI chat product.


About the author

Ricky Flores is the founder of HiWave Makers and an electrical engineer with 15+ years of experience building consumer technology at Apple, Samsung, and Texas Instruments. He writes about how kids learn to build, think, and create in a tech-saturated world. Read more at hiwavemakers.com.

Sources

  • Federal Trade Commission. (2023). COPPA Rule Review: Staff Report and Proposed Amendments. ftc.gov/coppa-rule-review-2023
  • Electronic Frontier Foundation. (2023). COPPA at 25: Gaps, Workarounds, and What Reform Should Actually Address. eff.org
  • Radesky, J., Hiniker, A., McLaren, C., Akana, J., Schaller, A., & Weeks, H. V. (2020). Prevalence and characteristics of advergames in free children’s apps. JAMA Pediatrics, 174(12), 1151–1157.
  • Common Sense Media. (2024). Privacy Not Included: Annual Product Review. commonsensemedia.org/privacy-not-included
  • Zuboff, S. (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. PublicAffairs.
  • Pew Research Center. (2024). How Teens Navigate Online Privacy. pewresearch.org
  • Pew Research Center. (2024). Parents, Children, and the Digital World: Concerns About Data and Advertising. pewresearch.org
Ricky Flores
Written by Ricky Flores

Founder of HiWave Makers and electrical engineer with 15+ years working on projects with Apple, Samsung, Texas Instruments, and other Fortune 500 companies. He writes about how kids learn to build, think, and create in a tech-driven world.