Table of Contents
What Happens to Kids Who Don't Learn AI — and Those Who Do
The AI literacy gap is becoming a class gap. High-income kids get AI exposure. Lower-income kids often don't. Here's what the research says will follow from that divide.
There is a version of the AI story that goes like this: AI is democratizing knowledge. Anyone with a smartphone can now access a tool that answers questions, helps with writing, and explains complex ideas in plain language. For the first time, the quality of information available to a kid in a low-income school district approaches what was previously available only to kids with expensive tutors and well-resourced libraries. This version of the story is real. It is also incomplete in a way that matters enormously for how parents and educators respond.
The incomplete part: access to AI tools is not the same as the ability to use them well. And the ability to use them well is distributed, in 2025, in ways that closely track existing socioeconomic inequality. The AI literacy gap is not primarily a technology gap — it’s a preparation gap, a context gap, a quality-of-guidance gap. And the research on what preparation gaps predict about children’s economic futures is not reassuring.
The Problem: Access Without Preparation Is Not Equity
When policymakers talk about the digital divide, they typically mean hardware and connectivity: do students have devices and internet access? In the 2000s and 2010s, closing this gap was the central technology equity project in education. Schools deployed Chromebook carts. Broadband access programs expanded. By the early 2020s, most U.S. students, including lower-income students, had some form of device access.
The AI literacy problem is structurally different. A child with device access and an internet connection can open ChatGPT. Whether they can use it to learn, create, problem-solve, or develop career-relevant skills depends on something harder to distribute than hardware: context, guidance, and practice in using AI for high-value tasks.
Common Sense Media’s 2024 survey on AI access gaps by income found that while AI tool access was roughly comparable across income groups among teenagers (given smartphone penetration), the nature and quality of AI use diverged significantly. Higher-income students were more likely to use AI for learning-oriented tasks: researching topics, getting explanations, working through problems, and building projects. Lower-income students were more likely to use AI for task completion: getting answers to homework questions, generating text they submitted as their own, and social uses like entertainment and communication.
This divergence is not a character difference between students. It is a preparation difference. Students who understand how to use AI as a learning tool, who have been taught to evaluate AI output critically, and who have experienced AI being used in substantive and intellectually demanding ways, are more likely to use it that way independently. Students who haven’t received that preparation default to the easiest available use — which is also, typically, the least educationally valuable.
What the Research Actually Says
The foundational framework for understanding what follows from preparation gaps is the labor market research on technology and wage polarization.
Daron Acemoglu and Pascual Restrepo’s 2022 NBER paper on technology and labor market polarization is one of the most cited analyses of what automation has historically done to income distribution. Their finding: new technologies that automate routine tasks eliminate middle-skill jobs while increasing demand for both very high-skill (complementary to AI) and very low-skill (non-automatable) work. The result is polarization — growth at the top and bottom, contraction in the middle — and the workers most harmed are those whose skills were most closely matched to what the automated tools replaced.
Acemoglu and Restrepo’s analysis describes what has already happened with previous automation waves (manufacturing robots, computerized accounting, logistics software). The AI wave now underway has a wider scope because LLMs can engage with language and reasoning — previously considered non-automatable domains. McKinsey Global Institute’s 2023 analysis of AI and workforce displacement estimated that 12 million U.S. workers may need to change occupations by 2030, with effects concentrated in clerical, customer service, and data processing roles — jobs that historically provided entry-level access to the middle class for workers without four-year degrees.
The distributional implication is direct: if AI eliminates the jobs that served as the traditional entry point for lower-income workers without specialized credentials, the divergence in AI literacy by income group compounds an existing inequality rather than offsetting it.
The World Economic Forum’s 2025 Future of Jobs report found that AI-complementary skills — specifically, the ability to work effectively with AI tools, critically evaluate AI outputs, and apply AI to complex domain problems — were among the fastest-growing skill demands across industries. The report identified a skills gap: demand for these capacities is growing faster than supply, and the workers and students most likely to develop them are those with existing advantages in education access and quality.
OECD’s 2024 review of education technology and skills identified AI literacy as an “emerging foundational skill” — meaning it is moving from specialized to expected, the same transition that coding literacy went through over the past decade. The OECD’s analysis found that AI literacy development was strongly correlated with school quality, teacher preparation, and extracurricular access — all of which track socioeconomic status.
Brookings’ 2024 analysis of AI and educational equity made the equity case most directly. Analyzing data from enrichment programs, school curricula, and summer learning programs, Brookings found that AI-focused enrichment was concentrated in higher-income communities. The mechanisms are familiar: enrichment programs cost money, AI-focused summer programs charge tuition, robotics clubs require materials and adult volunteers with technical knowledge, and schools in high-income districts have better-funded technology programs. The result is that the children who most need preparation for an AI-saturated economy are the least likely to receive it — not because anyone is deliberately excluding them, but because the preparation happens through channels that already favor the advantaged.
| AI Literacy Component | High-Income Access (Est.) | Low-Income Access (Est.) | Primary Gap Mechanism |
|---|---|---|---|
| AI tool access (device + connectivity) | High | Moderate–High | Shrinking; near-parity in most urban areas |
| Enrichment programs using AI | High | Low | Cost and proximity of paid programs |
| School-based AI curriculum | Moderate | Low | Funding, teacher preparation |
| High-value AI use (learning-oriented) | Moderate–High | Low | Preparation and context, not access |
| Critical evaluation of AI output | Moderate | Low | Teaching quality and deliberate instruction |
| Career-relevant AI application | Moderate | Low | Mentorship and network effects |
| Access to AI-literate role models | High | Low | Network effects and professional proximity |
What to Actually Do
The access-and-preparation distinction matters for what parents can actually do — and for what policy can actually do. The following recommendations are aimed at both populations: parents of higher-income children who want to ensure their AI exposure translates into real preparation, and parents of lower-income children who want to access preparation opportunities that may not be fully visible to them.
For all families: distinguish AI use from AI learning
The most important distinction any parent can make is between their child using AI (which most children now do) and their child learning with AI (which is a specific and less common thing). Using AI to get an answer is not preparation. Using AI to explore a question, build something, get feedback on their own work, and iterate toward a better result is preparation. The same tool produces both outcomes depending on how it’s used. Parents who ensure their children experience the second kind of use — regardless of income — are providing meaningful preparation that many schools are not.
Seek out programs with access to technical mentorship
The Brookings analysis identified the absence of AI-literate mentors and role models as one of the most significant gaps in lower-income communities. A child who knows an engineer, a programmer, or a researcher — who has seen in person what it looks like to use AI tools for meaningful work — has a fundamentally different frame for what AI is for than a child who has only seen it used for homework shortcuts. Mentorship programs, after-school maker spaces, and community college partnerships can provide this exposure when families don’t have it in their immediate network. For related context on how access gaps compound over time, the article on the AI literacy gap and future-proofing kids’ careers covers the career implications in depth.
Prioritize AI literacy over AI use
The OECD and WEF research converge on a consistent finding: children who understand what AI is, how it works at a conceptual level, and what its limitations are, are better equipped to use it productively than children who simply have access to it. AI literacy — the knowledge framework, not just the practice — is the preparation that compounds. A child with a conceptual mental model of AI will update that model as AI changes. A child who only knows how to use a specific tool is starting over when the tool changes. Programs and parents who invest in the conceptual layer of AI understanding are investing in preparation that is more durable than technique. The article on AI literacy for kids in middle school outlines what that conceptual layer should include.
Advocate for AI literacy in your child’s school curriculum
The OECD’s 2024 finding that AI literacy development correlates strongly with school quality and teacher preparation is a policy diagnosis, not a fixed reality. Schools that receive parent advocacy for AI literacy curriculum — specific, structured, research-grounded AI education — are more likely to develop it. This is especially true in districts where the administration is aware of the equity implications but hasn’t prioritized curriculum development yet. Asking specifically for curriculum (not just tools) is more effective than asking for “more technology.”
Evaluate enrichment programs by what they teach, not what they use
A camp or program that puts children in front of AI tools for a week is not automatically providing AI literacy preparation. The question to ask is: are children learning to use AI as a thinking tool, or are they learning to use AI as a convenience tool? Programs that include reflection, evaluation of AI outputs, building with AI (rather than just prompting it), and discussion of what AI can and can’t do are providing preparation. Programs that use AI tools as a medium without teaching about the medium are providing exposure, which is worth less and does not reliably produce AI literacy.
Address the network gap deliberately
The single most important non-monetary advantage higher-income children have in developing AI literacy is network access: people in their lives who work with AI, who can model sophisticated use, and who can contextualize AI within real career paths. Families without this network can partially substitute through online communities, mentorship platforms, and sector-specific programs (libraries, community colleges, maker spaces) that connect children with practitioners. The gap is real; it is also addressable with intentional effort.
What to Watch for Over the Next 3 Months
Watch what your child’s school is actually doing with AI — not just what it says it’s doing. “We are preparing students for an AI future” is a statement that can be made by a school that has done nothing and by a school that has restructured its curriculum. The specific questions worth asking: Is there explicit instruction in what AI is and how it works? Are there structured activities requiring students to evaluate AI output critically? Do teachers use AI in their instruction in ways they explain to students? Are AI-related projects integrated into core subjects or limited to a specific elective?
The Brookings analysis found that the schools making the most progress on AI equity were those where AI literacy had been integrated into core curriculum rather than isolated in optional tracks. Schools where AI is only available to students who seek it out — through clubs, electives, or enrichment — are, by definition, selecting for students who already have higher engagement and are more likely to be advantaged. Equity in AI literacy requires universal access through required curriculum, not optional access through enrichment.
The window for building AI literacy as a foundational skill is probably the next 3–5 years. After that, AI literacy will be assumed in the same way that typing and basic computer use are assumed today — no longer differentiated, no longer a competitive advantage, but a baseline. The children who develop it before it’s mandatory will have practiced it longest and will hold it most fluently. The ones who don’t develop it until it’s required will be catching up.
Frequently Asked Questions
Is the AI literacy gap really about socioeconomic status, or is it about individual effort and interest?
Both are real, but they’re not equal. Individual interest matters and some children will seek out AI learning regardless of their circumstances. But the Brookings and Common Sense Media data consistently show that access to high-quality AI preparation — structured, mentored, substantive — is not randomly distributed. It tracks income, school quality, and neighborhood. Individual effort can overcome structural disadvantage, but structure shapes the baseline probability. Addressing the structural side is not in conflict with valuing individual effort.
What if my child’s school doesn’t teach AI literacy at all?
You have two options that are not mutually exclusive: advocate for change (talking to teachers, school boards, and administrators about curriculum), and provide what you can at home or through external programs. Libraries, community colleges, and maker spaces are often underused resources that provide access to both tools and mentorship without the cost of private programs. The article linked above on AI literacy in middle school provides a framework for what home-based learning can look like.
How long does it take to develop meaningful AI literacy?
The research on this is nascent, but the analogy to coding literacy is informative. Studies of coding education found that meaningful coding literacy — the ability to use code as a tool for building and problem-solving — required sustained engagement over 6–18 months of regular practice, not a one-week camp. AI literacy probably follows a similar pattern. Single exposures produce awareness. Sustained, structured practice produces capability.
Are AI literacy gaps likely to widen or narrow over the next decade?
The honest answer from the WEF and McKinsey analyses is: it depends heavily on policy choices. Left to market forces, the gap is likely to widen — the children of knowledge workers and technology professionals will continue to have better access to high-quality preparation. Public investment in AI curriculum, teacher training, and universal program access could narrow it. The trajectory is not fixed.
Does AI literacy guarantee better employment outcomes?
No. The Acemoglu and Restrepo analysis and the McKinsey workforce data show that AI will create new jobs as well as eliminate existing ones, but the new jobs require not just AI literacy but domain expertise — knowing a field well enough to use AI to do advanced work in it. AI literacy is a foundation, not a guarantee. A child who is both AI-literate and develops deep domain knowledge in engineering, healthcare, education, or other fields is much better positioned than one who is AI-literate without substantive expertise. AI literacy alone is not enough; combined with domain expertise, it is a significant advantage.
Is providing AI tools at home enough to close the literacy gap?
Not by itself. Access is necessary but not sufficient, and the Common Sense Media data make clear that access alone tends to produce high-volume, low-quality AI use rather than meaningful AI literacy development. What converts access into literacy is guided, purposeful use — with an adult or mentor who models thoughtful engagement with AI tools, discusses what they’re doing and why, and asks evaluative questions. The guidance is the active ingredient.
About the author
Ricky Flores is the founder of HIWVE Makers and an electrical engineer with 15+ years of experience building consumer technology at Apple, Samsung, and Texas Instruments. He writes about how kids learn to build, think, and create in a tech-saturated world. Read more at hiwavemakers.com.
Sources
- Acemoglu, D., & Restrepo, P. (2022). Tasks, automation, and the rise in US wage inequality. NBER Working Paper No. 28920. https://doi.org/10.3386/w28920
- McKinsey Global Institute. (2023). The Economic Potential of Generative AI: The Next Productivity Frontier. McKinsey & Company.
- World Economic Forum. (2025). Future of Jobs Report 2025. WEF.
- OECD. (2024). Education at a Glance 2024: AI and the Transformation of Skills. OECD Publishing. https://doi.org/10.1787/c00cad36-en
- Common Sense Media. (2024). AI Access and Use Among Youth: A National Survey. Common Sense Media Research.
- Brookings Institution. (2024). AI and Educational Equity: Who Gets Prepared and Who Gets Left Behind. Brookings Center on Education Policy.
- Common Sense Media. (2025). AI Use Among Teens and Tweens: 2025 Survey Report.
- Muro, M., Maxim, R., & Whiton, J. (2019). Automation and Artificial Intelligence: How machines are affecting people and places. Brookings Metropolitan Policy Program.