The form was a single page, photocopied slightly crooked. It sat in a stack between the emergency contact card and the field trip authorization at a back-to-school registration table in a cafeteria outside Charlotte, North Carolina, on the second Tuesday of August. A mother of two signed it in under a minute. She remembers the emergency contact card because she had to look up her sister's new phone number. She remembers the lunch application because it asked for her income. She does not remember the acceptable-use policy because there was nothing to remember. It said something about the district's right to provide students with access to digital tools for educational purposes. Eleven forms. Nine minutes. A ballpoint pen running out of ink. Her seven-year-old asking for the third time whether they could go see the classroom.

Her signature now appears in the compliance records of at least four educational technology companies. One of them uses artificial intelligence to monitor student conversations and flag messages related to self-harm. The mother does not know its name. She has never seen its privacy policy, never been told what data it collects from her children, never been offered a chance to say no to one tool while saying yes to another. The edtech industry treats her signature as informed consent. The Federal Trade Commission has a different term for it.

The Architecture of Outsourced Consent

painting the school examination
Albert Anker, "The School Examination" (1862). Kunstmuseum Bern. The examiner reviews the student. Nobody examines the examiner. Public domain.

The Children's Online Privacy Protection Act[1] requires verifiable parental consent before an operator collects personal information from anyone under thirteen. The law is clear. Obtaining consent from individual parents is expensive, slow, and introduces friction that kills product adoption in schools. So the industry found a workaround.

COPPA contains a narrow exception: when a school contracts with an edtech provider for a legitimate educational purpose, the school may consent on behalf of parents. The exception carries strict conditions — no commercial use, no advertising, no repurposing — and the operator must provide the school the same detailed notice it would otherwise give each parent directly. The FTC designed this as a limited accommodation. The edtech industry rebuilt it as the foundation of a business model.

The mechanism is consistent across vendors. The company's terms of service state that the school is responsible for obtaining parental consent. The school's acceptable-use policy contains a blanket authorisation for "digital learning tools" without naming any specific platform. The parent signs the form at registration. The vendor points to the school. The school points to the form. The form points to nothing.

Somewhere in that circle, a seven-year-old's data enters a pipeline that nobody in the chain has fully described to anyone else.

What the Law Already Said

The FTC obtained a six-million-dollar order against Edmodo in May 2023,[2] an educational technology provider that had served tens of millions of students. Edmodo had "unlawfully outsourced its COPPA compliance responsibilities to schools." Its terms of service told schools they were "solely responsible" for compliance. The FTC called this "nonsensical."[3]

The ruling was three years ago. The architecture it condemned has not changed. But the stakes have. On December 28, 2024, a hacker exfiltrated the personal data of approximately 62.4 million students and 9.5 million teachers from PowerSchool — the largest breach of children's data in history.[4] Names, dates of birth, Social Security numbers. Most of those students had no direct relationship with PowerSchool. Their parents signed an acceptable-use policy that said nothing about it. An attorney representing parents put it plainly: "Kids don't get to consent to using this software, and parents basically don't have a choice about whether their kids use it."[5]

In November 2024, a federal judge reinforced the point. In Shanahan v. IXL Learning, Judge Rita Lin wrote that "neither COPPA nor common-law agency principles support IXL's contention that school districts act as agents of parents when contracting with educational vendors."[6] Schools may consent for educational data uses. Schools cannot bind parents to a vendor's extraneous terms.

The FTC finalized amendments to the COPPA Rule in January 2025, requiring separate, specific parental consent before children's data is used for AI training.[7] The gap between what the law demands and what the industry practises is not narrowing. It is filling with compliance theatre.

The Waiver That Keeps Working

A new generation of AI-powered learning platforms has emerged — tools offering AI tutors, writing assistants, and automated behavioural monitoring. Many make substantive commitments: data not shared with third parties, not used to train models, storage segregated from providers like OpenAI. But the commitments address what happens to data after collection. They do not address whether collection was authorised in the first place.

The consent mechanism, stated in vendor after vendor's documentation, follows a consistent pattern: the platform "can be used by students under the age of 13 as long as the school has a waiver in place where parents sign to give administrators the right to choose which tools students can use."[8] The waiver does not name any specific AI platform. It does not describe what data the platform collects. It does not mention artificial intelligence. It is the acceptable-use policy that school districts have maintained since the 1990s — a document written for internet filtering and computer lab rules, now serving as the consent mechanism for AI-powered data collection from children who cannot legally consent for themselves.

The vendor points to the school. The school points to the form. The form names no platforms, describes no data practices, and mentions no AI. The FTC has a word for this arrangement: illegal.

The AI Difference

painting girl reading a letter
Johannes Vermeer, "Girl Reading a Letter at an Open Window" (c. 1657). Gemäldegalerie Alte Meister, Dresden. She believes the letter is private. Public domain.

A student using an AI tutor does not submit a form. The student has a conversation — asking questions that reveal what they do not understand, what frightens them, what they cannot say out loud in a classroom of thirty. This is cognitive and emotional data, generated in the context of what feels to the child like a private exchange, collected under the authority of a form designed for internet filtering in 1998.

Then there is the monitoring. Many platforms disclose that "inappropriate messages sent to the AI are automatically flagged for administrator review."[9] Every message passes through an automated system that evaluates it against behavioural criteria and routes flagged content to administrators. Every question. Every vulnerable disclosure a student types because the chat window felt safer than raising a hand.

The surveillance extends beyond AI tutors. GoGuardian tracks approximately 27 million students across 11,500 schools. Gaggle monitors roughly 6 million in 1,500 districts. Only one in four teachers report that monitoring is limited to school hours.[10] In Minneapolis, an LGBTQ student was outed to their parents after Gaggle flagged keywords including "gay" and "lesbian" — the school did not talk to the student first.[11] In Lawrence, Kansas, a parent tried to opt their son out of Gaggle surveillance. The principal, deputy superintendent, and school board president all said it was not an option. The parent sued.[12]

A Senators Markey and Warren congressional investigation found that surveillance companies "have not taken any steps to determine whether student activity monitoring software disproportionately targets students from marginalized groups."[13] The acceptable-use policy that authorised all of this describes none of it.

The Problem Is Not American

painting the money lender and his wife
Quentin Matsys, "The Moneylender and His Wife" (1514). Musée du Louvre. One counts the coins. The other watches. Neither looks at the person who earned them. Public domain.

The consent architecture is rooted in U.S. law, but the pattern is global. The EU's GDPR requires explicit consent before processing children's data.[14] The UK's Age Appropriate Design Code requires privacy protections by default.[15] Canada's PIPEDA and provincial education laws impose parallel consent requirements. Australia's Privacy Act requires reasonable steps to ensure parental consent for children's data. Every jurisdiction has the law. Every jurisdiction has the enforcement gap.

The pattern scales because the vendors scale. A platform built in San Francisco, deployed in Charlotte, in Leeds, in Toronto, and in Melbourne, carries the same terms of service everywhere. The consent form changes language but not structure. The parent in Charlotte signs a form that says "digital learning tools." The parent in London signs one that says "online learning resources." Neither form names the AI platform. Neither describes the data pipeline.

What Parents and Schools Can Do

The legal framework gives parents more power than most realise. In the U.S., FERPA[16] grants parents the right to inspect and review all education records — schools must respond within 45 days. In the EU and UK, a Subject Access Request compels any data controller to provide all personal data within 30 days. Canada and Australia provide equivalent rights. Template request letters are available from the Student Data Privacy Project and PASEN.[17]

Parents everywhere should:

  • Request a complete list of every edtech tool used in their child's classroom and the data each tool collects.
  • File a data access request for copies of records held by third-party vendors.
  • Check Common Sense Media's privacy ratings — four out of five edtech applications still fail minimum privacy criteria.[18]
  • Ask five questions at back-to-school night: which platforms collect data from my child, what data they collect, who can see it, how long it is retained, and whether my child can be opted out of specific tools without losing access to all digital learning.

Schools are not the villain of this story. They are caught between vendors who shift compliance downward and parents who are never given the information to exercise their rights. But tools exist. The Student Data Privacy Consortium's National Data Privacy Agreement provides a free, state-aligned baseline contract for every vendor.[19] CoSN's Student Data Privacy Toolkit covers vetting, inventories, and governance.[20] At minimum, schools should conduct a complete edtech inventory, adopt the NDPA, notify parents individually when new tools are adopted mid-year, and provide a per-tool opt-out mechanism. The binary choice — all tools or no tools — is not required by law. Parents should be able to decline the surveillance tool without losing access to the gradebook.

Open-source, self-hosted platforms — including Moodle, Canvas (open edition), Kolibri, and Sage.Education — demonstrate that the current architecture is not inevitable. When student data never leaves infrastructure the school controls, there is no third-party pipeline to authorise. This approach carries real limitations: it requires technical capacity many districts lack and ongoing maintenance underfunded IT departments cannot guarantee. But it proves that the consent problem is a design choice, not a technical constraint.

The Signature at the Cafeteria Table

The mother in Charlotte signed eleven forms in nine minutes. She does not remember the acceptable-use policy. It asked for nothing except a signature. It described nothing except a general right to provide digital tools. It was, in every functional sense, invisible: not because it was hidden, but because it was designed to disappear into a stack of forms that a parent signs in a cafeteria while a child asks to see the classroom.

Her daughter is seven. Her son is ten. Both are using AI-powered tools in their classrooms. Both are generating conversational data stored on servers she has never heard of, governed by privacy policies she has never been shown, under terms of service that place the burden of consent on a parent who was never given the information to provide it.

She signed a form. The industry calls it compliance. The FTC called it illegal three years ago. The form has not changed.


Footnotes


The views expressed are those of the editorial board and do not necessarily reflect the positions of any institution mentioned. Sage.Education is a product of Startr LLC and is cited in this article as one of several open-source alternatives; its inclusion represents a disclosure of interest, not an endorsement over the other platforms named. No individuals quoted in this article were interviewed; all quotes are from published sources. Full disclosure and transparency is a feature, not a bug.


  1. COPPA (Children's Online Privacy Protection Act): a federal law enacted in 1998 (15 U.S.C. §§ 6501–6506) that requires operators of websites and online services directed to children under 13 to obtain verifiable parental consent before collecting, using, or disclosing personal information from children. The FTC enforces COPPA through the COPPA Rule (16 CFR Part 312). ftc.gov/coppa. ↩︎

  2. Federal Trade Commission, "FTC Obtains Order Against Edmodo for Illegally Collecting Children's Personal Information and Using It for Advertising," May 2023. Case No. 202-3129. ftc.gov. The $6 million penalty included a requirement to delete models and algorithms built using improperly collected children's data. ↩︎

  3. Federal Trade Commission, "Policy Statement of the Federal Trade Commission on Education Technology and the Children's Online Privacy Protection Act," May 2022. Adopted unanimously, 5-0. The Commission stated: "Ed tech companies cannot pass compliance off to school administrators, parents, or others through contract provisions or terms of service." ftc.gov. ↩︎

  4. PowerSchool data breach, December 28, 2024. Approximately 62.4 million students and 9.5 million teachers affected. Matthew D. Lane, 19, was charged and sentenced to four years in federal prison. PowerSchool paid approximately $2.85 million in Bitcoin ransom. PowerSchool breach notice. K-12 Dive. ↩︎

  5. Attorney quote from class action filings against PowerSchool. HBS Law. ParentMap. ↩︎

  6. Shanahan v. IXL Learning, Inc., U.S. District Court for the Northern District of California, November 2024. The FTC filed an amicus brief (August 2024) explaining that COPPA does not create a broad agency relationship between parents and school districts. K-12 Dive. FTC amicus brief. ↩︎

  7. Federal Trade Commission, "Children's Online Privacy Protection Rule: Final Amendments," 16 CFR Part 312, effective January 2025, compliance required by April 2026. The amendments explicitly classify AI training as a non-integral purpose requiring separate parental consent. ftc.gov. ↩︎

  8. Quoted language is representative of consent mechanisms found across multiple AI-powered edtech platforms' published guidance for schools deploying tools with students under 13. Vendors including Flint K-12, Khanmigo, MagicSchool, and others use variations of this phrasing. ↩︎

  9. Content monitoring disclosures are standard across AI-powered edtech platforms. Vendors typically state that inappropriate messages are "automatically flagged for administrator review" and that administrators can view student-AI interactions. These disclosures appear in vendor documentation, not in the acceptable-use policies parents sign. ↩︎

  10. GoGuardian monitors approximately 27 million students across 11,500 schools. Gaggle tracks roughly 6 million students in 1,500 districts. Only 1 in 4 teachers reported monitoring was limited to school hours (CDT, "Off Task," 2023). CDT report. New America. ↩︎

  11. Minneapolis (2021): an LGBTQ student at Roosevelt High School was outed after Gaggle flagged keywords including "gay" and "lesbian." CDT survey: "Nearly a third of LGBTQ students said they or someone they know experienced nonconsensual disclosure of their sexual orientation or gender identity as a result of digital activity monitoring." LGBTQ Nation. CDT: Chilling Effect. ↩︎

  12. Lawrence, Kansas, 2025. Principal Amy McAnarney, Deputy Superintendent Larry Englebrick, and School Board President Kelly Jones all confirmed opt-out was not an option. The parent filed suit. Lawrence KS Times. ↩︎

  13. Senators Edward Markey and Elizabeth Warren, congressional investigation of student monitoring tools (Gaggle, Bark, GoGuardian, Securly), 2022. Senator Warren report. ↩︎

  14. GDPR and children's data: the EU General Data Protection Regulation (2018) requires explicit, informed consent before processing children's personal data. Article 8 sets a default age of 16, with member states permitted to lower it to 13. gdpr.eu/children. ↩︎

  15. UK Age Appropriate Design Code (Children's Code): enacted 2021, enforced by the Information Commissioner's Office (ICO). Requires that digital services likely to be accessed by children provide privacy protections by default. Applies to any service accessible by children in the UK, regardless of where the company is headquartered. ico.org.uk. ↩︎

  16. FERPA (Family Educational Rights and Privacy Act): a federal law enacted in 1974 (20 U.S.C. § 1232g) that protects the privacy of student education records. Schools must accommodate inspection requests within 45 days. The Department of Education has stated that "schools cannot require parents or students to waive their FERPA rights through ed tech company's terms of service." studentprivacy.ed.gov. ↩︎

  17. Template FERPA request letters: Student Data Privacy Project at studentdataprivacyproject.com. PASEN step-by-step guide at pasen.org. ↩︎

  18. Common Sense Media, "The State of Kids' Privacy." Privacy ratings on 400+ edtech products. Finding: "four out of five edtech applications and services still do not meet minimum criteria to safeguard student privacy and data security." commonsense.org. ↩︎

  19. SDPC (Student Data Privacy Consortium): the National Data Privacy Agreement (NDPA), Version 2, developed by 28 state alliances. privacy.a4l.org. ↩︎

  20. CoSN (Consortium for School Networking): Student Data Privacy Toolkit covering vendor vetting, edtech inventories, and data governance. cosn.org. ↩︎