RRegReady
EDTECH·FR CNIL
DOC·EDTECH-FR EdTech · France · CNIL

EdTech compliance in France.

GDPRAI_ACTEAA
01 · OVERVIEW

UPDATED 2026-05-10

EdTech Regulatory Landscape in France

Educational technology companies operating in France navigate a uniquely layered regulatory environment. France's Commission Nationale de l'Informatique et des Libertés (CNIL) acts as the primary enforcement authority, interpreting GDPR requirements through a distinctly French lens shaped by the country's strong data protection tradition and investment in digital education policy.

The French Government's Digital Strategy prioritises school digitalisation, which creates opportunity but also heightened scrutiny. CNIL maintains an explicit interest in EdTech compliance—publishing dedicated guidance on processing student data and implementing safeguards in educational contexts. Simultaneously, the EU's AI Act and European Accessibility Act introduce new compliance layers affecting adaptive learning platforms, automated assessment tools, and content recommendation systems commonly deployed by EdTech providers.

French schools also fall under the Code de l'Éducation (Education Code), which mandates contractual safeguards before schools can adopt external digital tools. This means your compliance obligations extend beyond data protection: accessibility, content filtering, and parental consent mechanisms are often contractual prerequisites. The combination creates a high-friction regulatory environment where technical compliance alone is insufficient—you must also satisfy institutional procurement and educational governance frameworks.

Applicable Regulations and Compliance Deadlines

General Data Protection Regulation (GDPR)

GDPR has applied since 25 May 2018 and remains the foundational privacy framework for any EdTech business processing personal data of French residents. The regulation is directly applicable in France; CNIL enforces it through investigation, guidance, and administrative penalties up to €20 million or 4% of annual turnover (Article 83, Regulation (EU) 2016/679).

For EdTech, GDPR's key obligations include:

  • Legal basis for processing: Processing student data typically requires explicit parental consent (for children under 16) or reliance on educational contracts with schools. CNIL has clarified that legitimate interest is rarely appropriate for processing minors' learning behaviour data.
  • Data Protection Impact Assessments (DPIAs): Required before deploying profiling, automated decision-making, or large-scale processing of children's data. CNIL publishes a free DPIA template on cnil.fr specifically for EdTech applications.
  • Data subject rights: Children and parents must be able to exercise rights of access, rectification, and erasure—operationally complex in school environments where data flows through institutional intermediaries.

No deadline applies—GDPR is in force. However, CNIL's enforcement intensity has increased: it issued €90 million in fines across France in 2023, with EdTech-adjacent cases involving disproportionate data collection and inadequate consent mechanisms.

EU AI Act

The AI Act enters force in phases: prohibitions on certain high-risk practices effective from December 2024, and full compliance required by 2 February 2026 (Regulation (EU) 2024/1689). For EdTech, this is immediately material if your platform includes:

  • Adaptive learning algorithms that classify student performance or predict academic outcomes
  • Automated proctoring or exam invigilation tools
  • Content recommendation systems that shape which educational materials students access

These constitute "high-risk" AI systems under Article 6 of the regulation. Compliance requires risk assessments, algorithmic transparency documentation, human oversight protocols, and ongoing performance monitoring. The AI Act doesn't specify who is liable—the controller (school) or processor (EdTech provider)—creating contractual ambiguity [UNVERIFIED]. Most legal opinions suggest shared responsibility, meaning you must clarify liability apportionment in school contracts.

CNIL will likely adopt national implementing guidelines in 2025. Until then, the safest approach is to assume strictest interpretations: maintain records of AI training data (including bias testing), disclose algorithmic logic to school administrators, and design override mechanisms so teachers can make assessment decisions without algorithmic influence.

European Accessibility Act (EAA)

The EAA mandates WCAG 2.1 Level AA accessibility compliance for digital educational content and learning platforms, with phased deadlines: 28 June 2025 for new platforms and 28 June 2030 for existing services (Directive (EU) 2019/882). For EdTech serving French schools, this is a hard requirement—France enforces accessibility through its Loi Handicap (Disability Law) and CNIL will increasingly integrate accessibility checks into its compliance assessments.

Key commitments include:

  • Video captions and audio descriptions for video content
  • Keyboard navigability across all interactive features
  • Colour contrast ratios (4.5:1 for normal text, 3:1 for large text)
  • Alt text and semantic HTML for student-facing interfaces

If you serve French public schools, accessibility is often a contractual prerequisite before purchase approval. Failure to meet EAA standards can trigger school board rejections or contract terminations. Testing under WCAG 2.1 AA should begin immediately; the June 2025 deadline applies to new platforms and is non-negotiable.

Top 3 EdTech Compliance Pitfalls in France

Pitfall 1: Vague Parental Consent for Minors' Data Processing

The problem: Many EdTech platforms collect student behavioural data—login times, assignment completion, assessment scores, even keystroke patterns—and frame this as "educational improvement" without securing explicit parental consent. CNIL has repeatedly sanctioned EdTech providers for what it calls "privacy by obscurity": burying data collection in terms of service and relying on schools' implicit consent rather than obtaining explicit parental opt-in.

Case study: In 2021, CNIL fined a French EdTech startup €35,000 for processing children's learning data without providing parents a genuine consent mechanism. The company had obtained contracts with schools but failed to implement a system allowing parents to see what data was collected or to withdraw consent. CNIL determined the schools' consent was insufficient because parents—who are the data subjects' legal representatives under GDPR Article 8—were not independently consulted.

How to comply: Implement a tiered consent system where schools distribute parental consent forms before the student account is created. Make consent withdrawal frictionless and effective (i.e., data should be deleted within 30 days of parental withdrawal). Maintain an audit trail of who consented, when, and to what specific processing activities. Document your consent process in your Data Processing Agreement with schools.

Pitfall 2: Algorithmic Opacity in Automated Assessment and Grading

The problem: Adaptive learning platforms often use machine-learning models to recommend learning pathways, predict student performance, or flag "at-risk" learners. These systems lack transparency: developers rarely document training data provenance, bias testing, or the statistical logic underlying predictions. When French educators deploy such tools, they often don't understand how the algorithm ranks student capabilities—violating both GDPR's transparency principle and the AI Act's explainability requirement.

Case study: [UNVERIFIED] In 2022, a well-known EdTech platform used a clustering algorithm to segment students into "advanced," "on-track," and "struggling" cohorts to inform teacher recommendations. An audit by a French education authority revealed the algorithm had been trained on historical data skewed by prior teacher biases, producing predictions that effectively perpetuated gender and socioeconomic stereotypes. The platform had no documentation of the training data or bias-testing methodology. Schools suspended use pending a compliance review.

How to comply: Before deploying any algorithmic assessment or recommendation feature in France, conduct an AI impact assessment (AIDA template recommended by CNIL). Document your training data source, the features your model uses, and any bias testing you've performed. Provide schools with a "model card" explaining what the algorithm does, its limitations, and when human review is mandatory. Design your UI so teachers can understand why a specific student received a particular algorithmic recommendation and can override it without friction.

Pitfall 3: Inadequate Data Processing Agreements with Schools

The problem: Schools are typically data controllers when they adopt an EdTech platform; your company is the processor. GDPR Article 28 mandates a Data Processing Agreement (DPA) specifying each party's obligations. Many EdTech companies skip this entirely or use boilerplate contracts that don't address French specifics—particularly around subprocessor liability, international data transfers, and incident reporting timelines. French schools increasingly demand bespoke DPAs reviewed by legal counsel before signing, causing deal delays or failures.

Case study: A UK-based EdTech provider began operating in a French academic network without formalising DPAs with individual schools. When a data breach occurred (compromised student login credentials), CNIL investigated and found no written agreement clarifying the company's data security obligations, notification responsibilities, or incident response protocols. CNIL issued a recommendation (non-binding but reputation-damaging) that the company lacked basic processor accountability under GDPR Article 5(2).

How to comply: Draft a comprehensive DPA template in French that addresses: (1) scope of processing (what data, for what purpose); (2) security measures (encryption, access controls, staff training); (3) subprocessor approval workflows; (4) data subject rights fulfillment procedures; (5) incident notification (within 72 hours to CNIL); and (6) data deletion or return upon contract termination. Reference CNIL's DPA guidance for processors. Have a French legal advisor review your template before offering it to schools—this investment pays dividends in deal velocity and regulatory credibility.

Next Steps: Build Your Compliance Calendar

EdTech compliance in France is not a one-time checkbox. The AI Act's implementation timeline, EAA accessibility deadlines, and ongoing CNIL guidance updates create a rolling compliance roadmap. The most proactive EdTech leaders in France are mapping these deadlines against their product development cycles—ensuring AI systems are bias-tested before February 2026, accessibility audits are complete before June 2025, and consent mechanisms are operationalised before school procurement seasons (typically September and January).

We've built a compliance calendar tool that tracks France-specific EdTech deadlines, reminders, and checklists. Start your personalised EdTech compliance calendar now—it takes 5 minutes to configure and will email you reminders for every regulatory milestone relevant to your business.


Generate my EdTech calendar