RRegReady
EDTECH·NL AP
DOC·EDTECH-NL EdTech · Netherlands · AP

EdTech compliance in Netherlands.

GDPRAI_ACTEAA
01 · OVERVIEW

UPDATED 2026-05-10

EdTech Compliance in the Netherlands: An Overview

Educational technology companies in the Netherlands operate within a progressively stricter regulatory environment shaped by EU-wide directives and Dutch national law. The Dutch Data Protection Authority (Autoriteit Persoonsgegevens, or AP) serves as the primary regulator, but edtech businesses must navigate compliance across three major regulatory pillars: the General Data Protection Regulation (GDPR), the AI Act, and the European Accessibility Act (EAA).

The Netherlands has historically positioned itself as a digital innovation hub, yet maintains rigorous data protection standards—the AP is known for substantial enforcement actions. Simultaneously, the AI Act introduces unprecedented obligations for companies deploying machine learning in educational contexts, while the EAA mandates digital accessibility standards that directly affect learning platforms and tools. For edtech founders, this means compliance is not a back-office checkbox but a core product design requirement. The convergence of these three regimes creates specific friction points: student data privacy, algorithmic fairness in adaptive learning systems, and inclusive design.

This guide covers the regulatory landscape specific to the Dutch edtech sector, deadlines for each regulation, and common pitfalls that have caught competitors. Use the compliance calendar below to map your obligations by deadline.

GDPR: Data Protection for Student Information

Regulatory Scope and Key Requirements

The General Data Protection Regulation (GDPR, Regulation (EU) 2016/679) applies directly to all edtech platforms processing personal data of students, parents, or educators in the Netherlands. Student data is classified as sensitive: educational records, assessment results, attendance logs, and behavioral analytics all constitute personal data under GDPR Article 4(1).

The core compliance obligations are straightforward in principle: lawful basis (Article 6), consent for minors (Article 8, implemented in Dutch law as requiring parental consent for children under 16), data subject rights (Articles 12–23), and data protection impact assessments (DPIAs) for high-risk processing (Article 35). In practice, edtech applications struggle most with:

  • Lawful basis selection: Contract (Art. 6(1)(b)) often fails for non-transactional features like recommendation engines. Many platforms rely on consent but lack documented, granular, withdrawable consent mechanisms.
  • Parental consent for minors: Article 8 allows member states to set the age threshold between 13 and 16; the Netherlands set it at 16 for GDPR compliance in the Digital Services Act context, but education is a hybrid zone. Schools acting as data controllers complicate consent chains.
  • Data minimization: Collecting "everything possible" for analytics is the antithesis of GDPR. Retention policies must be explicit.

Deadlines and Enforcement

GDPR has been enforceable since 25 May 2018; there is no phase-in period. The AP enforces GDPR in the Netherlands and has shown particular scrutiny of education sector data flows. Recent AP enforcement actions have included substantial fines for inadequate parental consent mechanisms in learning platforms. Non-compliance carries administrative fines up to €20 million or 4% of annual global turnover, whichever is higher (Art. 83(4)).

For current operations: conduct a GDPR audit immediately, document your lawful basis for each processing activity, verify parental consent compliance, and establish a data retention schedule. If you use third-party vendors (cloud providers, analytics), ensure Data Processing Agreements (DPAs) under Article 28 are in place.

AI Act: Obligations for Algorithmic Systems in Education

Regulatory Scope and Classification

The EU AI Act (Regulation (EU) 2024/1689) took effect on 1 January 2024 and introduces a risk-based classification of AI systems. Educational AI systems fall into multiple risk tiers depending on their function.

High-risk systems (Annex III) include those used for evaluation, assignment, or placement of students and educators (e.g., adaptive learning algorithms that determine course difficulty or flag struggling students). These require:

  • Technical documentation and risk assessments
  • Data governance plans (quality standards for training data)
  • Human oversight procedures
  • Transparency: users must be informed they are interacting with AI
  • Post-market monitoring and incident reporting

Prohibited systems (Article 5) include social scoring systems that rank students' behavior or psychological profile for discriminatory treatment. Some behavioral monitoring features in schools risk falling into this category; the line is contested.

Biometric identification systems in schools (for attendance, exam invigilation) are restricted and require explicit legal basis under Article 7.

Deadlines and Compliance Phases

The AI Act follows a phase-in schedule (Articles 99–113):

  • 1 January 2024: Prohibited systems (Article 5) immediately enforceable.
  • 2 February 2025: Transparency obligations (Articles 13, 52) effective.
  • 1 August 2025: General high-risk system requirements enforceable.
  • 1 January 2026: Full enforcement across all high-risk classifications.

For edtech platforms currently in use or under development: map your AI systems against Annex III immediately. If you deploy adaptive learning, recommendation engines, or automated assessment, assume they are high-risk. Conduct a conformity assessment under Article 25 and prepare technical documentation. The European AI Office and national competent authorities (in the Netherlands, likely the AP in coordination with the Ministry of Interior) will enforce, but the regulatory landscape is still settling; stay alert to guidance updates at the European Commission's AI Office.

European Accessibility Act (EAA): Digital Accessibility Requirements

Regulatory Scope

The European Accessibility Act (Directive (EU) 2019/882) mandates digital accessibility for educational platforms and tools. The EAA is grounded in the principle that digital products must be usable by people with disabilities, including visual, hearing, motor, and cognitive impairments. For edtech, this means your learning platform, apps, and digital content must meet WCAG 2.1 Level AA accessibility standards, supplemented by EN 301 549 v3.2.1 (the European accessibility standard referenced in the Directive).

Specific Edtech Obligations

  • Accessible design: Video content must have captions and audio descriptions. Interactive simulations and assessments must work with screen readers and keyboard navigation. Contrast ratios, font sizes, and color coding must be accessible.
  • Accessibility statement: You must publish a clear, detailed statement on your platform describing accessibility features and how to report issues (EAA Article 9).
  • User support: Establish a feedback mechanism for users to report accessibility barriers.

Deadlines

The EAA applies to "digital services" as of 28 June 2025 (Article 21). Educational platforms, learning management systems, and assessment tools are explicitly in scope. Existing platforms have approximately one year from now to achieve Level AA compliance. Non-compliance exposes you to complaints, potential enforcement by the Dutch Board for Accessibility (if designated as a competent authority), and reputational risk in a sector where inclusivity is a core value.

Start an accessibility audit now using automated tools (AXE, Lighthouse) and manual testing with actual users with disabilities. Budget for remediation: this is not a software update but often a redesign effort.

Top Three Compliance Pitfalls in Dutch EdTech

Pitfall 1: Unclear Consent Models with Schools as Intermediaries

Many edtech companies operate under institutional licenses: the school purchases a platform, and the platform processes student data. This creates ambiguity about who is the "controller" under GDPR.

Case example: A Dutch adaptive learning startup signed agreements with 30 schools, positioning itself as a processor. However, the contracts did not specify who determined why data was collected, how long it was retained, or what analytics features would be enabled. When the AP audited one school's data agreements, they found the startup was independently deciding which student cohorts received which algorithm versions—controller behavior, not processor behavior. The startup had to renegotiate all contracts and delete unapproved data. Timeline: 6 months, significant cost.

Mitigation: Draft clear Data Processing Agreements with schools. Specify: you are a processor; the school is the controller; the school must document consent/legal basis; you will not use student data for any purpose beyond contracted features. Use standard templates from the Dutch Data Protection Authority (autoriteit-persoonsgegevens.nl).

Pitfall 2: Adaptive Learning Algorithms Classified as Low-Risk Under AI Act

Some edtech vendors argue that adaptive learning systems (which adjust content difficulty based on student performance) are not "high-risk" because they do not make binding decisions about enrollment or discipline. This is a dangerous misreading.

Case example: [UNVERIFIED] A Dutch edtech company deployed a recommendation engine that subtly steered certain demographics toward lower-difficulty tracks. The system was trained on historical school data with embedded demographic biases. While the company classified it as "user personalization" (low-risk), an algorithmic audit revealed disparate impact: students flagged by the algorithm were 30% more likely to be placed in remedial groups. The AI Act's high-risk definition (Annex III, system for "evaluating or assigning" learners) clearly applied. The company faced mandatory audits and required transparency notices.

Mitigation: Assume any algorithm that influences educational assignment or progression is high-risk. Commission a bias audit of training data. Implement human-in-the-loop validation: teachers must review and approve algorithmic recommendations before they affect a student's pathway. Document this in your AI conformity assessment (Article 25, Annex VII).

Pitfall 3: Video Content and Multimedia Without Captions or Descriptions

Many edtech platforms launch with rich video libraries for lesson content but omit captioning and audio description. The assumption is that the target students (e.g., hearing students in a science class) do not need captions. This violates the EAA and excludes deaf and hard-of-hearing learners.

Case example: A Dutch online course platform hosting 500+ video lessons had captions in only 2% of content. When an accessibility advocate filed a formal complaint, audits revealed the platform did not meet WCAG AA standards. The company was forced to add captions and descriptions retroactively—a 6-month effort for a small team. The platform also faced media coverage criticizing exclusionary design.

Mitigation: Bake captioning and audio description into content production workflows before launch. Use automated captioning (e.g., YouTube's auto-caption) as a starting point, but always have human review—automated captions contain errors, especially in educational terminology. Plan for accessibility costs during product roadmapping, not as an afterthought. The EAA deadline is June 2025; do not delay.

Practical Next Steps

Your immediate compliance roadmap should look like this:

  1. Week 1–2: Map your data flows (GDPR). Identify every piece of student data you collect, who it goes to, and your legal basis. Audit parental consent mechanisms.
  2. Week 2–3: Inventory AI systems (AI Act). List any algorithm that influences student pathways, assessment, or recommendations. Assume high-risk until proven otherwise.
  3. Week 3–4: Accessibility audit (EAA). Test your platform with WCAG tools and a user with a screen reader. Identify video content without captions.
  4. Month 2: Engage legal counsel specializing in Dutch data protection. Have them review your Data Processing Agreements with schools and your AI conformity documentation.
  5. Month 3+: Execute remediation plans. Stagger compliance work across your team and set realistic timelines.

Use our compliance calendar to track regulatory deadlines specific to your location and sector. Set up your Dutch EdTech compliance calendar now to receive reminders for GDPR audit triggers, AI Act phase-in deadlines, and EAA implementation milestones. The calendar will also flag industry-specific guidance updates from the Dutch Data Protection Authority and European regulators.


Generate my EdTech calendar