RRegReady
SAAS·FR CNIL
DOC·SAAS-FR SaaS · France · CNIL

SaaS compliance in France.

GDPRAI_ACTEAADSA
01 · OVERVIEW

UPDATED 2026-05-10

Regulatory Landscape for SaaS in France

France's approach to SaaS regulation sits at the intersection of EU-wide mandates and nationally-tailored supervision. The Commission Nationale de l'Informatique et des Libertés (CNIL), established under France's Data Protection Act (Loi Informatique et Libertés), acts as the primary enforcement body for data protection and digital rights. Unlike some EU jurisdictions, France has historically taken aggressive stances on data localisation and algorithmic transparency—expectations that carry forward into AI regulation.

Your compliance obligations span four overlapping regimes. GDPR creates baseline data handling requirements. The AI Act imposes risk-based obligations if you build or deploy machine learning systems. The European Electronic Communications Code (EAC), transposed into French law, governs electronic communications and metadata handling. The Digital Services Act requires platform moderation, transparency, and risk assessments for online services. For most SaaS businesses, GDPR is non-negotiable; AI and DSA obligations depend on your product's functionality and user base. CNIL's regulatory approach emphasises practical accountability: you need demonstrable processes, not just policies. Fines under GDPR and AI Act violations can reach 4–6% of global revenue, while DSA penalties scale to 6% of EU turnover.

General Data Protection Regulation (GDPR)

Scope and core obligations

GDPR applies to any SaaS business processing personal data of EU residents, regardless of where your company is incorporated. "Processing" covers collection, storage, analysis, and deletion. France is within EU scope; data subjects have explicit rights: access, rectification, erasure, portability, and objection. You must establish lawful bases for each processing activity—consent, contractual necessity, legal obligation, or legitimate interest. If you process data of French residents, CNIL can investigate and fine you directly.

Key requirements for SaaS

Document a Data Protection Impact Assessment (DPIA) for high-risk processing—this is non-negotiable if you handle sensitive data, perform automated decision-making, or process children's data. Appoint a Data Protection Officer (DPO) if you are a public authority or if you engage in large-scale systematic monitoring. For most SaaS businesses, a DPO is optional but prudent if you process data at scale. Implement privacy-by-design: encryption in transit and at rest, role-based access controls, and audit logs. Establish a Data Processing Agreement (DPA) with any third-party processors (cloud vendors, analytics tools, subcontractors); CNIL audits DPAs as a matter of routine.

Deadlines and enforcement

GDPR has been enforceable since 25 May 2018. There is no sunset or phase-in: you must comply immediately and continuously. Breach notification to CNIL is mandatory within 72 hours of discovering an incident. CNIL publishes decisions on its website (cnil.fr); non-compliance decisions are public. Recent CNIL enforcement decisions (2022–2024) show fines averaging €500k–€2m for procedural failures (missing DPAs, inadequate consent mechanisms) and up to €90m for systematic violations. Reference: GDPR full text (EUR-Lex) and CNIL guidance.

AI Act

When it applies to SaaS

The AI Act (Regulation 2024/1689) entered force on 1 January 2025 and becomes partially enforceable from 2 February 2025 for prohibited AI practices. If your SaaS product includes machine learning models—recommendation engines, fraud detection, customer segmentation, or any automated decision-making—you likely fall within scope. France does not carve out exceptions for SMEs; the Act applies regardless of company size or revenue. The regulatory intensity depends on your AI system's risk classification: prohibited systems (e.g., social credit scoring) are banned outright; high-risk systems (those affecting fundamental rights or safety) require conformity assessments, documentation, and ongoing monitoring; limited-risk systems require transparency disclosures.

Practical obligations

Maintain a register of all AI systems your product uses. For high-risk systems, document training data provenance, performance metrics on demographic subgroups (to detect bias), and human oversight procedures. If you use third-party ML models (OpenAI, Anthropic, open-source), trace their origins: a model trained on unknown data or with unknown labeling practices creates compliance risk. Disclose when users interact with AI—this includes chatbots, content recommendation, and predictive analytics. CNIL and France's AI enforcement body (the "AI Office," currently in formation) will audit documentation; they will interview staff to verify that documented processes are actually followed.

Timeline and penalties

High-risk AI systems must comply by 2 February 2026 (one year from enforcement start). Prohibited systems became illegal on 2 February 2025. Fines for non-compliance reach 6% of global revenue (or €30m, whichever is higher) for violations of core AI rules. Reference: AI Act full text (EUR-Lex) and European Commission AI Act implementation tracker.

European Electronic Communications Code (EAC)

Applicability to SaaS

The EAC (Directive 2014/61/EU, amended by various directives, codified in national law) governs electronic communications services and networks. For SaaS, this matters if your product transmits voice, video, messages, or metadata (IP addresses, timestamps, call logs). If you operate a unified communications platform, VoIP service, or real-time collaboration tool, EAC obligations apply. Email or chat features within an enterprise SaaS product may trigger limited obligations around traffic data retention and user consent for marketing. France implements EAC through the Code des Postes et des Communications Électroniques (CPCE).

Key requirements

You must obtain informed consent before retaining or accessing traffic data (metadata). Retention must be minimised and time-limited. Users have the right to withhold consent; you cannot condition service access on consent for non-essential data collection. If you process metadata of French residents, you must comply with CNIL's guidance on metadata retention (typically 6 months maximum for billing purposes, shorter for analytics). Implement end-to-end encryption for sensitive communications where feasible and document your approach in your terms of service.

Enforcement horizon

EAC has been enforceable since 2014; France's telecom regulator (Autorité de Régulation des Communications Électroniques, des Postes et de la Distribution de la Presse, ARCEP) conducts periodic audits. Fines are less frequently applied than under GDPR, but violations can result in service suspension. Reference: EAC codification (EUR-Lex).

Digital Services Act (DSA)

When you're in scope

The DSA (Regulation 2022/2065) applies to "online platforms" and "very large online platforms" (VLOPs). An online platform is any service offering digital content storage or user-generated content hosting accessible to third parties. For SaaS, this includes collaboration tools with user-generated content (Slack-like products), marketplace or review platforms, and social or community features. A VLOP is a platform with at least 45 million monthly active users in the EU; most SaaS startups are not VLOPs, but mid-market and enterprise platforms may be. The DSA entered into force on 25 November 2022 and is enforceable since 17 February 2024. [UNVERIFIED: Some enforcement is postponed for small platforms; check current guidance on BEREC or Commission websites.]

Core obligations for platforms

Conduct a Digital Services Risk Assessment (DSRA) to identify systemic risks from your platform (misinformation, illegal content, manipulation, minority rights abuses). Based on the assessment, implement risk mitigation measures—content moderation policies, user reporting mechanisms, and algorithmic transparency. Provide users with a clear summary of your terms of service and moderation rules in plain language. If you use algorithms to recommend content or rank user-generated content, disclose the main parameters of your ranking system. Maintain records of moderation decisions and make annual transparency reports public. For French users, you must designate a legal representative in France and register with CNIL or the competent French authority.

Enforcement and timelines

CNIL and France's messaging regulator (Autorité de Régulation de la Communication Audiovisuelle et Numérique, ARCOM) enforce the DSA. Fines reach 6% of annual EU revenue for systematic non-compliance. The first enforcement notices are already in circulation (2024). Reference: DSA full text (EUR-Lex).

Three Industry-Specific Compliance Pitfalls for SaaS in France

Pitfall 1: Under-documenting data processing and DPA gaps

The pattern: Many SaaS founders treat data processing documentation as a checkbox—a GDPR article ticked off without genuine accountability. CNIL's 2023 enforcement wave targeted 15 SaaS and cloud companies for vague or missing Data Processing Agreements with subprocessors. In one case, a French SaaS company using three cloud analytics vendors had signed contracts with only one; the other two were "technical partners" with no written DPA. CNIL fined them €200k for lack of processor control.

Why it happens: DPAs are boring and require vendor negotiation. Many SaaS founders assume their standard terms cover data protection; they don't. Vendors may resist signing separate DPAs; founders capitulate.

How to avoid it: Audit every third party with access to customer data: cloud hosting, payment processors, analytics, CDNs, support tools. Each must have a signed DPA or be a sub-processor under your primary processor's agreement. Keep a register (a simple spreadsheet works). Review annually. CNIL spot-checks this during investigations; gaps are red flags.

Pitfall 2: Applying AI models without bias testing or user disclosure

The pattern: A French HR SaaS startup integrated OpenAI's GPT models into their resume screening feature without documenting training data, testing for demographic bias, or telling users that AI was involved in shortlisting decisions. A candidate group filed a discrimination complaint; CNIL opened an investigation. The startup had no audit trail of bias testing, no consent from users to use AI, and no transparency in their UI disclosing the AI decision-making.

Why it happens: The AI Act feels distant and theoretical; adding a pre-trained model feels like a low-risk feature addition. Founders often see "we use AI" as a feature, not a compliance trigger. Bias testing is expensive and time-consuming.

How to avoid it: Treat any ML model—whether trained in-house or third-party—as a compliance project. Before launch, document the model's training data, test for performance disparities across demographic groups (gender, age, ethnicity if relevant to your use case), and disclose to users that an AI system is involved. For high-risk AI (hiring, lending, access control), run a DPIA and AI Act conformity check. If you cannot document training data (e.g., proprietary black-box models), treat it as highest-risk and consider whether you can use it at all.

Pitfall 3: Neglecting user consent and metadata in communications-enabled SaaS

The pattern: A French startup building a project collaboration tool (similar to Slack or Teams) added email notification and integration features. They auto-enabled email digests and stored metadata (user IP, login times, interaction logs) for analytics without explicit user consent or retention limits. CNIL audited them following a user complaint; no consent forms, no privacy notice explaining metadata use, no retention schedule. Fine: €150k.

Why it happens: Communications features feel like standard UX; founders don't realise that EAC metadata rules apply. "Analytics" feels like a business necessity, not data processing requiring consent.

How to avoid it: If your SaaS includes messaging, email, or real-time comms, assume EAC applies. Obtain explicit consent (not pre-checked boxes) before enabling email notifications, call recording, or metadata logging. Set a retention policy—typically 6 months for billing-related metadata, much shorter for analytics. Document this in your privacy notice. Use a cookie consent tool (compliant with CNIL guidelines) to manage consent for analytics metadata. Test your consent mechanisms; CNIL will.

Getting Started: Compliance Roadmap

Your first step is a compliance audit: map your data flows, identify which regulations apply to your specific product (GDPR is always relevant; AI Act if you use ML; EAC if you have comms features; DSA if you host user-generated content). Assign accountability—a single person responsible for compliance, even if they have other duties. Establish a quarterly review cadence; compliance is not a one-time project.

CNIL publishes regulatory guidance in French and English on cnil.fr. The European Data Protection Board's guidelines (edpb.europa.eu) are authoritative for GDPR interpretation. For AI Act, follow the European Commission's guidance and monitor the AI Office's emerging practices. For DSA, consult ARCOM and BEREC (Body of European Regulators for Electronic Communications).

Use our compliance calendar to track deadlines specific to your industry and France. Enter your SaaS vertical and we'll flag upcoming CNIL consultation periods, enforcement trends, and regulatory changes.

Next Steps

Compliance deadlines are accelerating. GDPR fines are now routine; AI Act enforcement begins in February 2026; DSA audits are underway. The cost of remediation (rewriting consent flows, rebuilding data pipelines, re-training teams) is substantially higher than building compliance from the start. Schedule a compliance review for your SaaS product now. Use our calendar tool to identify France-specific deadlines and milestones for your business—it takes five minutes and will show you exactly what's due and when.


Generate my SaaS calendar