GDPR Article 36 Explained: Prior Consultation with Supervisory Authorities (With 5 Practical Examples)

The General Data Protection Regulation (GDPR) is built around one central idea: preventing harm before it happens. While many GDPR articles focus on how organizations must react to incidents or manage personal data on a daily basis, Article 36 is different. It is about anticipation, risk awareness, and proactive dialogue with regulators.

GDPR Article 36 introduces the concept of prior consultation. In simple terms, it obliges organizations to consult their supervisory authority before processing personal data when that processing is likely to result in a high risk to individuals’ rights and freedoms, and when identified risks cannot be sufficiently mitigated.


What Is GDPR Article 36?

GDPR Article 36 is titled:

“Prior consultation”

It requires data controllers to consult their competent supervisory authority before beginning certain high-risk processing activities.

Article 36 applies when:

  • A Data Protection Impact Assessment (DPIA) has been conducted
  • The DPIA shows that the processing would result in a high risk
  • The controller cannot sufficiently reduce or mitigate those risks

In such cases, the organization must not proceed with the processing until it has consulted the supervisory authority.

This obligation reinforces the GDPR’s preventive approach and ensures that regulators have visibility into potentially dangerous processing activities before they go live.


The Purpose of Article 36

Article 36 serves several key purposes:

Preventing Serious Harm

Rather than waiting for violations or data breaches, Article 36 helps stop problematic processing before it impacts individuals.

Encouraging Responsible Innovation

Organizations are not prohibited from innovating, but they are required to pause and consult when risks are unusually high.

Strengthening Accountability

Controllers must demonstrate that they have thought deeply about risks, attempted mitigation, and sought expert oversight where necessary.

Creating Regulatory Dialogue

Article 36 promotes cooperation between organizations and supervisory authorities instead of adversarial enforcement after the fact.


Article 36 and the Link to DPIA (Article 35)

GDPR Article 36 cannot be understood in isolation. It is directly linked to GDPR Article 35, which governs Data Protection Impact Assessments.

The Logical Flow

  1. You plan a new processing activity
  2. You assess whether a DPIA is required (Article 35)
  3. You conduct a DPIA
  4. The DPIA identifies high residual risks
  5. You cannot adequately mitigate those risks
  6. Article 36 is triggered
  7. You must consult the supervisory authority

In other words, prior consultation is the final escalation step when internal risk management is no longer sufficient.


What Counts as “High Risk”?

The GDPR does not define “high risk” numerically, but supervisory authorities and the European Data Protection Board (EDPB) provide guidance.

High risk often involves:

  • Large-scale processing of sensitive data
  • Systematic monitoring of individuals
  • Automated decision-making with legal or significant effects
  • Use of new or untested technologies
  • Processing involving vulnerable individuals (children, patients, employees)
  • Data matching or profiling across multiple datasets

If, after mitigation, these risks remain high, Article 36 applies.


What Must Be Submitted During Prior Consultation?

When consulting the supervisory authority, the controller must provide detailed documentation, including:

  • A description of the intended processing operations
  • The purposes and legal basis of the processing
  • The categories of personal data involved
  • The categories of data subjects
  • The identified risks to rights and freedoms
  • The measures already taken to mitigate risks
  • Contact details of the Data Protection Officer (if applicable)

The authority must have enough information to assess whether the processing complies with the GDPR and whether additional safeguards are required.


How Long Does the Supervisory Authority Have to Respond?

Once a valid consultation request is submitted, the supervisory authority has:

  • Up to 8 weeks to provide written advice
  • This may be extended by 6 additional weeks for complex cases

During this period, the organization should not begin the processing.

The authority may:

  • Approve the processing with no further conditions
  • Recommend additional safeguards
  • Require changes to the processing
  • Prohibit the processing if it violates GDPR principles

What Happens If You Ignore Article 36?

Failure to consult when required is a serious GDPR violation.

Possible consequences include:

  • Administrative fines (up to €10 million or 2% of global turnover)
  • Orders to suspend or stop processing
  • Mandatory changes to systems and workflows
  • Reputational damage
  • Increased scrutiny in future audits

Ignoring Article 36 often signals poor governance and weak accountability, which regulators take seriously.


Who Is Responsible for Prior Consultation?

The obligation rests with the data controller, not the processor.

However:

  • Processors should alert controllers if they believe Article 36 applies
  • Data Protection Officers play a key advisory role
  • Legal and compliance teams typically coordinate the consultation

Ultimately, the controller bears full responsibility for compliance.


Example 1: AI-Based Credit Scoring System

A fintech company plans to introduce an AI-driven credit scoring system that automatically determines loan eligibility.

Identified Risks

  • Fully automated decisions with legal effects
  • Potential discrimination based on indirect data correlations
  • Limited transparency into algorithm logic

Mitigation Attempts

  • Bias testing
  • Model explainability tools
  • Human review in some cases

Why Article 36 Applies

Despite safeguards, the DPIA concludes that residual risk remains high, particularly for marginalized groups.

Outcome

The company consults the supervisory authority, which requires:

  • Mandatory human review for borderline cases
  • Regular fairness audits
  • Clear appeal mechanisms for applicants

Example 2: Nationwide Facial Recognition in Public Transport

A public transport authority wants to deploy facial recognition cameras to detect fare evasion.

Identified Risks

  • Biometric data processing
  • Mass surveillance
  • Risk of false positives
  • Impact on freedom of movement

Mitigation Attempts

  • Short data retention
  • Encryption
  • Restricted access

Why Article 36 Applies

Even with safeguards, biometric surveillance in public spaces poses intrinsically high risks.

Outcome

The supervisory authority advises:

  • Narrower deployment scope
  • Alternative non-biometric solutions
  • Stronger legal justification
    The project is significantly redesigned.

Example 3: Employee Productivity Monitoring Software

A multinational company plans to install software that tracks:

  • Keystrokes
  • Screen activity
  • Idle time
  • Application usage

Identified Risks

  • Continuous employee surveillance
  • Power imbalance in employment context
  • Psychological harm and chilling effects

Mitigation Attempts

  • Transparency notices
  • Limited access
  • Aggregated reporting

Why Article 36 Applies

Employees cannot freely consent, and the monitoring is systematic and intrusive.

Outcome

After consultation, the authority requires:

  • Removal of keystroke logging
  • Use of anonymized productivity metrics
  • Employee consultation and works council approval

Example 4: Health Data Platform for Predictive Diagnostics

A health-tech startup develops a platform that analyzes genetic and medical data to predict disease risks.

Identified Risks

  • Highly sensitive health and genetic data
  • Risk of misuse by insurers or employers
  • Long-term consequences for individuals

Mitigation Attempts

  • Pseudonymization
  • Restricted access
  • Strong encryption

Why Article 36 Applies

The sensitivity and long-term impact of genetic profiling mean high residual risk remains.

Outcome

The supervisory authority mandates:

  • Stricter access controls
  • Independent ethics oversight
  • Clear data deletion timelines
  • Enhanced consent mechanisms

Example 5: Smart City Mobility Tracking System

A city plans to track citizens’ movements using mobile app data and sensors to optimize traffic and infrastructure.

Identified Risks

  • Location tracking
  • Risk of re-identification
  • Function creep over time

Mitigation Attempts

  • Aggregation
  • Short retention periods

Why Article 36 Applies

Large-scale location monitoring creates ongoing risks to privacy and autonomy.

Outcome

The authority requires:

  • Opt-in participation
  • Strong anonymization
  • Explicit legal limits on future data use

Best Practices for Article 36 Compliance

Organizations can reduce friction and risk by following these best practices:

  • Involve the DPO early in system design
  • Treat DPIAs as living documents
  • Document all mitigation decisions
  • Be transparent and cooperative with regulators
  • Avoid viewing consultation as a failure—it is a safeguard

Early engagement often leads to better system design and stronger trust.


Key Takeaways

GDPR Article 36 is not a bureaucratic obstacle—it is a protective mechanism designed to prevent serious harm before it occurs.

If your organization:

  • Conducts high-risk processing
  • Uses emerging or intrusive technologies
  • Handles sensitive data at scale

Then Article 36 may apply.

Understanding and respecting prior consultation obligations is essential for:

  • Legal compliance
  • Ethical data use
  • Sustainable innovation
  • Long-term regulatory trust

Handled correctly, Article 36 becomes a powerful tool for building responsible, future-proof data systems rather than a barrier to progress.