Examples of GDPR Article 8 – Conditions Applicable to Child’s Consent in Relation to Information Society Services

The General Data Protection Regulation (GDPR) represents one of the most comprehensive frameworks for data privacy and protection worldwide. Within its provisions, Article 8 serves a unique and important purpose — it regulates how consent from children is obtained when they use online services, such as social media platforms, educational websites, gaming networks, and mobile applications.

This article acknowledges that children may not fully understand the implications of sharing their personal data online. Therefore, it imposes additional obligations on service providers and establishes clear age thresholds for valid consent. In this article, we will explore realistic examples and applications of GDPR Article 8, demonstrating how it is implemented across different types of online services and industries.


1. What GDPR Article 8 Actually Covers

Article 8 of the GDPR focuses on the conditions under which a child’s consent is valid for the processing of their personal data in the context of “information society services.” These services include any commercial online activity — from streaming platforms and messaging apps to online education tools and virtual reality environments.

According to Article 8:

  • If a child is under 16 years old, their personal data can be processed only if consent is given or authorized by a parent or guardian.
  • EU Member States can lower this threshold to no less than 13 years old, depending on national law.
  • The service provider must make reasonable efforts to verify that parental consent has been given, taking into account available technology.

In simple terms, Article 8 ensures that online companies do not exploit the lack of understanding children may have about privacy, data collection, or digital risks.


2. Example 1: Social Media Platforms (Instagram, TikTok, Facebook)

Social media is one of the most common areas where GDPR Article 8 applies. Platforms like Instagram or TikTok often attract millions of young users under 18, and many of them fall below the consent age set in their respective countries.

For example, in countries like France, Germany, and the Netherlands, the minimum age for a child’s consent under GDPR is 16. In contrast, Ireland and Spain have lowered it to 13. This means that TikTok users in France under 16 must have their parents’ consent to create an account, whereas in Spain, children as young as 13 can register independently.

Practical Application

A social media platform operating in the EU must:

  • Ask the user to confirm their date of birth during registration.
  • If the user’s age is below the threshold, prompt for parental authorization before allowing account creation.
  • Provide an easy and transparent explanation of how personal data will be used, written in child-friendly language.

Verification Methods

In practice, platforms might:

  • Send an email to the parent or guardian for approval.
  • Request that the parent log in via a verified identity service.
  • Use automated systems that check digital signatures or official parental consent forms.

These methods must be reasonable and proportionate to the risk involved in processing the child’s data. For instance, creating a social media profile poses a higher risk of data misuse than registering for a basic educational platform.


3. Example 2: Online Learning Platforms and Virtual Classrooms

Educational services have seen significant growth, especially after 2020. Platforms such as virtual classrooms, language-learning websites, or math tutoring apps collect personal data like names, email addresses, grades, and even video or audio recordings of children during classes.

Under GDPR Article 8, such platforms must:

  • Obtain verifiable parental consent before creating an account for a child below the national threshold.
  • Clearly explain what kind of data is collected — for instance, class recordings, student performance data, and communication logs.
  • Allow parents to withdraw consent or request data deletion at any time.

Example in Practice

A language-learning app that offers courses for children aged 10–15 across Europe must adapt its registration process.

  • In Germany, the platform cannot accept a child’s consent unless it verifies parental permission.
  • In Portugal, where the age of consent is 13, it can accept direct consent from a 14-year-old but not from a 12-year-old.

The platform might send a verification link to the parent’s email, requiring explicit authorization before account activation.

This ensures transparency, parental control, and compliance with Article 8 requirements.


4. Example 3: Online Gaming Networks and Apps

Online gaming platforms — such as PlayStation Network, Xbox Live, or mobile gaming apps — often collect large volumes of personal data: usernames, chat logs, geolocation data, purchase histories, and even behavioral analytics.

Children, especially those under 16, may not realize how much information they are sharing. Some may use real names, reveal personal details in chat rooms, or agree to data collection for in-game advertising.

Compliance Example

To comply with GDPR Article 8:

  • Gaming companies must check the user’s age during registration.
  • If underage, they must seek verifiable parental consent before collecting any data.
  • Parents must have the ability to monitor and manage privacy settings, such as voice chat visibility, friend requests, and location sharing.

Case Example

A popular mobile game that includes in-app purchases for children aged 9–14 added a feature requiring parents to digitally sign consent forms before the child could start playing.
In this case:

  • The app provides clear information about data use, such as storing progress or offering personalized recommendations.
  • Parents can later revoke consent, leading to data deletion or account suspension.

This practice directly aligns with Article 8, ensuring that minors do not enter a digital contract without adult supervision.


5. Example 4: Streaming and Entertainment Services

Streaming platforms like Netflix, YouTube, and Spotify offer personalized recommendations based on user activity — which means they process data such as watch history, search queries, and device identifiers.

For children under the GDPR consent threshold, this profiling and data collection cannot happen without proper authorization.

YouTube Kids Example

YouTube Kids, available in many EU countries, operates differently from regular YouTube. The service requires a parent or guardian to create and manage the child’s account through their own profile.
Parents can:

  • Control search settings.
  • Review content history.
  • Limit the kind of videos available.

This example illustrates how Article 8 fosters data minimization and control by guardians, making sure the child’s digital footprint remains protected.


6. Example 5: E-Commerce Platforms and Online Stores

Online shopping websites that sell toys, digital goods, or games often collect users’ names, addresses, and payment details. When these websites are directed toward children, GDPR Article 8 is triggered.

Practical Implementation

Imagine an online store selling children’s educational toys that encourages users to sign up for a newsletter or loyalty program. If a child aged 12 tries to subscribe:

  • The company must request parental authorization.
  • The newsletter must not contain marketing materials that exploit children’s inexperience or pressure them into purchases.
  • Personal data collected (e.g., email addresses) must not be used for profiling or behavioral advertising without parental consent.

Such practices ensure fair data handling and align with GDPR’s emphasis on transparency and lawful processing.


7. Example 6: Mobile Apps for Children

Many mobile apps — from interactive storybooks to drawing tools — target young users directly. Under Article 8, these developers must implement child-appropriate consent mechanisms and privacy-by-design principles.

Implementation Example

A mobile app designed for kids aged 8–12 may:

  • Ask for the parent’s email during setup.
  • Send a consent form explaining how the child’s drawings and user data are stored.
  • Restrict app use until the parent approves the request.

If a parent later withdraws consent, the app must delete all stored information associated with the child’s account.
This gives guardians real-time control over the child’s personal data and ensures accountability for app developers.


8. Example 7: Artificial Intelligence Chatbots and Virtual Companions

Some AI chatbots or virtual assistant apps target teenagers, offering features like emotional support, language practice, or companionship. These services often process sensitive data, such as messages or voice recordings.

GDPR Article 8 in Action

If such a service allows children under the national consent age to register:

  • The developer must verify parental consent before activating the account.
  • The data collected must be limited to what is necessary for service functionality.
  • Conversations and personal details cannot be used for marketing or AI model training without explicit consent from parents.

This ensures children are not unknowingly contributing personal data for commercial purposes or subjected to psychological profiling.


9. Example 8: Health and Fitness Apps for Minors

Health-tracking applications, such as step counters or diet planners, increasingly appeal to teenagers. Some apps even collect sensitive biometric data like heart rate, sleep patterns, or calorie intake.

Under GDPR, processing health-related data requires stronger protection, and when minors are involved, Article 8 adds an additional layer of compliance.

Illustration

A fitness app that markets to users aged 13–17 in the EU must:

  • Explicitly mention that parental consent is required if the user is below the national threshold.
  • Ensure that consent forms include details on data storage, retention periods, and rights to erasure.
  • Implement strict security measures to prevent data sharing with advertisers.

This combination of measures demonstrates a high level of compliance with GDPR’s privacy expectations for young users.


10. Example 9: Messaging and Communication Apps

Apps like WhatsApp, Telegram, or Signal allow the exchange of personal messages, images, and videos — often involving children communicating with peers. Such services are covered by GDPR Article 8 if they process data for users under the consent threshold.

Practical Case

WhatsApp sets its minimum age for European users at 16, aligning with Article 8’s default rule. Users younger than this must have a parent or guardian’s authorization.
This is verified by self-declaration at registration and through user reports. Although the process is not perfect, it demonstrates an effort to meet GDPR’s “reasonable verification” standard.


11. Challenges in Implementing Article 8

While the intent of Article 8 is clear, implementing it presents challenges:

  • Verification difficulties: It’s not always easy to verify parental consent without collecting even more data.
  • Differences across countries: Varying age limits (13–16) create complexity for global platforms.
  • Balancing privacy and accessibility: Overly strict consent systems might discourage legitimate use of educational or communication tools by minors.

Despite these challenges, Article 8 has led many companies to rethink how they design services for young audiences, leading to more privacy-focused solutions.


12. Key Takeaways from Examples

Across all examples, the following principles consistently appear:

  1. Age Verification: Every platform must determine the user’s age before collecting data.
  2. Parental Consent: Services directed at children must ensure parental or guardian approval.
  3. Transparency: Explanations of data use must be simple, honest, and understandable to minors.
  4. Data Minimization: Only essential data should be collected from children.
  5. Right to Withdraw: Parents and children should have the ability to revoke consent and request data deletion.

These elements form the backbone of GDPR Article 8 compliance and represent the spirit of ethical digital engagement with minors.


Conclusion

GDPR Article 8 reflects a modern understanding of how children interact with technology. Its requirements protect minors from data exploitation, ensure parental involvement, and encourage companies to design safer digital environments.

From social media networks and online classrooms to gaming and mobile apps, the examples of GDPR Article 8 in action highlight how consent, verification, and transparency must adapt to protect the most vulnerable users of the internet — children.