Academic

The DSA's Blind Spot: Algorithmic Audit of Advertising and Minor Profiling on TikTok

arXiv:2603.05653v1 Announce Type: cross Abstract: Adolescents spend an increasing amount of their time in digital environments where their still-developing cognitive capacities leave them unable to recognize or resist commercial persuasion. Article 28(2) of the Digital Service Act (DSA) responds to this vulnerability by prohibiting profiling-based advertising to minors. However, the regulation's narrow definition of "advertisement" excludes current advertising practices including influencer marketing and promotional content that serve functionally equivalent commercial purposes. We provide the first empirical evidence of how this definitional gap operates in practice through an algorithmic audit of TikTok. Our approach deploys sock-puppet accounts simulating a pair of minor and adult users with distinct interest profiles. The content recommended to these users is automatically annotated, enabling systematic statistical analysis across four video categories: containing formal, disclose

arXiv:2603.05653v1 Announce Type: cross Abstract: Adolescents spend an increasing amount of their time in digital environments where their still-developing cognitive capacities leave them unable to recognize or resist commercial persuasion. Article 28(2) of the Digital Service Act (DSA) responds to this vulnerability by prohibiting profiling-based advertising to minors. However, the regulation's narrow definition of "advertisement" excludes current advertising practices including influencer marketing and promotional content that serve functionally equivalent commercial purposes. We provide the first empirical evidence of how this definitional gap operates in practice through an algorithmic audit of TikTok. Our approach deploys sock-puppet accounts simulating a pair of minor and adult users with distinct interest profiles. The content recommended to these users is automatically annotated, enabling systematic statistical analysis across four video categories: containing formal, disclosed, undisclosed and none advertisement; as well as advertisement topical relevance to user's interest. Our findings reveal a stark regulatory paradox. TikTok demonstrates formal compliance with Article 28(2) by shielding minors from profiled formal advertisements, yet both disclosed and undisclosed ads exhibit significant profiling aligned with user interests (5-8 times stronger than for adult formal advertising). The strongest profiling emerges within undisclosed commercial content, where brands/creators fail to label promotional content/paid partnership and the platform neither corrects this omission nor prevents its personalized delivery to minors. We argue that protecting minors requires expanding the regulatory definition of advertisement to encompass brand/influencer marketing and extending the Article 28(2) prohibition accordingly, ensuring that commercial content cannot circumvent protections merely by operating outside formal advertising channels.

Executive Summary

The article critically examines the EU Digital Service Act (DSA) Article 28(2), which prohibits profiling-based advertising to minors. While the regulation appears compliant by excluding minors from formal, profiled advertisements, the authors identify a significant regulatory blind spot: the narrow definition of 'advertisement' excludes influencer marketing and promotional content that functionally equivalent to advertising. Using an algorithmic audit on TikTok with sock-puppet accounts, the study reveals that both disclosed and undisclosed ads exhibit substantial profiling of minors—up to 8 times stronger than in adult advertising—particularly in undisclosed content, which often lacks labeling and platform intervention. The findings expose a legal loophole that allows commercial content to evade DSA protections by operating outside formal advertising boundaries.

Key Points

  • 1. Article 28(2) excludes influencer marketing from 'advertisement' definition.
  • 2. Algorithmic audit reveals significant profiling in undisclosed and disclosed ads for minors.
  • 3. Undisclosed content shows the strongest profiling, with no platform correction or prevention.

Merits

Empirical Evidence

First empirical audit demonstrating the definitional gap in practice, providing concrete statistical data on profiling patterns across content types.

Demerits

Scope Limitation

Analysis confined to TikTok; results may not generalize to other platforms or user demographics without further validation.

Expert Commentary

This study is a pivotal contribution to the discourse on algorithmic governance and youth protection. The authors effectively bridge legal interpretation with computational analysis, exposing a systemic regulatory inconsistency that undermines the DSA’s intended protective effect. Their methodology—leveraging sock-puppet accounts to simulate real-world user behavior—is both innovative and robust. Importantly, the findings challenge the assumption that compliance with regulatory text equates to substantive protection. The article rightly calls for a paradigm shift: protecting minors cannot be achieved through narrow legal definitions alone. To preserve the integrity of youth-focused digital rights, policymakers must adopt a functional, content-based approach to advertising regulation, recognizing that influence and persuasion operate beyond formal advertising structures. This work sets a new benchmark for empirical legal critique in digital governance.

Recommendations

  • 1. Advocate for legislative amendments to expand the DSA’s definition of 'advertisement' to encompass influencer marketing and undisclosed promotional content.
  • 2. Encourage platform-level transparency mandates requiring disclosure of paid partnerships and commercial intent in all content targeting minors.

Sources