NetzDG Legal
Content Reporting
Qualitative research into user mental models around Germany's NetzDG law — and data-driven advocacy to a government regulator when a mandated implementation caused a 13% drop in high-severity content reporting.
A Law, Two Reporting Flows, and a Difficult Choice
Germany's Netzwerkdurchsetzungsgesetz (NetzDG) law required Facebook to introduce a dedicated legal reporting flow for specific categories of harmful content, sitting alongside the existing Community Standards (CS) reporting flow. This created an unusual product challenge: users reporting objectionable content now had to decide, in the moment, whether to route their report through German law or through Facebook's own rules.
The research question was deceptively simple: did users understand the difference, and could they reliably make that choice?
flows to navigate
interviews conducted
tested with users
What began as a generative study to inform design became critical evidence in a direct regulatory conversation — ultimately changing how the law was implemented on the platform.
Key Questions
- What leads people to report a post via legal (NetzDG) reporting versus Community Standards reporting — and what drives that choice?
- Is it true that people perceive NetzDG reporting as being for more severe content? What drives that perception?
- How can the scope of legal reporting be defined clearly and comprehensibly so users can make an informed decision?
Qualitative Study in Berlin
Conducted qualitative research in Berlin, Germany, leveraging MindSpark to moderate the research in users' native language, to understand user mental models around both reporting flows. The approach combined individual depth interviews with focus groups, and layered in several interactive exercises to probe decision-making in detail.
Twelve 60-minute IDIs exploring how participants thought about NetzDG and CS reporting, and what drove their sense of which flow was "appropriate" for different content types.
Two 90-minute focus groups to observe collective reasoning around the decision to report — surfacing group norms and shared mental models that individual interviews alone might miss.
Participants were split evenly: 50% had no Facebook account (non-account holders), and 50% had recently reported content to Facebook (active reporters). This ensured the research captured both general public awareness of NetzDG and the experience of people who had actually used the reporting system.
| Activity | Purpose & Design |
|---|---|
| Closed card sort | Participants classified types of reported content by perceived severity and allocated them to either NetzDG or Community Standards. This directly tested whether users could reliably distinguish the two — and surfaced the heuristics they were relying on when they couldn't. |
| Usability testing — 5 concepts | Five distinct reporting flow concepts were tested: Current/Recommended Actions (control), Subtext (contextual info below options), Q&A (guided decision tree), Chevron Split (diverging paths), and Opt-In (soft-language NetzDG entry). Each was designed to test a different hypothesis about how to reduce decisional burden. |
| Co-creation exercise | Participants worked to design their "ideal" reporting solution from scratch, unconstrained by the existing concepts. This surfaced needs and preferences that none of the pre-built concepts had fully addressed — particularly around language and uncertainty. |
Participant split rationale: Including non-account holders was a deliberate choice. NetzDG as a law applies to German residents regardless of whether they use Facebook — and understanding public awareness of the law (not just active user awareness) was important context for how the reporting flow should be framed and labelled.
Users Couldn't Reliably Choose — and Didn't Want To
The central finding was stark: participants could not say with certainty what NetzDG covers, and were deeply reluctant to make what felt like a legal determination. The decision between two reporting paths created anxiety rather than clarity.
- Uncertainty and reluctance: Participants could not say with certainty what NetzDG covers, and were uncomfortable deciding between NetzDG and CS reporting — they did not want to make what felt like a legal determination. Fear of repercussions (particularly that the poster might learn their identity) added to hesitancy.
- Severity perception: Users tended to associate NetzDG with more severe offenses, driven by language indicating "legal" and "law." They guessed its scope was smaller and reserved for extreme content — which led them to underuse it for content that actually fell within its remit.
- What helped: Information explaining NetzDG with specific examples, and clearly delineated reporting paths, helped users feel they were accurately and efficiently reporting. Reducing the burden of the decision was the core design challenge.
| Concept | Performance & User Response |
|---|---|
| Opt-In | Preferred by users who wanted to reduce decision burden. The soft phrasing "I'm of the opinion" gave users confidence to select the NetzDG option without feeling they were making a firm legal declaration. Lowered anxiety around commitment and repercussion. |
| Subtext | Also preferred by users seeking lower decisional burden. Contextual explanatory text beneath each option helped users feel informed without having to navigate away, but only worked well when the explanation was concise and jargon-free. |
| Chevron Split | Preferred by more confident reporters wanting a clear distinction. The visual divergence of paths made NetzDG vs. CS reporting feel genuinely different — but users who became uncertain later in the flow struggled to find a route back to Community Standards reporting. |
| Q&A | Also preferred by confident users, but shared the same loop-back problem. The guided decision-tree logic worked well for users who were certain of the content category, but penalised users who changed their mind partway through. |
| Current / Recommended Actions | Control condition — baseline experience. Users struggled most with this design because no additional context was provided to guide their choice, leaving uncertainty unresolved. |
Design implication: The clearest recommendation from the study was to favour concepts that reduced decisional burden (Opt-In, Subtext) over those that created sharper divergence (Chevron Split, Q&A). Users found it difficult to make confident legal determinations, and designs that acknowledged this uncertainty — rather than demanding resolution — consistently performed better.
When Implementation Went Wrong
The research findings were clear on what users needed. But when the NetzDG reporting flow was eventually implemented — under strict regulatory requirements in the law's wording — the result was a significant decline in key metrics.
High-severity content reporting dropped 13%. This was precisely the kind of content the NetzDG law had been introduced to combat — and the implementation was making it harder, not easier, for users to report it. Strict requirements in the law's wording (for example, requiring the NetzDG form entrypoint to be no more than one click away in the user journey) made it structurally difficult for the product team to implement fixes unilaterally.
The research findings provided a clear explanation for why this was happening: user confusion around multiple entry points, the technical legal language in the regulatory form, and concern about lack of confidentiality were all deterring reporters from completing the NetzDG flow — particularly for the most severe content categories.
Taking the Evidence to the Regulator
With the product team's hands tied by the regulatory requirements, I repurposed the original research and combined it with the post-implementation data to build a case for the German regulator directly. The goal was to demonstrate that the current mandated approach was producing the opposite of the intended effect — and that a research-driven alternative would better serve the law's intent.
The case presented to the regulator focused on four areas:
- Causal evidence of the drop: Data clearly demonstrating the causal relationship between the mandated reporting flow and the 13% decline in high-severity content reports — the content NetzDG was specifically designed to address.
- User confusion around multiple entry points: Research findings showing that the requirement for a prominent, separate NetzDG entry point was creating confusion for users — who, faced with two options, either chose the wrong one or abandoned the report entirely.
- Confidentiality concerns: Evidence that the official NetzDG form's requirement for personal data was deterring reporters — particularly those with concerns about the content creator identifying them. This was having the most severe chilling effect on reports of the most serious content.
- A/B-tested recommendation: A proposal, supported by A/B testing, to simplify the NetzDG entry and implement it in-situ within the existing Facebook reporting flow — rather than as a separate, additional entry point — while retaining full legal compliance.
The initial research had been conducted to inform design — not to argue with a regulator. Repurposing it required reframing the findings around the regulator's own goals, not just Facebook's product goals. The central argument was that the current implementation was actively undermining the law's intent: fewer high-severity reports meant less enforcement, not more.
Combining qualitative and quantitative evidence: The qualitative research from the Berlin study provided the "why" — the specific user behaviours and mental models that explained the drop. The post-implementation data provided the "how much." Neither was sufficient alone: the data showed there was a problem, but the research explained what to do about it.
A/B testing the recommendation: Before approaching the regulator, A/B testing was run to demonstrate that the proposed in-situ approach (integrating NetzDG into the existing reporting flow rather than requiring a separate entry point) would restore reporting rates. Having experimental evidence — not just research findings — was important for a regulatory audience who might otherwise treat user research as subjective.
Addressing the one-click requirement: The regulatory requirement for NetzDG reporting to be accessible within one click was a key constraint. The recommendation did require more than one click to reach the NetzDG path — but the argument was that a path that users could actually find and complete was preferable to one that was technically compliant but deterred reporters. This required the regulator to accept a trade-off between formal compliance and effective compliance.
Regulator-Approved — and it Worked
The German regulator supported the proposed approach. The simplified, in-situ implementation was approved and shipped — and the results were clear.
The approach worked because it was grounded in how users actually think and behave — not how the law assumed they would. The research-driven design aligned with user mental models, reducing confusion and dropoff. Complete mitigation of the 13% decline was the result.
Simplified, in-situ NetzDG reporting path integrated into the existing Facebook reporting flow. Opt-In soft language ("I'm of the opinion") applied to reduce decisional anxiety. Two clearly delineated CTA buttons preserving both reporting paths.
This was an unusual research outcome: findings were used not just to inform a product decision, but to influence a government regulator. It established that user research could be a credible input into regulatory dialogue — not just internal product planning.
Responsibilities
Product team constraints: The product team's hands were tied once the NetzDG flow was live — the regulatory requirements left little room for unilateral fixes. This is why the decision was made to go to the regulator directly rather than attempting to engineer around the problem. The product team were supportive of this approach but needed the research to make it credible.
Policy and legal involvement: Presenting to a government regulator required close coordination with Facebook's policy and legal teams. The framing had to be precise: the argument wasn't that the law was wrong, but that the specific implementation approach was failing to achieve the law's stated goals. This distinction was important for maintaining a productive regulatory relationship.
Research as regulatory evidence: User research is rarely used as primary evidence in regulatory contexts — it's more commonly presented as supporting colour. In this case, the research was the central argument: the qualitative findings explained the mechanism, and the data showed the effect. Ensuring the research methodology was defensible to a non-research audience (including lawyers and policy officials) required extra rigour in how the work was documented and presented.
A/B testing as the bridge: The decision to run A/B tests before approaching the regulator was critical. Without experimental evidence that the alternative approach would restore reporting rates, the recommendation could have been dismissed as theoretical. The A/B results transformed the research from "here's what users say" to "here's evidence it will work" — a meaningful shift for a regulatory audience.