August 2, 2026. The deadline the content industry does not know it has.
The EU AI Act's transparency obligation for AI-generated text has not been delayed. It takes effect in less than four months. Here is the plain-English breakdown for publishers, content teams, and anyone who uses AI to write.
The regulation in one paragraph
Regulation (EU) 2024/1689, Article 50(4), second subparagraph. If you use AI to generate or manipulate text that is published to inform the public on matters of public interest, you must disclose that the text is AI-generated. Unless the content has undergone a process of human review or editorial control and a named person holds editorial responsibility for the publication.
That is the entire obligation. Label it, or prove a human reviewed it. One or the other.
The timeline
EU AI Act enters into force
Regulation published. Obligations phased in over two years.
Prohibited practices take effect
Social scoring, manipulative AI, real-time biometric surveillance banned.
Code of Practice second draft published
Adds specificity to Article 50 documentation requirements. Final version expected June 2026.
EU Parliament votes to delay high-risk AI rules
High-risk system obligations pushed to December 2027. Article 50 transparency obligations are NOT included in the delay. The deadline holds.
Article 50 transparency obligations take effect
From this date, AI-generated text in scope must be labelled or the human review exemption must be documented. Article 99 penalties apply.
The binary choice
From August 2, 2026, every publisher using AI faces this decision:
Path A: Label
Add an AI-generated disclosure to every article produced with AI assistance. Satisfies the regulation. Costs nothing in cash. Costs 62% in reader trust (per published research). Creates a permanent public record that your editorial operation relies on AI.
Path B: Document
Implement a human review process with per-article attestation, a named editorially responsible person, and a compliance file. No label required. Content publishes clean. The documentation sits in your audit file, not on your articles.
There is no Path C. If your content is in scope and you do neither, you are in violation. Article 99 penalties: up to 15 million euros or 3% of global annual turnover, whichever is higher.
Who is in scope
Article 50(4) applies to "text which is published with the purpose of informing the public on matters of public interest." The regulation does not define that phrase precisely, but the scope is broader than most publishers expect.
The grey zone is where most B2B content teams sit. A "Complete Guide to GDPR Compliance" published by a SaaS company makes substantive claims a reader might rely on. The conservative legal position: treat it as in scope. The cost of being wrong on the conservative side is a human review you did not strictly need. The cost of being wrong on the aggressive side is a regulatory inquiry.
The extraterritorial reach
Article 2 of the AI Act applies the regulation to providers and deployers outside the EU "where the output produced by the system is used in the Union." If your AI-generated content is accessible to EU readers, you are potentially in scope regardless of where your company is based.
This mirrors GDPR's extraterritorial approach and has the same enforcement characteristics: legally expansive, practically targeted at the most visible publishers with the most EU readership. But the legal exposure exists from day one, and the trend in EU digital regulation is toward broader enforcement, not narrower.
What the Code of Practice adds
The General-Purpose AI Code of Practice (second draft, March 5, 2026) operationalizes Article 50 with specifics the regulation itself does not provide. For the human review exemption, the relevant guidance points toward:
- Documentation of who performed the review (named individuals)
- Documentation of what the review consisted of
- Identification of the person holding editorial responsibility
- Evidence that the review occurred before publication
The final Code of Practice is expected in June 2026. The documentation bar is rising, not falling.
What this means for your operation
If you use AI in your content workflow and your content informs the public on a topic that matters, you have less than four months to decide between labelling and documenting.
Labelling is instant and free. It also tells every reader, every enterprise customer, and every procurement team that your content is machine-generated. The research on trust impact is unambiguous (see: AI Disclosure Kills Engagement).
Documenting takes setup but preserves your editorial credibility. Your content publishes without a label. Your compliance file sits in a vault. When anyone asks, you have an answer.
Sygil provides the documentation path as a subscription. A written review procedure customized to your workflow. Per-article attestation from named, credentialed reviewers. Monthly compliance reports. A vault that holds everything for the regulatory retention period.
Starting at approximately 400 per month.
Less than four months. Two paths. One decision.
Take the 60-second scope check on our homepage to see if Article 50(4) applies to your content. Or book a 15-minute scoping call and we will walk you through it.