In a statement published today, the Free Speech Coalition reminded adult industry stakeholders that certain provisions of the TAKE IT DOWN Act, legislation that “created a federal criminal prohibition on the nonconsensual publishing of intimate images (including AI-generated “deepfakes”) and requires covered platforms to establish a notice-and-removal process for such content within 48 hours of a valid request,” take effect on May 19, 2026.
While the ban on nonconsensual imagery took effect immediately after the Act was signed into law, the notice-and-removal requirements are the portions that come into force on May 19.
As noted in the FSC post, the law criminalizes two categories of content: “authentic intimate visual depictions published without consent” and “digital forgeries.” The latter category includes “AI-generated or otherwise computer-manipulated intimate images of an identifiable individual that a reasonable person would find indistinguishable from authentic depictions.”
FSC observed that “any person who knowingly publishes such content using an interactive computer service” is liable under the law, adding that this “targets the individual uploader/publisher, not the platform.”
Under the notice and removal obligations for covered platforms (which takes effect May 19), “websites, online services, online applications, or mobile applications that serve the public and primarily provide a forum for user-generated content (including messages, videos, images, and audio)” are required to comply with these provisions.
“Covered platforms must establish a process by which an individual (or their authorized representative) can submit a removal request,” FSC noted. “The request must include a signature, identification of the content, a good faith statement that it was published without consent, and contact information.”
As for the removal timeline, upon receiving a valid request, covered platforms “must remove the content as soon as possible, but no later than 48 hours after receipt,” FSC explained. “Platforms must also make reasonable efforts to identify and remove known identical copies.”
In terms of the notice requirements, covered platforms must “post a clear, conspicuous, plain-language notice of their removal process and how to submit a request.”
“Failure to comply with the notice-and-removal obligations is treated as an unfair or deceptive act or practice under the FTC Act, enforced by the Federal Trade Commission,” FSC added.
FSC noted that the Act’s definition of the term “covered platform” is broad enough to include “most sites that host user-generated content.”
“Platforms that host any user-uploaded content should assume they are covered and consult with counsel,” FSC added.
FSC also observed that under the law “consent to create an intimate visual depiction does not equal consent to publish it.”
Observing that covered platforms must respond to “valid” removal requests, FSC explained that such requests must be in writing and include the following information:
- a physical or electronic signature of the requestor (or their representative)
- identification of, and information sufficient for the platform to locate, the offending content
- a statement of the requestor’s good-faith belief that the depiction was not consensual
- the requestor’s contact information
FSC also noted the law “includes no provisions that address how platforms can or should deal with erroneous or fraudulent removal requests.”
You can read the full statement on the FSC website.







