The “Take It Down” Act
On May 19, 2025, the “TAKE IT DOWN” Act (the “Act”)[1] was signed into federal law, establishing criminal punishment for intentional disclosures of nonconsensual sexually explicit content—either AI “deepfakes”[2] or actual depictions—online.[3] The Act makes two major changes to current federal law. First, the Act amends Section 223 of the Communications Act of 1934 to include new criminal prohibition for the publication of certain intimate images.[4] Second, the Act creates new requirements for covered platforms that will be enforced by the Federal Trade Commission.[5] Given the new federal law, it is important for schools and educators alike to know what the Act is and its implications for students, school policies, and campus administrative procedures.
What is the “TAKE IT DOWN” Act?
The Act criminalizes the publication of deepfake AI or nonconsensual sexually explicit content on covered platforms,[6] and requires covered platforms to promptly remove such depictions upon receiving notice of their existence. Specifically, the Act prohibits the online publication of intimate visual content of:
- an adult where publication is intended to harm or does harm the individual—particularly if the content was published without consent or, in the case of an authentic depiction, was created or obtained under circumstances where the adult had a reasonable expectation of privacy, and it was not a matter of public concern; or
- a minor where publication is intended to abuse or harass the minor or to arouse or gratify the sexual desire of any person.
However, liability does not follow when: (1) disclosures involved in lawfully authorized investigative or intelligence proceedings; (2) disclosures made in good faith to law enforcement; (3) medical education; (4) legitimate scientific purposes; or (5) self-made disclosures of the depicted person.
Violators are subject to mandatory restitution and criminal penalties, including prison, a fine, or both.[7] The Act similarly prohibits and criminalizes threats to publish such intimate visual depictions. The Act’s criminal prohibition takes effect immediately, while covered platforms have one year from the effective date (until May 19, 2026) to establish the required notice-and-removal process.
Under the Act, covered platforms are required to establish a process through which subjects of intimate visual depictions may notify the platform of the existence of, and request removal of, an intimate visual depiction. Covered platforms must remove such depictions within 48 hours of notification.
How Independent Schools Can Be Proactive
Schools should update their digital use and safety policies and ensure that faculty and staff are trained to recognize and address online exploitation. This includes establishing clear procedures for reporting content involving students or faculty to both school personnel and law enforcement, educating students and parents about their rights, and updating the Student Handbook to set expectations and policies on violations of the Act. Additionally, schools should promote awareness through age-appropriate curriculum to encourage lawful online activity and inform students about the legal dangers and implications of publishing deepfake or nonconsensual explicit consent to help prevent incidents before they occur. Schools must also prepare for potential litigation by establishing protocols for handling explicit content in investigations, documenting disciplinary actions, and retaining records to support future legal or platform-based removal efforts.
Christopher L. Brigham is a Shareholder at Updike, Kelly & Spellacy’s Hartford office and Chairman of the Employment Practices Group. He focuses his practice on representing and counseling businesses and educational institutions with respect to workplace employment and school law issues. He can be reached at cbrigham@uks.com or (203) 786-8310.
Valerie M. Ferdon is a Shareholder at the firm’s Hartford office and is a member of the Employment Practices Group and the Litigation Practice Group. She can be reached at vferdon@uks.com or (860) 548-2607.
Gillian S. Wilson is an Associate at the firm’s Hartford office and is a member of the Employment Practices Group and the Litigation Practice Group. She can be reached at gwilson@uks.com or (860) 509-5337.
Patricia Moriarty, who assisted in the preparation of this alert, is a Summer Associate at the firms’ Hartford office.
[1] The TAKE IT DOWN Act stands for “Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act.”
[2] Deepfakes or digital forgeries are defined in the Act as “any intimate visual depiction of an identifiable individual created through the use of software, machine learning, artificial intelligence, or any other computer generated or technological means…”
[3] TAKE IT DOWN Act, Pub. L. No. 119-12, S. 146, 119th Cong. (2025).
[4] See 47 U.S.C. § 223.
[5] See 15 U.S.C. § 57a(a)(1)(B). The failure to reasonably comply with the notice and takedown obligations will be considered an unfair or a deceptive act or practice under the Federal Trade Commission Act.
[6] Under the Act, covered platforms are defined as public websites, online services, or applications that primarily provide a forum for user-generated content.
[7] Id. Offenses involving adults will result in a fine, a maximum imprisonment of 2 years or both. Offenses involving minors will result in a fine, a maximum imprisonment of 3 years, or both.