DEFIANCE Act: Federal Lawsuit Rights for AI Deepfake Victims
Senate passed S.1837 unanimously on January 13, 2026. If enacted, victims of nonconsensual "intimate digital forgeries" can sue for $150,000-$250,000 liquidated damages plus attorney's fees. House action pending.
Senate: PASSED House: PENDING📋 DEFIANCE Act (S.1837) Status Tracker
What Is the DEFIANCE Act?
⚗ What the DEFIANCE Act Does
The DEFIANCE Act (Disrupt Explicit Forged Images and Non-Consensual Edits Act, S.1837) amends federal law to create a civil cause of action for victims of nonconsensual "intimate digital forgeries" - AI-generated or manipulated sexual imagery.
Core Provision: New Federal Civil Remedy
Victims can sue in federal court anyone who knowingly produced, possessed with intent to disclose, disclosed, or solicited and received an "intimate digital forgery" without consent.
Amends 15 U.S.C. § 6851(b)
Key Features
- Liquidated damages: $150,000 (or $250,000 if tied to stalking, harassment, or sexual assault)
- Attorney's fees: Prevailing plaintiffs recover costs and reasonable fees
- Equitable relief: TROs, preliminary injunctions, permanent injunctions ordering deletion
- Privacy protections: Pseudonymous filing, sealing, protective orders
- 10-year statute of limitations: From discovery or victim turning 18
What's an "Intimate Digital Forgery"?
The bill defines this precisely:
- An intimate visual depiction of an identifiable individual that
- Falsely represents the individual or the intimate conduct (in whole or part)
- Created via software, machine learning, AI, or other computer-generated means
- Is indistinguishable from authentic imagery to a reasonable viewer
Labels don't save you: Adding "fake" or "AI-generated" disclaimers does not take content out of scope.
🏛 Two Federal Laws: DEFIANCE vs. TAKE IT DOWN
There are now two complementary federal tools for victims:
| TAKE IT DOWN Act | DEFIANCE Act | |
|---|---|---|
| Status | Signed into law (May 19, 2025) | Passed Senate (Jan 13, 2026) |
| Focus | Platform removal + criminal penalties | Victim civil lawsuits |
| Mechanism | 48-hour takedown after valid notice | Federal court lawsuit for damages |
| Covers | Real NCII + deepfakes | AI deepfakes specifically |
| Remedy | Content removal + up to 2-3 years criminal | $150K-$250K + fees + injunctions |
| Platform deadline | May 19, 2026 | N/A (targets individuals) |
Use both: File a TAKE IT DOWN notice to get content removed fast, then pursue DEFIANCE damages against the creator/distributor.
🔥 Why Congress Acted Now: The Grok Deepfake Crisis
The DEFIANCE Act was fast-tracked after reports that xAI's Grok was being used to generate nonconsensual sexually explicit deepfakes at scale.
California AG Investigation (Jan 2026)
Attorney General Rob Bonta announced an investigation into xAI, citing reports that Grok features were used to create nonconsensual sexually explicit images, including of women and children.
The Scale Problem
- "Nudification" apps can generate realistic fake nudes from clothed photos in seconds
- No technical barrier: Anyone with a smartphone can create deepfake porn
- Viral spread: Content spreads across platforms faster than takedown requests
- Minors targeted: High school students creating deepfakes of classmates
Senator Dick Durbin moved to fast-track S.1837 specifically citing the Grok controversy and the need for victims to have meaningful legal recourse.
👤 Who Can Be Sued Under DEFIANCE
Liable Parties
- Creators: Anyone who knowingly produced the deepfake
- Possessors with intent: Anyone who possesses it intending to disclose
- Distributors: Anyone who disclosed it without consent
- Solicitors: Anyone who solicited and received it
Knowledge Requirement
Plaintiff must show defendant knew or recklessly disregarded that:
- The victim did not consent to the specific conduct (creation/disclosure/etc.)
- The content depicts the victim in intimate circumstances
Platform Liability?
Section 230 likely still shields platforms from DEFIANCE claims as intermediaries. The bill targets creators and distributors, not hosts. Use TAKE IT DOWN for platform removal.
Elements of a DEFIANCE Act Claim
📖 Threshold Definitions (Element 0)
Before analyzing the claim elements, these definitions must be satisfied:
"Identifiable Individual"
The person's body appears (in whole or part) in the intimate visual depiction or digital forgery
Identifiable by face, likeness, or other distinguishing characteristic (including from information displayed with the depiction)
"Intimate Digital Forgery"
Falsely represents (in whole or part) either the identifiable individual OR the conduct/content that makes it "intimate"
Created via software, machine learning, AI, or other computer-generated/technological means (including adapting/modifying/manipulating authentic imagery)
Indistinguishable from an authentic depiction when viewed as a whole by a reasonable person
Labels Don't Matter
Content still qualifies as an "intimate digital forgery" even if labeled "fake," even if context suggests it's fake, even if a disclaimer is attached.
⚗ Claim Type 1: Activity Claim (Production/Disclosure/etc.)
This is the primary claim covering most deepfake scenarios:
Elements to Prove
Plaintiff is the subject of an intimate digital forgery (satisfies definitions above)
Defendant knowingly did at least ONE of: (a) produced it, (b) possessed with intent to disclose, (c) disclosed it, or (d) solicited and received it
Plaintiff did not consent to the specific conduct (production/possession/disclosure/solicitation)
Defendant knew or recklessly disregarded plaintiff's lack of consent
The conduct is in or affects interstate/foreign commerce or uses any means or facility of such commerce
Commerce nexus is easy: Using the internet satisfies this element.
🔨 Claim Type 2: Production-Only Claim (Requires Harm)
A narrower claim specifically for production that requires showing harm:
Elements to Prove
Same as Claim Type 1
Defendant knowingly produced the intimate digital forgery
Plaintiff did not consent AND defendant knew or recklessly disregarded this
Plaintiff was harmed OR was reasonably likely to be harmed by the production
Same as Claim Type 1
When to use: When deepfake was created but not yet distributed. The harm requirement is the trade-off for catching pre-disclosure conduct.
💰 Available Remedies
Damages
| Type | Amount | Notes |
|---|---|---|
| Liquidated (standard) | $150,000 | Elect instead of proving actual damages |
| Liquidated (aggravated) | $250,000 | If tied to stalking, harassment, or sexual assault |
| Actual damages | Variable | Includes defendant's profits attributable to conduct |
| Punitive damages | Variable | Available for egregious conduct |
Other Relief
- Attorney's fees and costs: Prevailing plaintiffs recover
- Injunctive relief: TRO, preliminary injunction, permanent injunction
- Deletion orders: Court can order defendant to delete/destroy content
- Privacy protections: Pseudonym filing, sealing, protective orders keeping content under court control
Statute of Limitations
Actions must be filed within 10 years of the later of:
- When victim reasonably discovers the violation, OR
- The date the victim turns 18
Evidence Preservation Checklist
⚠ If you just discovered a deepfake of yourself:
- Screenshot everything NOW - URL, username, timestamp, comments, the content itself
- Screen record the page - Scroll from profile to post, showing URL bar throughout
- Download the file if possible - Many platforms allow saving
- File platform report - Use NCII/intimate image reporting (TAKE IT DOWN Act compliant)
- Do NOT confront the poster yet - They may delete evidence
- Contact an attorney - For preservation notices and litigation strategy
⏰ Immediate (0-60 Minutes)
📋 Evidence Capture Checklist
If There's Extortion
Preserve: demand amount, payment method, wallet address, deadline, exact wording. This escalates criminal exposure significantly.
📅 Same Day Steps
📋 Documentation Checklist
Preservation Notice Content
Your preservation demand to platforms and suspected posters should request they preserve:
- Upload timestamps and IP logs
- Device identifiers
- Account recovery email/phone
- Payment records if content was promoted/monetized
- All communications related to the content
📆 Next 72 Hours
📋 Litigation Prep Checklist
Need Help Preserving Evidence?
Fixed-fee services for evidence preservation and litigation prep.
Evidence Preservation Package
Professional forensic capture, hash documentation, chain of custody log, exhibit index.
Get StartedPreservation + Takedown
Evidence package + platform takedown notices + 48-hour follow-up on removals.
Get StartedTemplate Letters
📩 Platform Takedown Notice (TAKE IT DOWN Act)
Use this template for platform reporting. After May 19, 2026, covered platforms must remove content within 48 hours of a valid request.
Platform Reporting Links
⚗ Demand Letter to Creator/Uploader
Use when you've identified the creator or uploader. This letter puts them on notice and preserves your litigation options.
Bill Status Note
DEFIANCE has passed the Senate but is not yet law. This template cites it as "pending" legislation that will expand liability. Do not represent it as enacted until it passes the House and is signed.
Need Custom Letters?
Attorney-drafted letters tailored to your specific situation.
Platform Takedown Letter
Attorney-drafted notice citing TAKE IT DOWN Act. Sent within 24 hours.
Get StartedDemand Letter to Creator
Formal cease & desist with litigation hold, citing federal and state exposure.
Get StartedFull Package
Platform notices + demand letter + evidence preservation + 30-day follow-up.
Get StartedCalifornia Deepfake Laws
🐻 California State Remedies
California provides state-level remedies that work alongside federal law. You can pursue both.
Civil Code §1708.86 (AB 621, Chapter 673)
Creates civil liability for creating or distributing sexually explicit deepfakes.
- Statutory damages: $1,500-$50,000 per violation
- Malice enhancement: Up to $250,000 if defendant acted with malice
- Plus: Actual damages, injunctions, attorney's fees
Chapter 673 (2025) | Effective: January 1, 2026
Civil Code §1708.85 - Expanded NCII Remedy
Original revenge porn civil remedy now explicitly covers AI-generated content depicting identifiable individuals.
- General or special damages
- Injunctions
- Attorney's fees
- Pseudonymous filing to protect victim identity
Penal Code §647(j)(4) - Criminal Revenge Porn
Misdemeanor to intentionally distribute intimate images without consent with intent to cause distress. Report to local police or DA.
⚖ Federal vs. California Claims: Strategy
| Federal (DEFIANCE) | California (§1708.86) | |
|---|---|---|
| Status | Pending (passed Senate) | In effect (Jan 1, 2026) |
| Liquidated damages | $150K-$250K | $1.5K-$50K (up to $250K w/ malice) |
| Venue | Federal court | State court |
| SOL | 10 years | 3 years (general tort) |
| Discovery | Federal rules | CA rules |
Strategic Considerations
- If DEFIANCE passes: Federal court may be better for higher liquidated damages and longer SOL
- Currently: California state claims are your primary civil remedy
- Both: You can plead state claims in federal court under supplemental jurisdiction
- Malice: California's $250K malice enhancement may exceed DEFIANCE in egregious cases
Frequently Asked Questions
❔ DEFIANCE Act FAQ
Questions about your deepfake case?
Schedule a confidential consultation with a California-licensed attorney.
Book Consultation