Private members-only forum

Can You Sell Adobe Firefly Images? Commercial Rights & IP Safety (2026)

Started by so_frustrated_rn_19 · Dec 6, 2024 · 5 replies
AI image generation terms and IP protections vary by plan. Verify current Adobe terms before commercial use.

Key Takeaways

Summary generated from 59 posts in this thread
Questions about AI content rights? Book a 30-minute consultation — $125. Schedule now
BD
so_frustrated_rn_19 OP

Adobe keeps marketing Firefly as "commercially safe" because it was only trained on Adobe Stock images and public domain content. Is this actually true? And what are the exact commercial rights?

  • Can I use Firefly images in client brand work?
  • Does Adobe's IP indemnity actually cover me if someone claims infringement?
  • Firefly in Photoshop (Generative Fill) vs standalone — any license difference?
BD
so_frustrated_rn_19 OP

@TransactionalLaw_Dan_34 This is incredibly useful. I had a client last month who wanted to trademark a logo I created using Firefly as a starting point. I did extensive modifications in Illustrator -- probably 80% of the final mark was my own vector work -- but I wasn't sure if the Firefly origin would be a problem.

What kind of documentation do you recommend keeping? Screenshots of each iteration? Illustrator version history? I want to make sure I'm covered if the trademark is ever challenged.

ML
too_tired_for_this_28

@tyler_92_16 The recursive training data concern you're describing is called "model collapse" in the ML research community, and it's a well-documented phenomenon. When generative models are trained on their own outputs, the quality and diversity of outputs degrades over successive generations.

Adobe is aware of this. Their research team published a paper in late 2025 describing their data curation pipeline, which includes filters to identify and exclude AI-generated content from training datasets. They use a combination of C2PA metadata detection and statistical classifiers to flag AI-generated submissions.

However, no filtering system is perfect. Some AI-generated images will inevitably slip through, especially as generation quality improves and becomes harder to distinguish from photographs. This is an industry-wide challenge, not unique to Adobe.

For the purposes of this thread's topic -- commercial rights -- the recursive training issue doesn't directly affect your rights as a Firefly user. Your commercial license and IP indemnity remain the same regardless of what percentage of the training data is AI-generated. But it's worth monitoring because it could affect output quality over time.

MJ
long_time_lurker_28

Not gonna lie, following up on my earlier comparison post with some new info. Midjourney just updated their ToS this week (February 24 apparently, 2026), and there are some notable changes.

Midjourney now explicitly states that subscribers own their outputs and that Midjourney will not assert copyright claims over user-generated images. This was always implied but never explicitly stated. They also added language about "best-effort" IP screening -- they claim to have implemented filters that prevent outputs from closely replicating copyrighted works in their training data.

However, they still offer zero IP indemnification. Their new ToS includes a clause that reads: "Midjourney makes no representations regarding the intellectual property status of outputs and disclaims all liability for third-party IP claims." This is unchanged from their previous terms.

So the gap between Midjourney and Firefly on the legal safety front has actually widened. Midjourney's image quality continues to improve, but their legal protections remain the weakest among the major players. For commercial work, especially anything client-facing or brand-critical, Firefly's legal advantages are now even more pronounced smh.

KW
will_b_28 Attorney

@the_silent_type_20 The EU AI Act has extraterritorial reach, similar to GDPR. Under Article 2, the Act applies to providers and deployers regardless of their location if the AI system's output is "used in the Union."

In your scenario, both parties have obligations. The US agency is a "deployer" of the AI system and has disclosure obligations if they know the content will be distributed in the EU. The German client, as the party publishing the content in the EU market, has the primary disclosure obligation.

In practice, this means US-based agencies working with EU clients need to build EU AI Act compliance into their contracts. The agency should disclose to the client that AI was used, and the client should include appropriate disclosures when publishing in the EU. Both parties should document their compliance steps.

This is essentially the same contractual framework that agencies already use for GDPR data processing. The AI Act adds another layer of compliance, but the mechanics are familiar. If your contracts already address GDPR, adding EU AI Act provisions is a natural extension.

LN
omar_s_8

I'm dealing with a force majeure dispute right now. The other party claims COVID-era supply chain issues are still causing delays in 2026. At some point, a 6-year-old pandemic isn't a force majeure event anymore — it's a known risk that should have been planned for.

Join the discussion. Create a free account to reply.