Exit
  • Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
  • 한국 커뮤니티
15

How (AI) Data Ethics Shape Your Creative Workflow?

Community Expert ,
Sep 29, 2025 Sep 29, 2025

One thing that really stands out to me with Adobe’s Firefly-powered features, especially the new Harmonize feature in Photoshop, is that it’s trained only on licensed or public-domain data, never on our personal projects.  This means the native Adobe AI model in Photoshop (Firefly) is designed to work for you, not from you. In other words, it helps you create without using your personal projects as training data. 

By contrast, other AI models may use different methods and don’t always provide the same clarity, leaving the consent process less transparent.

 

So here are my questions for the community:

  • Do you think an ethical approach to training data changes how much you trust AI tools?

  • Would transparency about not using your work without consent make you more likely to try features like Harmonize?

  • More broadly, how does the way AI models are trained affect your creativity and your willingness to see them as partners in your process?

I’d love to hear how this transparency shapes your willingness to bring AI into your creative workflow, whether you’re using Adobe’s tools or not.

TOPICS
iPadOS , macOS , Phone , Web , Windows
3.3K
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe
New Here ,
Sep 30, 2025 Sep 30, 2025

Transparency in how AI models are trained really makes a difference. If I know my personal projects won’t be used without consent, I feel more comfortable experimenting with features like Harmonize. It creates trust and allows me to see AI as a supportive tool rather than something I need to guard against. In creative work, that sense of ethical assurance often matters as much as the results.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Sep 30, 2025 Sep 30, 2025

Thank you for sharing your perspective, @stromt_9157. A sense of security is an essential pillar for creative work, and insecurity can certainly lead to creative blocks.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Oct 02, 2025 Oct 02, 2025

You make a thoughtful point. Security really does give people the freedom to take risks and express themselves fully, while insecurity often creates hesitation. I’ve noticed the same idea applies in many areas of life, whether it’s in art, work, or even planning smooth experiences with services like seattleblacklimo.com, stability allows the focus to shift toward creativity rather than worry.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Oct 01, 2025 Oct 01, 2025

Great question, Valdair. For me, data ethics directly impacts trust—and trust is what determines whether I’m willing to invite an AI tool into my creative process.

When a company makes it clear that my personal or client projects won’t be repurposed as training data, it lowers the barrier to experimentation. I can explore features like Harmonize without worrying that my unique style or proprietary work is being fed into a massive model that others might benefit from without my consent. That sense of security makes me more comfortable using the tool freely, which ironically leads to more creativity, not less.

On the flip side, when models are vague about their training sources or when consent feels like an afterthought, I tend to hold back. I’ll use them for drafts, tests, or ideation, but rarely for final client-facing projects. The lack of transparency makes me treat the AI as a “sandbox toy” rather than a genuine partner.

 

So yes—the ethics behind the training data shapes not just my trust, but the depth of integration AI has in my workflow. Transparency turns the tool from something I cautiously test into something I can confidently collaborate with.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Oct 01, 2025 Oct 01, 2025

Thanks a lot for your thoughtful contribution, @Amy_Greenz it really resonates. I especially connect with the way you framed transparency as the difference between treating AI like a “sandbox toy” versus a true collaborator. That metaphor captures the reality of how trust shapes not just adoption but depth of use.

 

I’d also add that this ethical clarity doesn’t only affect how much we use AI, but what kinds of projects we feel safe bringing it into. Without that foundation of trust, many creatives (myself included) will hold back on client work or high-stakes projects, which means the tool never reaches its full potential in our workflows. With transparency, on the other hand, it’s not just experimentation that grows, it’s confidence, speed, and the willingness to explore new creative directions

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Oct 03, 2025 Oct 03, 2025

I really think ethics play a huge role in trusting AI tools. The fact that Firefly is trained only on licensed and public data makes me a lot more confident using it — I don’t have to worry about my own projects being taken without consent. That kind of clarity is rare, and it makes a difference.

With other tools, where the training process isn’t as transparent, I sometimes hesitate because I’m not sure what’s happening behind the scenes. And when you’re doing creative work, that little bit of doubt can hold you back.

So yes, transparency like this definitely makes me more willing to try features like Harmonize and actually see AI as something that supports my creativity rather than something I need to be cautious about.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Oct 07, 2025 Oct 07, 2025

That's a great point of view! I feel the same! Thanks for sharing @walter_0059 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Oct 03, 2025 Oct 03, 2025

Yes, an ethical approach definitely makes me trust AI tools more. Knowing my work won’t be used without consent gives me peace of mind and makes me more open to experimenting, since it feels like the AI is truly assisting rather than taking.

 
 
 
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Oct 08, 2025 Oct 08, 2025

Thanks for sharing your point, John!

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Oct 04, 2025 Oct 04, 2025

Absolutely—knowing my work isn’t being used to train the model builds real trust. That kind of transparency makes me much more open to exploring features like Harmonize and seeing AI as a true creative partner.

 
 
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Oct 09, 2025 Oct 09, 2025

Thank you so much for sharing, Ethan 🙂 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Oct 05, 2025 Oct 05, 2025

Absolutely, an ethical approach to training data makes a big difference in how much I trust AI tools. Knowing that Adobe Firefly is trained only on licensed or public-domain content gives me more confidence to use features like Harmonize without worrying about where the AI is pulling inspiration from.

Transparency is key. When I know my personal work isn’t being used to train the AI without my consent, I’m far more open to exploring what it can do. It feels more like a collaboration than a risk.

In my workflow, this kind of clarity makes me more willing to experiment with AI. It shifts the dynamic from “Will this steal my style?” to “How can this enhance my vision?”, which is empowering, creatively.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Oct 06, 2025 Oct 06, 2025

Absolutely — transparency makes a huge difference. Knowing that Firefly is trained only on licensed or public-domain data builds real trust. When creators feel their work is respected, it’s easier to see AI as a true creative partner, not a threat. Ethical AI = more confidence, more creativity.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Oct 06, 2025 Oct 06, 2025

Great questions, Welder! Yes, knowing that an AI like Firefly is trained on licensed data definitely gives me more confidence in it. That kind of transparency makes me feel safer and more open to using these tools in my creative work.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Oct 08, 2025 Oct 08, 2025

Great! Thank you so much for sharing your opinion @nulls_2046 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Oct 07, 2025 Oct 07, 2025

É sem dúvida uma dúvida muito importante e vale muito a pena a reflexão.

Eu mesmo estou neste momento em um dilema, haja vista em meu trabalho ser necessário auditar uma quantidade de arquivos imensa e dentre as várias IA's existentes no mercado, fico receoso de utilizá-las para me auxiliar nesta análise de documentos, pois são documentos sigilosos.

Uma IA que transmite confiança já teria sem dúvida me ajudado a tomar uma decisão e acelerado meu trabalho.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Oct 07, 2025 Oct 07, 2025

Hi Manuel! I suggest you to contact the Adobe Support Team to get more details in your case

https://helpx.adobe.com/contact.html

 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Oct 07, 2025 Oct 07, 2025

Honestly, I think ethics in AI training data make a huge difference in how much trust users have.
When I know a model like Firefly is trained only on licensed or public-domain content — and not on my personal work — I feel a lot more comfortable experimenting with it.

It’s not just about protecting artists’ rights, it’s also about creating a sense of respect and collaboration between the tool and the creator. Transparency like this makes AI feel less like it’s “taking” from us and more like it’s “working with” us.

For me, that’s exactly what encourages creativity — knowing the tech I’m using aligns with my own values.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Oct 19, 2025 Oct 19, 2025

Absolutely agree — trust is everything when it comes to creative tools. When users feel confident that their work and others’ creations are respected, it changes the entire relationship with AI. Models trained on ethical data don’t just protect rights — they foster genuine collaboration and make creators feel part of the process rather than exploited by it. That’s the kind of ecosystem that inspires long-term creativity.

 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Oct 09, 2025 Oct 09, 2025

Sorry to hear that @daughtrey_7277 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Oct 09, 2025 Oct 09, 2025

Thank you so much for bringing your view, @focused_enthusiasm3800 , yes, especially on websites with forms collecting data it gets even more concerning. 🙂 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Oct 11, 2025 Oct 11, 2025

Yes, an ethical approach to training data builds trust—knowing my work won’t be used without consent makes me more comfortable exploring AI features like Harmonize. Transparency like Adobe’s sets a positive standard and makes AI feel more like a true creative partner, not a silent competitor.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Oct 13, 2025 Oct 13, 2025

That’s actually something I’ve been thinking about too.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Oct 21, 2025 Oct 21, 2025
LATEST

thanks for you

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines