Skip to main content
Valdair Leonardo
Community Expert
Community Expert
September 29, 2025
StickyQuestion

How (AI) Data Ethics Shape Your Creative Workflow?

  • September 29, 2025
  • 33 replies
  • 3823 views

One thing that really stands out to me with Adobe’s Firefly-powered features, especially the new Harmonize feature in Photoshop, is that it’s trained only on licensed or public-domain data, never on our personal projects.  This means the native Adobe AI model in Photoshop (Firefly) is designed to work for you, not from you. In other words, it helps you create without using your personal projects as training data. 

By contrast, other AI models may use different methods and don’t always provide the same clarity, leaving the consent process less transparent.

 

So here are my questions for the community:

  • Do you think an ethical approach to training data changes how much you trust AI tools?

  • Would transparency about not using your work without consent make you more likely to try features like Harmonize?

  • More broadly, how does the way AI models are trained affect your creativity and your willingness to see them as partners in your process?

I’d love to hear how this transparency shapes your willingness to bring AI into your creative workflow, whether you’re using Adobe’s tools or not.

33 replies

Participant
October 3, 2025

I really think ethics play a huge role in trusting AI tools. The fact that Firefly is trained only on licensed and public data makes me a lot more confident using it — I don’t have to worry about my own projects being taken without consent. That kind of clarity is rare, and it makes a difference.

With other tools, where the training process isn’t as transparent, I sometimes hesitate because I’m not sure what’s happening behind the scenes. And when you’re doing creative work, that little bit of doubt can hold you back.

So yes, transparency like this definitely makes me more willing to try features like Harmonize and actually see AI as something that supports my creativity rather than something I need to be cautious about.

Valdair Leonardo
Community Expert
Community Expert
October 7, 2025

That's a great point of view! I feel the same! Thanks for sharing @walter_0059 

Participant
October 1, 2025

Great question, Valdair. For me, data ethics directly impacts trust—and trust is what determines whether I’m willing to invite an AI tool into my creative process.

When a company makes it clear that my personal or client projects won’t be repurposed as training data, it lowers the barrier to experimentation. I can explore features like Harmonize without worrying that my unique style or proprietary work is being fed into a massive model that others might benefit from without my consent. That sense of security makes me more comfortable using the tool freely, which ironically leads to more creativity, not less.

On the flip side, when models are vague about their training sources or when consent feels like an afterthought, I tend to hold back. I’ll use them for drafts, tests, or ideation, but rarely for final client-facing projects. The lack of transparency makes me treat the AI as a “sandbox toy” rather than a genuine partner.

 

So yes—the ethics behind the training data shapes not just my trust, but the depth of integration AI has in my workflow. Transparency turns the tool from something I cautiously test into something I can confidently collaborate with.

Valdair Leonardo
Community Expert
Community Expert
October 2, 2025

Thanks a lot for your thoughtful contribution, @Amy_Greenz it really resonates. I especially connect with the way you framed transparency as the difference between treating AI like a “sandbox toy” versus a true collaborator. That metaphor captures the reality of how trust shapes not just adoption but depth of use.

 

I’d also add that this ethical clarity doesn’t only affect how much we use AI, but what kinds of projects we feel safe bringing it into. Without that foundation of trust, many creatives (myself included) will hold back on client work or high-stakes projects, which means the tool never reaches its full potential in our workflows. With transparency, on the other hand, it’s not just experimentation that grows, it’s confidence, speed, and the willingness to explore new creative directions

Participant
September 30, 2025

Transparency in how AI models are trained really makes a difference. If I know my personal projects won’t be used without consent, I feel more comfortable experimenting with features like Harmonize. It creates trust and allows me to see AI as a supportive tool rather than something I need to guard against. In creative work, that sense of ethical assurance often matters as much as the results.

Valdair Leonardo
Community Expert
Community Expert
September 30, 2025

Thank you for sharing your perspective, @stromt_9157. A sense of security is an essential pillar for creative work, and insecurity can certainly lead to creative blocks.

Participant
October 2, 2025

You make a thoughtful point. Security really does give people the freedom to take risks and express themselves fully, while insecurity often creates hesitation. I’ve noticed the same idea applies in many areas of life, whether it’s in art, work, or even planning smooth experiences with services like seattleblacklimo.com, stability allows the focus to shift toward creativity rather than worry.