Build better products with our product team
I would like to provide some constructive feedback regarding the current workflow for generating AI videos in Adobe Firefly. I am a regular user of Adobe products and have always appreciated the quality and innovation you bring to creative tools. However, the way Firefly handles credit usage for video generation is causing significant frustration—not just for me, but for many users raising similar concerns.At the moment, Firefly requires users to spend credits before they can see what the generated AI video looks like. This means we are asked to pay in advance without knowing whether the end result will be usable, relevant, or even close to what we requested.To put it simply, this feels like being asked to buy a car without knowing what model, colour, or condition it will arrive in. The outcome can vary widely, and there is no opportunity to judge quality before credits are consumed.While I understand that AI generation has inherent costs, this approach creates a very real risk:Adobe may lose customers to competitors who offer previews.For example, platforms like DeeVid AI already allow users to preview the output before committing credits, ensuring that people only pay for the results they actually want. This model feels fair, transparent, and respectful of the customer’s time and investment.I genuinely believe Adobe Firefly could benefit from adopting a similar preview‑first flow—for example:A low‑res or watermarked preview before credit confirmation A “pre‑flight check” that shows a short, rough version of the generated motion An option to adjust prompts without being penalized for trial‑and‑errorThis would give users confidence in Firefly’s capabilities and reduce the feeling of gambling credits on unpredictable outputs.As much as I prefer to stay within the Adobe ecosystem, I will be exploring alternatives that provide previews before purchase. I hope this feedback helps illustrate why many users feel hesitant to continue using credit‑based video generation as it currently works.Thank you for taking the time to review this. I genuinely want Firefly to succeed and believe this change would significantly improve the user experience.
Bring the Flux.2 [max] model into Firefly and Photoshop!!! It is crazy that ElevenLabs has it there and not my flagship graphics programs :(
The current velocity graph editor in Premiere is far too clunky and makes it unnecessarily difficult to get the exact curve you're looking for. It’s a constant struggle with the UI just to get a smooth ramp. We need a complete overhaul to make keyframe manipulation fluid and intuitive—similar to what the Smoothify plugin achieves.On top of a better UI, adding a native Curve Presets feature (like Ease In/Out, Elastic, or Custom Bezier presets) would be a massive productivity boost, allowing us to apply consistent motion without manually tweaking handles every single time. We need to spend more time editing and less time fighting the graph.
The ability to see a waveform and the video at the same time.
I find it pretty awkward how the video and audio tracks are in their own separate panels with two different scrollbars working independantly. Is it possible to have all the video + audio in just one panel? I know it would mean more scrolling up / down overall, but I'd find it a lot easier than having two scrollbars
Adobe, you USED to be easy to love. Firefly is decent for prototyping images and small product videos, but there’s no clear “in progress” or time bar after clicking Generate, so I WASTE credits and feel constant FRUSTRATION.
An excellent feature that I believe many can take advantage of would be the ability to take markers in the marker pane and convert them into subclips with CTRL/CMD + U (indivudally or with multiple selected.) Subclips would be created either from marker to marker automatically or (if the marker had a duration defined) the subclip would be the length of the marker range. Subclips would automatically receive the name of the marker.
Hi Adobe Team,I’d like to suggest an enhancement to Photoshop’s masking workflow that could benefit both intermediate and advanced users. Currently, creating precise masks from colour data often requires multiple steps — using channels, Apply Image, Selective Color, and Levels — which can be fiddly and unintuitive.My idea is an Advanced Masking Panel where users could: Select a source channel (RGB, Lab, or individual channels) Adjust hue range sliders to isolate specific colours Adjust luminance sliders to target highlights, midtones, or shadows Preview the mask live and refine it interactively before applying This would allow users to isolate very specific pixels — for example, only the brightest blue highlights in wave spray — without inadvertently selecting background elements. The workflow would remain non-destructive, visual, and much more accessible, reducing trial-and-error while retaining full creative control.It’s a small enhancement conceptually, but it could dramatically simplify advanced masking, making Photoshop more intuitive and powerful for a broad spectrum of users.Thanks for considering!
I would like an option for adding a watermark AND a border to an image when exporting. This is already a feature that's available on Lightroom mobile for android so I don't see why it can't also be added to iOS
Hello Adobe Firefly Team,I’m Juan Diaz, Course Director for Compositing & Scene Finishing at Full Sail University. As part of our AI-integrated production curriculum, we actively test Firefly alongside platforms such as ComfyUI, Weavy, Runway, and Pika to evaluate real-world workflow integration.While Firefly’s generation quality and Creative Cloud integration are excellent, I would like to formally request consideration for a node-based modular workflow system within Firefly for advanced users and enterprise pipelines.A node-based architecture would enable:• Procedural prompt stacking• Mask and region-based routing• Multi-pass generation control• Style and motion layer branching• Depth and normal pass injection• Controlled iteration comparisons• Export-ready compositing structures• Seamless After Effects / Premiere / Photoshop pipeline flowCurrently, tools like ComfyUI allow modular control but lack Adobe ecosystem integration. Firefly is uniquely positioned to bring professional, non-destructive procedural AI workflows into mainstream creative production.As an educator integrating AI into compositing workflows, this capability would:• Elevate Firefly from generative tool to production system• Align with node-based industry standards (Nuke, Houdini thinking)• Enable deeper curriculum adoption• Support enterprise-level pipeline integrationI would welcome the opportunity to participate in beta testing or provide structured educational feedback if such a development path is being considered.Thank you for your continued innovation in AI-driven creative workflows.Best regards,Juan Diaz
Hi There, can you please put back the Essential Graphics panel in Adobe Premiere Pro 2026 Thanks Kevin
Now that Premiere Pro is distributed exclusively through Creative Cloud and Adobe no longer provides physical boxed software, I’d like to request that Adobe make individual point-release installers (for example, version 23.6) available to subscribers.Access to specific builds can be important for maintaining compatibility with existing projects, plugins, and established production workflows. When only the newest version—or a limited set of major versions—is available, it can make it difficult to reopen older projects in the same environment in which they were created.Since Creative Cloud is now the sole distribution method, it would be very helpful if subscribers could download and install specific point releases within a major version when needed.I’d appreciate Adobe considering this feature, and I’m curious if other editors have run into similar situations.
Would be amazing if the Premiere Graph editor UI looked and felt more like it does in After Effects.Thicker and less pixelated lines, more reactive easing. Right now it feels pretty clunky in comparison.
I still have my old iPad 7 and it has a previous version of Lightroom that still utilizes full screen crop mode. When it was taken away, i figured it would come back in a future update as a 'new feature'. The mobile app is the very reason I started using Adobe products. Since the removal of full screen crop, I've had to spend upwards of 5-10 minutes, sometimes, trying to crop my images properly. Honestly, I don't know why I have to request such a simple feature that we already had while, instead, getting magic wands that do nothing for professionals. If this is too difficult of a task, could you at least completely black out the shadowed edges AND remove the thick white corners?
I dont know if I am using it wrong, but I feel like the Zoom/Magnification in adobe bridge is pretty bad. I would personally much rather have it like adobe Lightroom, where u can click and it zooms in slightly, and then hold + drag to zoom in/out as much as you would like. It would make it so much easier to see details in my photos. hope this can get updated pretty fast, would make checking photos so much easier in adobe bridge, and I feel like it should have already been a thing years ago.Thank you
A built-in notes panel inside Photoshop where users can store and quickly copy text snippets such as titles, captions, hashtags, or recurring content used in designs. This would reduce the need to switch between external text apps and Photoshop during the design process.e
When iterating textures, it becomes time-consuming to repeatedly export each texture set into its own directory. Currently, exports are limited to a single output directory per export, which means I need to re-export each texture set separately into its respective folder. This becomes a time sink for any project with multiple texture sets.It would significantly improve workflow efficiency if we could create multiple “General Export Parameters” templates, each with: its own output directory an assigned texture set After setting these up once, re-exporting could be done with a single click while automatically routing each texture set to its correct folder.This would streamline iteration-heavy pipelines where textures are frequently re-exported.If there are technical or compatibility reasons this hasn’t been implemented, I’d also appreciate insight or recommended alternatives.Thanks!
"Currently, users have to create custom bins to organize frequently used effects. It would be much more intuitive to have a simple 'Star' or 'Heart' icon next to each effect in the Effects Panel. Clicking this icon would automatically add the effect to a dedicated 'Favorites' category at the top of the list. This would significantly speed up the workflow for editors who rely on the same 10-15 effects daily."
.
I have several collections in Lightroom Classic on my desktop computer that are sorted in a custom order and sync to Lightroom Mobile. The order displays correctly in LR Mobile app on both my iPhone and iPad. But when I want to view the pictures on the LR app on apple tv, the sort order is not correct, even when i select "custom" on TV screen option next to the photos, Any idea how to fix this? Thanks.
Already have an account? Login
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
OKSorry, our virus scanner detected that this file isn't safe to download.
OK