Skip to main content

Filter by idea status

10000+ Ideas

swrutherford
swrutherfordInspiring

Add a "Save as Preset" option to the output settings dialog box in After EffectsOpen for Voting

After Effects:Currently, to create a “template” in Output Settings, you click the drop down next to the Output Module and select “Make Template...”Options Presented When Clicking the Output Module settings in the Render Queue in After Effects ...then have to select “Edit” from the middle of the screen…Output Module Templates Dialog Box, Creating a New Template And then you can edit your settings. Output Module Settings Dialog Box This is the same dialog box that can be accessed by simply clicking the queued item’s Output Module Settings in the Render Queue:Queued Composition in Render Queue I believe it would be beneficial to have a “Save as Preset” or “Save as Output Module Template” button on the Output Module Settings page like Photoshop has on their “New Document” dialog box: Photoshop “New Document” Dialog Box - “Save as Preset” OptionHaving the option on the Output Module Settings dialog box would speed up the creation of templates/presets since it can reduce clicks overall. Also, no matter how I end up at this dialog box, I can still save the template/preset if I access these settings by clicking in the Render Queue (i.e. - I would not have to “back out” of the Output Module Settings dialog box just to reload the same box from a different path). It’s a small change, but could be beneficial for streamlining workflows when you often need to create new templates for clients. Thank you for your consideration.

Max29840681yomcParticipating Frequently

Dynamic Link between AE projectsOpen for Voting

This might be a long shot and I dont know if its technically possible to implement, but I think this would be a major improvement to after effects as a whole. I think the easiest way to explain it is looking at the dynamic link feature between premiere and AE, but from AE projects to AE projects.I would have plenty of usecases for this and I it would elevate professional workflows into new heights.Just like having your own library of functions in coding, that you can link in multiple scripts, updating or adding new logic to the source function which automatically integrates in the scripts where its linked to. In this case the function would be an After effects project that you can reuse and implement in other projects and automatically updates in all linked projects if theres changes to the master project.Reallife example: I have a beverage flatlabel that I wrap around a beverage can with UV Map. I use this Label in multiple projects, since theres plenty of different campaigns and animations where this can is animated differently and set into different scenes.everytime anything on the master flatlabel changes, or we have to translate it into different languages for local campaigns, then I have to manually update and import the label project in ALL my different campaigns to update the Label there. If this Live Link would be a feature, then I could just do the changes in the master file and it automatically implements into all projects. This could also be further expanded in a way like Mogrts, that you can even control certain properties when its imported/live linked from the master project, without manipulating the actual master project. Obviously this would be a pretty major feature, but Im certain that this would have a huge impact on general user experience and complex professional workflows.

AI-Enhanced 3D Camera Tracker (Adobe Sensei Integration)Open for Voting

Hi Adobe team, I’d love to see a major update to the 3D Camera Tracker in After Effects — something that brings it up to modern AI standards and makes it a truly intelligent scene analysis tool instead of just a point solver.Right now, the 3D Camera Tracker works, but it still feels very manual and outdated comparatively. We get tracking points and a solved camera, but we still have to figure out ground planes, scale,         object anchoring, occlusion, and scene layout ourselves.What I want to see is an AI-enhanced version of the 3D Camera Tracker powered by Sensei.   Something that understands the scene, not just tracks it.What This Could Look Like1. Context-Aware Auto Camera Solve AI-assisted solve that distinguishes between static and dynamic objects Better drift prevention by ignoring moving people, cars, etc. Real-time confidence feedback so we know if a solve is solid 2. Automatic Ground & Room Detection Auto floor detection and proper world orientation Basic room building (floor + walls from detected planes) Scene scale estimation This alone would save a ton of manual setup time.3. Full Scene Depth & Dense Reconstruction Optional full-scene Z-depth scan AI-generated depth map layer Dense point cloud or lightweight mesh generation Even experimental Gaussian splat-style scanning for better object placement This would dramatically improve occlusion and 3D compositing inside AE.4. Object-Aware TrackingInstead of just random track points: Detect and label objects (person, desk, chair, vehicle, etc.) Allow tracking or stabilizing specific objects Generate 3D nulls attached to recognized objects Separate dynamic vs static elements automatically 5. Scene Graph / Structured Data Build a basic scene hierarchy (camera, ground, walls, objects) Allow us to attach elements to real-world objects instead of guessing track points 6. Lens Detection & Distortion Handling Auto-detect focal length Detect and correct lens distortion Match CG lens settings automatically 7. Modern Export Options Export camera, point cloud, depth maps, object tracks Clean export to C4D, Blender, Unreal, etc. Include semantic scene data Why This MattersAs a video creator, I spend a lot of time: Hunting for good track points Manually defining ground planes Fixing drift caused by moving objects Creating fake occlusion masks Guessing scale An AI-enhanced tracker would turn this into:Drop footage → Solve → Get a usable 3D scene.It would make After Effects feel years ahead of other compositing tools and remove a lot of friction from 3D integration.After Effects is already incredibly powerful. Updating the 3D Camera Tracker with Adobe Sensei would make it feel modern again and massively speed up real-world workflows.I’d love to see Adobe explore this direction.Thanks for considering it.

Jason PursellParticipating Frequently

Any idea on when some missing features will be added?Open for Voting

Hi,I'm a user who does mobile only editing for half the year (when I'm hiking, i only have a phone). As such, I had high hopes for this release, but I am underwhelmed. I'm sure that my expectations were too high, but I hoped for at least parity with Rush, since you've deprecated that. Having said that, the interface is nice and fluid, although there is some wasted space (even on the phone). It hasn't crashed on me like Rush did so that's good. The audio/music/voiceover/sound effects  features are really great and easy to use. The generate video from image feature is pretty darn cool i must say. I don't want to sound negative either and I do hope you have plans to expand the features. I'd be willing to pay for a subscription if it includes features on par with or surpassed those of Lumafusion. But currently I'm unsure if this is ultimately meant to compete with apps like Lumafusion (since you deprecated Rush) or if this is only meant for short and simple TikTok-like videos.  Some examples of features that I hope to see:1. More contol over clip fit, crop, position, size, scale, blending, and rotation. For example, the ability to Zoom in/out of a video clip. Rush only had this for still images. Being able to zoom in/out video (with control) would require the next item.2. keyframes. 3. effects4. adjustment layers5. double-tap in blank area of timeline to collapse/expand all clips to fit to horizontally in view. I understand this is not possible vertically. 6. ability to select multiple clips in the timeline for editing or dragging7. Finer control of Titles. Especially size and position. For example, the ability to Apply to All for text size and position. Currently we're just guessing with the haptics. Or even give us the ability to create a user generated template.8. Project settings. Everything seems to be preset by first imported clip, but there's no way to tell in the app. Only Aspect ratio is exposed to the user. Anyway, thank you for taking the time to read.