
kalamazandy
Engaged
kalamazandy
Engaged
Activity
‎Feb 07, 2025
01:25 PM
Oh it differs. Probably more so because I am now home and have only my Terrible laptop monitor and a slightly better but also terrible cheap asus monitor for home. I don't do visual color adjustments at home. With that turned on, using my less terrible asus screen, I actually get f1f2f1 but then a27e1f for the gold color. Export H.264 using the standard high bitrate preset (I tried software encoding as well, but that made no difference) I get f3f3f3 and a17e1d. So No DCM is better, but it still changes it a bit. I have exported it a year or two ago and got them to match, and checked in browser even and I had it matching. This is why I am now baffled as to what is going on. It seems to me that exporting a video from premiere and importing it directly back in should potentially yield the same results. ok Then I had a breakthrough, sort of. I rendered ProRes, and it was fine. Changed down to 8 bit, still fine. Changed the maximum Bit depth checkbox off, which eliminates the Rec. 709 color space option, and I got the same results with the proRes file. So I went back to H.265 and tried maximum bit depth and it was the same as if I had it unchecked until I changed to the Main10 profile, so now the output ends up being 10bit. An improvement! The values are now only 1 digit off at a07d23 and f0f2f0. H.264 with Maximum bit depth at High10 actually got closer with f0f2f1 and nailed the gold at a17d24. So maybe it's more of a lossy compression optimization thing that you just can't control with adobe's outputs. Out of curiousity, I added webM in the mix and only changed 10 bit and 12 bit color depths. Both produced different results. So it also could be that Premiere also has some different ways it handles color conversion between bit depths. AfterEffects lets you choose 8, 16, and 32bit compositing. Premiere is geared more toward camera footage, and I'm not actually sure what bit depth it works in, or if it just adjusts automatically to the highest piece of footage? Either way, I could see rounding issues if you convert an 8 bit color to a 32bit float, then go back, especially if you're adding color profiles and gamma to the mix. So it seems premiere just has some issues nailing down the colors on export to h.264 or h.265 unless you use Main10 and maximum color depth. I have to assume that with the longer numbers at the same bitrate, you'd have to get a lower quality in most cases though, and I'm not sure how well supported 10/12 bit playback is outside of the editing world. So probably best to export ProRes and compress final output elsewhere.
... View more
‎Feb 07, 2025
10:35 AM
Trying to remove as many variables as possible here to figure this out. I'm set up with the same color settings. Disabled DCM in both AE and Premiere. I added just some text right in premiere and made them the same two colors mentioned. a14d24 and f1f2f1. I have an sRGB image set as my background image on my desktop and have those two colors there. They sample as the correct values, and the display in premiere samples as the same values. So far, so good. I'm seeing the same thing in both windows And premiere. The export window is actually showing the correct values as well. I export to H.264, standard for most web applications, and import the file back in to premiere automatically. Add it to my timeline and it samples as a07d23 and f2f2f2 I tried changing the project settings viewer gamma between 2.2 and 2.4 but made no difference. I tried overriding the footage color input from the default Rec.709 to none, manually setting to preserve RGB. Those do make some changes, but more wildly in the wrong directions. I'm not sure what the Re.709 levels for the monitor are. If you are referring to the video monitor for the camera, that shouldn't apply here because I'm using graphics, not footage. Footage is a lot more flexible. Graphics, like motion graphics, I'm being asked to figure out how to make sure it matches...at least Somewhere. The export, then import as different colors has me stumped.
... View more
‎Feb 07, 2025
09:39 AM
Ah, so you're referring to the Display Color Management button. I played around with having that on/off and changed the color profile, in windows, from the correct monitor profile and just assigning sRGB IEC61966-2.1 and both renders turned out the same with display color management turned on/off. I know windows can be set to some funky settings. I have auto color management turned off by the way. It also seems we are still speaking two languages. I am Not matching a color chip. I am Only after trying to get the colors of a video export to match other on screen elements with the same color. I'm not matching footage either. I'd chalk it up to browser differences as well, but we have the same issue just exporting a video, then importing it right back into premiere. There colors shift. So I'm just trying to understand why, in order to learn how to combat that.
... View more
‎Feb 07, 2025
07:14 AM
That is great information, and I appreciate the back and forth. What I'm after is relative color, Not accurate screen color. By that I mean, I'm creating content that is generally on a website that is on brand. Lets say it has a17d24 elements all over the page (sRGB of course). And lets say the background is F1F2F1, Mostly white, but not white. They want a video to look as if it doesn't have any borders in my example, with some elements in a17d24 moving around. So the video background is f1f2f1 on a webpage that has a background color that is the same. They don't want to see any difference in the color. I can usually get it close, like f1f1f1 or f2f2f2. Perceptively, most people that aren't the designers are perfectly fine with that, especially given their devices aren't incredibly high quality monitors and their lighting conditions aren't iideal. But the gold color, since it's more toward the middle of that gamma curve shifts pretty far. It can go from orange to brown, in comparison to the same sRGB element sitting next to it on the site. Again, we aren't holding color chips up to a monitor. And the marketing folks are understanding of different quality monitors showing things differently. In a tradeshow booth, when this color is up on a TV, it probably doesn't matter. Bad lighting, unknown TV. being in the ballpark is fine. But when it is on screen, next to the same thing on screen, they expect the colors to appear the same Relative to eachother. I've found various people saying to change different color spaces in after effects and premiere to accomplish different things, just applying a lut on export that adjusts colors, and even exporting an uncompressed file and reencoding with ffmpeg to apply the transform or importing back into after effects. So I was just hoping to understand how the settings work with each other a bit better, and what variables I can eliminate. If I'm not interested in seeing the correct colors myself, can I just eliminiate my monitor profile from any calculations, for example? I can import into after effects and get the view to sample the correct color on screen. I can even get the color to sample correct in premiere, Right up until I export. The sampled (in sRGB) on-screen color matches the sRGB on the same screen. But when I go to export, the preview color is not the same. So what is confusing, is figuring out where input color transforms happen, the view transforms, and the export transforms. I'm also not seeing an option to skip color management in premiere, or maybe I'm misunderstanding the terminology. When I create a sequence, I can only choose one of 3 working color space options, which is defaulted to Rec. 709, with the others being 2100 HLG and PQ. I do not have enough knowledge to understand this, but I am having difficulties finding it. Resources like illusion has some Great information on the importance of calibration and science behind what happens between a colorspace, the screen, and your face. But I haven't found much in terms of workflow, and understanding the differences between monitor profiles, software color working spaces, interpretation of built-in color spaces (or how to handle colors defined with specific color spaces that don't match your working space), and the output colorspace.
... View more
‎Feb 06, 2025
09:47 AM
I completely agree with the importance of monitors being calibrated in terms of accepting color adjustments, especially when viewed on different machines, locations, etc. What I'm getting at is the trouble understanding at which point different profiles and adjustments are being applied/used. I'll try and wade through the 2 resources you mentioned, but I've found resources in the past and they were Very heavy on explaining the importance of calibration and I didn't find anything on how to accomplish it in practice. In my case, I'm not making color adjustments. I'd like to know the workflow for inputting sRGB values, going through AfterEffects (or starting directly in premiere) and outputting a video that does not change the sRGB. For things like lower thirds, logo animations, giant branding rectangles for whatever reason, the colors need to be Relatively correct. Currently, I'm having troubles just inputting a color like a17d24 (sRGB) and outputting a video what that block of color samples at a17d24. I recently had to work with the Aces workflow because I needed to use multi-layer exr files that were saved with a specific color profile, so that helped a bit that we have to worry about the input profile, what profile to use for a working profile (usually a linear profile for properly calculating light), a view profile, and an output profile. Of course, I'm probably still getting the rec.709 / sRGB problem here because I still don't understand that. Thanks for mentioning the lumetri color panel, settings tab. I forgot that existed, and that wonderfully shows all of the related settings in one place rather than having to look at your premiere color settings, project settings, sequence, clip, etc. In searching for this problem, I see lots of other people mentioning it, and the "solution" being people suggesting to just adjust the gamma when you go to render. Even in premiere, I can't seem to figure out the settings to set a color, then go to the export tab and sample that color that is export there. Is it Always adjusting the output file based on my monitor profile? Is that the answer? I have to switch my monitor profile to sRGB before exporting, because That would be terrible. I've seen similar problems with graphic designers working in illustrator and photoshop, not understanding how profiles work with RGB and their monitors, just thinking "convert the CMYK color to RGB...and Done." Of course, it's way more complicated than that. But it's difficult to find the correct setups for windows or mac color profiles to use, and how to have the software setup as well. I remember at a previous job, we hired a professional to come down and calibrate the design team's monitors. They set everything up, and we realized most people were set up incorrectly because their operating system had the profile applied And Adobe software, so basically the change it adjusted for then made the same change in the wrong direction. It was getting applied twice, so it stacked. Even professionals get it wrong. That's why it's so important to see the entire intended workflow, and understand what it's doing. If anyone finds this and has a good resource that walks through (for both windows and mac) color settings and how to handle proper rec.709 workflows as well as how to handle sRGB inputs and consistent outputs for web, I'm all ears. That's what I'm after. This was the closest discussion I could find that seemed correct. Most questions went unanswered, or just said to adjust it the best you can.
... View more
‎Feb 06, 2025
08:35 AM
I swear I had figured this out at some point, but the issue made it's way back. It seems that @R Neil Haugen is the most practiced expert in the area, so I made sure and put that tag to help clear things up.Color grading is one thing, and you might just deal with the fact that web, even different browsers, adjust video gamma slightly different from each other and ignore color profiles entirely. From what I can tell, browsers (and many video players in general) assume content is sRGB rather than Rec.709. It's probably not a huge deal in most cases, but for branded material where a solid brand related color is used on a web page And a video, that color shift can be significant enough for graphic design folks to be upset. I gave a video to someone and they said they changed the color profile to srgb and it matched much closer. I have no idea how they did that, but that's what they said. The challenge I always have is, regardless of being AfterEffects or Premiere, when I input a color, how do I tell the software that I'm specifying an sRGB color rather than whatever the working colorspace is? Am I correct that Premiere only allows for working in Rec color spaces, but in AfterEffects you can specify the color space in ways that are incredibly confusing overall, but you've got serious control. I would Love to find an in depth training on color spaces in terms of input and output, conversions, and monitors. It seems they are always focused on making sure I am seeing the color correctly, and that I am outputting the color based on the monitor I am viewing it on. That's definitely true for someone that is color grading. But if I'm doing a 4 color brand animation, I want to input the 4 sRGB values, and I want to output the same 4 sRGB values, ignoring my monitor profiles entirely. It becomes even More complicated when using OCIO color workflow with 32bit exr files which are treated as linear, and also having to mix in sRGB values (obviously talking about after effects here), then moving that footage to premiere, and output the final file as Rec.709...only to have the colors shift because the final output is displayed in a web browser that shifts the colors back to sRGB. I may be incorrect about what's going on. I'm just trying to explain my understanding, and hoping to be pointed in the right direction at how to input an sRGB value for a solid color, and have the output h.264 file displayed in a web browser be as close as possible to the input value. Again, understanding there are slight differences in the sRGB values because sRGB is not a straight 2.2 gamma, so you can't just apply a gamma adjustment to get the same values.
... View more
‎Oct 16, 2024
12:39 PM
Google also suggests you use Glue to keep cheese from sliding off pizza. Google is a Terrible source of information. It is a great resource to Find good sources of information. If you ask it certain questions it will also give as solutions to some problems because enough people provided that as an answer in various places and google forgot to program in sarcasm detection. Also of the times I am stumped at things and search for them, more often than not, the first results tell you that you can't do the thing. Keep searching, keep searching, dig around a bit, and I find the answer which IS in fact possible. I have found an incredibly high frequency with Mac users responding saying you can't do something compared to others, furthering my opinion that many people think they know mac well, but that is because they don't do anything that can't be done on their phone. Anyway, I wouldn't go around quoting as search engine being more reputable than a person. The person at least has some level of intelligence. Google's Ai is a LLM, so is both incredibly smart and incredibly stupid and it doesn't know the difference. [abuse removed by moderator]
... View more
‎Oct 16, 2024
11:51 AM
1 Upvote
Priority based on user frequency alone would cause Adobe to completely lose professionals. How many people actually use ProRes 4444? That's a very specialized format that a Vast majority of Adobe users will not, and Should not use. But it's important to have for those professionals who rely on it. If we always wait until we see a demand for something before pulling the trigger on doing it then we would have very few innovations. The masses didn't want a motorized vehicle, they just wanted a faster horse. Adobe probably supported H264 well before hardware really handled it. That's why so many people say things like "No, you need prores for proxies because h264 will grind your timeline to a halt." And there are still quite a few people who still say that, just because that's what it was and they haven't tested it since. It may be true if they are working on a machine that's more than 10 years old, but on anything that is capable of reasonably editing 4k+, it is doubtful that they don't have hardware encoding/decoding specific to h264.
... View more
‎Oct 16, 2024
06:08 AM
That's a great point. If the argument against it is that it requires specialized hardware then why in the world would Adobe support Mercury Transport monitors, for example? They choose things with balance between what their mass users use and what their professional users use. Sometimes the mass users ends up being what the professional users use as well. AV1 is likely something that social media content creators would use, and large studio editors will likely not. And that's ok. It's kind of like normal users wanting adaptive cruise control, and an indy car driver yelling at everyone telling them how dumb they are for wanting that in their vehicles because they wouldn't use that. Recognize there are multiple groups of users, request what you want, and help the regular requests along in a way that would potentially benefit you in the long run If you would like to. I don't have a 4000 series yet, so this doesn't benefit me Now, but it could in the future. Plus, if someone were to give me a slew of files, I'd love to not have to jump through hoops to use them whenever possible.
... View more
‎Oct 16, 2024
05:45 AM
Mismatching audio, for sure, can't be used for doing an audio mix...sort of. Adobe could handle it several ways. The most obvious would be to highlight mismatching audio so the user knows that they would be able to understand Some information from the proxy, but shouldn't go adjusting different settings of individual channels for the audio. There are some easy one also like if the original has no audio, but your proxy has audio channels. Ignore them automatically, or at Least do a check for data. Yeah, there's an audio stream, and it's completely silent, so go ahead and ignore it. Maybe just warn me if the original has no audio, but my proxy does? And Still let me use it if I want to. Maybe someone decided to create a proxy of the original video and include the lav mic audio in it or something just to use as a stand-in for something else. That could be a good idea, maybe that works for them. I just wouldn't do something like that because "that's not the way people work with premiere." I get Canon footage all the time with 4 channel audio and only 1 is being used. So my proxy Has to have matching audio. Again, those 3 channels are garbage because they are silent. Adobe Could have a method of flagging silent tracks and allow mismatches for them. In fact, I would Prefer to let it spend the time to check for audio signal on ingest and automatically remove them (as an option). If someone mistakenly recorded a mic to stereo audio and it's only on one channel, it would be great if I had the option of just eliminating the right channel and treating left channel where there actually was signal as mono. I have to change that manually in Premiere AND leave the original alone, OR run it through ffmpeg to change the stream of the original, which I don't have time to do 99% of the time and wouldn't trust someone that does have the time to do that kind of task. Again, this should just be optional. I'm sure some people Do want to keep the original audio format even when it isn't used, or if the video file was S&Q so has no audio stream at all then they have some process that requires their proxy file to Also have no audio streams. I don't know why that is useful, but it very well could be for someone, and the software should support that workflow.
... View more
‎Oct 09, 2024
06:29 AM
1 Upvote
Shoot, you're right. For some reason I thought I had hardware encoded that on my computer at some point, but I may have confused that with another experiment. I am currently using H.265 10 bit 422 for our proxies so that we are able to access them remotely easier. We are already shooting ProRes422, so only using it to lighten things up a bit. But I'm definitely interested in AV1 for the future since H.265 only works well encoded packaged as an mov at the moment since that allows for encoding with audio streams that match things like Red camera and canon's default 4 channel on the C series cameras. I wish Premiere would just let you ignore the mismatch and warn you every time or something, but I've only found one workaround and it is apparently not Mac compatible. AV1 could be a great potential for transparent proxies. Currently we have to use ProRes4444 which is larger than an image sequence, or I think CineForm has a version that works with alpha as well. I can't remember. It is handy to package image sequences as a video format, but not so much if the video is larger than the images. That's more of an AfterEffects thing, but I do occasionally use them in Premiere from 3D animation software. It's easier keeping track of a single file though. And if someone has to offline the sequence and find it again, AfterEffects is kind of dumb and sometimes changes the framerate because it's as if you have completely changed footage rather than just Finding the missing footage. So if your footage was 24fps but your machine's default is 30fps then it may change it to 30.
... View more
‎Oct 08, 2024
01:21 PM
1 Upvote
Nvidia has hardware acceleration for AV1 starting with the 30xx series, and increased resolution on the newer and higher end cards. Intel also has hardware acceleration starting with Arc. I just wanted to throw that out there, that hardware acceleration exists for both encoding and decoding. It's just not in camera hardware yet, and the software to utilize the hardware acceleration is behind, which is where this request comes in.
... View more
‎Jul 19, 2024
12:47 PM
Which is Alt+Shift drag on top. In After Effects it's just Alt drag on top. I love it when the software is for very similar purposes and has Slightly different shortcuts. But also it currently has a bug, like I mentioned, that causes it to not work if it has a percentage speed change. It Does work fine if the speed change is done as a time remap with keyframes though. And in after effects it just works fine all the time.
... View more
‎Jul 19, 2024
12:09 PM
1 Upvote
It works Mostly, if you add the new file to you bin first (or have a bin open if it's in a different prproj). Drag the file over top of the timeline clip you want to replace and hold Shift+Alt. That does Not work if the clip has a speed change though, which is a bug. But annoyingly, I tried reporting that. And adobe employee said yea, that seems like a bug. Then proceeded to tell me to spend a bunch of time doing steps to show them exactly how it is a bug. I don't have time for that, so for now you just have to change the time to 100%, replace it, then change it back.
... View more
‎Jul 18, 2024
11:49 AM
1 Upvote
On windows, holding Alt+Shift should allow you to replace a clip on the timeline with another copy. In my case, I have to do this because I am replacing footage in an existing project with footage from a production. Since the footage in a production, annoyingly, doesn't show in your bin at all this seems to be the only way to take a timeline that uses footage from a production and do that in a way that actually uses the footage from the production. It's the same footage, so why would I want to do that? I'm sure someone would ask that. The answer is because we created a premiere file for each shoot already and created proxies for them. I figured out a trick to make proxies with the stupid headache of Premiere being incredibly difficult when dealing with audio mismatches. Yes Premiere I'm aware that my proxy file has audio channels and the S&Q footage that we shot has no audio. Since proxies already exist, but they can't be assigned to half the files because of various things being mismatched, I'm replacing the footage on the timeline one by one. I got to a nested sequence where the couples pieces of footage in ther had their speed set to 200%. When I replaced the footage, holding Alt+Shift, I noticed the footage jumped. I changed the footage speed to 100 and the replacement stayed the same. Then just changed the speed back to 200%. I thought it would be annoying to have to replace a clip with more complicated time remapping, but no. Replacing one with time remapping Keys works fine, just as you would expect. It's only when the speed has been changed with a straight percentage. So this seems to be clearly just a bug. I did an experiment and change the speed to 200%, but with a time remapping key and then it was fine. I also set several keys on another clipe and changed it dramatically. Same thing, no issues with changing which footage the clip was referencing.
... View more
‎Jul 16, 2024
01:04 PM
....unless you are using a production, because the footage on your timeline isn't in your bin. Productions are really nice, but that is something they overlooked for sure. So the only way to replace something on the timeline from footage in a production is to drag the footage in, so you need a way to match the in/out points.
... View more
‎Jun 26, 2024
01:43 PM
One big oversight I've found is the ability to scale footage to fit within a certain size. Only recently did Premiere finally add the ability to set a scale percentage. That's still annoying though because if I have Some 12k footage, some 6, 8, and 4k, then I'm not going to want to create proxies all at quarter size. I just want them all to fit within something like 2048x1080 maybe. Is there a way to maybe do it with the scripting API? With certain cameras I've especially found people like to record cinematic formats then then it isn't just choosing between 16:9 and 17:9. I'm looking to scale files down, apply an appropriate rec.709 lut for whichever camera, and use a low bitrate so it can be previewed by folks on the web. I don't have a Mac handy, nor an extra $1k for EditReady, so hoping I can get this done with AME watchfolders somehow.
... View more
‎Jun 17, 2024
11:02 AM
1 Upvote
Looking at a 4 year old video explaining that encoding in a new format is the culprit? Well, that may have been included, but AV1 can now be encoded by GPU. WebM was definitely in the same boat. If you look at which cards can encode which things, so long as you have a card within the last 2-3 years that is reasonably good, then you'll have no problem with encoding speeds. It's probably likely that Adobe splits things into two categories: Mass users, and professional users. Mass users are likely your youtubers and cheaper wedding/church videographers (it's suprising how many of those there are). Those are mostly going to be, on Average, using H.264 and whatever is straight out of cammera. Professional users are going to use as original files as possible mostly, so whatever they are getting straight from the camera or an external recorder like Atomos type device. A good number of those will be ProRes, and things that aren't, like if you needed a racing style drone to get footage, will likely be H.264. Unless something causes the mass users to start switching to another editor because they support AV1, then the driving point will likely be if a camera manufacturer puts a hardware encoder in their camera system to capture AV1 12bit 4:2:2. I can't quite remember, but I thought it also supported 4:4:4, which is a one up on ProRes, which only offers that in 4:4:4:4. This is very annoying, because there's no point in that last 4 unless you've got alpha. Premiere supports H.264 10bit, BUT there are a Ton of variations involved in how things are packaged. It is fast with some, and slow with anything I've tried getting out of ffmpeg, or similar. Resolve, is aware that H.265 can actually be a good edit format if you have reasonably current hardware, and it is one of the options for edit proxies. Resolve may be another driving factor, because Premiere Does need to compete with them a bit. We would definitely switch to Resolve if we didn't use AfterEffects so often. The ability to copy and paste between them (with Some degree of success), or import AE files directly, create mogrt files when needed, speeds up just enough to make the annoyances in color workflow and needed extra hardrive space for Resolve workflows not worth it for our studio.....yet. Project success can always shift things a bit right?
... View more
‎Mar 29, 2024
08:19 AM
I think you've nailed it on the head with the cameras not having it is likely the cause for lack of support. I think, just like people still stuck in thinking they have to use CMYK for anything related to print (which is pretty obsurd given how the majority of printers work and the majority of content is created primarily for on-screen usage with print as a secondary) I think people are Definitely being mislead by their own bias with thinking that Adobe cares about professional use first, prosumers next. If this were true, they wouldn't have put work into creating lightroom cloud, Adobe Rush, Photoshop and illustrator on iPad, etc. They're going to follow the money. One thing you all might be missing are corporate users. There are a Lot of them. And they often barely know how to open the software, but they Pay for it. Granted they are more likely to use Rush in something like this, but Rush is really just Premiere with a different skin on it and lots of limitations so should still apply. A few cameras, like Sony a7sIII for example, shoot to odd H.264 formats that record in 10bit 422 and Premiere edits them incredibly speedy. Part of that is likely because so many computers have some form of h.264 video acceleration built in. A few years ago, I remember seeing folks bashing h.264 in the same way. "not an editing format" "will bring your machine to a screeching halt" "low quality" etc. There are tradeoffs, but Sony is definitely recognizing them. You sacrifice a bit of color inaccuracy for files that are 1/8 the size. For something like an internal corporate video, you'd be an Idiot not to find that an acceptable sacrifice. For something like a feature films, probably not. Often people focus too much on a detail they care about, not thinking about the quality loss on something else. I'm sure we can all relate when choosing something like a lens for a camera. Yeah, I'd Love a T1.2 35mm but I don't have 40K to spend on a lens and don't want to spend 2K on a lens that has that, but glass with a ton of distortion anywhere outside of exact center. If you only focus on the aperature, you'll end up with a lens that captures a really narrow DOF. My point is, AV1 Could be a Really nice editing format for users that have a data budget. I have to keep 100% of my footage, forever, because of compliance reasons and I'm currently looking into using H.265 to replace older footage that we are unlikely to use, but are required to keep. (Yes, I know I'd be deleting the old files, but legally there are exceptions for storing the files in an "archival" format without specifying what that means). I was just showing the results to another video professional and he was delightfully surprised that it could hold so much information. AV1 would be even better, but I wouldn't commit to that if I couldn't throw a clip into Premiere. The way I'm converting sLog3 ProResHQ footage to H.265 and the only noticeable loss is film grain in low contrast areas. I tested it on a piece of terrible footage recorded in ProResRaw from an FX6 with the sun through some trees behind the subject, recorded 3 stops too high. Terrible footage. lol None of it was Great, because ProResRaw IS actually lossy. So in the grain you'll get some oversaturated bits. Those changed regardless of converting, but the overall color of the H.265 wasn't "worse." There was a slight color shift in some mid tone orange, but I had to push that Hard to see, well past anything you would do normally. So I get that it might not be "here" yet. But Adobe does things All the time that aren't "ready." I wouldn't assume anyone is lazy, stupid, etc. But it sure would be nice if Adobe would understand that if they added this, then they would help change the industry a bit. As cameras include hardware accelleration, you'll see more of them using formats Like AV1. I'm sure that's why Adobe has an interest in it. As soon as a Pro camera has figured out how to handle the hardware to utilize it if they can do that at an "acceptable loss" for the gain they get in the ability to use cheaper media, less battery drain, and bragging rights of smaller high quality files. My guess is it will be a model that crosses over from Prosumer to Cinema. I'm sure AV1 adoption would be appealing to someone like DJI, who requires light weight for most all of their cameras.
... View more
‎Mar 27, 2024
06:32 AM
1 Upvote
I appreciate it, but no thanks. Similar to how folks have been complaining about moving to self-checkout, then complaining about people stealing (often on accident simply because the self-checkout they offered sort of sucks), it is not my job to fix your software. It used to be that software companies had an interest in identifying and fixing bugs on their own. Now when we inform them of a bug they forward us to the "correct" way of using a form to fill out the bug. In many cases they even ask to repeat the actions on a different machine with a clean operating system install, with no plugins, etc to see if the problem persists. Again...Not my job. And hey, I'm sure it's not Your job Kevin, so this is a complaint about the way Adobe is running and not You. But it seems they rely Much too heavily on user voice style reports rather than just improving their software by having people listen and identify frustrations and probable bugs from forums like this. I provided enough information for a knowledgeable Adobe employee to replicate the problem. If Adobe wants people to write their own bug reports properly then they will put a form built into the software again, where you would go to the help menu, and see a "report bug" button. This should then ask you that information AND not all of the information should be required because it may not be applicable. In this case it is related to the way adobe handles cloud files. How much available VRAM I happen to have is not important. It's also something that the software should be able to fill out automatically. Of course, it can't do that if I have to go to a community forum. Bring back the button. There was a time that when I was working in a software and i posted a question, or potential bug, a technician would actually Call me in response. They would do that after they looked into the issue themselves, to offer a temporary solution if needed or to ask for further information that maybe wasn't provided. I have an interest in making the software better, so I do appreciate the ability to add bug reports and will do that, but Make it Easy. Adobe Really sucks at this at the moment. Make it consistent through all of their software, and provide responses by technical teams rather than generic "adobe employee" who just points you to "how to write a bug report" a year and a half after the last response.
... View more
‎Feb 02, 2024
06:43 AM
Is there still no way to create a proxy render preset that lets you match audio settings, and set the resolution of only one side while keeping the aspect ratio? It really sucks to send a bunch of files to media encoder, not realizing that some of the files weren't recorded with audio in the camera. Premiere doesn't both checking the files against each other, so it renders them, then tells you that you can't use the file as a proxy because the audio doesn't match. The same goes for when someone uses Canon and records 4 channel audio even though they aren't using it. (Very annoying) It just doesn't make much sense that you need to create different presets for Every change other than framerate. I've got a 16:9 and 17:9 preset for h.264, h.265, a couple proRes sizes, etc. And if I have an audio mismatch, I've got to create another one? Am I missing something? I am currently looking into a way to just have some watch folders using ffmpeg instead because you Can do that. Unfortunately, Premiere is really picky with which settings you need in order to be somewhat optimized. We use H.264 when working remote, and only create edit proxies when needed. I'd rather use Media Encoder though, since Premiere already reads those files and edits smoothly. I found a Reddit post that suggested Adobe uses a different package type, which ffmpeg can do, but not identify or something. I would Really like a solution to create files that will edit ok in terms of just playing back smoothly. And we need it to be something like h.264 or h.265 so we can sync the footage remotely easily.
... View more
‎Jan 24, 2024
10:49 AM
Outside of disabling internet entirely on a computer, is there a way to disable all of the features in all adobe apps that upload information for processing? I have to assume not all users understand how something like generative fill works. It uploads your image, processes it, then sends it back. Transcripts in Premiere, Selections in some circumstances, etc. You are not always well informed of when something is uploaded. And in some cases the image is actually saved for review, like if you rate a generative image, the image itself is saved for Adobe to review but it does not inform the user of that. I only found that out while searching through some of their very disjointed documentation. It looked like maybe you could do that with software profiles, which look to only be available on the full enterprise version (not teams). But it looks like you can only enable/disable full services, so I'm not even sure it would work. You can disable firefly, but would that disable non-local options for select subject? I'm guessing no. And as they add new features this will only get worse. CS5 had a registry option to disable it, but I added the option and Photoshop would only show the splash screen. Windows Firewall might be a solution, but it would be complicated to track if the service that is sending information is not the same one as the running software. Any other ideas? It would be really nice if Adobe would just give us the option to let us run the software and just disable uploading anything. But if I'm working on some silly social media post or something, toggle it back on because it wouldn't really matter if an image of a few people standing outside of a paintball arena were to get uploaded offsite. I don't mind things being uploaded. I just need to be sure we are able to control What is being uploaded.
... View more
‎Jan 24, 2024
06:57 AM
This is not the correct answer. This is only related to cloud storage. Many of Adobe's programs send pieces of your work to AWS, where the image is processed by their servers, and then results sent back to you. And some interactions that users may not be aware of actually cause the image to be stored for review by adobe without explicitly telling you. I'm not sure most users know that making a selection with the direct select tool is doing that. They may know that generative fill sends data, but content aware fill seems Awfully similar and Doesn't send anything.....I think. Again, you don't Really know. I'm trying to figure out a way to enable/disable All creative cloud features that may send data somewhere else when I'm working on something that may be more sensitive, but have them enabled when I'm just doing something for social media, personal work, and something known to be public work. I love the direction of some of the new features, but I Don't love that the only way for me to know they aren't working is to shut off internet on my computer entirely.
... View more
‎Sep 27, 2023
04:54 AM
Of course, after I had a moment to look at the code that Bing spit out at me I found the mistake. Rather than just asking for artboard.name it tried to define a variable for it, then use that. Sure, that's fairly standard. But it had this in the code: var name = artboard.name;
artboardNames.push(name); The text file it gave me was the correct number of artboards, so I knew Part of it worked, but every text entry was just "Adobe Illustrator" So I looked back through the code. I didn't get any errors. I don't know much about illustrator scripting yet, but I was guessing that "name" might be something reserved for a global variable. It sure would be nice if it gave an error to that effect. This has been my frustration with attempting to learn any programming language because it's not the language part that is difficult. It's trying to tie the language with Illustrator, or Blender, or Windows, etc. Anyway, here's the code that works. Run it (you can drag and drop the file onto illustrator for those who may not know) and it will save a text file on your desktop of all of the artboard names in order. Do some find and replace, whatever you need. Save that text file, then come back to illustrator and press the button that asks if you want to update the names. And the artboard names get updated to whatever you changed them to. Keep in mind, I am not a programmer, and I did not write this. Artificial (not quite so) Intelligence finding a bunch of bits and pieces of script and smashing it together. So if you do something silly, like add a line return then it will fail. I see it Was nice enough to think of that potential error and is supposed to give an error instead of changing the names. But I didn't test that out. I just read through it. It looked correct. I saved a copy of the file I was working on, tried it, and it worked. So use at your own risk. // This script will export all of the artboard names in order, which can be edited in a text editor, then use that text file to update the artboard names in the file.
// Define the file name and path for the text file
var fileName = "artboard_names.txt";
var filePath = Folder.desktop + "/" + fileName;
// Get the document and the number of artboards
var doc = app.activeDocument;
var numArtboards = doc.artboards.length;
// Create an array to store the artboard names
var artboardNames = [];
// Loop through the artboards and push their names to the array
for (var i = 0; i < numArtboards; i++) {
var artboard = doc.artboards[i];
artboardNames.push(artboard.name);
}
// Write the array to the text file, separated by newlines
var file = new File(filePath);
file.open("w");
file.write(artboardNames.join("\n"));
file.close();
// Display a message that the file has been created
alert("The file " + fileName + " has been created on your desktop. You can edit it in a text editor and then run this script again to update the artboard names.");
// Ask the user if they want to update the artboard names from the text file
var answer = confirm("Do you want to update the artboard names from the text file?");
// If yes, read the text file and update the artboard names
if (answer) {
// Open the text file and read its contents
var file = new File(filePath);
file.open("r");
var contents = file.read();
file.close();
// Split the contents by newlines and store them in an array
var newNames = contents.split("\n");
// Check if the number of new names matches the number of artboards
if (newNames.length == numArtboards) {
// Loop through the artboards and update their names from the array
for (var i = 0; i < numArtboards; i++) {
var artboard = doc.artboards[i];
var newName = newNames[i];
artboard.name = newName;
}
// Display a message that the artboard names have been updated
alert("The artboard names have been updated from the text file.");
} else {
// Display a message that the number of new names does not match the number of artboards
alert("The number of new names in the text file does not match the number of artboards. Please make sure they are equal and try again.");
}
}
... View more
‎Sep 26, 2023
11:48 AM
1 Upvote
I really wanted to export all of the names to a file, edit the file, then import that file again. After not getting that to work, I found this. I know very little about Illustrator scripting yet, and made a Dumb mistake. I accidentally copied the "--------------" as well. That wasn't in code block formatted text, so I didn't catch it. And rather than giving me errors about the first line, it tols me main(); did something illegal. I have other scripts that didn't define the whole thing as a function and then run it, so I removed that. Then it told me var counter was illegal. What the hell? Since no one was commenting about it not working I took another look and realized my first line was ------------ So to folks wanting this and Also finding Adobe's error system to be mostly useless, be sure Not to accidentally copy extra characters. I guess Adobe just goes ahead and uses them somehow and gives you an error about something Else.
... View more
‎May 25, 2023
07:48 AM
2 Upvotes
Oh, I'm aware. But as the video points out, there are people who just use RGB/CMYK because that's what they learned. Actually using CMYK is basically pointless these days, however using an out-of-gamut CMYK check is still required. But that's an entirely different discussion. Anyway, vector scopes are LAB. Incredibly useful for skin tones and seeing color shift. RGB parade is incredibly useful for white balance and understanding more distribution of your value than a histogram tells you. For instance, if you have a picture of someone on a stage in a spotlight, the histogram will have a nice full bell-curved-like value even though the majority of the image is incredibly dark. The RGB parade will show you the distribution of value for the rest of the image....for the most part. It's really just showing all of the values in Y separated by X. So for those that don't know, it gives you something very similar to a histogram, but flattened to a single line of vertical pixels for every column of pixels. It is simply, differently useful. Some will find it more or less useful, but still useful.
... View more
‎May 25, 2023
06:54 AM
I assumed it was related to the tone mapping. I have the "Auto Detect Log Video Color Space" in preference turned off. After updating one of the more recent versions (I have auto update turned on, so don't know which version from/to it was related) the sequences got the "Auto Tone Map Media" button checked. I had turned that off on some C-Log footage, and it is now turned back on again. I wanted to not use any of it because it was causing some terrible errors when media was nested. I think that bug is fixed, now, but it doesn't really save me much time and since it still seems like you folks are trying to figure it out I'd rather not have it. Regardless of what the cause is, there's a bug somewhere because premiere renders the file as I see it. Sending it to AME renders Some of the clips with incorrect color. This is windows 11 Pro 10.0.22621 Build 2261 Premiere pro v23.4.0 build 56 Media Encoder v23.4 build 47
... View more
‎May 25, 2023
06:43 AM
5 Upvotes
Well, yes. You just asked the same question that this entire thread is about. It has become pointlessly heated also, as people object to a feature that they don't have to use but already exists, just not in their software. This should exist also in Photoshop. The people who don't want it, don't need to use it. Good for them. The rest of us, can use the same workflow in both video and photography. You could now check skin tones without having to reference stupid skin color charts or worry about your monitor being calibrated and viewing in proper lighting. I assume in adding vector scopes RGB parade would also be part of that, which then also means you can white balance in the same way. Images with multiple color light sources make white balancing a pain in the butt, and with RGB parade you can visually see what is happening all across the image and get a good average as your starting point. I've asked several professional photographers what they thought. They all agreed that they don't think they need it. And HA, they also didn't understand it because they hadn't ever used it in video software. So I opened up an image in AfterEffects and showed them how it worked. And every single one of them then changed their mind and said they would like to have that tool available in their software as well. Unless it would harm the software in some way, like making it incredibly slow as a result, then it would be silly Not to consider adding it. That would be like if they wanted to add something like the essential graphics text panel to Photoshop and me objecting to it because I don't think I would use it. It'd probably be useful to some people, and with how that may change how people create mogrts already, maybe that would increase adoption and there would be improved mogrts out there that aren't limited to a single font and size (because making them responsive is a lot of extra work). So leave your objections to yourself unless you are able to state facts that contribute to the harm it would cause you. One objection you might have would be, "Please don't add that, because I struggle with learning new things and currently have a speed advantage compared to others since I've memorized every RGB combination of skin tone and would like to continue working that way."
... View more
‎May 19, 2023
12:02 PM
What is going ON with Media encoder changing things!? Previously, it screwed up only Specific letters from a mogrt. Now, it's messing up just a Few clips. This time it's not involving AE at all. And I Think I might have an idea of where the issue is coming from. Premiere has ben automatically enabling the "Auto Tone Map Media" checkbox between versions. Honestly, I don't even know what it does other than screw things up. I tried it when it first came out, and it Seemed like it automatically added the correct LUT, only I think maybe it was doing it as an input lut rather than a creative LUT, even though Premiere's documentation Specifically says NOT to do that because Lumetri works top down, so that means you adjust your colors Before adjusting basic corrections and if the input lut causes something to be under/over exposed, you can't get that information back. So, what's a bug? I have no clue, other than I think it's related to that because I have check the preference off multiple times, and it mysteriously turns itself back on. Most of the footage, even from the same camera, is fine within the same video. There are probably a few slight shifts, but 2 of the videos became Very washed out. I rendered the exact same sequence directly from Premiere and it rendered just like my timeline. Checking the clips > Interpret Footage > Color management > Input LUT = None. -- Use Media Color Space from File: Rec. 709 Which is what all of the other footage, except for the RED footage, is set to. When I tried out the tone map setting previously, I had the same issue without realizing what had happened, so I decided whatever that setting was doing wasn't a great idea, and turned it off. It seems like it turns itself on after some of the updates, but again, unsure exactly. I thought there was a setting in the preferences at one point, but now it's only in the sequence options? I'd actually use it if it just automatically added the correct LUT to Lumetri, if a Lumetri effect didn't already exist, but that doesn't seem to be how it works. It just changes the colors Fairly close to what they should be after the LUT is applied...without telling you which one it chose, or at what point it's doing it? And when I tried it originally, I created a nested sequence, things were fine. I opened premiere, and it applied the auto-tone map twice to every clip I had nested, without providing a way to turn it off (that I could find). Regardless of my frustration with why we would get half-baked releases from the Adobe dev team; this is one project, renders fine from Premiere, and Some of the clips have color issues when rendering with Media Encoder, while others from the same camera on the same shoot do Not have any issue if I render from premiere. I even closed premiere and opened it, then rendered again for good measure Just in case I had made an error that was only showing up when opened again for some reason, but it rendered fine. I mentioned mogrts because I had the same issue there. They work and render fine within premiere, but when I sent the render to AME, within text that was set to change to all caps, only a few specific letters ignored that.
... View more
‎Apr 13, 2023
05:46 AM
I suppose I could do that, sans the media.
... View more