Copy link to clipboard
Copied
I would like to use Roto Brush 2 on live streams to mask out a human. Doesnt have to be too high fps, but anything around 10-20 would still be acceptable. Below would be a bit hard to work with.
Would this be possible?
It's unlikely that this will work at all. How would you even define some basic strokes as you stream? That and the motion field synthesis is really way too slow to have any value for realtime work. I mean, let's face it: Sports broadcasters use 0.5 million dollar hardware boxes with custom ASICs to do their motion overlays in realtime for a reason. The best you can hope for is to get something like the depth based blurs and keying in video-conferencing software, but that has nothing at all to do
...Copy link to clipboard
Copied
Rotobrush works in the Footage panel. The Composition Panel that shows the result will not show any changes until Rotobrush has finished propagating and you switch to the Composition Panel. After Effects must at least render a ram preview before you get any motion. If you want to share the footage live you are looking at the wrong app.
If you just want to get the work done fairly quickly so you can create a composite, then the amount of time it takes to have Rotobrush create a matte depends entirely on the footage, the footage format, the frame rate of the footage, the frame size of the footage, and your system capabilities. On the very same system, footage not specifically shot with the requirements for rotobrush work can easily take ten times as long to process as a shot taken with the same camera if you pay a little more attention to the background and the subject. When I shoot for Rotobrush I almost always shoot at the lowest frame rate possible for the project and I use a high shutter speed to reduce motion blur. I make sure that the depth of field is sufficient to give be really good edge detail, the exposure is as close to perfect, that the lighting is even, that there are few if any reflections or highlights on the edges, and that the color and luminance values between the subject and the background are different enough to not cause problems. Ignore any one of those details and your workflow will increase significantly. There is no roto project that is more frustrating than a shot with a shallow depth of field, a lot of motion blur, and a background that has the same tonal range as the subject.
In an effort to find an example of easy to roto footage I took a look at Adobe Stock and only found about 1 shot out of 20 that would not require a fair amount of additional strokes to fix the background. About half of them would require some hand masking, and about a third of them would be nearly impossible to roto.
Copy link to clipboard
Copied
Thanks for the detailed info! Definitely helped me a lot.
My use case would be a bit different than doing it directly in after effects though. I am currently creating a javascript/python program (havent fully decided yet about the language) which would draw something over a live stream of a person. To make the effect convincing, I need to mask out the person first though. The environment would be a typical livestreamer setup, so indoors, light from the front, indoors room as backdrop. My target framerate would again be 10-20 fps, the rest would be solved with interpolation.
Therefore 2 questions: How fast would roto brush work in such a setup? And would it be possible to trigger it via api?
Copy link to clipboard
Copied
It's unlikely that this will work at all. How would you even define some basic strokes as you stream? That and the motion field synthesis is really way too slow to have any value for realtime work. I mean, let's face it: Sports broadcasters use 0.5 million dollar hardware boxes with custom ASICs to do their motion overlays in realtime for a reason. The best you can hope for is to get something like the depth based blurs and keying in video-conferencing software, but that has nothing at all to do with Rotobrush and is its own thing. Since it's based on free research I'm sure you could even find reference code for Py and Java/ JS and just include it in your other code.
Mylenium
Copy link to clipboard
Copied
Ah, thats a bummer. But thanks for telling me that it is based on free research, will definitely look into that!
Find more inspiration, events, and resources on the new Adobe Community
Explore Now