Exit
  • Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
  • 한국 커뮤니티
0

Creating 2D animations in VR ?

New Here ,
Oct 23, 2017 Oct 23, 2017

Hello!

In the latest version of AE, Adobe, acquired skybox studios to better work with 360 and VR footage. From this video, Apply immersive effects | , you can now add new video effects to your 360 scenes.

Unfortunately this is not what i am looking for, and i am asking for your help to figure out how to apply 2D animations to a 360 video scene. We have a scene from a museum where we wish to add popup information to the galleries and what not.

What is start to end of this process?

We've shot the footage with samsung gear 360 and then stitched the footage in their app, Action Director.

To edit the 360 footage, we need a special filetype named "equirectangular" which is mentioned in the adobe support video, but i am not sure how to get that outcome from the action director app. Our video is MP4 and stitched, and can be seen in 360 and uploaded as 360 to facebook, but i am having trouble to load it up as a 360 video in after effects just like in the support video.

So to sum up.

How do i properly load up a 360 video and export it afterwards in AE - and how do i add 2D effects to the footage where the fisheye bending is included so it actually matches the VR experience.


Thanks in advance!
- Ben

4.6K
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Correct answer

Adobe Employee , Nov 07, 2017 Nov 07, 2017

Hi Ben!

So exciting that you're diving into exciting new territory!

These are a couple of my favorite resources for getting started and were essential for helping me wrap my head around the workflow.

Tutorial: Getting Started with SkyBox Studio V2 - YouTube

Here is a great video that explains exactly how to overlay pop-ups and text and get things to "curve" into the spherical shot as desired.

360 Video Tutorial Using Skybox - Creating 360 Motion Graphics - YouTube

Now these tutorials were created

...
Translate
Adobe Employee ,
Nov 07, 2017 Nov 07, 2017

Hey Ben,

Let me see if I can get the right person to help you here. PM me if there is no response in the next 48 hrs.

Thanks,
Kevin

Kevin Monahan - Sr. Community & Engagement Strategist – Pro Video and Audio
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Nov 07, 2017 Nov 07, 2017

Hi Ben!

So exciting that you're diving into exciting new territory!

These are a couple of my favorite resources for getting started and were essential for helping me wrap my head around the workflow.

Tutorial: Getting Started with SkyBox Studio V2 - YouTube

Here is a great video that explains exactly how to overlay pop-ups and text and get things to "curve" into the spherical shot as desired.

360 Video Tutorial Using Skybox - Creating 360 Motion Graphics - YouTube

Now these tutorials were created before Skybox tools were integrated natively into AE so some things will flow a bit differently in the UI.

Things to note:

  • What was once V2 Composer is now called "VR Comp Editor" and can be found under the Windows Menu.
  • Skybox Project 2D is now called VR Plane to Sphere
  • Skybox Viewer is now called VR Sphere to Plane.

Here is a broader strokes break down of how the 3 VR tools are used, as well as some helpful tips on navigating workflows:

VR Comp Editor:

VR Comp Editor is a script that allows users to work with their 360 content in a view-based editing environment. The script creates a series of comps that allow the user to make 2D or 3D edits and add elements to their compositions through Edit comps. Those edits are treated independently by the script, and then composited by the script into a final Output comp for rendering.
The edit based comps give users the ability to see their composition as their audience will in a headset by using the VR Sphere to Plane plugin (formerly known as the Skybox Viewer), and / or a virtual 6 camera rig that is created under the hood. In edit mode, users can use the Camera tool to pan around the 360 space. Because of these intricate comp structure, users are required to drive the complex system of comps via the VR Comp Editor. Anytime a user wants to add an element in one of their Edit Comps or see the result in the Output comp, they should navigate between these comps via the buttons in the comp editor, which drives the script to refresh the supporting comps.

*It is important to think of the VR Comp Editor as the Master Control Panel for switching between Edit Comps and Viewing the Output. *
Instead of navigating between comps via the Project Panel or the Timeline Window comp tabs, users should use the Edit and Open Output buttons.

Common workflows for 2D Edits:

2D edits take flat objects and convert them to fit the spherical environment using VR Plane to Sphere under the hood.

  • Adding Text
  • Adding Logos
  • Adding color overlays, shapes, or "flat" motion graphics elements to guide a viewers' attention
  • Using the Clone stamp to remove unwanted visuals like a tripod shadow
  • Using masks to add / remove objects

Things to note when working with 2D Edits:

  • 2D edit composites whatever is within the field of view of the VR Master Camera in the Edit comp. Where ever the user points the camera is the part of the frame that will be composited into the Output comp. 2D Edit uses that part of the frame as a "background layer".
  • Because of this it is important to add a unique 2D edit for each area that a user wants to edit. Point the camera and leave it. Add a new 2D edit, point the camera and leave it. (This is not true for 3D edits).
  • Also because of this behavior, if a user wants to add elements to overlapping areas of the equirectangular frame, they will notice clipping of those 2D edits in the Output because the "background" layer will obscure the edit below it. A simple workaround is to use the 3D Edit workflow instead.
  • 2D Edit is a treated like a layer and patched back to the sphere as is, at given projection location, including the background. It allows to edit precomp directly and apply any effects and/or masks.

 

Common Workflows for 3D Edits:

  • Using 3D Camera Tracker to anchor assets to specific spots in moving footage.
  • Footage Stabilization
  • When using 3D plugins like Trapcode, Plexus, Element 3D, etc
  • To add edge blending to help reduce the obviousness of seams in the stitching of the footage.

Things to note when working with 3D Edits:

  • • 3D edits behave differently than 2D Edits, a user can move the VR Master Camera around the Edit Comp and make as many changes or additions as they choose and all of those elements will make their way into the Output comp regardless of where the camera is facing.
  • 3D Edits extract 3D geometry and the background does not make its way to the Output comp.

Extract Cubemap:

The Extractor was introduced in Version 1 of Skybox tools for AE. Before the VR Comp Editor workflow was made available in Skybox Version 2, Extractor was used to "flatten" spherical footage in a way that allowed users to more intuitively work with it. View-Based Editing of VR Comp Editor is more intuitive on a lot of levels and does more of the work behind the scenes. Using the Extract Cubemap workflow still has its advantages, although requires users to drive the workflow a bit more manually. The Extractor workflow is also more demanding of the system's memory.

In addition to the cube-map, the Extractor script creates an Edit Comp, an Output Comp, and a Preview comp that allows the user to see their edits as a viewer in a headset will. Its important to use the Extractor Panel to Refresh the Output comp and see the edits integrated into the scene as desired.

Common Extractor Workflows:

  • Extractor creates a virtual 6-camera rig and places it at the center of the Cube that it has converted the spherical footage into.
  • By creating a cube map of the Equirectangular footage with 6 faces, users can more easily visualize where in 3D space the elements they are adding, live in relation to each other in the Omniscient Cube View.
  • It also allows for using clone stamps and masks in more intuitive ways, as a user can select one of the camera "faces" to work within.
  • Adding flat motion graphics in this space can create undesired distortion in the "corners" of the cube though, as they are further in distance from the camera at the center.

 

Creator:

Users who do not have 360 footage to work with and who want to create a spherical environment or "canvas" use Creator to create that container. Like extractor, it creates a cube with a virtual 6-camera rig at the center for a user to place objects within the virtual spherical space.

Common Workflows for Creator:

  • Users who do not have equirectangular footage from a camera
  • Users who want to create a 360 environment with traditional footage
  • Users who are creating 3D assets in C4D, Element 3D, or Maya
  • Users who want to build 3D environments with Trapcode Mir or Form for example.

Happy Experimenting and Exploring!!! I hope this helps get the wheels spinning. Please let me know if I can clarify anything!

Cheers,

Dacia

- Dacia Saenz, AE & PR Engineering Teams
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Nov 16, 2017 Nov 16, 2017

Ben,

Did Dacia's response help at all? Sure hope so. Let us know if it did or not.

Thank You,
Kevin

Kevin Monahan - Sr. Community & Engagement Strategist – Pro Video and Audio
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Jan 15, 2018 Jan 15, 2018

has adobe created any of their own video tutorials for using this plugin in after effects?

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Jan 22, 2018 Jan 22, 2018
LATEST

Thank you for the extensive answer!

I managed to figure out a rather easy solution when we did the project. Simply load the VR footage in premier and do the 2D popups in After Effects. Then simply drag the AE project inside Premier and add it wherever you want in the composition.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines