cmiVFX Releases New Nuke 3D Tracking Video
Press Release from cmiVFX
Princeton, NJ (April 5th, 2010) - cmiVFX launches its latest training video in the extensive Nuke collection, Nuke Tracking. Learn how to perform several key tasks involved with 2D and 3d Matchmoving without ever leaving the Nuke application. Grain removal and restoration, lens distortion removal, point cloud analysis, object reconstruction, projections, and stereoscopic paired camera equations are just a few topics covered in this robust fast paced training video. Get up to speed today, by learning all you need to know about these topics in one convenient place. This video is available today at the cmiVFX store for a Special Introductory Offer Price!
Don't forget about the cmiSubscription! Get one today. cmiVFX launched the most affordable subscription plan in Visual FX Training for only $299 USD and if you were a subscriber, this New Training Video would already be in your account. For more information check out: <http://www.cmivfx.com/productpages/product.aspx?name=Subscriptions_And_Pric
Nuke Tracking Training Video
3D tracking really simplifies many compositing tasks, whether it's removing unwanted objects or adding in additional elements. Having to swap applications in order to obtain the necessary camera data has always been time consuming and making alterations difficult. Now, Nuke comes with its own camera tracker - so 3D tracking can be done exactly where you need it!
Focus on Nuke's camera tracker and learn other relevant tasks for a good track, i.e. grain/noise, lens distortion and 2d tracking. After analyzing the point cloud and solving the camera movement, learn how to set-up some geometry and recreate the scene with 3D and UV projections, then animate a new camera movement which will turn that into a stereoscopic set-up and end up with a nice (red & blue) anaglyph for a stereoscopic workflow alternative.
Preparing the Footage
In order to get the most out of the trackers in Nuke it is important to remove any distractions. Grain and noise can offset the tracker. When it comes to 3D tracking the rolling shutter is an issue as well.
How does Nuke's 2D tracker work and when would one use it? 2d trackers can be helpful to keep Roto shapes in position. It is even possible to patch areas by connecting the tracker to a corner pin node.
How do we reduce the amount of lens distortion in our footage? A simple thing when using a lens grid but there are also other ways like analyzing lines in the sequence or using the lens distortion estimation in Nukes camera tracker.
3D Camera Tracking
Even though a compositing app, Nuke provides its own camera tracker! So how do we solve a scene? What can our point cloud tell us? And how do we produce results that are suitable for our cause.
Now that the camera movement is solved we will use our tracking points to place some simple geometry to the scene. We will move them to the correct position in 3D space and check if everything lines up with our original shot.
Using the geometry just created we will use our tracking camera to project the footage onto our geometry. This gives us the possibility to stabilize the shot and create a fairly smooth camera movement even though we started out with a very shaky shot.
Detailing in 3D
In order to make our projected texture stick to our geometry we will switch to UV projections. That also gives us the option to displace the geometry. That way we can create a facade that has a contour, rather than just a flat card.
Find Best Frame (FBF)
Since textures can sometimes not be created from just one frame it is necessary to stitch a couple of frames together. Rendering them out as UV's gives us the possibility to match, merge and paint them.
The scene now only consists of textured geometry. That gives us the freedom to animate a new camera movement. This camera can be paired to create stereoscopic images. If you have red and blue glasses, you can view the set-up as an anaglyph.