Week 1: Nuke 3d and Camera Projection
In the first week of this module, we got introduced to the 3D space within Nuke and also how to use a camera projection to make a 2D image look like it has depth within Nuke. Below are my processes for this week.
The image above shows 3 objects in the 3D space in Nuke, to do this I used different nodes that are specific to 3D spaces like the 'scene' node and the 'camera' node. For these objects to appear in the 2D renderer I will need to use the 'ScanlineRender' node and connect that up with the camera node as seen below.
Objects in 2D View
Task - Tunnel Cam Projection
The image above is a standard JPEG of a random tunnel. I will use this image in nuke to create the illusion that you are moving down the tunnel using nukes 3D space that I have learnt in the examples above.
I needed to use the 'Project3D' node above to generate the 2D image first into something that could work in 3D space and then followed the steps to allow this image to show up from a 3D space to a 2D one. To generate the camera movement I have used an already setup camera in which I used the 'FrameHold' node to freeze the first frame and then allowed the camera movement to carry on after that so I could freeze the image and also navigate down the tunnel. The result of this is seen below.
Tunnel in 3D View
Finished Result
Task - Cell Cam Projection
For this task, I made use of 3D cards in Nuke as I rotoscoped out different parts of the prison cell image seen below. I then combined these cards in 3D space to make up the prison cells interior and then added some prison bars in front to make it seem like the 2D image was in fact in 3D space.
Prison Cell Image
The image above shows the first rotoscope of the back wall, this process will be done for the ceiling, floor and the left wall/ right wall.
After the rotoscope, I then added the premult node to cut it out and then I applied a 3D card to it as seen above. This process was done for all of the rotoscopes to create the cell as seen below.
Cards in 3D Space
Finished Result
Week 2: Nuke 3D Tracking
This week for the module we had a look at 3D tracking in Nuke. We firstly had a look at lens distortion and why it is important for 3D tracking. Lens distortion occurs as a result of optical design when special lens elements are used to reduce spherical and other aberrations. Below are my processes for this week.
An Example of Lens Distortion
Task - Tracking in 3D Space
The image from the sequence above has made use of the STMap node in the node graph editor as this is the method for solving lens distortion in Nuke. The node makes the video stretch out to the sides.
Next, I added the camera tracker node which has tracked 300 points above the waterline since tracking water is very difficult. I made sure that I removed the bad tracks that were highlighted in amber and red.
I then exported the camera tracking data and it created the 3 nodes seen above in the node graph, I also changed the view to the exported camera 1 in the viewer
A checkerboard was added to highlight the object that is attached to the tracked window in the viewer seen above.
Finished Result
Task - Painting Out Window a in 3D Space
For this task, I needed to remove the window highlighted above with the rotoscope. To do this I have first used a roto paint node where I cloned the window out to then use the premult to cut it out.
I then followed the same steps as in the previous task which was to add a card node and align this with the tracking data that was previously created. The image above illustrates the card in the 3D space.
Finally, I connected all the necessary nodes to the scanline render for it to show up in 2D and then also added a grade node in which I keyed it multiple times to blend the new area in with the building.
Finished Result
Week 3: 3D Equalizer
This week for the module, I have started to have a look at the software 3D Equalizer which is software that focuses on 3D tracking. The tasks that I have done this week were mainly just getting used to this new software as seen below with examples of how it works.
Above is the software interface for 3D Equalizer, I have imported an image sequence as seen above in 3D Equalizer. I also set up short cuts for all the key buttons that I will need.
Setting Up Lens and Film Back
I then started to add tracking data to the image sequence by adding tracking points as seen above. I made sure that these points started on frame 1 so that it tracks from frame 1 to 100 when clicking 'T'.
Adding More Tracking Points
Finished Tracking Points
Attaching an Object to a Tracked Point
Week 4: Lenses and Cameras
In this week for the module, I again had a look at 3DEqualizer but this time it was a more in-depth look at certain aspects of the software such as the use of lenses and cameras within 3DEqualizer. Below are my processes for the tasks this week.
Task - Door Nodal
For this task, I first needed to track 15 points of interest on the video above, making sure I track in various place over the course of the video.
I then needed to make sure that the camera settings were all in check. Since the video was film on a Canon 5d Mk3 35mm I needed to make sure that all of these camera settings were setup in 3DEqualizer as seen above.
Next, I needed to calibrate all these trackers to take into account the distortion and the quartic distortion from the camera. The result is seen above.
I also used the distant constraint tool to make sure that all of my tracking markers were in the same distance relative to the in real life distance of where the video was taken.
Task - Door Freemove
In this task, I needed to track a certain number of points on the image below. This video was much more difficult to track since the camera was free moving around the door when recording and there was a lot more motion blur.
Finished Tracking Points
Week 5: 3DE Freeflow and Nuke
This week for the module, I had a look at the pipeline for working in 3DE and also Nuke. In the first exercise, I exported a scene from 3DE with all the geo's and tracking points and applied this in Nuke. And the second exercise was done in a different but similar way. Below are my processes for this week.
Task - Exporting 3DE to Nuke
In this task, I needed to transfer this 3DE project above into Nuke. I first needed to correct the original tracking points to the correct dimensions of the area that was shot using the survey data. I also added a cylinder on the floor to highlight the 3D space that was tracked.
I then needed to export all of the important data from 3DE into Nuke and to do this I used different exporting methods for each of the data that was needed for Nuke as seen above with the file being exported into a nuke file.
Once in Nuke I needed to compile all the data into nodes in the graph editor, to achieve this I copied the files into the node graph for them to become nodes. As seen above all the data has been organised into backdrop nodes.
I also needed to undistort and distort the image sequence using the data that was exported from 3DE as this is necessary to make sure the tracking makers are in the correct place.
Finished Import
Task - Removing Writing on Sign
For this task, I needed to remove the writing that is on the white sign. To do this I used the roto paint and frame held at 1109. I then connected these to a premult and then to a card that will be placed in the 3D space along the tracked markers on the sign. I also had to make sure the overscan node I made at the end.
Card In 3D Space
Finished Result
Week 6: Surveys
In this week for the module, I continued to work on 3DEqualizer in more detail as I focused on creating tracking points from points that I have made in the 3D space using precise survey data and also sometimes some simple maths. Below are the processes that I took this week.
Reference Information
I first created a single point that will be the origin point for the others to go of from. This point is located at the edge of the decking from the reference image above. I made sure that I changed the survey type to 'exactly surveyed.
I then continued to add more of these points but this time they were placed in the correct location from using the survey data as with the example above I have set the point to an X of 3.61 and a Z of 4.06
I continued to build these points up so I could fill the scene with lots of accurate points.
Once all the necessary points were made I then went into manual tracking mode to complete these tracks and made sure that the tracking was in the correct location of where the points are in the 3D space.
Adding More Tracks
The final thing to do was to make sure that the tracks in the manual tracking space line up with the correct information in the lineup display. To do this I made sure that I readjusted the tracking point to fit within side the x cross as seen above.
Assignment 1
For the first assignment for this module, I was tasked with using 3DE to track a scene where I must have a minimum of 40 tracked points and I had to make sure that the tracked points had a low deviation. I also need to align the points in the scene using the survey data provided and add a 3D cube to the scene. Once I am done in 3DE I will then export the camera, LD data, locator geo, and 3D models to Nuke. In Nuke I will compile all the data to be able to play the video sequence and combine it with the 3DE data and I will also add a clean up to the fire exit sign in the scene. Below are my processes for this assignment.
I first needed to track the scene, above I have created 50 tracks where I have made sure to position them in areas where I can use the survey data to correctly position the tracks in the 3D space. I have also made sure that I have tracked the fire extinguisher as this will be needed for the clean up in Nuke.
Tracking Process Breakdown
The image above shows the deviation browser of all the tracks, I have managed to get an initial average deviation of 0.4. This number will reduce as I add data like lens distortion and focal length adjustments.
To further decrease the average deviation of the tracked markers I used the lineup view to make sure that the tracker was in the centre of the green cross as it illustrates where 3DE thinks the tracker should be for a good track.
Next, I added in the focal length adjustment as seen in the image above I used the adaptive method of calculating this and this resulted in the deviation lowering to around 0.33.
I have also added the lens distortion adjustment where I made sure that 3DE calculated the distortion and also the quartic distortion using the adaptive method this lowered the deviation further to 0.32.
Once the average deviation of the tracked points was sorted I then needed to make sure that these points were correctly scaled to one another in the 3D space and to do this I created a distance constraint along one of the white cupboards using the reference data of 140 cm as the measurements were 1,4m.
Aligning The Tracked Points To Origin
Next, I added locators to all the tracked points to highlight that the track has been done properly and I then also added a cube to the scene as seen in the image below.
Added Cube In Scene
Once in Nuke, I imported all of the data that was needed from 3DE and I have organised them into the backdrop nodes as seen above.
The first thing to compile in the node editor was to set up the lens distortion of the video sequence and to do this I used the node BlackOutside which creates a black border of the image as this is needed since I then use the LD data as this chops the edges of the image creating an empty space. I also used two reformat nodes that convert the image sequence into an undistort state and a distorted state.
For the cube and the locators to show up in the viewer of Nuke, I needed to make sure that I set up the nodes correctly as I have connected the locators and the cube to the camera data that I exported from 3DE where it is then connected to a ScanlineRender whilst also using the lens distortion data that I have created.
The final task that I needed to do was to apply a cleanup to the fire exit sign and to do this I first created a rotoscope of the sign and also applied a roto paint node to the area that I want to clean up as seen below.
Applied Cleanup
Finished Nodes
Finished Video Render
Week 7: Creating Shot Footage for Assignment 2
In this week for the module, I worked on creating some footage for my upcoming assigntment 2. I created two shot sequences and made sure that I applied markers to the wall and the floor so that when I come to track these shots in 3DE, they can be tracked easier. Below is the footage that I created.
Shot 1
Shot 2
Week 8: Surveys Continued
This week for the module I continued to have a look at survey data within 3DE, however this week I have had a look at importing multiple reference images into the software where I then tracked 30 points across the reference images. Below are my processes for this week.
Above I have imported multiple reference images where I will go through each of them to track certain points that consist across all of them.
Creating Tracking Points
I finished of with creating 30 tracked points that span across all the reference images as I next need to compile all the reference images into an image sequence that contains the tracks.
Week 9: Lens Distortion and Grids
This week for the module, I have had another look at lens distortion in 3DE but this time I have had a look at using a grid to help with the lens distortion of an image sequence. In class today I have imported a nodal image sequence and tracked 24 points for which I used a grid to generate the lens distortion, below are my processes for this week.
Finshed Sequence
With the image sequence above I first tracked 24 points using reference images and the result is seen above.
To achieve the lens distortion I first needed to create a new camera that contains the checkerboard.
Next, I used the distortion grid view where I snapped the points to the checkerboard that I have imported, this allows 3DE to use the checkerboard data in 3D space.
I used the menu above to calculate the lens distortion using the parameters seen above.
Finished Lens Distortion
This week for the module, I have had a look at the workflow of taking a tracked image sequence from 3DE and combining this with a 3D object from Maya whilst also using Nuke to create implement the lens distortion data into the exported image sequence. The workflow that I went through this week are presented in the images below.
I first started in 3DE where I needed to set up the image sequence and the camera properties. I then went to the 'export to Maya' area under the export tab where I made sure the start frame was set to 1001.
After I exported the image sequence I then brought it into Nuke where I needed to undistort the the sequence since there is no lens distortion in a software like Maya.
When importing the image sequence into Maya I have had to set up the properties of the script that I have also imported containing all the tracking points from 3DE. I needed to make sure the scale and size of the image sequence was properly set up in the placement section and also the focal length of the Maya camera.
Since the world origin of the scene was already set to a tracked marker all I needed to do to place the objects in the given position above was just to create new objects. I have also made sure to set up the shading of these objects and added an area light for rendering.
Finally, once I had finished rendering the shots from Maya I then brought this back into Nuke where I applied a lens distortion and merged the rendered AOVs to the original sequence to create the final shot seen in the image above.
Final Render
For the second assignment, I plan on modeling and rigging a raptor dinosaur and placing this on the shot that I have taken for this assignment.
Above is the footage that I have chosen for this assignment, I plan to make the raptor walk from the left-hand side of the screen to the right.
Reference Shot
The reference image above is an example of what I am trying to achieve with my final shots, as I plan to make the raptor walk with the video sequence from the left hand side of the screen to right and near the end, I plan on making the raptor turn at the camera.
Raptor Modelling Process In Maya
Tracking Scene in 3DE
The images below are the processes that I took to track the chosen video sequence below, whilst also making sure that the deviation number remains at a low number to generate the best tracking of the scene.
The image above illustrates the 35 tracked markers that I have created. However, the deviation at the moment is 0.8264 and this is something I need to bring down.
I used different parameters such as the quartic distortion and distortion to get the deviation of the scene down to 0.2320 as seen in the deviation browser above.
Next, I added locators to the scene where I made sure that I aligned the tracked points to the ground and set an origin point for one of them, I also created a constraint between the sofa. I have added a cylinder in the scene above the check how well it sits in the scene.
I then brang all the exported data from 3DE into Nuke so I could display the tracking data/markers on the original image sequence inside of Nuke. I made sure to set up a lens distortion pipeline to that Nuke can work out the lens distortion of the image sequence. The Nuke pipeline that I have created above is now ready for when I merge my raptor in the scene from the render sequence that I will create in Maya.
Further Raptor Modelling In Maya
Rendering In Maya
To render out the raptor in Maya I have to place it within the mel script in Maya that contains all the tracking data from 3DE. I have also made sure to create a plane that uses the aiShadowMatte which will create shadows from the raptor. I have placed four area lights to replicate the lights from the original image sequence.
Rendered Shot
First Full Render Test
Final Rendered Sequence