Object tracking from video

Generate tracking points on a video to be able to composite screnes into videos or characters into virtual scenes.

Could be real-time, baked data, or both? Baked = analysis of video file for control point generation. Very cool for placing a video file into scene for augmented reality work. For example if a film scene had a mirror in it, then Vuo could be told the square control points for the mirror - then place a 3D quad with a real-time video camera into the movie mirror. By baking the data into the scene and using it for the 3D quad translation it would be very smooth and accurate.

Component: 

Notes from Team Vuo

Vuo Pro: 

No — available with both Vuo and Vuo Pro licenses

"Track Points in Image" node

A node that takes a stream of images as input, and outputs tracked points and their velocities.

Similar to Kineme CVTools: QC Image To CV Image -> BGR CV Image to Grayscale -> Good Points To Track -> Optical Flow Points.

QC equivalent: 

Kineme CVTools

Component: 

Notes from Team Vuo

Vuo Pro: 

No — available with both Vuo and Vuo Pro licenses

Complexity: 

●●○○ — A few weeks of work

Potential: 

●●○ — Could expand community

mxfx_SphereDisplace_04

mxfx_SphereDisplace_04_vdmx1

mxfx_SphereDisplace_04 takes incoming image and Extrudes a Sphere using the image's Luminance values.

Vuo FX plugin for use in VDMX and Coge. Includes Published ports to map control data from host App. Ports include:

Distance (Controls Distance of Extrusion)

posX (X Position of Sphere)

posY (Y Position of Sphere)

posZ (Z Position of Sphere)

rotX (X Rotation of Sphere)

rotY (Y Rotation of Sphere)

rotZ (Z Rotation of Sphere)

scaleX (X Scale of Sphere)

scaleY (Y Scale of Sphere)

Composition and supporting files: 

Pages

Subscribe to RSS - Image