A little follow up info from Jesse.
Unfortunately Microsoft does not provide a Mac driver for their SDK and only libfreenect2 can be used on Macs.
On a mac you are unfortunately limited to the Kinect 1 joints. If you have access to a Windows PC you could try using NI mate on it.
I received this response from Jesse at NImate:
'With the hands turned off they aren't supposed to be outputted. It's possible it's a bug. You can force disabling them as follows:
Enable the hands
Clear all the fields for the finger joints
The fields should no longer be outputted
Chest and Pelvis are specific to Kinect for Windows v2 Microsoft SDK, I think. Other sensors as well as libfreenect2 won't output those joints.'
So sadly this FR is redundant, NImate with Kinect v2 does not output the extra joints as expected in the first image in this thread.
You only get the same points as with the V1
Kinect v1 outputs a depth image, great.
From the node documentation: 'This is a grayscale image in which closer objects are colored darker and farther objects are colored lighter.'
I believe a standard depth image should be the reverse of this, closer objects should be lighter and vice versa.
Have you found a workaround?:
Yes, invert the colours.
All the other Kinect apps around output lighter grey as the closer regions.
but both are in separate windows and should be handling the frame rates independently, right???
Maaaaaybe? I'm not too sure about it to say the least. Have you had open the Desktop Video setup window while looking at the issue? It might be that the driver wants to lock down everything to the same framerate. Using their mixers locks everything in the chain to the same framerate, but I don't know if this is transferrable to the desktop video products (I'd assume so since it is the same driver). 30 and 29.97 might be so close that you get partial delivery anyways, but then drops out when out of sync.
Looking at the camera specs, it seems that you should be able to switch both to 1080p 25fps. Can you try that and see if you have the same issue?
Linear -> exponential sounds great (although the nerd in me would love exp f(x) = x^x however useless it would be (disregard this)).
I think I initially assumed the easing would bunch up at where the time was in not set to linear. For instance that it would bunch up at .3 if easing were sat to .3 (VuoCurveEasing_Middle = 0.5), but both makes sense though.
This is where it gets out of hand, but perhaps a 3x2 matrix of options for the in and out easing could work?
This way, if both are set to in, the whole duration is calculated with an in-easing. In-linear gives in-easing until the center duration then linear the rest of the way. In-out gives in-out, and out-in gives middle easing and so on. In theory this could be hidden from stuff like the curve node while providing the same function as it currently has through the Vuo_curve function while giving a few extra options for other nodes. I assume that is on a different level of nodes to re-code if implemented though (unless it gets its own call at the bottom of the if-nest in VuoCurve.c; VuoCurveEasing_damnUnappeasableUsers(...)).
Render Image to Window uses the refresh rate of the monitor you are rendering to. ( I think).
The Blackmagic inputs work at whatever framerate you have coming in from the camera.
As your cameras are different framerates there is a conflict. ( I'm pretty damn sure that's where the issue begins).
Are you rendering both windows to the same monitor? Have you tried it with two monitors?
Can you tell the windows how often to render with Fire Periodically, and make those framerates match both the cameras and the monitors refresh rates?
Maybe add a Hold Value with a Fire Periodically on the input so that you are telling it to drop frames on the higher FPS camera? Maybe take them both down to 25fps....
Do you use Resolume? I believe you could easily set up both inputs as sources in the demo version to see if it works in that software.