Also is there anywhere you can direct me to as how I can use that code in an Xcode Project?
I would rather that be able to be a separate function that I can pass a composition to as a variable.
Rather than a main() function call.
From the other Xcode project I see that there is some action that happen in the main.mm that sets up a context object and a VuoRunnerCocoa object.
would I need to include some code in there? Or can I do it in the AppDelegate
For the most part all of my Xcode Projects in the main.m file it's usually always just the standard.
Template, except for when I've created AscOBJC AppleScript projects.
Yes can totally see the need for speed for getting info a compile tile.
And I know VDMX is using a cache based system in their VuoCache folder to pre-build the executables.
I also know they've build their own Cocoa Framework that extends the Vuo Framework and from my observations
in the log they are using the C++API with VuoRunner rather than the CocoaRunner which I understand has it's limitations
(which I think I was running into, I've come from straight Objective-C programming and not too familiar with using C++
and mixing it. I think a few times I was trying to do some bridging stuff but was probably doing it all wrong! haha)
I will experiment with the
and start trying to learn the C++ API
I know it can also be gathered straight from the GraphViz text info in the comp.
Which I'm realizing isn't necessary straight JSON. (also realizing I need to learn more about json_object
Technically speaking, it does that. Layer 2 is on top of layer 1 - but the list is upside down from what you're used to in the UI 🙃. As a workaround, you can use an "invert list" to have it represented as you're used to.
I hope so keithlang. though Apple is notoriously hard to rely on for software support on undisclosed matters.
I put a project I was developing in QC a few years ago on ice that was a kind of vision mixer for Zoom with some flair I guess you'd say. would have been gold during the pandemic with a virtual webcamera output for getting it into Zoom, Teams, et al. Was too busy to pick it up but thinking I'll make it a custom layer for mimoLive at first. It's an mmhmm sort of thing, but more TV studio production type of settings.
Very cool that you were able to find a solution and that the Vuo Team was able to help you out. I’ve been reading more about how macOS support webcams and realize there are some real hurdles using IOkit or come up with a solution and have it signed by either Apple or by the same dev as the App you’re using it with. I think it’s a security thing to prevent rogue apps hijacking your camera while it’s in use.
Thanks, I appreciate the advice. I’ve actually dabbled with both Syphon and NDI. Here are a couple of my Compositions:
I’m actually working on something new that could potentially incorporate both Syphon, NDI and originally was thinking potentially a virtual webcam output as well if that could be supported at some point. It’s looking much less likely now the way it appears Apple has secured video access by apps in macOS.
In the meantime, you can output from Vuo as a Syphon image stream server and pick that up in other apps that can output a virtual webcam (the, still QC based, mimoLive for example), and there may still be free apps that do this, CamTwist used to do it back in the day IIRC.
Maybe some of us can look at paying something to make this a Vuo Pro included feature based on what keithlang commissioned. I don't imagine that his specs would be very different to what we all are looking for?