I don’t think this is a feature request type scenario in that there is no way to render movies and have a workflow where the user is previewing the output they are going to get. Does it really make sense to open it up to voting whether or not the window should match the dimensions being sent? Or is this just a bug? Maybe you all are right though, just throwing that line of thought out there.
The QCTV code example has cocoa based code that shows how this could work.
But also, can’t the window spawned just match the texture size it is being sent? There doesn’t even have to be anything fancy if the output window just went ahead and matched the input res when the composition is initialized. I think it is arguably bug-like behavior that it is currently somehow arbitrary.
When a composition is an Image Generator protocol, there seems to be no way to size the output window other than manually with a mouse cursor.
I was attempting to render a composition to a 1280x720 movie file, and was noticing that after I converted it to the Image generator protocol, that I couldn't figure out a way to adjust the composition window to exactly the same resolution.
Am I overlooking some place where this can be set by inputting the desired pixel resolution for the output window?
The reason I added ten votes for this last night, is that I think it would increase the usability of VUO as an offline rendering system immensely.
Jaymie, I was trying to figure out how to make a video of the midi based composition I sent you, and also capture audio at the same time. Or, how to render it offline and then join the audio to it. Basically...”how do I make a video of this with audio, whether offline or live capture”. The feature that creates movies from the Viewer, seems to not be able to be configured to capture audio.
I guess that with audio, there is the “play audio file” node...which I have not used yet with the offline render, but I am guessing works. With MIDI, I can’t think of any way to do this. In QC, Value Historian helped a lot by being able to be used for offline renders, after capturing this kind of data.
It is really hard to think of a workflow to use VUO to make videos of audio visualization or midi visualization. That doesn’t mean it can’t be done of course, it only means I haven’t figured it out.
If VUO could render offline while outputting values from a MIDI file, then a DAW linear editor could essentially be used as a linear editor for VUO values. That would be amazingly powerful.