This recent discussion about the UI nodes got me thinking again about some things I'd like to see happen with them in the future. I've used UI elements in my own projects quite extensively, mostly with apps I've exported but there have definitely been some things I've found to be tedious when designing with them. Before I go any farther I want to say how much I really do like and appreciate what the Vuo Team has done with them so far, I intend my suggestions simply be for taking away the tediousness of designing with them and to speed up workflow.

The difficulty I've experienced has always revolved around easily making visual changes to how UI elements appear in the render window. I've found if I don't have a very clear layout or mockup pre-made before I start a composition it is painfully slow to create UI elements that have a professional looking layout. If I find myself needing to add a new UI element this often means manually editing one-by-one every other UI element to make room for the new button.

So what I am proposing is some type of visual Render Window editor mode that supports a drag-and-drop method for adding and editing UI Elements, layers and Images right in the render window while a composition is running. Also providing simple floating tool bar for maintaining alignment, position, size and changing a UIs theme and appearance would be amazing and really speed up a compositions development time. The Visual Window Editor would intelligently create the required nodes and connections in the composition like attaching them to the Render Window node automatically. Even better would be to have a feature available from the composition window where a user could publish ports to the Visual Window Editor so they were available to connect to when adding a UI element.

Thoughts and/or ideas?

Moderator note: 

Relocated here from elsewhere on vuo.org

Comments

I think that another approach

George_Toledo's picture
Submitted by

I think that another approach would be to have nodes that look like typical GUI objects appear on the editor surface, selectable from the node library, then be able to hide non-GUI type nodes when deploying to a user.

So, you would lay out all of your slider GUI object nodes (for example) in the editor window itself, hook them to layer nodes and such, hit a button and all the layer stuff goes away…with the user just seeing GUI.

The (current) approach of using existing stock VUO nodes to draw buttons and sliders and such in a window to send messages to control graphics in a different window is definitely workable for many scenarios as well.

The idea of auto connections in such a case seems like it could be very cool. I remember there were some issues with QC interaction, as far as “what happens when the patches are rendered into a texture buffer”, and them possibly then having the mouse event data in the render window not match up with what a user is seeing. Not a reason against it at all, just something to keep in mind if something like that was to be implemented.