I hesitated to talk about this point, because I didn't want to get too off topic, but it does tie in with your point.
Macros in QC were great, and I think that in earlier thoughts about the VUO system and the problems of QC, that the judgement that macros are wonky was not well considered in context of the workflow benefits. I think the macro-esque implementation of an iterator was very handy from a workflow perspective.
Relatedly, If I am stringing together a multistage fluid simulation in VUO, and I'm trying to do it with either the ShaderToy node, or the GLSL node from parabox, it is astounding how many nodes have to be laid out on just one level of a composition. For me, it becomes difficult to actually make the compositions, without being able to section stuff away in a Macro.
Think about if you had a Render in Image patch in QC, a GLSL shader, a grid, and then need to feed it with typical values. You'll soon have all of that stuff that would be inside the Render In Image patch in QC, all on one level. You'll also have a bunch of extra "build point" type stuff, more than likely.
So it was GREAT to see that subcompositions work the way they do, but it's pretty hard to work that way because the debugging gets difficult. If only one could just "click" and get into it, the way that macros worked.
I often find myself almost having to lay things out in QC first, because if I didn't, it would be impossible to get the composition worked out to start with, between debugging shaders without code highlighting for errors, and without macros.
Come to think of it, since I updated to High Sierra, specific shader bugs no longer even print to Console. It just gives a sort of generic line that references VUO, without the info it used to give.
I was noticing some unexpected results using textures in shaders, and I am pretty sure it is from textures not having mipmapping enabled.
In QC, if you use the Image Texturing Properties patch, it will format the texture so that mipmapping is enabled; so when you feed shaders with various random textures and patterns, it winds up looking MUCH better in some cases. In some cases, without mipmapping, the results will look very broken.
I hadn't quite realized that there was sub composition support that is as robust as it is until recently.
But this got me thinking - isn't that most of what is needed to create an iterator type scenario? What happens if a subcomposition is setup to load itself? Or if a sub composition is fed a structure which then causes it to load itself X amount of times?
I haven't tested what happens now, but if it doesn't just work, it seems like some sort of loop could be added to whatever part of the code loads compositions up as nodes, and then get iterator like behavior? I'm sure it's more difficult than just that, but it got me thinking about that being the basis of an approach.
I will point out that a fundamental part of the image filter protocol in QC is that published xy automatically receive mouse events in the editor without even hooking anything up, at least for many major versions, possibly still. That’s one better than even having to load and connect the QC mouse patch. And if you wanted to manually, you could, of course. Mouse, and other forms of hardware interaction that need to get info about the output window coordinates, are pretty fundamental to the concept of an interactive programming environment. IMO. Just making an argument for it and against the hypothesis that it doesn’t fit.
You all have done such a great job in general, but every so often, the typical useful defaults and use cases that were easily available in QC just are not here. I think maybe I coincidentally stumble on those a lot :)
There are so many different mouse related patches and if one one of them did what the one in QC did, it would eliminate the need for all of the linking together stuff. Or maybe some kind of subcomposition that could work similarly.