I decided its time, after a long time out of the music business, to build a new studio.
I have been looking at control surfaces, as I am planning a hybrid approach but do not want a large format analog console as I don’t really feel like it will be used beyond monitoring and there are plenty of other ways to achieve that. I DO however want the large console workflow, and that started some R&D into exactly what is possible with touch technology in 2020, after trying every touch UI I could find I ended up here… and really this software is a class apart. I was really surprised how little information is out there about using touch interfaces to control a DAW beyond the really basic stuff… Slate Raven is a thing, but its proprietary, expensive and closed feeling (not to mention being rather pro tools slanted… yuk). None of that makes for a platform that you can truly make 100% perfect for your individual vision, and for me that is paramount (including customizing the visual design which is a HUGE part for me of what makes a UI have that really special extra something) I’m not a
use the presets kind of guy.
I probably should also just come clean and admit that… I want a studio console that looks cool… there is REAL ergonomic functionality here, but make no mistake that when clients come to my recording suite, I want jaws to drop and eyes to pop. You may think that’s not important, but its seriously the reason why many facilities still operate large format consoles as a giant stereo monitoring desk… Sounds shallow, but actually really does matter from a business perspective, there is no getting around it… To the uneducated client coming to view your wares; looks impressive? worth the price. end of. I’m a realist and I have fessed up to the fact right here that it is at least PART of the motivation I have to build this idea. Don’t judge me…
Hardware Control surfaces are all very well, but they always feel too small to use like a large console, you cannot see every track and certainly cannot just go and reach for track 50 while riding 26 etc… its just not the same. They also struggle to handle things like EQ elegantly and don’t allow for customization of the position of the control elements… it feels like a poor half-way house, like viewing a beautiful mountain vista through a small keyhole window… you can look left and right and can technically see everything… but it always feels restricted somehow.
So, after a few weeks of hard work I present version alpha 1 of my new touch interface that will be the UI for my new studio project, a large format TOUCH console with 3 32" 10 point multitouch screens making a full console experience a reality. This video shows the UI running on just 1 screen, eventually I will incorporate 4 of them (3 built into the desk and 4th on a mobile kiosk (like an old school tape remote for those of you that remember the old days ) I don’t know why, but I just decided to buy 4 large format 10 point multitouch screens and go for it with little to no information confirming(or denying) that it would work… seems madness now thinking about it… but hey. IT WORKS!!
Caveat: To run all this visual goodness you need a proper GPU in your system, not an integrated intel jobby, and I also discovered, although i can find no other reference to it anywhere, again maybe we are on the bleeding edge and that is why but… AMD GPUs (or more likely the AMD software) totally break multi screen touch in windows… literally after installing catalyst the touch screens ALL just control the main monitor, wherever you touch them just pops up on that screen… so beware, AMD for the moment at least is a deal breaker, nvidia is good to go, again though you want a
proper gtx /rtx gpu, not like competitive gaming proper (1050ti is probably fine), but certainly not just a GPU with ports and 2d acceleration only, this stuff needs at least a modicum of 3d rendering juice to work…
The interface currently works with any number of monitors, as it has the necessary navigation to move between screens as you can see in the video, it will fully function with only 1 display. It controls reaper and is 100% osc based. I have screens for:
Faders (with pan, rec enable, automation mode, solo and mute controls+mini stereo VU per channel), sends (6 aux sends + automation mode select, solo, mute and mini VU per channel)
Eq that automatically inserts the reaeq plugin on touch of any control and brings up the UI on the reaper screen so you can edit each of 4 bands in real-time (HF, HMF, LMF and LF each with separate q control) with mini vu, automation mode and mute/solo.
Then there is the full screen VU mode with 60fps VU per channel, mute+solo.
I also have a master screen with programmable mute/solo groups, automation master controls that affect all channels, all channel select/deselect, mute/unmute and unsolo buttons. The master screen also has a 3d GL backdrop of a nebula rendering in real-time… just because its cool… and SSl style buttons because if I could do that… I can make it look however I want in principle and I was too tired for original button design at that late hour
There is a 9 button locator keypad that jumps to the marker number and also has a previous/next marker button. Then a standard transport button group and loop enable/skip
All the channels are in groups of 15, which fits better for us 5 fingered mammalian operators out there… really why we still group things in 8s makes me giggle considering the roots of those numbers back in the days of early tape machines… people have 5 fingers, touch interfaces need to work in 5s
All screens pull channel names and send names in real-time from the DAW.
I am really in awe of what can be done with some time and determination using only OSC… Its not been super simple working all of this out but the possibilities are starting to look limitless, especially considering that I nearly finished this first iteration of the design before I discovered the “actions” command type… at which point my mind was quite literally blown and I realized the full potential of what we are looking at. Reaper is wide open to completely custom control ideas in a way that is really hard to process for me at this stage… There are some features I would like to see, mainly around fader functionality but I will save that for another post when I have everything finished… Suffice to say I am really hugely impressed with this software, and I feel like for DAW control it may not be immediately obvious to either the developer or some of the other users exactly what it is capable of, maybe I am wrong about that but I have not seen anything like this yet done. We are on the edge of a brave new world where touch surfaces replace physical controls completely. The ability to create this large format console style workflow is a serious game changer imho. Devices like slate raven hinted at it for sure, but this is another thing entirely. I have tested and can confirm with multiple instances running I can have 4 (or more) screens, each with 10 point multitouch all working independently as a homogenous surface. Its incredible and I am really excited for how the studio build will turn out as a direct result of this feature. For the devs of OSC/Pilot I would like to extend a direct request for support with this project in terms of features that can enhance the use of the software for a studio control room application, as I know you will be aware that it can be used for that (perhaps on a smaller scale) but maybe not exactly HOW much it is capable of even in its current form…
2 features I really would like asap are:
- the ability to store settings in the configuration file. At the moment the settings are global, and this means that if I want (and I definitely do) each screen to have its own instance they need to run on different ports, currently to achieve this I have to open osc/pilot on the screen I want it, then change the port each time on each instance which of course has to be done each time I start up the system… a minor issue but one that could prove annoying as the setup becomes more complex.
- And this is really a big one… the software needs to not move the mouse pointer. This may sound simple but I’m sure it will make the devs quake… but right now because of how windows works with touch by default if you touch a screen, the mouse pointer moves to that location. If you then move the mouse you have to track all the way back to where you were on the DAW screen, sometimes this can be many screens away and it is annoying AF. At present if you double tap anywhere on the screen a mouse pointer also APPEARS under your finger… that looks super janky and breaks the UI immersion completely. I am pretty sure there exists now in the windows APIs some way of making your application not do this, and to achieve that would really make the solution feel much more slick and remove the only really annoying issue when working with touch & mouse simultaneously… can we make this happen? I hope so…
I will save other feature requests for another post, and most of them are either cosmetic or nice-to-haves. As things stand right now, this is ready to use as a 60 channel hands on surface, and I personally was not aware that this was possible with NO custom scripting or even the need to use midi commands… How far we have come. So THANK YOU to the dev team for bringing us this far, please lets keep an eye together on this use case, its a huge deal for me and I am sure I am not alone in wanting studio workflows to return to a format where you can see and interact with EVERY channel in your mix without needing to scroll through banks and menus, it truly is the best of both worlds and gives me goosebumps even playing with it in this early state. One final point I would like to add is that after working on the UI for over a week straight (with some loong days in there ) I have yet to have a SINGLE crash. not 1 hiccup in all that time?! for any of you that have worked with pro tools for a length of time that kind of reliability is god-tier, I cannot stress enough how super stable this application is. well done people, you got that right for sure.
Yes this is a long post, but also, yes this is a huge deal. I don’t mean my paultry efforts, that’s the very tip of a HUGE industry shattering iceberg in my opinion. I seriously believe this is a “mark this day” moment in the world of DAW control. Either by us or by someone, and soon, this will become the way things are done. infinitely reconfigurable workspaces? custom UIs? personalized layouts and branded interface elements? welcome to the future… we literally just brought star trek to the recording studio… and its as obvious and inevitable as the smartphone was as far as I can see…