Hi ! I’ve been experimenting with EboSuite for a while, and I am now working on a live piece. The set-up :
A drum machine, the Elektron Analog Rytm mk2 (referred as AR in this post) is the master : it sends clock, transport, audio and midi to Live. The AR has the ability to send each of its 8 drum tracks as audio and midi sequences on individual channels through usb to ableton.
I have dedicated tracks hosting ebosuite video clips that are triggered by the incoming midi from the drum machine.
This works like a charm. Selected tracks of the AR (in my case Bass Drum, Bass Tom and Hi Hats) are triggering video clips live ! Mapping the AR encoders to send midi to eFX or ISF parameters gives the possibility to tweak the sound on the AR and deform the video clips at the same time. Awesome. Now I have scene that reacts live to the drum machine.
My big workflow concern is about scaling this to a whole set, containing maybe 10 scenes ( I use ‘scene’ here not in the ableton way but as a section of my live piece).
Is there a trick to ‘swap’ media on a ebosuite track ? Eg. If I want, the bass drum to trigger a different video clip at some point (keeping all other settings unchanged), am I forced to duplicate my track, change the video source clip, mute the previous track and unmute the new one ? This could maybe be done programmatically with CliphX but seems quiet a heavy procedure for a simple result, especially if it concerns 3 to 5 tracks for 10 ‘scenes’.
Am I going to overload my system ? Of course this depends on the power of the said system (in my case a decent i7 MacBook pro 2016 with 32gigs of ram ). My question is more general, is ebosuite suitable for a 30mn live set with 20 to 30 ebosuite tracks and their FX and shaders ?
Specific live/ebosuite question : how can I map the enveloppe follower on audio track A to a parameter of an ebosuite plugin on a track B ?
I’m facing other issues that are not ebosuite related (can I trigger Live scenes with incoming midi program change ? How to properly route audio to syphon to record synced audio and video output…).
and am still evaluating if ebosuite is the right tool for my purpose. What I am trying to achieve could be done in many ways (I’m thinking MadMapper, Vezer,…) but I really love the idea of working all the set in one software and having audio, midi and video talk to each other in Live!
I’ve watched turorials and examples, what is specific here is the live aspect : I am not using clips but incoming midi and audio for all interactions.
If you have any interesting examples or references, workflow tips, or global insights please reply!
Ebosuite is very powerful and the developers are doing a great job upgrading the tool and exchanging with the community, thanks a lot for that !!!
Thank you for your time and happy holidays everyone !