Hi guys, been working on incorporating ebosuite into a streaming setup. Sometimes when performing, certain efx are not responsive. For example, I really like EFX-wave and I’ve mapped on/off and zoom to a midi controller. During performance, sometimes it takes a few seconds for the effect to show when I’m looping and performing with audio. However, once I select the video track in ableton that has eFX-wave, it will immediately respond with the changes I envisioned. Is there a reason for the unresponsiveness?
Sorry to hear about your issue.
I know max for live plugins do not deal very well with parameter changes when you turn them off and on in the meantime, especially if they are not visible at the moment of the parameter changes. This might be related to what you experience (and unfortunately it is something out of our control).
Do you only have these in situations where you automate the on/off of the individual plugins? Alternatively you could automate the amount parameters of the plugins(and another option you could try is group a plugin and disable/enable the group, I have hunch that this might give a different result as well.).
Please let me know how it goes.
Thanks for the prompt response. I really only experience the issue when not automating and just turning on an eFX live. I like the idea of automating all parameters and then enabling and disabling them individually, so I will try that.
One more question: I have created automations using Rotazoom and sometimes the boundmode changes out of nowhere. I prefer clip or wrap (not really sure what the difference is), but sometimes the video will playback in fold or clear modes, when I will still be in clip or wrap mode. In order to get the desired effect back, I just toggle back and forth between clip and wrap. Anyway to get around this?
Even one more question: When streaming, I’m getting some serious lag that I’ve been trying to understand. I’m currently running Ableton & Ebosuite with 5 video sources, including 4 live cameras. The aggregate of audio and video is sent via NDI to another Mac that is only streaming the capture via OBS. When there are not many fx going on, I’ve managed to account for the lag via OBS (it’s about 1000 ms). However, when there are several fx going on, I then encounter a varying lag amount. Should I just be more mindful and efficient with the eFX that I am using or would there be a way to normalize the lag?
Thanks in advance!
I think I used the term automating a bit sloppily here, I also meant changing parameters via midi. The gist here is to try to avoid turning devices on and off while you are also changing other parameters of those devices.
I am not familiar myself with adjusting and measuring the latency with the setup you describe.
What I think what is happening is that if you use more demanding fx combinations the render fps goes down, thus resulting in a different latency. Are you recording at a fixed framerate? or variable? Variable might help in this case.
A different way to normalize the lag is to simply record the hdmi output of your system on your second mac with something like a blackmagic hdmi capture card.
It is difficult for us to understand what is happening with eFX-RotaZoom in your case. Do I understand correctly that the menu is still showing Wrap mode, while the output shows the effect of Clear or Fold mode? The mode shouldn’t change out of nowhere, so this might be a bug that we’ll have to fix. I can’t reproduce this bug here, can you maybe send me the project so I can have a closer look at your situation?
The difference between the modes is as follows:
- Ignore/Clear: when the Tile size is smaller than the texture size, the unused part of the texture is empty (transparent).
- Wrap: when the Tile size is smaller than the texture size, the tile is repeated to fill the unused part of the texture.
- Clip: when the Tile size is smaller than the texture size, the pixels on the edge of the tile are repeated to fill the unused part of the texture.
- Fold: when the Tile size is smaller than the texture size, the tile is repeated and mirrored to fill the unused part of the texture.
Thank you, Timo! I’m so close to figuring out the sustainability of this system.
I wanted to know if there was any way to add a video delay to a specific eVideoIn channel. It seems that with streaming setups involving USB webcams and HDMI-in cameras, there is a differential that I can’t seem to line up perfectly. I tried adjusting individual track delay on an eVideoIn, but it ended up messing up other eVideoIn tracks. So anyway to offset an eVideoIn without messing up other eVideoIn tracks?
sorry, Jared, there is not a video delay included in EboSuite.
It is possible to create a video delay with ISF shaders, there you are limited by a theoretical maximum of 16 buffers. (I found this one, but i haven’t tried it: https://editor.isf.video/shaders/3817)
Another thing I found was this free application built in Max that might do what you want: https://www.zachpoff.com/software/live-video-delay/ (I also haven’t tried it).