Jump to content

Logic doesn't support MIDI output from plug-ins.


Mattinay

Recommended Posts

@fastfourier -- I have found this entire thread incredibly confusing, but I *thought* the consensus was, before 10.3, that there was no way to get midi out of an au plugin in logic pro, even with iac, because logic pro wouldn't pass the midi from the plugin to the virtual midi port. 

 

Can you elaborate on what you meant?

 

This is so confusing!

Link to comment
Share on other sites

It is possible, as long as the plugin supports it. It will either use IAC (e.g. WIDI) or plugin-created virtual MIDI ports (e.g. Bidule).

 

In both cases the transmitted MIDI data appears at the Physical Input object in the Click & Ports layer of the Logic environment.

 

This happens internally to the plugin, so it avoids Logic's restrictions. Hopefully the developers will now write in support for "native" MIDI out and we can get the data straight out of the channel strip object instead of using a loopback.

Link to comment
Share on other sites

So I am getting midi out from AU's by using BlueCatAudio patchwork.  They have an AU-MFX version which can wrap around VST plugins.  So add Patchwork to the midi fx slot, put the VST you want inside there and then put external instrument plugin into the instrument slot of the same channel.  Then you'll have midi out working.  Unfortunately there is no way to get midi out from the built in Logic plugins like Ultrabeat that I know of until Apple supports midi out from AU's

 

I'm still not sure what this new addition to 10.3.1 is, I haven't found any extra virtual midi ports showing up or anything, I'm really not sure what that reference in the feature updates actually is

Link to comment
Share on other sites

  • 4 weeks later...

Does AU3 fix this situation?  I'd like to write a JUCE plugin and standalone app.  Ideally, I'd like edits on notes in the app to sync directly in Logic and same in the other direction, notes deleted/added in one, is deleted/added in the other without even using "record"..  Am I dreaming in technicolor? Does scripter/environment make this possible?

https://www.juce.com/discover

Link to comment
Share on other sites

Hi 

I found that bidule has an AU midi out tool that creates a midi port at the clicks and ports,it also transmits it from the channel strip,so far thats the only AU I've come across (haven't tried the new v2 of bluecoat patchwork yet) that does this.

It's a start,and I've yet to experiment to see if its anymore user friendly that external inst/iac bus etc.heres hoping

Link to comment
Share on other sites

  • 3 years later...

hello,

i got reaktor 6 player with the plugin "the mouth" who has a midi output.

i red that "srf_audio" knows a way to get this midi output of this plugin via the environment window of logic pro.

can it be done or do i have to change the DAW to get this midi data?

thanks a lot for your help/answer/instructions!

Link to comment
Share on other sites

From my experience, you have to change DAW. There may be a way to build an environment I guess, but can't help you on that one. No idea how you would access the MIDI output of a component plugin in the environment though. Not a clue because it's the plugin that does not route MIDI from what I can remember - before I gave up and moved everything over to other DAWs. MIDI out is in the spec, but not implemented in Logic nor in most plugins. VST works fine so see the suggestion above about using BlueCat's Patchwork and Patchwork Synth plugins to host VST plugins in Logic.
Link to comment
Share on other sites

you cannot generate midi in a plugin and get it out to the environment directly. Period.

 

You can use IAC in some cases to loop it around and back in to record to a track or whatever. You can also use Patchworks, or plogueBidule, etc..perhaps to route midi from Reaktor to another plugin.

 

You don't necessarily need to change DAW, but you will have to be creative depending on exactly what you're trying to accomplish.

Link to comment
Share on other sites

Yup - patchwork is 99 euro, is awesome, but with hosting VSTs in logic and routing MIDI to get around Logic's shortcomings / lack of implementation so far in this area, I found logic to become more unstable (a major increase in likelihood Logic would crash). Not the fault of patchwork from what I could tell - I found this to be rock solid. As soon as MIDI is involved with Logic outside of its comfort zone though, make sure you save save save save save, or lose lose lose lose lose. :-) My experience anyway. More robust was to rewire reaper in slave mode to logic as host and run the vsts in reaper, but other instability and limitations of one "port" 16 channels in reaper. Either way, too high a level of flakey for me, too much wasted time, moved on for those sort of projects into other DAWs, Much more efficient is what I personally found in my way of working and wants/ needs etc.
Link to comment
Share on other sites

if you route the midi around over IAC, you have to be careful about not getting a midi loopback which can cause LogicPro to freeze up. Its totally manageable, crash-free, but you have to pay attention tot what you're doing in that regard.

 

If you use one of the subhoster plugins, then you won't have to worry bout that. Some of those are free by the way. You have several options..

 

Patchwork, PlogueBidule, DDMF, Kushview Element, Image-line minihost modular...to name a few. I Currently like Kushview Element as the best free choice.

Link to comment
Share on other sites

Thanks Dewdman. I found with one instance, no issue. Many instances, trouble. No loopbacks, just unstable. Anyway, it is what it is. I used the free DDMF "connect" for ages before they pulled it, was fantastic if not always consistent / honest about its latency and overheads every time. DDMF metaplugin is good. I will give Kushview a looksee. My experiences with Bidule have not been pleasant.
Link to comment
Share on other sites

you cannot generate midi in a plugin and get it out to the environment directly.

You can. Kind of. If it's continuous controller type data, you can sneak it out as automation/fader without too much trouble. Notes would be a little harder to encode/decode.

 

Here is an example with a Reaktor LFO controlling the ES1 cutoff on another channel via MIDI CC. After loading the .ens file into Reaktor, turn off settings -> sync to external clock and click the "START" button on the panel to get it going. Maybe someone clever can figure out how to make it do a "loadbang", but I am not that person.

 

https://www.dropbox.com/s/76ri2uaxy1yndq9/automazeontest.zip?dl=0

Link to comment
Share on other sites

I don't have reaktor5 installed on my system anymore (only Reaktor6), so your project doesn't load properly for me to be able to try it out. I presume you are generating automation events in Reaktor (which are seen in the environment as fader events, which you can then transform back to midi cc events if you want in the environment).

 

I have done similar things with Scripter, you can enable automation for any pluginParameter, hidden or not...and then that will show up in the environment as fader events, which can be converted or used in the environment.

 

However, there are various problems with this that make it difficult to deal with. For one thing, the identifying fader number changes depending on your mixer channel setup, so its difficult to create a single reusable transformer circuit in the environment that will work across projects. For another thing, the time stamping of events is lost. Interestingly, when you generate an automation event in scripter (and I presume in Reaktor5 as well), there is no timestamp...it happens at the time the plugin processes its process block and immediately goes to the environment as if performed live. That's kind of ok, but exact atomicity of events cannot be ensured, particularly if you are combining that with other midi events which ARE being scheduled in the midi scheduler. The generated fader events will probably be slightly early.

 

But you kind of have the same problem when you go over IAC...timestamp is lost... so.... there is that...

 

I will try to make and share later here a Scripter script that can basically convert midi CC to fader events... You could then theoretically put this as the last midifx on the mixer strip, which would convert all CC events into fader events. Then use an environment transformer to convert back to CC in the environment, but you'd also have to constantly update that transformer circuit with the exact fader id numbers to use depending on on how many midi fx plugins are on the channel I believe. so it would simply be kind of annoying to work with, but definitely doable. Using IAC is probably more straight forward overall.

Link to comment
Share on other sites

the sloppier timing of live midi may be perfectly fine for many uses, I'm just pointing it out in case it matters for someone. Midi timestamping will be lost with this trick....as it will be with IAC too I might add. it can also get a little weird with the midi channel numbers because of the way the environment considers the channel strip object object id as the midi channel, rather then an actual midi channel.

 

its definitely possible though, with some fiddling...

Link to comment
Share on other sites

Just for the sake of explanation... here is a simple scripter script that converts any CC encountered by scripter into a fader value:

 


var PluginParameters = [];

for(let num=0;num<128;num++) {
   
   name = "cc" + num;
       
   PluginParameters.push({
       name: name,
       type: "lin",
       minValue: 0,
       maxValue: 127,
       numberOfSteps: 127,
       disableAutomation: false,
       hidden: true
   });
}

function HandleMIDI(event) {

   if(event instanceof ControlChange) {
       SetParameter(event.number, event.value);
       return;
   }
   
   event.send();
}


 

Then you have to use transformer in the environment like this:

 

circuit.thumb.jpg.b2385ccb45e3e8f9f9eafa1e5aa206f3.jpgtransform.thumb.jpg.9eef58b09c476675b12f27073876c47e.jpg

 

In the above, inst1 has the Scripter script, which converts all incoming CC events into fader events. then in the environment we capture only the faderMIDI events and convert them back to CC events and pass on to another channel strip...

 

Also note that we have to subtract 19 from the data1 byte, that is because the mixer channel itself must have 19 other numbered fader values...so in this case the Scripter plugins starts at 20 instead of 0, or something like that. If there are other plugins above Scripter on the channel strip, then a value other then 19 may have to be used.

 

That's more or less a general solution that may explain what is possible. But there are complications. The midi channel has to be explicitly set in the environment, for one thing. Midi timestamp, if it was generated in a midifx plugin...will be lost...it becomes like live midi...same as if routed over IAC....so not sample accurate, but may be fine in many cases.

Edited by Dewdman42
Link to comment
Share on other sites

My comment about midi channel is because in your example, ES1 is basically operating in OMNI mode, so it doesn't matter, but the fader events are being converted to CC events on midi channel 2 in this case. This doesn't effect your usage case, but if someone were trying to use a multi-timbral instrument that listens to midi channel, then it would matter.
Link to comment
Share on other sites

hello,

i got reaktor 6 player with the plugin "the mouth" who has a midi output.

i red that "srf_audio" knows a way to get this midi output of this plugin via the environment window of logic pro.

can it be done or do i have to change the DAW to get this midi data?

thanks a lot for your help/answer/instructions!

 

So in more direct answer to this question.. I just had a quick look at "the Mouth" user manual. I don't think this will work in LogicPro without a third party sub-hoster plugin. The reason is because AU plugins basically do not output midi. Well they can, but LogicPro ignores midi from AU instrument plugins. Its not in the spec and Apple respects the spec. AU plugins can output midi and sometimes some other hosts will detect it. But anyway LogicPro does not, So it won't work.

 

In the reaktor example that fourier gave, Reaktor is not actually outputting midi, it is outputting fader automation, and then converting back to midi inside the environment. Same with the Scripter example I made. But The Mouth appears to export directly to midi, not through fader automation...so I don't think it would work. It would work if you were using The Mouth in the midifx slot, but then it would not be seeing the audio for input and output I guess?

 

The reality is that The Mouth would need to be setup similar as StutterEdit is on LogicPro...as a midi controlled AUFX plugin, and even in that case, I guess it can't output midi...so...kind of an interesting situation, but I think in this case there is no way around using PlogueBidule or one of the other subhost plugins already mentioned, together with the VST version of Reaktor, which should then handle all the audio and midi as you are desiring. running out the door right now, I will try to play around with The Mouth later to see if I can figure anything out, but I think that's all there is to say about it most likely.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...