External Midi object.

My external Midi object. does not read what my midi port sends it apparently. All control stays with the regular midi mixer object so I can't use audio plugins on it. What did I miss?
Boe

here are the steps:
use a software instrument track with midi regions
Assign the input of that track to External instrument
on the external plug GUI, Set midi destination to your external synths midi channel, set input to the inputs that your synths audio outs are patched to on your audio interface so that the audio from your external synth is being sent to logic
Assign effects to the inserts of that software instrument tract or use the sends to a bus
That should do it

Similar Messages

  • HELP! External MIDI is playing EARLY compared to the audio tracks!

    Hi Guys
    I have a project in L9 which has about 12 audio tracks, each running a few effects (fairly basic ones, nothing too heavy).
    Tonight I am recording a rhythm part on an external MIDI device (my Kurzweil synth). I'm quantising this part to 1/8 notes, and in the piano roll editor, it's lining up perfectly after quantization.
    However, when I play back the project, the MIDI is all EARLY!
    I've tried soloing just the MIDI and the click, and the MIDI part runs uniformly early compared to the click track.
    I believe this has something to do with the delay compensation that is in effect across all the audio channels... what's happening is that all the audio content is being delayed so all the tracks sync with each other. But for some reason, MIDI is not part of this delay compensation, and is playing in perfect time, and hence sounding early compared to the audio material.
    I've done a test new project with just one MIDI track, and when quantized, it DOESN'T suffer this problem... ie, it aligns perfectly with the click track.
    Is there any way to get external MIDI instruments to be part of the delay compensation process? Or is there some other solution to this issue?
    Really desperate here as this has come out of the blue and I can't think of any way to solve it!
    Thanks heaps guys,
    Mike

    yeloop wrote:
    Is this something new for Logic 9?
    I don't know, I have never used external MIDI with Logic (8 or 9) - but think it is new, yes.
    I wonder if there's a way to set the global default for MIDI so that it always has delay compensation? I can't see why anyone would want to use an external MIDI device and not have it in perfect time with their audio material!
    Yes, there is. You'll have to set it up in AMS first (choose *New Device*, and then configure the device and cable it) In Logic, you'll have to go into the Environment. You need a *Standard Instrument Object* for a monotimbral device, a, *Multi-Instrument Object* for multitimbral devices and a *Mapped Instrument object* for Drummodules. Explanation starts on pg 1082 of the LP9 pdf manual.
    Once it is all setup, you can make a template of it, or you can import the Environment layer into new projects.
    regards, Erik.
    regards, Erik.

  • How to control external MIDI multi-timbre device

    I have an external midi device that has distinct buttons with which I can control on/off individual sounds. The device has a single midi channel but many different sounds that can be turned on/off in any combination - i.e., I can have two or more sounds turned on which the device mixes down to its stereo outputs.
    Each specific sound has a name and can also be controlled with a midi message in the form of "Bn 49 xx" (on event) or "Bn 4A xx" (off event). I would like to be able to control these sounds from logic pro.
    Note that I have refrained from calling these "sounds", "patches". A fundamental difference between a patch and a "sound" as I described above is that you can only select a single patch at the time - I need to be able to turn on/off any combination of these sounds.
    Hope it is clear to someone what I am trying to do... your comments are most welcome.

    After some serious support from Pancenter (thanks again!), I arrived at a working configuration using Apple’s Environment editor. In the hope that my learning experience is of use to someone else, hereby a quick write up.
    My objective was to control the stops (programs) of two pipe organ modules from Logic Pro 9, preferably with a nice graphic interface.  The modules are broken up into 3 sections that each have a direct midi channel. I already set them up on my midi controller and could play a midi file on them just fine but not the stops. Rather than a typical synth program switch, stops on an organ can be turned on/off in any combination thereby mixing the sounds together.  The documentation I had showed these midi byte sequences for turning on/off a stop:
    Bn 49 rr  = stop on
    Bn 4A rr = stop off
    with ‘n’ being the midi channel, and ‘rr’ the hex code for a particular stop.  With Pancenter’s help, I started playing around with a Fader/Button 3 on the “Click and Ports” layer from the Environment (command 8).  A button can be used to send a variety of midi message types, in my case I need a Control message with its channel set to the appropriate midi channel of the pipe organ module (channel 2 for me).  A midi control message has the form <control byte> <byte 1> <byte 2>. The byte 1 field holds the specific control and byte 2 the value for that control.
    Buttons send their max value when on (depressed) and min value when off. i.e., the value of byte 2.  As you can see from the midi sequences listed above, this creates a bit of a complication in my case because the value ought to stay the same but the control (byte 1) must change.
    I decided to set the button values to the midi stop I wanted to control as the min value and the max value as the stop +1. So my button out going sequences are now:
    Bn 74 34    stop off
    Bn 74 35    stop on   this should be changed to Bn 73 34
    (Notice that 74 = 4A hex and 34 = 22 hex)
    Using a transformer object right after the button, I detect the Bn 74 35 sequenc and change it to the desired Bn 73 34. As it turns out... easy as pie when you get a little help from a friend ;-)
    This is the button with output to the transformer:
    The inspector details for the button:
    The transformer setup:
    The entire pipe organ module with stops, just need to clean up the names a bit and we're done. Notice that I used a fader text object for the stop headers.
    I am a 2-3 week user of Logic Pro and must admit that I am starting to be thoroughly impressed with its capabilities. The Environment editor is ahhwwsome though perhaps a bit daunting at first.   Thanks again Pancenter for your totally cool help !

  • Can't get the patch/bank window for an external midi device to stay open

    I am going to be very explicit with the description of my problem.
    I am running Snow Leopard (10.6.1) and using Logic 9.0.1 on a MacBook Pro (2.16GHz, Intel Core 2 Duo, 4 GB memory).
    I create a new empty project and add an external midi track.
    I click on the track and select my Roland XV-5050, Channel 1 in the Library window.
    I open the Environment window.
    I choose MIDI Instr. from the pull down menu.
    I select the XV-5050 icon.
    I choose Define Custom Bank Messages under the Options menu.
    I define all the Bank changes for my XV-5050 and close the Define Custom Bank Messages window.
    I right-click on the XV-5050 icon in the Environment window and select Open Object Editor.
    The patch/bank window opens and then immediately closes.
    I can't get the patch window to stay open.
    The same thing occurs when I double click on the midi track in the arrangment window.
    Could someone let me know if they are able to get the patch/bank window to stay open or is it a bug?
    Thanks
    Scott

    Scott,
    Although this doesn't have much to do with your question, could you enlighten me as to how you connectd your XV-5050 to your mac with Snow Leopard? I can't get my Mac to "see" the XV at all.

  • Problem with external MIDI routing has to be MainStage 2 bug

    I posted a bug the other day where I was getting occasional bogus MIDI note events in one external synth while playing a keyboard that was (supposed to be only) sending MIDI data to a different external synth.
    I have now performed two experiments that would seem to indicate that the problem must be caused by MainStage 2 and not by CoreMIDI or by MOTU drivers/external MIDI interfaces.
    Experiment 1:
    I created a new Digital Performer sequence, enabled multi-input mode and routed one keyboard controller to one external synth and another keyboard controller to another external synth, essentially reproducing the MainStage 2 patch. Played both keyboards together for some time with no problems whatsoever. (In MainStage 2, I'd get a glitch pretty quickly)
    Experiment 2: (to eliminate the possibility that Digital Performer (a MOTU product itself) wasn't doing something special)
    I used Max/MSP and created two midiin/midiout connections, one representing the first keyboard controller and external synth, the second representing the second keyboard controller and external synth respectively (duh). Again, I played both keyboards for a while and everything worked perfectly, no glitches.
    I have reported this to Apple several times via their feedback but have heard NOTHING back from them.
    Original bug posting
    (http://discussions.apple.com/thread.jspa?threadID=2137282&tstart=0)

    If I understand you correctly, I think I experienced a similar problem. However, in my case, I don't have two separate midi input devices, but rather one; and two separate external devices I am wanting to control with midi.
    Brief background: I use an EWI to play various softsynths & now external synths via MS2. I also route the audio from my EWI into MS2 for additional processing and I use an external effects processor (Eventide Eclipse) for additional processing and sound manipulation. I send program changes with my EWI to change patches using MS2. My EWI transmits only on midi channel 1, which MS2 is set to receive on. Also, my Eclipse is set to receive midi data on channel 5 (a method I used to keep things separate in my all hardware rig when MOTU MTP AV was my switching device).
    Initially, I thought I could just have my EWI send a program change message on channel 1, and then MS2 would send a program change message to the Eclipse on channel 5, all from the same MS2 keyboard object. It worked, but with one small glitch ... after the program change happened, and I started playing my EWI, the Eclipse began receiving my midi note information which in turn caused the Eclipse to be confused and start trying to change patches, etc. I checked the keyboard object which was not supposed to transmit on channel 1 to anything external, but I could not figure out how to set it up. Note: none of my devices are set to OMNI because I learned a long time ago, that I want to control routing and specific channels ensure their separation ... or supposed to anyway.
    What I ended up doing was to create another midi object (this time the little spherical thing, not the keyboard), and assigned it to transmit on midi channel 5 and not receive on any channel. This turned out to be the solution / work around for this situation.
    Another problem I am having with midi control when using MS2 as a go-between, is an issue with pitch bend information getting messed up as its sent to my external synth (Yamaha VL70m). For some reason, playing the VL70m by way of MS2, causes my EWI pitch-bend data to force the VL70m pitch-bend in the permanent down position making all notes flat. My solution in this case was to set up a direct link with MidiPipe, from my EWI to my VL70m and filter out any program changes so that I can use the MS2 to send a patch change to it, but use my EWI to play it directly. It seems to work well, and my pitch-bend is back to normal, but I am not certain (and don't know how to test) if the VL70m is now getting duplicate note on/off information with the potential of a minuscule delay from MS2 being in the path as opposed to playing direct from my EWI. It seems unnoticeable to my ears, but it's in the back of my mind.
    Anyway, these are some of my midi episodes and I think it's pretty clear that Apple needs to quickly address some issues about midi handling and MS2.
    Hope this helps ... and maybe we can come up with some more creative ways to deal with these "undocumented features" (to borrow a saying from the Windows world).

  • PDC & External Midi Device Delays

    Hi All,
    Got my head around the PDC - i think , but -- Logic states that :
    ' Another effect with delay compensation set to All is that MIDI tracks triggering external sound modules will be out of sync. This is because Logic has no direct control over the audio output of external devices. A possible solution for this would be to route the audio outputs from the external MIDI devices to inputs on your audio hardware and monitor them through Logic. This way, the audio streams from the MIDI devices can be compensated during playback. Using Logic's External instrument to route MIDI to your external devices is an ideal way to work in this situation. '
    Question : So what would you do if all of your inputs were being used from other sources ( Hardwired / or even if you had only 2 ) or you did not have enough inputs to cater for simultaneous internal monitoring from your midi gear- ( phew big sentence ! ) but still need the ALL setting for PDC selected because of the induced latency from plugins . How can you compensate for the external midi delay in the new logic 7 if you cant monitor internally as they suggest ? This was not the case in version 6 and i am pretty worried my clients projects will all now play out of time ! Hope i explained correctly and you understand my question. Please dont say bounce them !!
    Thanks and look foward to youre suggestions

    Presumably, all you would need to do is have a stereo submix of your external MIDI gear coming up on a stereo input object (software monitoring on, obviously.)
    The whole submix would therefore be PDC-delayed as necessary to match up with the rest of the audio, and you could bounce (real-time only of course) to include all your external instruments in one go.
    What I do is have all external hardware coming up in a mixer (Mackie 1604-VLZ), so you could record individual channels of submixes back to the computer as necessary.
    However, the best solution for most cases is to record your external MIDI gear as audio, which is what I do when I've finished editing, keeping the MIDI tracks just in case.
    This is also handy as you can process with internal FX, automate and offline bounce, and don't need to worry about having all the hardware when you open up the song in a couple of years. (Although you should keep sysex dumps etc as well, obviously...)

  • How do you set Korg Karma as EXTERNAL MIDI?

    Hi, I have a Korg Karma and I know that I'm supposed to use external Midi so that I can use the Logic effects. Can someone please give me a step by step? It doesn't seem to work. I got my Mbox to work just fine, but as far as using an EXTERNAL MIDI...help!!
    thanks!

    Hi,
    >>where do I specify which csv file to read from the external table..
    Please, read either the link provided by Andrew or the link that I have posted in my previous post.
    "# DEFAULT DIRECTORY - specifies the default location of files that are read or written by external tables. The location is specified with a directory object, not a directory path. See Location of Datafiles and Output Files for more information.
    # LOCATION - specifies the location of the external data. The location is specified as a list of directory objects and filenames. If the directory object is not specified, then the default directory object is used as the file location."
    Cheers
    Legatti

  • MainStage does not control external midi devices

    in the Environment I can use the "External Instrument" object to be able to access external legacy devices such as synths and effects units. Since this capability already existed in the Environment, why wasn't it added to MainStage? I have seen this feature request on almost all "rack" type hosting systems such as NI Kore, and it seems to be a terrible oversight on the part of Apple. Can we supplement this by using MainStage objects in the Environment or somehow add this capability using Environment objects to bring this feature to MainStage?

    well I think it should be a little more than just able to send program changes. It should also be able to access the continuous controller mapping through the MainStage layout, it should pass through all midi notes, etc. to the device and it should be able to act as a conduit for the midi clock / tempo features of MainStage. But yes, it doesn't appear to be that difficult to create; you know ... if Apple had thought of it, they could have actually used the External Instrument object which they discuss at great length in Chapter 10 "Working With Instruments and Effects".
    Someone really didn't think this through ... this is fundamental.
    P.S. I came across this AU plugin that was designed for GarageBand for the same purpose. Its missing a few things like the ability to send an initial program change and I am not sure if it can pass through midi control from another external controller such as guitar, keyboard, etc. I sent an email to the developer with suggestions. Its called MidiO:
    http://mysite.verizon.net/retroware/
    Message was edited by: msteveng

  • Using a second computer as a external midi instrument or DAW slave?

    So this is my current setup:
    iMac 24(FW 800 & 400) with logic 8 & reason 4,
    Focusrite Saffire Pro 40 firewire (x2) sound card/pre-amp,
    Blackbook with MainStage & Reason 4.
    & this is what I would like to do:
    Use my imac as my main DAW master, (currently tracking to an external FW800 drive & pulling loops from a USB drive), & use my blackbook to run reason as a external midi device or slave.
    My Saffire Pro 40 has 2xFW400 ports so I can connect both machines to it.
    The idea behind this setup is to pull as many software sounds from the Blackbook & mainly use the imac to run Logic.
    Does this make sense? & if so, what's the best way to execute this?
    thanks in advance

    Hello -
    I'm not entirely sure that one audio device such as the Saffire Pro 40 with 2 Firewire ports can be shared between two computers. It was always my impression that the second Firewire port was available to "daisy chain" another audio device or hard drive. Before you proceed, you might want to check with the company that manufactures the Saffire Pro 40 and make sure that two computers can interphase with it.
    With regards to using a second computer as a "slave machine", it seems that many, many people do have such configurations to help spread the CPU and memory work-load. My happy Mac Pro is set up to work with a potential of two other "slave machines". When I use this set up, it works wonderfully. But, for most of the time, my happy Mac Pro can manage most sequencing projects (mostly orchestral with LOTS of instruments) on its own. Each computer, however, has its own audio device. All are connected to the Mac Pro using the ADAT light inputs/outputs from each audio device (I use MOTU's audio devices) and are managed midi-wise using Music Lab's MOLCP 3.
    Good luck. . .

  • External MIDI device, can't get more than one sound, Logic Express 9 & Korg N364

    Hi, I just beginning to use Logic. I trying to get Logic Express 9 to work with and use my Korg N364 keyboard/workstation as an external MIDI device. I can only get one sound at a time even though I create multiple tracks and assign different programs (or patches) to them.
      Each time I do the following: I add a new external MIDI track, change the Channel to new number, put a check in the Program box ( as well as the Volume & Pan), and choose a instrument or patch from the Program drop down menu. What happens, though, it changes the sound to that instrument alright, but ALL THE PREVIOUS EXTERNAL MIDI TRACKS HAVE THE SAME SOUND NOW. I've followed the tutorials I found on the web and it seems like I'm doing everything correct in Logic. I hooked up the MIDI cables, added a new device and connected virtual cables in the MIDI studio window of my Mac OS, and have added the proper Korg multi instrument in the environment window of Logic and "un-slashed" the 16 channels. I get the sounds from the Korg no problem, it's just that I can only assign one sound and no more to the external MIDI tracks. Is there something in my Korg I need to do? Or am I missing something in Logic?

    It's not Logic, it's how you have the Korg setup.
    I'm not familiar with that model but it is a workstation so I'm pretty sure you can set it to a "Multi" mode.
    Right now you are running it in single patch mode.
    In single patch mode only one sound is available on a single MIDI channel, your Korg probably has a Multi or Combi mode, this is where multiple sounds can be played, each sound has to have it's own MIDI channel.
    You will need to set this up on the Korg first.

  • When I try to open a project, the only option for a new track is for external MIDI device; when I try to open an audio or software instrument track, I get an error message saying that "the specified number of tracks could not be created. Reduce the n

    When I try to open a project, the only option for a new track is for external MIDI device; when I try to open an audio or software instrument track, I get an error message saying that "the specified number of tracks could not be created. Reduce the number of tracks to create and try again." I didn't specify ANY number of tracks. Thanks for help!

    Ah.. then I suspect you just need to go to Logic's prefs/Audio and check CoreAudio...
    It has been unchecked somehow...
    Also.. as an aside.. Be careful not to create such large subject names in future when posting here.. They can cause issues with replying to posts in such threads.. due to a bug in the forum software.. so please keep them short and to the point...
    Cheers..
    Nigel

  • Why does Playback stop when I have an external MIDI channel?

    I finally have to try to sync up external gear with Mainstage 2.2.2. This time it's sending midi clock to an Akai MPK25. I set up the external midi instrument channel, set the channel's midi out to port 1 on the Akai, set the Akai to external clock. The Akai seems to pick up the tempo (mostly) but my Playback instances that start with Play action cut out after about 1 measure (yes they're set to Loop).
    Doesn't mater which instance. The clock keeps going but there's a substantial lag to the transport controls. After a few seconds I can retrigger the Playback that stopped and then it loops as it should.
    As soon as I delete the external midi strip, everything is back to normal. Has anyone else experienced this? Is it fixable? Maybe an Akai thing?
    I've wrestled with MS to get it gig-worthy for a year and some new bug always seems to pop up.
    System:
    iMac 20" Core2Duo 2GHz, 4GB RAM

    I try 2/3 things.
    - monitor sleep after 1 minute
    - computer sleep after 3h
    - itunes playing songs
    .....after a minute, monitor sleep and so does itunes....
    VLC doesn't have this issu (luckyly)....

  • External midi interface connection. Please HELP!!!!

    Hi,
    I just got the iMac 20 inch widescreen, primaritly for use in my recording studio.
    I have tried on three different occasions, to install an external midi interface (MOTU Micro Lite), but it is not recognised in the system's midi set up window, and it's driving me nuts! Can someone please tell me what to do. Maybe I'm not getting some things properly set up before the installation.
    The MOTU installation cd says it is compatible with Mac OS X operating system.

    You've posted in the OS X Server area.
    Any special reason why ?
    Did you install the software for the device ? :P
    http://www.motu.com/products/midi/lite/body.html/view?searchterm=mac%20AND%20OS% 20AND%20X
    Your next stop should be (and should have been)
    http://www.motu.com/techsupport
    Message was edited by: davidh

  • Extern midi useless: miditiming in sync with audio, is it possible?

    using a UAD-1 latency is a big issue here. Is it possible to have extern miditiming in sync when you use groups? With full pdc miditiming is useless, with pdc on tracks & instr works until you put some UAD plugin on a bus and compensate it with track advance to get the audio sync again; now extern midi is way early. This means that with any plugin; extern midi timing will be compromised!!!!!!!!!!
    Can anyone tell me if there is a way to sync midi and audio and still use groups?
    Without a workaround Logic Pro is useless. What a mess; instruments dropping out, serious miditiming issues, au validation issues... I'm getting a bit desperate here.
    regards
    Budy

    yes, i know - that's been discussed on this forum though it is not something i feel strongly conversant, only to say that i believe there are ways round the problem.
    the site is www.logicprohelp.com
    and if they can't help you then no-one can.
    if it were me - and it has been when remixing older cues, i have simply put a delay (or by that i mean negative delay which brings it forward) on the midi tracks.
    maybe this is not the best method, but you could record a midi generated click (ie a click track with your external module as the source), and then loop it in playback against klopfgeist, adjusting the delay setting until it is tight. this may not offer sample accurate precision, but midi is not capable of sample accurate precision anyway.
    there are people out there who have this sorted, and my method worked fine for me, but if you do elctronic music such as house or whatever, then you may be concerned with really really tight timing in which case you might be looking for the "recognized" solution.
    have real trawl through this forum - it's been discussed.

  • Monitoring 2 External Midi devices at playback

    Hi, I'm a "Newbie," I have a MAC G5, Logic Express 7.1, a MOTU 828
    (first generation) and 2 external midi devices (Triton Extr, and a Roland JV 880) I
    can only monitor one of the devices at a time at playback - wichever channel
    I have chosen in the MOTU preferences.
    I know it was a challenge to set up Cubase to monitor both at the same time.
    Is it possible in Logic Express?
    Thanks for any help that might be had out there!
    Borgy

    Couple of things I don't understand about your question.
    Are you running Vienna Ensemble on a separate computer? If not, why not use it as an AU plugin? If you have it installed on the same computer as Logic there would be no need to treat it as an external instrument. If you are running it on a separate machine double check the audio and MIDI hardware connections, drivers for both interfaces, Apple Audio/Midi Utility settings, environment, and finally the settings on your ext instrument (in the track.)
    You mention being unable to hear VE when bouncing, but then you also mention that there is no signal from those tracks showing on the master fader. Is this specifically a problem when bouncing, or are you basically unable to hear and see signal from your VE tracks while recording, playing, mixing?
    If VE runs smoothly from another computer (presumably to ease the burden on your main cpu) and you are able to hear it and see it at the master outputs just as you expect, and you ONLY have a problem when bouncing, then your solution is simple. Make sure that you bounce in "realtime" mode when including external instruments or fx. "Offline" mode will not include them.
    Sorry if I'm covering stuff you know and have already tried; hopefully some portion of this info will help you.

Maybe you are looking for