Best approach to take for TWEAKING (automating) backing audio on the fly?

I want to be able to turn a knob every now and again and have it tweak an aspect (FX or whatever) of an element of my backing audio. My dillema is that I know the Playback and Loopback plug-ins are CPU monsters and I'm afraid to have multiple loopbacks or playbacks running in the background so that I can tweak them.
Is either the Loopback or the Playback plug-in less CPU intensive to where I could have multiple instances running in the background?
Having just 1 playback plug in running in the background isn't exactly ideal for tweaking (considering they're will be a lot of elements stacked into that playback audio--ie drums, bass, chords or whatever). I'd like to be able to tweak JUST the backing drums or JUST the backing bass. I realize this would take MULTIPLE loopbacks or playbacks.
Could anyone impart some words of wisdom? It's always greatly appreciated!

Hi
I'm not altogether sure playback is a CPU monster - I've been using up to five instances of it per set (song) and tweaking each one, and the overall mix between them on the fly, exactly as you describe, and I've not noticed any related performance probs in over a year of regular gigging - and if you check out my spec, you'll see I have a very basic macbook.
The only hit I noticed was that I originally put all my sets (songs) into one big concert, and that did hit my RAM, making the response to the transport buttons lag. Once I split them up into separate concerts, each containing about 10-12 sets, that problem disappeared.
Obviously to minimise CPU hit, I set up aux channels for reverb, compression etc as is standard practice.

Similar Messages

  • HT4796 any idea how long it takes for migrate process. Is it the best way to transfer old computer files to your mac?

    Any idea how long it takes for migrate process to complete. Is it the best way to transfer file from your pc to mac?

    If you are using Migration Assistant with a WiFi connection expect 12+ hours, 24 hours is possible.
    If you are using hardwired network connections expect 4 to 8 hours.

  • What is the best approach to take daily backup of application from CQ5 Server ?

    Hello,
    How to maintain daily backup to maintain the data from server.
    What is the best approach.
    Regards,
    Satish Sapate.

    Linking shared from ryan should give enough information. 
    If case backing up large repository you may know Data Store store holds large binaries and are only stored once. To reduce the backup time remove the datastore from the backup by following [1] (CQ 5.3 example)
    [1] In order to remove the datastore from the backup you will need to do the following:
    Assuming your repository is under /website/crx/repository and you want to move your datastore to /website/crx/datastore
        stop the crx instance
        mv /website/crx/repository/shared/repository/datastore /website/crx/
        Then modify repository.xml by adding the new path configuration to the DataStore element.
    Before:
    <DataStore class="org.apache.jackrabbit.core.data.FileDataStore">
    <param name="minRecordLength" value="4096"/>
    </DataStore>
    After:
    <DataStore class="org.apache.jackrabbit.core.data.FileDataStore">
    <param name="path" value="/website/crx/datastore"/>
    <param name="minRecordLength" value="4096"/>
    </DataStore>
    After doing this then you can safely run separate backups on the datastore while the system is running without affecting performance very much.
    Following our example, you could use rsync to backup the datstore:
    rsync --av --ignore-existing /website/crx/datastore /website/backup/datastore

  • HT1766 How long should it take for a icloud back up of about 6.4GB?  It has been running all night and not complete.

    How long should it take for an icloud backup of about 6.4GB?  I selected a manual backup and it has been running all night?  Thanks         

    That's going to depend alot on your internet connection speed. 6.4 GB is alot of data, especially if your connection speed is slow (DSL) or low-end cable speeds of ~1.5 Mbps.

  • What Best Approach Tuxedo offers for Integrating Multiple Apps

    Hi Tuxedo Experts,
    I have a question in a scenario where I am planning to integrate multiple applications through Oracle Tuxedo. I will not be using it as a Transaction Processing System. Rather, I am trying to use it as an integration layer (somewhat like ESB). I have a couple of enterprise applications (Siebel & SAP), some telco system that only accept MML commands on a given port and some custom built applications. I have to design and implement the integration layer for this stack of applicaitons.
    As far as I have understood Oracle Tuxedo, I will have to develop Tuxedo client for each application and then make the clients do all integration related work (like an applicaiton adapter/wrapper). I am not thinking about work on SALT to expose web services throught Tuxedo, since web services can be created and exposed more easily without Tuxedo. So, I guess I have to do one client per application. First thing that I wanted to confirm was, am I on right track considering my scenario? or Oracle Tuxedo can offer a better approach for my scenario?
    Since I will be developing clients for each application that I integrate, would it be good if I skip the server all together and follow the Queue based communication (Peer-to-Peer Asynchronous Messaging Model)? Or will it cause any implications?
    Any guidance/tips is highly appreciated.
    Thanks & Regards,
    Ahsan Asghar

    Hi Ahsan,
    Many customers use Tuxedo as a form of ESB. Partly due to its performance and scalability, but also partly due to its connectivity options. Along with SALT, Tuxedo provides connectivity to:
    JEE via WebLogic Tuxedo Connector in WebLogic Server, or the Tuxedo JCA Adapter
    Mainframes using the Tuxedo mainframe adapters (SNA and TCP for IBM mainframes and OSI-TP for Bull GCOS and the like)
    IBM Websphere MQ adapter
    The above are all transparent to Tuxedo clients and servers, i.e., using the SNA mainframe adapter, CICS applications see Tuxedo services as a DPL or APPC remote transaction, and Tuxedo see those CICS resources as Tuxedo services.
    As for writing clients, etc., the real question is where do you want the adopting (? maybe adaptation?) done and how can the systems needing integration communicate with Tuxedo. For example, as I mentioned in another thread, any service advertised in Tuxedo can be accessed as a SOAP/HTTP(S) web service. Siebel allows calling external web services as business services. SAP allows calling into it via Remote Function Calls (RFCs). So you could expose an SAP function module as a Tuxedo service writing a trivial Tuxedo server that uses RFC to call SAP. Then expose that Tuxedo service as a web service and call it from Siebel. The real issue becomes whether the SAP function module has the kind of interface you want or need. If not, you could potentially alter the interface to the SAP function module by changing the simple Tuxedo server and not trying to change SAP.
    Using the above technique, you only need one mechanism in Siebel to interact with any other system, as the mechanics of the communication are hidden. So the same technique (client as you were referring to it I believe) can be used to communicate with SAP and any other external system. For instance, let's say you have an external system you want to call from Siebel and the external system uses IBM WebSphere MQ Series in an asynchronous fashion. You could probably directly build an MQ client in Siebel to communicate with that remote system, but then your Siebel code has explicit knowledge of that remote system and how to communicate with it. If you already have a means of communicating with Tuxedo, you can use the same mechanism to communicate with the external system again by writing a simple Tuxedo server that accepts the request from Siebel and then enqueues the request (with perhaps some transformation first) to MQ, and then waits for a reply on the appropriate MQ reply queue. Thus Siebel sees a synchronous interaction even though the system it is communicating with is operating asynchronously.
    The primary advantage of the above technique is you only have to develop one form of communication for each system, instead of a point to point mess you might otherwise end up with.
    Of course the above can also be done with something like Oracle Service Bus, but it sounded as though you had specific reasons for using Tuxedo. OSB might be especially appealing if most of the system to integrate already use XML as OSB is very good at performing XML transformations, something Tuxedo isn't especially strong at.
    Regards,
    Todd Little
    Oracle Tuxedo Chief Architect

  • How long does it take for an iPad to arrive in the Chicago area?

    My sister ordered the iPad around May 31 I wanted to know around what time should the iPad arrive.. thanks!

    For accurate information, she should call whatever seller she ordered it from. Anything anyone here could post would just a guess which may be far off of reality.
    Regards.

  • Best way to create and keep track of objects on the fly?

    I'm making a script interpreter and am having some trouble with variables. Basically I have a class Variable. Whenever my interpreter finds the right code it should create a new Variable object with two parameters, the value and the name. Later these Variables have to be accessed of course. I thought of using an array instead but then the number of variables is limited.
    So what is the best way to keep a sort of database for Variables? I thought a linkedlist might be good here but maybe there's an easier option?
    Thanks in advance!

    I think this is exactly what I need. Thanks for the reply!

  • Best Approach for Sharepoint/Office 365 Solution?

    Hello there,
    I have been asked to develop a new solution involving SharePoint Online. I know a bit about SharePoint but have not done any development with it as of yet. I need to understand the best approach to take (e.g. WebPart, Separate Web Application, something
    else?)
    In my opinion, the "application" needs are a bit complex. the solution must be hosted in the Cloud (most likely on Azure or within SharePoint if possible)
    It needs to meet the following objectives:
    Consume data from a SQL Server database (proprietary, no api available), on premises
    Manipulate/Read SharePoint Lists
    Launch Workflows
    Provide seamless authentication (Users log into the SharePoint online service, must not have to log into a separate app)
    Support additional database functionality (e.g. I will have my own tables, sps in SQL, that will be needed) and a UI for users to interact with the custom tables.
    Some concerns I have from reading articles is that some functionality is restricted using the SharePoint online platform.
    Any input on a viable approach would be greatly appreciated. Thank you!
    Cory

    You will want to implement a provider hosted app. A provider hosted app gives you the ability to host your app on a trusted server where you can have your own database. You can also communicate back into SharePoint using CSOM to manipulate lists and launch
    workflows. The authentication will flow through from SharePoint Online to your app using OAuth authentication. Some links to help you below:
    http://msdn.microsoft.com/en-us/library/office/fp179887(v=office.15).aspx#SelfHosted
    http://msdn.microsoft.com/en-us/library/office/fp142381(v=office.15).aspx
    Blog | SharePoint Field Notes Dev Tools |
    SPFastDeploy | SPRemoteAPIExplorer

  • Best Approach to create Security / Authorization Schema for an APEX Apps

    Hi,
    I am planning to create a Security / Authorization Schema for an APEX Application.
    Just want to know what is the best approach to create the security feature in APEX, so that it should be re-used in other APEXApplications too..
    I am looking for following features...
    1. users LOGIN and then user's name is stored in APEX_USER...
    2. Based on the user, I want to restrict the Application on following levels.
    - TABS
    - TABS - Page1 (Report
    - Page2 (Form)
    - Page2 (Region1)
    - Page2 (Region1, Button1)
    - Page2 (Region1, Items,....)
    AND so on.....basically depending on user....he will have access to certain TABS, Pages, Regions, Buttons, Items...
    I know, we have to create the Authorization Schema for this and then attach these Authorization Schema to the different Level we want.
    My Question is, what should be the TABLE structure to capture these info for each user...where we will say...this USER will have following access...AND then we create Authorization Schema from this table...
    Also what should be the FRONT end, we should have to enter these detail...
    SO, wondering, lot of people may already have implemented this feature....so if guys can provide the BEST Approach (re-usable for other APEX Application)....that will be really nice..
    Thanks,
    Deepak

    Hi Raghu,
    thanks for the detial info.
    so that means..I should have 2 table...
    master table (2 columns - username, password)
            username    password
       user1       xxxx
       user2       xxxx2nd table (2 columns - username, chq_disp_option)
    - In this table, we don't have Y/N Flag you mentioned..
    - If we have to enter all the regions/tabs/pages in the Applications here or just those regions/tabs/pages for which are conditionally diaplayed.
    - so that means in all the Pages/Regions/tabs/items in the entire Application, we have to call the Conditionally display..
    - suppose we have 3 tabs, 5 pages, 6 regions, 15 items..that means in this table we have to enter (3+5+6+15) = 29 records for each individual users..
              username    chq_disp_option
       user1       re_region1
       user1       re_region2
       user1       tb_main
       user1       Page1
       user1       Page5
       ----        ----     - how you are defining unique name for Regions..i mean in static ID or the Title
    - is the unique name for tab & item is same as the TAB_NAME (T_HOME) & Item Name (P1_ITEM1) or you are defining somewhere else.
    Thanks,
    Deepak

  • What approach do i take for implementing this messaging

    Hello friends,
    i am new to JMS.
    i have a scenario and not am not able to decide the approach to take for it . Please do read it as many of you might already have implemented it .
    I have to design a system where in there will be two jms on two different systems running. so for both the systems i will have to define Sender and receiver.
    Now the approach would be something like , if one of the JMS sends some messages lets say in xml format then the other JMS should be able to immidiately receive that mesage and operate on it and vice versa.
    so my problems and confusions are .
    1)
    what type of messaging to use .
    Point to Point , or public - subscribe. i mean if say the other JMS sends me a message , then after two seconds it again send some message , and my JMS listened to it only after a minutes then it sould be able to get both messages and should be able to operate on it.
    2 Writing sender is fine because that will be based on some action
    (if the point 1 is clear i.e where to register the message queue or topic)
    However i am confused in writing receiver . How do i continously listen to a jms so that i can read if some message had come and work on that message. and if two messages or say n messages had come then read those messages and operate on them.
    So important thing here is my Receiver is continously listening on both the JMS's for the messages.
    Please do help me with this. I dont want code but just an approach that solves all this .
    Thanks in advance.

    Cross-post:
    http://forum.java.sun.com/thread.jspa?threadID=683144
    oh common rene, dont interfere, no one has replied in JMS forum so i posted here. if u can help then please help

  • What is best approach to report building in my case?

    Hi all,
    I'm just getting started with Crystal Reports for our Swing-based desktop application.  We need the ability to generate PDF and XLS reports, perhaps later adding web-based dashboarding and interactive reports.  I'm trying to determine the best approach to take with Crystal Reports to fit our application's data.
    Our app stores results in a separate database (either Oracle, SQLServer, or Apache Derby).  The result records contain lots of ID lookups to tables in another database.  This makes using straight SQL for reporting difficult as I would like to avoid cross-database queries.  So I'm thinking of using the POJO reporting approach where our app gathers the results, generates POJOs, and then passes them to the report.
    My concern with this POJO approach is that it seems to require loading all results into memory and generating the report in one big step.  I've read other posts referring to heap issues.  Is there a way to avoid this?  Some-how to page through report data?
    I've also read that Crystal Reports can work with any data provider that implements ResultSet.  Is this true?  If so, could I create my own custom ResultSet implementation that would let me page through my results without loading everything into memory at-once?  If possible, please point me to the documentation for this approach.  I haven't been able to find any examples.
    If there is a better approach that I haven't mentioned, please let me know. 
    Thanks in advance,
    Guy

    The first option is the best one for performance.  The only time you should use result sets is when you need to do runtime manipulation of the data through your application and is not acheivable in a stored procedure.

  • What steps do I need to take for the iPod mini to remember the last song?

    What steps to I have to take for my iPod mini to remember the last song that I played from an album when I turn it on again, after I have turned it off.
    Whenever I turn my iPod mini off, it seems to go back to the main menue.
    Any help would be greatly appreciated.

    Depends on how long you are leaving it turned off. If it's more than enough time for the iPod to go into deep sleep, then when next you turn it on, it restarts with the Apple logo and returns to the main menu.
    If it's only for a few hours that you leave it turned off, then the iPod should resume from the point at which you left it.
    I don't know for sure whether the "remember playback position" in iTunes works after the iPod has been in deep sleep.

  • Best External Hard Drive for Carbon Copy Cloner?

    I'm about to do a clean install of Leopard on my *Mac Pro* but I want to back up/clone my current system (10.4.11) to an *external hard drive* first. Based on the recommendations of similar users (Logic Studio) -- a clean install is preferable to an upgrade from Tiger to Leopard. I plan to use *Carbon Copy Cloner* to clone my current system but am unclear as to whether I can use an *external USB drive* versus an external Firewire drive.
    Some people claim that firewire is much faster and more reliable for transferring/backing up data, and more importantly, that it's the only type of external hard drive one can boot from -- but I'm not sure that's true for these Intel-based Macs.
    Also, I currently use an *Apogee Ensemble* audio interface which does not like sharing the firewire bus with other peripherals. I've also heard references to a external hard drive chipset called 911+ as being important but I think this may only be in regard to firewire.
    Bottom line, _can I use an external USB drive with Carbon Copy Cloner on an Intel-based Mac?_
    I'm looking for something around 500 Gigs that can sit on my desk with a minimal footprint. I'd like to partition it into three volumes -- the first for the Tiger clone; the second for weekly Leopard back-up; and the third for sound library storage. I'll likely need to access the third volume in realtime from *Logic Pro 8* (audio application) so the HD should be pretty fast. Recommendations?

    So Kappy & Hatter...
    Allow me to spell this out in practical terms. I'm installing Logic Pro, an audio production application. The consensus is that best results are achieved by installing the application on the internal system drive in bay #1 on the Mac Pro.
    Most professional users of Logic Pro recommend a second internal drive in bay #2 solely for audio recording even though the app runs on the drive in bay #1. The drive in bay #2 is often a Raptor because of the 10,000 rpm spin rate, beneficial for the recording process.
    For sound libraries with large audio samples/loops, often streamed to the app -- a third drive is recommended to store the sound library. It can be internal or external. Many people use external drives for this function on a firewire 800 bus. The emphasis for this drive would be to read quickly and stream the samples quickly. I would assume that a 32MB buffer cache versus 16MB would be beneficial. I was looking for hi-speed if I went the external route -- hence my foray into the realm of eSATA buses, but perhaps I should just get an internal drive for this as I do have drive bays available in the Mac Pro.
    Can you recommend drive spex for this task or a particular hard drive for this task of storing and streaming the sound library?

  • Any recommendations for an automated backup application?

    Hi,
    A company that I work for is looking for an automated back up solution.
    Currently, they have Intel iMacs running 10.4.10. They want to have automated backups to their Windows server. The protocol used is Samba, if I am not mistaken. They do not want to back up their entire hard drives, only certain files.
    The dot mac Backup application is not what they want.
    I am not familiar with automated back up solutions in an office environment.
    Any recommendations are appreciated!

    I'm not sure if they want to spend the money on an Enterprise level solution, but if they do then there's one outstanding product: BRU.
    If they are looking for something mainly client oriented but that can backup to a network device then my choice is Synchronize! Pro X because it provides an almost complete backup solution. About the only thing it cannot do is backup across multiple optical media. It supports backup to network devices and will even mount a network device if required. It also supports a fairly easy method of creating backup sets consisting of selected files and/or folders.

  • Looking for best application and stylus for iPad

    I am looking to download an application for an ipad to use with a stylus, but don't know which application or stylus is best.  Please share your opinion on the best application and stylus for taking handwritten notes.

    Procreate is the only good reason to own an iPad, it has support for the top three stylus. However iOS7 has broken the Bluetooth Bridge that Pogo Connect, Wacom and Jot need to give pressure sensitive support. Procreate hardly works now.. except..you want to finger paint.
    But when it is fixed I would suggest buying the Pogo Connect and a set of nibs and if I had only one App to use it with I'd use Procreate. It's vastly superior to many desktop Apps too.
    However, Apple relying on third party Stylus support only makes my MS Surface Pro look like where I'll be spending my cash on the next upgrade cycle. Why Apple has not come out with an iPad Pro is a stunning insult.

Maybe you are looking for