Best way to quantize a sloppy live drum recording

I'm actually trying to quantize a guitar part, but for the sake of simplicity, let's say a sloppily played drum loop. My guess would be to use either Groove Machine or Quantize Engine, but neither of them changed anything, or perhaps I'm doing it wrong? I basically double-click the audio region I want to fix, select Quantize Engine, and hit "Process". There doesn't seem to be a change in the original audio file. Do I need to slice the audio file first, into the individual hits or is Logic supposed to automatically detect the start points of each drum hit?
Am I doing this right or is there a better way to quantize audio?

I haven't tried Flex Mode yet, but wouldn't that be a bit tedious if there are constant 16th notes for the whole song? I would imagine there is a way for Logic to detect the audio being played in the recording, then quantize them to tempo...

Similar Messages

  • What's the best way to delete 2.4 million of records from table?

    We are having two tables one is production one and another is temp table which data we want to insert into production table. temp table having 2.5 million of records and on the other side production table is having billions of records. the thing which we want to do just simple delete already existed records from production table and then insert the remaining records from temp to production table.
    Can anyone guide what's the best way to do this?
    Thanks,
    Waheed.

    Waheed Azhar wrote:
    production table is live and data is appending in this table on random basis. if i go insert data from temp to prod table a pk voilation exception occured bcoz already a record is exist in prod table which we are going to insert from temp to prod
    If you really just want to insert the records and don't want to update the matching ones and you're already on 10g you could use the "DML error logging" facility of the INSERT command, which would log all failed records but succeeds for the remaining ones.
    You can create a suitable exception table using the DBMS_ERRLOG.CREATE_ERROR_LOG procedure and then use the "LOG ERRORS INTO" clause of the INSERT command. Note that you can't use the "direct-path" insert mode (APPEND hint) if you expect to encounter UNIQUE CONSTRAINT violations, because this can't be logged and cause the direct-path insert to fail. Since this is a "live" table you probably don't want to use the direct-path insert anyway.
    See the manuals for more information: http://download.oracle.com/docs/cd/B19306_01/server.102/b14200/statements_9014.htm#BGBEIACB
    Sample taken from 10g manuals:
    CREATE TABLE raises (emp_id NUMBER, sal NUMBER
       CONSTRAINT check_sal CHECK(sal > 8000));
    EXECUTE DBMS_ERRLOG.CREATE_ERROR_LOG('raises', 'errlog');
    INSERT INTO raises
       SELECT employee_id, salary*1.1 FROM employees
       WHERE commission_pct > .2
       LOG ERRORS INTO errlog ('my_bad') REJECT LIMIT 10;
    SELECT ORA_ERR_MESG$, ORA_ERR_TAG$, emp_id, sal FROM errlog;
    ORA_ERR_MESG$               ORA_ERR_TAG$         EMP_ID SAL
    ORA-02290: check constraint my_bad               161    7700
    (HR.SYS_C004266) violatedIf the number of rows in the temp table is not too large and you have a suitable index on the large table for the lookup you could also try to use a NOT EXISTS clause in the insert command:
    INSERT INTO <large_table>
    SELECT ...
    FROM TEMP A
    WHERE NOT EXISTS (
    SELECT NULL
    FROM <large_table> B
    WHERE B.<lookup> = A.<key>
    );But you need to check the execution plan, because a hash join using a full table scan on the <large_table> is probably something you want to avoid.
    Regards,
    Randolf
    Oracle related stuff blog:
    http://oracle-randolf.blogspot.com/
    SQLTools++ for Oracle (Open source Oracle GUI for Windows):
    http://www.sqltools-plusplus.org:7676/
    http://sourceforge.net/projects/sqlt-pp/

  • Which is the Best way to upload BP for 3+ million records??

    Hello Gurus,
                       we have 3+million records of data to be uploaded in to CRM coming from Informatica. which is the best way to upload the data in to CRM, which takes less time consumption and easy. Please help me.
    Thanks,
    Naresh.

    do with bapi BAPI_BUPA_FS_CREATE_FROM_DATA2

  • Which is the best way for posting a large number of records?

    I have around 12000 register to commit to dababase.
    Which is the best way for doing it?
    What depends on ?
    Nowadays I can't commit such a large number of register..The dabatase seems hanged!!!
    Thanks in advance

    Xavi wrote:
    Nowadays I can't commit such a large number of registerIt should be possible to insert tens of thousands of rows in a few seconds using an insert statement even with a complex query such as the all_objects view, and commit at the end.
    SQL> create table t as select * from all_objects where 0 = 1;
    Table created.
    Elapsed: 00:00:00.03
    SQL> insert into t select * from all_objects;
    32151 rows created.
    Elapsed: 00:00:09.01
    SQL> commit;
    Commit complete.
    Elapsed: 00:00:00.00
    I meant RECORDS instead of REGISTERS.Maybe that is where you are going wrong, records are for putting on turntables.

  • Best way to build an web-based voice record player

    Could anyone please guide me on what technology to use to
    build a online voice recorded that would record a users voice using
    a flash player from a website and then the voice is saved on the
    server side. What would be the best way to build something like
    that.
    Thanks,
    Ket

    Use Flash or Flex for the client side, and Flash Media
    Interactive Server on the server side.

  • Best way to load a bunch of drum sounds?

    Hey, all - In the course of getting rid of some older equipment, I've preserved a bunch of drum sounds that I want to continue using; I've recorded them and edited them into separate audio files (about 250). I'll be triggering them from a percussion controller (TriggerFinger).
    My question is, am I better off loading them into EXS24 or Ultrabeat? It seems like each has pros and cons - Ultrabeat makes it easier to futz with the samples, and to create little beat patterns, but EXS24 seems more flexible in terms of arranging, layering, and (this may be a big factor) importing many files at once.
    Anyone have suggestions on reasons to go one way or the other? Many thanks in advance!

    I think you have answered your own question, to be honest and have to weigh your desired outcome against the features and benefits of each unit. Speaking personally however, I have often used the EXS24 for the layered drums as I have a greater degree of control between the various groups and also makes it easier to layer more than one sample at a time.
    I often use Ultrabeat when I want it to have more of a drum machine type of feel, where the samples are use to convey that type of feel.
    However, that's me (and others will probably have different ideas).
    jord

  • What is the best way to process a dataset of 2000 records...

    I have a problem where I get a collection of 2000 records. Using these rows I need to look for matches amongst existing rows in a table. Currently the procedure each row and scans the table for a match. Is there a way I can scan the table once for all possible matches.
    Thanks

    Assuming you can't retrieve the 2000 rows in one SQL statement another approach might be to create an object collection and cast this to a table in a subsequent SQL statement.
    For example
    CREATE TABLE test (abc NUMBER, def NUMBER, ghi NUMBER);
    INSERT INTO test VALUES (1,2,3);
    CREATE TYPE test_typ AS OBJECT (abc NUMBER, def NUMBER, ghi NUMBER);
    CREATE TYPE test_coll_typ AS TABLE OF test_typ;
    SET SERVEROUTPUT ON
    DECLARE
    coll test_coll_typ := test_coll_typ();
    CURSOR cur_dupes IS
    SELECT abc, def, ghi
    FROM test
    INTERSECT
    SELECT abc, def, ghi
    FROM TABLE(CAST(coll AS test_coll_typ));
    BEGIN
    -- Create some rows in our collection
    coll.EXTEND(3);
    coll(1) := test_typ(2,3,4);
    coll(2) := test_typ(1,2,3);
    coll(3) := test_typ(3,4,5);
    -- Output the duplicates in table "test"
    FOR rec_dupes IN cur_dupes LOOP
    DBMS_OUTPUT.PUT_LINE(rec_dupes.abc||' '||rec_dupes.def||' '||rec_dupes.ghi);
    END LOOP;
    END;
    The disadvantage is that you now have two more objects in your schema. This might not be a problem if they're there for other reasons too, but it is a bit of overkill perhaps if this is their sole reason for being.

  • Best way to update 8 out of10 million records

    Hi friends,
    I want to update a table 8 million records of a table which has 10 millions records, what could be the best strategy if the table has a BLOB column with 600GB worth of data. BLOB itself is 550GB.  I am not updating the BLOB column.
    Usually with non-BLOB data i have tried doing "CREATE TABLE new_table as select <do the update "here"> from old_table;" method .
    How should i approach this one?

    @Mark D Powell
    To give you a background my client faced this problem  a week ago , This is part of a daily cleanup activity .
    Right now i don't have the access to it due to security issue . I could only take few AWR reports and stats when the access window was opened. So basically next time when i get the access i want to close the issue once and for all
    Coming to your questions:
    So what is wrong with just issuing an update to update all 8 Million rows? 
    In a previous run , of a single update with full table scan in the plan with no parallel degree it started reading from UNDO(current_obj=-1 on event "db file sequential read" wait event) and errored out after 24 hours with tablespace full on the tablespace which contains the BLOB data(a separate tablespace)
    To add to the problem redo log files were sized too less , about 50MB only .
    The wait events (from DBA_HIST_ACTIVE_SESS_HISTORY )for the problematic sql id shows
    -  log file switch (checkpoint incomplete) and log file switch completion as the events comprising 62% of the wait events
    -CPU 29%.
    -db file sequential read 6%.
    -direct path read 2% and others contributing a little.
    30 % of the samples "db file sequential read" had a current_obj#=-1 & p1 showing undo file id.
    Is there any concurrent DML against this table? If not, the parallel DML would be an option though it may not really be needed. 
    I think there was in the previous run and i have asked to avoid in the next run.
    How large are the base table rows?
    AVG_ROW_LEN is 227
    How many indexes are effected by the update if any?
    The last column of the primary key column is the only column to be updated ( i mean used in the "SET" clause of the update)
    Do you expect the update will cause any row migration?
    Yes i think so because the only column which is going to be updated is the same column on which the table is partitioned.
    Now if there is a lot of concurrent DML on the table you probably want to use pl/sql so you can loop through the data issuing a commit every N rows so as to not lock other concurrent sessions out of the table for too long a period of time.  This may well depend on if you can write a driving cursor that can be restarted in the event of interruption and would skip over rows that have already been updated.  If not you might want to use a driving table to control the processing.
    Right now to avoid UNDO issue i have suggested to use PL/SQL approach & have asked increasing the REDO size to atleast 10 times more.
    My big question after seeing the wait events profile for the session is:
    Which was the main issue here , redo log size or the reading from UNDO which hit the update statement. The buffer gets had shot to 600 million , There are only 220k blocks in the table.

  • Best way to extract data from 2 years records/Numbers files for tax return?

    Hi
    I am wanting to find a way of generating my annual tax returns from my financial records, which will be stored in a separate Numbers file for each Calendar Year.
    As the Tax Year (in the UK) is different from the Calendar Year (in the UK it runs from April 6 to April 5), I would need to extract information from two different Numbers files, to a third Numbers file. Is it possible to do this, and would this be relatively straightforward?
    I'd be interested to hear from others who have used Numbers for a similar purpose, and how they have organised things.
    Thanks
    Nick

    The easiest soluce is to forget the use of separate documents.
    Use one sheet per year and it will be really easy to achieve your goal using only formulas.
    With separate documents, the unique soluce is to use an AppleScript.
    I may write one but I'm a bit tired because I posted a lot of such soluce and in most cases, I don't know if they give satisfaction to the askers.
    Coding for the clouds isn't my cup of tea so I think that for some days, I will leave AppleScript and will work upon the archives of my workshop.
    Yvan KOENIG (VALLAURIS, France) mercredi 4 août 2010 19:15:58

  • What is the best way to plug my iPhone 5 into a mixer for live music performances?

    I'm thinking of performing live with my iPhone 5 or maybe an iPad that I haven't bought yet. What is the best way to get the sound for the device to a mixer or PA? I'm wondering if anyone has any experience with wireless options. I'm wondering if it is stable enough for live performance. There doesn't seem to be any audio interface options yet.

    there is always the 3.5mm minijack out connector sure it's analogue but a cable with male minijack in 1 and and 2 rca in the other would work with most mixers and or pa's

  • What is the best way to remotely access my sister in laws Mac who lives in another city to help her with her computer problems?

    as stated above
    What is the best way to remotely access my sister in laws Mac who lives in another city to help her with her computer problems?

    The best way? Get her to bring it to you, especially if she makes good cakes.
    Apples Back to my Mac isn't really suitable for this - it is designed for a single person who wants their Apple ID on the system. It would mean she would have to share hers with you & you would also have to setup her Apple ID on your Mac - it is messy & causes trouble with iCloud, iTunes etc.
    You can try Messages if she is able to begin a session with you, see the 'invite to share screen' in the menus, weirdly you need to use a service that isn't from Apple.
    Messages (Mavericks): Share your screen
    It may be better if you to setup LogMeIn or GoToMyPC. They should 'dial out' & maintain a constant connection so you can login whenever the Mac is powered up. It won't, require a human to initiate the process at the other end, just be aware that the router may cause issues depending on what is configured, you may need settings to enable automatic port forwarding - it really depend on which option you choose.

  • Best way to use live traced images?

    I've got a hand drawn logo I want to use on a photoshopped image.  If I live trace it in illustrator and then live paint it, what's the best way to import it into photoshop for use?  I'd like to do some advanced colouring and touching up of the logo in photoshop before scaling and positioning correctly on the original image, this would likely be via path shapes so I can retain scalability of the image.
    What's the best thing to do?

    File>Export>Photoshop here;

  • Using the new iPad, what's the best way to watch video files (away from home) which are stored on a NAS (WD My Book Live)? Any help would be appreciated!

    Using the new iPad, what’s the best way to watch video files (away from home) which are stored on a NAS (WD My Book Live)?  Any help would be appreciated!

    Before you go, move the files to I tunes and sync them down.  There is no viable way to stream from your nas drive to the pad.

  • Best way to record Roland V-drums

    Hello to everyone!
    I am a new owner of a mac mini, as of today. It is a 1.83 GHZ Intel Core 2 duo w/ 2 GB of ram.
    The question I have pertains to recording. I own a Roland TD-10 Vdrum kit, and would love to be able to record into Garageband with it. My question is - What is going to be the best way for me to record?
    I believe I have 2 options - One would be to go with a Firewire interface Audio Interface, like this one:
    http://www.guitarcenter.com/M-Audio-FireWire-Solo-Mobile-Audio-Interface-1029518 37-i1154073.gc
    The other option would be to get a USB MIDI interface, like this.
    http://www.zzounds.com/item--MDOUNO
    I am very ignorant when it comes to this, and therefore, before purchasing anything, I am asking for advice. The TD-10 drum kit of mine has MIDI ports on it, as well as a line out.
    Any suggestions would be much appreciated.
    Zach

    if you want to record the sounds that your drum kit makes, as audio, you can get the audio interface (however, all you really need is an audio cable).
    if you want to play GB's (or any third party) drums, get the MIDI interface. MIDI gives you some advantages, like being able to edit each note, if you wish. plus there are many drum kits you can install and then play with your hardware.

  • What is the best way to use single shot drum sounds?

    I have a huge collection of wav. format single sound drum sounds that I use in FL Studio. What is the best way to program drums in GB using them besides buying iDrum? I like the flexibility of using drums from different kits when I make my music so I kind of get board using a preset drum kit from GB even layering 3-4 kits isn't as nice as choosing each individual drum sound.

    Would be great to have more info but on a 24fps timeline with 24fps footage, you can simply right click and change the speed/duration to whatever. It'll be more jittery than smooth 60fps slow-mo. If your footage is shot at 60fps and you are editing on a 24fps timeline, just select the clip and right-click and choose MODIFY>Interpret footage and change the fps to 23.976 and you're done. Now when you drop that clip on the timeline it's in smooth slow-mo. Or, you can just toss the original clip on the timeline and change the speed/duration setting to 40%. Pretty much the same thing although the former is better if you need to add any warp stabilization to the clip. Anything you want to speed back up you can either duplicate the clip in the project panel and change it back to the original frame rate OR you can simply adjust it on the timeline to 250% speed which is back to normal. Good luck!

Maybe you are looking for

  • Entries in Database Tables are missing

    Hi Experts, In ERP2004, I have 2 tables in different clients. The 2 tables have some entries in Client 000. But same entries are not there in the 2 table in the copied client 020. IS there any way that these table entries can be brought into client 0

  • Time/date settings

    Is there a way to change the time /date format so that the format is MM/DD/YY and 2 hour time not 24 hour. Thanks

  • Airport Express won't recognize TSC printer

    We are having a really weird issue with a new 802.11n Airport Express. It refuses to recognize a specific USB printer attached to it (TSC TTP-245c label printer, to be precise). By "refuses to recognize" I mean the printer does not show up in the Pri

  • Enabling button in BSP

    The query goes like this OnClientclick of one button i am calling a function which enables another button which is in disabled mode. Now the issue is buttonname.disabled = false ; does not work in this case Isthere any other way to do the  same in ja

  • Setting for 'status of AuC' in the definition of the asset class

    where should i check the setting for 'status of AuC' in the definition of the asset class