Can I use Bulk Collect results as input parameter for another cursor

MUSIC            ==> remote MUSIC_DB database, MUSIC table has 60 million rows
PRICE_DATA ==> remote PRICING_DB database, PRICE_DATE table has 1 billion rows
These two table once existed in same database, but size of database exceeded available hardware size and hardware budget, so the PRICE_DATA table was moved to another Oracle database.  I need to create a single report that combines data from both of these tables, and a distributed join with DRIVING_SITE hint will not work because the size of both table is too large to push to one DRIVING_SITE location, so I wrote this PLSQL block to process in small blocks.
QUESTION: how can use bulk collect from one cursor and pass that bulk collected information as input to second cursor without specifically listing each cell of the PLSQL bulk collection?  See sample pseudo-code below, I am trying to determine more efficient way to code than hard-coding 100 parameter names into 2nd cursor.
NOTE: below is truly pseudo-code, I had to change the names of everything to adhere to NDA, but below works and is fast enough for my purposes, but if I want to change from 100 input parameters to 200, I have to add more hard-coded values.  There has got to be a better way.
DECLARE
     -- define cursor that retrieves distinct SONG_IDs from MUSIC table in remote music database
     CURSOR C_CURRENT_MUSIC
     IS
    select distinct SONG_ID
    from MUSIC@MUSIC_DB
    where PRODUCTION_RELEASE=1
     /*  define a parameterized cursor that accepts 100 SONG_IDs and retrieves
          required pricing information
     CURSOR C_get_music_price_data
               P_SONG_ID_001 NUMBER, P_SONG_ID_002 NUMBER, P_SONG_ID_003 NUMBER, P_SONG_ID_004 NUMBER, P_SONG_ID_005 NUMBER, P_SONG_ID_006 NUMBER, P_SONG_ID_007 NUMBER, P_SONG_ID_008 NUMBER, P_SONG_ID_009 NUMBER, P_SONG_ID_010 NUMBER,
               P_SONG_ID_011 NUMBER, P_SONG_ID_012 NUMBER, P_SONG_ID_013 NUMBER, P_SONG_ID_014 NUMBER, P_SONG_ID_015 NUMBER, P_SONG_ID_016 NUMBER, P_SONG_ID_017 NUMBER, P_SONG_ID_018 NUMBER, P_SONG_ID_019 NUMBER, P_SONG_ID_020 NUMBER,
               P_SONG_ID_021 NUMBER, P_SONG_ID_022 NUMBER, P_SONG_ID_023 NUMBER, P_SONG_ID_024 NUMBER, P_SONG_ID_025 NUMBER, P_SONG_ID_026 NUMBER, P_SONG_ID_027 NUMBER, P_SONG_ID_028 NUMBER, P_SONG_ID_029 NUMBER, P_SONG_ID_030 NUMBER,
               P_SONG_ID_031 NUMBER, P_SONG_ID_032 NUMBER, P_SONG_ID_033 NUMBER, P_SONG_ID_034 NUMBER, P_SONG_ID_035 NUMBER, P_SONG_ID_036 NUMBER, P_SONG_ID_037 NUMBER, P_SONG_ID_038 NUMBER, P_SONG_ID_039 NUMBER, P_SONG_ID_040 NUMBER,
               P_SONG_ID_041 NUMBER, P_SONG_ID_042 NUMBER, P_SONG_ID_043 NUMBER, P_SONG_ID_044 NUMBER, P_SONG_ID_045 NUMBER, P_SONG_ID_046 NUMBER, P_SONG_ID_047 NUMBER, P_SONG_ID_048 NUMBER, P_SONG_ID_049 NUMBER, P_SONG_ID_050 NUMBER,
               P_SONG_ID_051 NUMBER, P_SONG_ID_052 NUMBER, P_SONG_ID_053 NUMBER, P_SONG_ID_054 NUMBER, P_SONG_ID_055 NUMBER, P_SONG_ID_056 NUMBER, P_SONG_ID_057 NUMBER, P_SONG_ID_058 NUMBER, P_SONG_ID_059 NUMBER, P_SONG_ID_060 NUMBER,
               P_SONG_ID_061 NUMBER, P_SONG_ID_062 NUMBER, P_SONG_ID_063 NUMBER, P_SONG_ID_064 NUMBER, P_SONG_ID_065 NUMBER, P_SONG_ID_066 NUMBER, P_SONG_ID_067 NUMBER, P_SONG_ID_068 NUMBER, P_SONG_ID_069 NUMBER, P_SONG_ID_070 NUMBER,
               P_SONG_ID_071 NUMBER, P_SONG_ID_072 NUMBER, P_SONG_ID_073 NUMBER, P_SONG_ID_074 NUMBER, P_SONG_ID_075 NUMBER, P_SONG_ID_076 NUMBER, P_SONG_ID_077 NUMBER, P_SONG_ID_078 NUMBER, P_SONG_ID_079 NUMBER, P_SONG_ID_080 NUMBER,
               P_SONG_ID_081 NUMBER, P_SONG_ID_082 NUMBER, P_SONG_ID_083 NUMBER, P_SONG_ID_084 NUMBER, P_SONG_ID_085 NUMBER, P_SONG_ID_086 NUMBER, P_SONG_ID_087 NUMBER, P_SONG_ID_088 NUMBER, P_SONG_ID_089 NUMBER, P_SONG_ID_090 NUMBER,
               P_SONG_ID_091 NUMBER, P_SONG_ID_092 NUMBER, P_SONG_ID_093 NUMBER, P_SONG_ID_094 NUMBER, P_SONG_ID_095 NUMBER, P_SONG_ID_096 NUMBER, P_SONG_ID_097 NUMBER, P_SONG_ID_098 NUMBER, P_SONG_ID_099 NUMBER, P_SONG_ID_100 NUMBER
     IS
     select
     from PRICE_DATA@PRICING_DB
     where COUNTRY = 'USA'
     and START_DATE <= sysdate
     and END_DATE > sysdate
     and vpc.SONG_ID IN
               P_SONG_ID_001 ,P_SONG_ID_002 ,P_SONG_ID_003 ,P_SONG_ID_004 ,P_SONG_ID_005 ,P_SONG_ID_006 ,P_SONG_ID_007 ,P_SONG_ID_008 ,P_SONG_ID_009 ,P_SONG_ID_010,
               P_SONG_ID_011 ,P_SONG_ID_012 ,P_SONG_ID_013 ,P_SONG_ID_014 ,P_SONG_ID_015 ,P_SONG_ID_016 ,P_SONG_ID_017 ,P_SONG_ID_018 ,P_SONG_ID_019 ,P_SONG_ID_020,
               P_SONG_ID_021 ,P_SONG_ID_022 ,P_SONG_ID_023 ,P_SONG_ID_024 ,P_SONG_ID_025 ,P_SONG_ID_026 ,P_SONG_ID_027 ,P_SONG_ID_028 ,P_SONG_ID_029 ,P_SONG_ID_030,
               P_SONG_ID_031 ,P_SONG_ID_032 ,P_SONG_ID_033 ,P_SONG_ID_034 ,P_SONG_ID_035 ,P_SONG_ID_036 ,P_SONG_ID_037 ,P_SONG_ID_038 ,P_SONG_ID_039 ,P_SONG_ID_040,
               P_SONG_ID_041 ,P_SONG_ID_042 ,P_SONG_ID_043 ,P_SONG_ID_044 ,P_SONG_ID_045 ,P_SONG_ID_046 ,P_SONG_ID_047 ,P_SONG_ID_048 ,P_SONG_ID_049 ,P_SONG_ID_050,
               P_SONG_ID_051 ,P_SONG_ID_052 ,P_SONG_ID_053 ,P_SONG_ID_054 ,P_SONG_ID_055 ,P_SONG_ID_056 ,P_SONG_ID_057 ,P_SONG_ID_058 ,P_SONG_ID_059 ,P_SONG_ID_060,
               P_SONG_ID_061 ,P_SONG_ID_062 ,P_SONG_ID_063 ,P_SONG_ID_064 ,P_SONG_ID_065 ,P_SONG_ID_066 ,P_SONG_ID_067 ,P_SONG_ID_068 ,P_SONG_ID_069 ,P_SONG_ID_070,
               P_SONG_ID_071 ,P_SONG_ID_072 ,P_SONG_ID_073 ,P_SONG_ID_074 ,P_SONG_ID_075 ,P_SONG_ID_076 ,P_SONG_ID_077 ,P_SONG_ID_078 ,P_SONG_ID_079 ,P_SONG_ID_080,
               P_SONG_ID_081 ,P_SONG_ID_082 ,P_SONG_ID_083 ,P_SONG_ID_084 ,P_SONG_ID_085 ,P_SONG_ID_086 ,P_SONG_ID_087 ,P_SONG_ID_088 ,P_SONG_ID_089 ,P_SONG_ID_090,
               P_SONG_ID_091 ,P_SONG_ID_092 ,P_SONG_ID_093 ,P_SONG_ID_094 ,P_SONG_ID_095 ,P_SONG_ID_096 ,P_SONG_ID_097 ,P_SONG_ID_098 ,P_SONG_ID_099 ,P_SONG_ID_100
     group by
           vpc.SONG_ID
          ,vpc.STOREFRONT_ID
     TYPE SONG_ID_TYPE IS TABLE OF MUSIC@MUSIC_DB%TYPE INDEX BY BINARY_INTEGER;
     V_SONG_ID_ARRAY                         SONG_ID_TYPE                     ;
     v_commit_counter           NUMBER := 0;
BEGIN
     /* open cursor you intent to bulk collect from */
     OPEN C_CURRENT_MUSIC;
     LOOP
          /* in batches of 100, bulk collect ADAM_ID mapped TMS_IDENTIFIER into PLSQL table or records */
          FETCH C_CURRENT_MUSIC BULK COLLECT INTO V_SONG_ID_ARRAY LIMIT 100;
               EXIT WHEN V_SONG_ID_ARRAY.COUNT = 0;
               /* to avoid NO DATA FOUND error when pass 100 parameters to OPEN cursor, if the arrary
                  is not fully populated to 100, pad the array with nulls to fill up to 100 cells. */
               IF (V_SONG_ID_ARRAY.COUNT >=1 and V_SONG_ID_ARRAY.COUNT <> 100) THEN
                    FOR j IN V_SONG_ID_ARRAY.COUNT+1..100 LOOP
                         V_SONG_ID_ARRAY(j) := null;
                    END LOOP;
               END IF;
          /* pass a batch of 100 to cursor that get price information per SONG_ID and STOREFRONT_ID */
          FOR j IN C_get_music_price_data
                    V_SONG_ID_ARRAY(1) ,V_SONG_ID_ARRAY(2) ,V_SONG_ID_ARRAY(3) ,V_SONG_ID_ARRAY(4) ,V_SONG_ID_ARRAY(5) ,V_SONG_ID_ARRAY(6) ,V_SONG_ID_ARRAY(7) ,V_SONG_ID_ARRAY(8) ,V_SONG_ID_ARRAY(9) ,V_SONG_ID_ARRAY(10) ,
                    V_SONG_ID_ARRAY(11) ,V_SONG_ID_ARRAY(12) ,V_SONG_ID_ARRAY(13) ,V_SONG_ID_ARRAY(14) ,V_SONG_ID_ARRAY(15) ,V_SONG_ID_ARRAY(16) ,V_SONG_ID_ARRAY(17) ,V_SONG_ID_ARRAY(18) ,V_SONG_ID_ARRAY(19) ,V_SONG_ID_ARRAY(20) ,
                    V_SONG_ID_ARRAY(21) ,V_SONG_ID_ARRAY(22) ,V_SONG_ID_ARRAY(23) ,V_SONG_ID_ARRAY(24) ,V_SONG_ID_ARRAY(25) ,V_SONG_ID_ARRAY(26) ,V_SONG_ID_ARRAY(27) ,V_SONG_ID_ARRAY(28) ,V_SONG_ID_ARRAY(29) ,V_SONG_ID_ARRAY(30) ,
                    V_SONG_ID_ARRAY(31) ,V_SONG_ID_ARRAY(32) ,V_SONG_ID_ARRAY(33) ,V_SONG_ID_ARRAY(34) ,V_SONG_ID_ARRAY(35) ,V_SONG_ID_ARRAY(36) ,V_SONG_ID_ARRAY(37) ,V_SONG_ID_ARRAY(38) ,V_SONG_ID_ARRAY(39) ,V_SONG_ID_ARRAY(40) ,
                    V_SONG_ID_ARRAY(41) ,V_SONG_ID_ARRAY(42) ,V_SONG_ID_ARRAY(43) ,V_SONG_ID_ARRAY(44) ,V_SONG_ID_ARRAY(45) ,V_SONG_ID_ARRAY(46) ,V_SONG_ID_ARRAY(47) ,V_SONG_ID_ARRAY(48) ,V_SONG_ID_ARRAY(49) ,V_SONG_ID_ARRAY(50) ,
                    V_SONG_ID_ARRAY(51) ,V_SONG_ID_ARRAY(52) ,V_SONG_ID_ARRAY(53) ,V_SONG_ID_ARRAY(54) ,V_SONG_ID_ARRAY(55) ,V_SONG_ID_ARRAY(56) ,V_SONG_ID_ARRAY(57) ,V_SONG_ID_ARRAY(58) ,V_SONG_ID_ARRAY(59) ,V_SONG_ID_ARRAY(60) ,
                    V_SONG_ID_ARRAY(61) ,V_SONG_ID_ARRAY(62) ,V_SONG_ID_ARRAY(63) ,V_SONG_ID_ARRAY(64) ,V_SONG_ID_ARRAY(65) ,V_SONG_ID_ARRAY(66) ,V_SONG_ID_ARRAY(67) ,V_SONG_ID_ARRAY(68) ,V_SONG_ID_ARRAY(69) ,V_SONG_ID_ARRAY(70) ,
                    V_SONG_ID_ARRAY(71) ,V_SONG_ID_ARRAY(72) ,V_SONG_ID_ARRAY(73) ,V_SONG_ID_ARRAY(74) ,V_SONG_ID_ARRAY(75) ,V_SONG_ID_ARRAY(76) ,V_SONG_ID_ARRAY(77) ,V_SONG_ID_ARRAY(78) ,V_SONG_ID_ARRAY(79) ,V_SONG_ID_ARRAY(80) ,
                    V_SONG_ID_ARRAY(81) ,V_SONG_ID_ARRAY(82) ,V_SONG_ID_ARRAY(83) ,V_SONG_ID_ARRAY(84) ,V_SONG_ID_ARRAY(85) ,V_SONG_ID_ARRAY(86) ,V_SONG_ID_ARRAY(87) ,V_SONG_ID_ARRAY(88) ,V_SONG_ID_ARRAY(89) ,V_SONG_ID_ARRAY(90) ,
                    V_SONG_ID_ARRAY(91) ,V_SONG_ID_ARRAY(92) ,V_SONG_ID_ARRAY(93) ,V_SONG_ID_ARRAY(94) ,V_SONG_ID_ARRAY(95) ,V_SONG_ID_ARRAY(96) ,V_SONG_ID_ARRAY(97) ,V_SONG_ID_ARRAY(98) ,V_SONG_ID_ARRAY(99) ,V_SONG_ID_ARRAY(100)        
          LOOP
               /* do stuff with data from Song and Pricing Database coming from the two
                    separate cursors, then continue processing more rows...
          END LOOP;
          /* commit after each batch of 100 SONG_IDs is processed */        
          COMMIT;
          EXIT WHEN C_CURRENT_MUSIC%NOTFOUND;  -- exit when there are no more rows to fetch from cursor
     END LOOP; -- bulk fetching loop
     CLOSE C_CURRENT_MUSIC; -- close cursor that was used in bulk collection
     /* commit rows */
     COMMIT; -- commit any remaining uncommitted data.
END;

I've got a problem when using passing VARRAY of numbers as parameter to remote cursor: it takes a super long time to run, sometimes doesn't finish even after an hour as passed.
Continuing with my example in original entry, I replaced the bulk collect into PLSQL table collection with a VARRAY and i bulk collect into the VARRAY, this is fast and I know it works because I can DBMS_OUTPUT.PUT_LINE cells of VARRAY so I know it is getting populated correctly.  However, when I pass the VARRAY containing 100 cells populated with SONG_IDs as parameter to cursor, execution time is over an hour and when I am expecting a few seconds.
Below code example strips the problem down to it's raw details, I skip the bulk collect and just manually populate a VARRAY with 100 SONG_ID values, then try to pass to as parameter to a cursor, but the execution time of cursor is unexpectedly long, over 30 minutes, sometime longer, when I am expecting seconds.
IMPORTANT: If I take the same 100 SONG_IDs and place them directly in the cursor query's where IN clause, the SQL runs in under 5 seconds and returns result.  Also, if I pass the 100 SONG_IDs as individual cells of a PLSQL table collection, then it also runs fast.
I thought that since the VARRAY is used via select subquery that is it queried locally, but the cursor is remote, and that I had a distribute problem on my hands, so I put in the DRIVING_SITE hint to attempt to force the result of query against VARRAY to go to remote server and rest of query will run there before returning result, but that didn't work either, still got slow response.
Is something wrong with my code, or I am running into a Oracle problem that may require support to resolve?
DECLARE
     /*  define a parameterized cursor that accepts XXX number of in SONG_IDs and
      retrieves required pricing information
     CURSOR C_get_music_price_data
  p_array_song_ids SYS.ODCInumberList              
     IS
     select  /*+DRIVING_SITE(pd) */
  count(distinct s.EVE_ID)
     from PRICE_DATA@PRICING_DB pd
     where pd.COUNTRY = 'USA'
     and pd.START_DATE <= sysdate
     and pd.END_DATE > sysdate
     and pd.SONG_ID IN
          select column_value from table(p_array_song_ids)
     group by
           pd.SONG_ID
          ,pd.STOREFRONT_ID
  V_ARRAY_SONG_IDS SYS.ODCInumberList := SYS.ODCInumberList();    
BEGIN
V_ARRAY_SONG_IDS.EXTEND(100);
V_ARRAY_SONG_IDS(  1 ) := 31135  ;
V_ARRAY_SONG_IDS(  2 ) := 31140   ;
V_ARRAY_SONG_IDS(  3 ) := 31142   ;
V_ARRAY_SONG_IDS(  4 ) := 31144   ;
V_ARRAY_SONG_IDS(  5 ) := 31146   ;
V_ARRAY_SONG_IDS(  6 ) := 31148   ;
V_ARRAY_SONG_IDS(  7 ) := 31150   ;
V_ARRAY_SONG_IDS(  8 ) := 31152   ;
V_ARRAY_SONG_IDS(  9 ) := 31154   ;
V_ARRAY_SONG_IDS( 10 ) := 31156   ;
V_ARRAY_SONG_IDS( 11 ) := 31158   ;
V_ARRAY_SONG_IDS( 12 ) := 31160   ;
V_ARRAY_SONG_IDS( 13 ) := 33598   ;
V_ARRAY_SONG_IDS( 14 ) := 33603   ;
V_ARRAY_SONG_IDS( 15 ) := 33605   ;
V_ARRAY_SONG_IDS( 16 ) := 33607   ;
V_ARRAY_SONG_IDS( 17 ) := 33609   ;
V_ARRAY_SONG_IDS( 18 ) := 33611   ;
V_ARRAY_SONG_IDS( 19 ) := 33613   ;
V_ARRAY_SONG_IDS( 20 ) := 33615   ;
V_ARRAY_SONG_IDS( 21 ) := 33617   ;
V_ARRAY_SONG_IDS( 22 ) := 33630   ;
V_ARRAY_SONG_IDS( 23 ) := 33632   ;
V_ARRAY_SONG_IDS( 24 ) := 33636   ;
V_ARRAY_SONG_IDS( 25 ) := 33638   ;
V_ARRAY_SONG_IDS( 26 ) := 33640   ;
V_ARRAY_SONG_IDS( 27 ) := 33642   ;
V_ARRAY_SONG_IDS( 28 ) := 33644   ;
V_ARRAY_SONG_IDS( 29 ) := 33646   ;
V_ARRAY_SONG_IDS( 30 ) := 33648   ;
V_ARRAY_SONG_IDS( 31 ) := 33662   ;
V_ARRAY_SONG_IDS( 32 ) := 33667   ;
V_ARRAY_SONG_IDS( 33 ) := 33669   ;
V_ARRAY_SONG_IDS( 34 ) := 33671   ;
V_ARRAY_SONG_IDS( 35 ) := 33673   ;
V_ARRAY_SONG_IDS( 36 ) := 33675   ;
V_ARRAY_SONG_IDS( 37 ) := 33677   ;
V_ARRAY_SONG_IDS( 38 ) := 33679   ;
V_ARRAY_SONG_IDS( 39 ) := 33681   ;
V_ARRAY_SONG_IDS( 40 ) := 33683   ;
V_ARRAY_SONG_IDS( 41 ) := 33685   ;
V_ARRAY_SONG_IDS( 42 ) := 33700   ;
V_ARRAY_SONG_IDS( 43 ) := 33702   ;
V_ARRAY_SONG_IDS( 44 ) := 33704   ;
V_ARRAY_SONG_IDS( 45 ) := 33706   ;
V_ARRAY_SONG_IDS( 46 ) := 33708   ;
V_ARRAY_SONG_IDS( 47 ) := 33710   ;
V_ARRAY_SONG_IDS( 48 ) := 33712   ;
V_ARRAY_SONG_IDS( 49 ) := 33723   ;
V_ARRAY_SONG_IDS( 50 ) := 33725   ;
V_ARRAY_SONG_IDS( 51 ) := 33727   ;
V_ARRAY_SONG_IDS( 52 ) := 33729   ;
V_ARRAY_SONG_IDS( 53 ) := 33731   ;
V_ARRAY_SONG_IDS( 54 ) := 33733   ;
V_ARRAY_SONG_IDS( 55 ) := 33735   ;
V_ARRAY_SONG_IDS( 56 ) := 33737   ;
V_ARRAY_SONG_IDS( 57 ) := 33749   ;
V_ARRAY_SONG_IDS( 58 ) := 33751   ;
V_ARRAY_SONG_IDS( 59 ) := 33753   ;
V_ARRAY_SONG_IDS( 60 ) := 33755   ;
V_ARRAY_SONG_IDS( 61 ) := 33757   ;
V_ARRAY_SONG_IDS( 62 ) := 33759   ;
V_ARRAY_SONG_IDS( 63 ) := 33761   ;
V_ARRAY_SONG_IDS( 64 ) := 33763   ;
V_ARRAY_SONG_IDS( 65 ) := 33775   ;
V_ARRAY_SONG_IDS( 66 ) := 33777   ;
V_ARRAY_SONG_IDS( 67 ) := 33779   ;
V_ARRAY_SONG_IDS( 68 ) := 33781   ;
V_ARRAY_SONG_IDS( 69 ) := 33783   ;
V_ARRAY_SONG_IDS( 70 ) := 33785   ;
V_ARRAY_SONG_IDS( 71 ) := 33787   ;
V_ARRAY_SONG_IDS( 72 ) := 33789   ;
V_ARRAY_SONG_IDS( 73 ) := 33791   ;
V_ARRAY_SONG_IDS( 74 ) := 33793   ;
V_ARRAY_SONG_IDS( 75 ) := 33807   ;
V_ARRAY_SONG_IDS( 76 ) := 33809   ;
V_ARRAY_SONG_IDS( 77 ) := 33811   ;
V_ARRAY_SONG_IDS( 78 ) := 33813   ;
V_ARRAY_SONG_IDS( 79 ) := 33815   ;
V_ARRAY_SONG_IDS( 80 ) := 33817   ;
V_ARRAY_SONG_IDS( 81 ) := 33819   ;
V_ARRAY_SONG_IDS( 82 ) := 33821   ;
V_ARRAY_SONG_IDS( 83 ) := 33823   ;
V_ARRAY_SONG_IDS( 84 ) := 33825   ;
V_ARRAY_SONG_IDS( 85 ) := 33839   ;
V_ARRAY_SONG_IDS( 86 ) := 33844   ;
V_ARRAY_SONG_IDS( 87 ) := 33846   ;
V_ARRAY_SONG_IDS( 88 ) := 33848   ;
V_ARRAY_SONG_IDS( 89 ) := 33850   ;
V_ARRAY_SONG_IDS( 90 ) := 33852   ;
V_ARRAY_SONG_IDS( 91 ) := 33854   ;
V_ARRAY_SONG_IDS( 92 ) := 33856   ;
V_ARRAY_SONG_IDS( 93 ) := 33858   ;
V_ARRAY_SONG_IDS( 94 ) := 33860   ;
V_ARRAY_SONG_IDS( 95 ) := 33874   ;
V_ARRAY_SONG_IDS( 96 ) := 33879   ;
V_ARRAY_SONG_IDS( 97 ) := 33881   ;
V_ARRAY_SONG_IDS( 98 ) := 33883   ;
V_ARRAY_SONG_IDS( 99 ) := 33885   ;
V_ARRAY_SONG_IDS(100 ) := 33889  ;
    /* do stuff with data from Song and Pricing Database coming from the two
  separate cursors, then continue processing more rows...
  FOR i IN C_get_music_price_data( v_array_song_ids ) LOOP
  . (this is the loop where I pass in v_array_song_ids
  .  populated with only 100 cells and it runs forever)
  END LOOP; 
END;

Similar Messages

  • Can i use the iq526 touchscreen as a monitor for another computer? if so, how do i input the video?

    can i use the iq526 touchscreen as a monitor for another computer? if so, how do i input the video?
    This question was solved.
    View Solution.

    You know, this is just a thought.  I'm not sure the differences between the iq526 and iq770, but as I was looking around I found this article.  I don't know if it'll help, but you might want to take a look.  Evidently someone figured out how to use the HP touchsmart as an external monitor for his Dell laptop.
    GeorgeFN
    I work on behalf of HP.

  • Can I pass an array as an input parameter for a stored procedure on SQL Server 2000

    I am trying to pass an array to a stored procedure residing on my SQL Server 2000 database server. Is this even possible? If it is possible, what is the syntax for this?
    Any help would be greatly appreciated.
    Thanks

    I have passed arrays to and from a database using SQL and ActiveX, including to and from stored procedures, but I cannot recall the precise method used to do so. If memory serves, everything is in the form of a string. You need to do a lot of parsing and 'unparsing' to get this data into your stored procedure.
    You are left with a couple of options to get your data to the stored procedure. I recommend using SQL in LabVIEW wherever possible as it saves the amount of external code calls (and believe me, calling ActiveX procedures developed by someone else in Visual Basic is NOT much fun at all...). You can either send the array and other data to the stored procedure (you will find the syntax in the SQL references in LabVIEW help under SQL), or you can send
    the array to the database, and have the database then act upon the array.
    I strongly recommend making routines (subVIs) to handle these operations.
    Sorry I don't have the syntax, I don't have SQL installed on this machine. If you can't find the syntax in the help, please post here again.
    -Mike Du'Lyea

  • Can I use my account to buy an app for another Mac user?

    I'm a registered user in the UK and have bought apps there. I'm currently in New Zealand and my hosts are Mac users. I want to buy them some software as a gift and install it on their machine, not mine (mine is back in the UK anyway). Is there any way of doing this?

    Somehow I did it! After being rejected when I tried to sign in with my Apple ID on the grounds that I wasn't in the UK, the App Store offered my Pages in UK currency. I gave my password again and Bingo it started to download Pages into my friend's Mac. I think this is some kind of loophole, but it does appear to have worked. I am slightly puzzled about how I got here, but I did...

  • How to use BULK COLLECT in ref cursor?

    hi,
    can we use bulk collect in ref cursor ? if yes then please give small example ..
    thanks

    Try this:
    create or replace type person_ot as object (name varchar2(10)) not final;
    create or replace type student_ot under person_ot (s_num number) not final;
    create type person_tt as table of person_ot;
    create table persons of person_ot;
    declare
    lv_person_list person_tt;
    lv_sql varchar2(1000);
    ref_cur sys_refcursor;
    begin
    lv_sql:= 'select new student_ot(''fred'', 100) from dual
    union all
    select new student_ot(''sally'', 200) from dual';
    open ref_cur for lv_sql;
    fetch ref_cur bulk collect into lv_person_list;
    close ref_cur;
    for i in lv_person_list.first..lv_person_list.last loop
    dbms_output.put_line(lv_person_list(i).name );
    end loop;
    forall i in lv_person_list.first..lv_person_list.last
    insert into persons values lv_person_list(i);
    end;
    /

  • Can I use my Korg D3200 as input mixer?

    Can I use my Korg D3200 as input mixer for recording into Garageband?

    Actually, there is a way to limit the amount of space Time Machine uses for backup so that it won't eat up all your disk space...for example, I have mine set to only use 65GB, and then it starts to overwrite older backups...Google it for more info. To begin with, I deleted my Sparse Disk Image Bundle from Time Capsule, and used Disk Utility to create a Sparse Disk Image Bundle (turn off backups first!!!) with a capacity of 65GB in HFS Journaled, Case Sensitive...The Image must be named as follows "Full Computer Name_Ethernet MAC Address" (The ethernet MAC address is used even if you plan to solely use wifi for backups). for example, my Sparse Disk Image Bundle name is "Ahmad Atiya's MacBook Pro_00254b9d8f72.sparsebundle" Move this image to your time capsule's main directory (if you didn't create it there to begin with). Go to your Time Machine preferences, turn it on, and set it to use your Time Capsule. It should start using the image you created. If you click "get info" on the disk image, it'll show a capacity of whatever you created.

  • Using bulk collect and for all to solve a problem

    Hi All
    I have a following problem.
    Please forgive me if its a stupid question :-) im learning.
    1: Data in a staging table xx_staging_table
    2: two Target table t1, t2 where some columns from xx_staging_table are inserted into
    Some of the columns from the staging table data are checked for valid entries and then some columns from that row will be loaded into the two target tables.
    The two target tables use different set of columns from the staging table
    When I had a thousand records there was no problem with a direct insert but it seems we will now have half a million records.
    This has slowed down the process considerably.
    My question is
    Can I use the bulk collect and for all functionality to get specific columns from a staging table, then validate the row using those columns
    and then use a bulk insert to load the data into a specific table.?
    So code would be like
    get_staging_data cursor will have all the columns i need from the staging table
    cursor get_staging_data
    is select * from xx_staging_table (about 500000) records
    Use bulk collect to load about 10000 or so records into a plsql table
    and then do a bulk insert like this
    CREATE TABLE t1 AS SELECT * FROM all_objects WHERE 1 = 2;
    CREATE OR REPLACE PROCEDURE test_proc (p_array_size IN PLS_INTEGER DEFAULT 100)
    IS
    TYPE ARRAY IS TABLE OF all_objects%ROWTYPE;
    l_data ARRAY;
    CURSOR c IS SELECT * FROM all_objects;
    BEGIN
    OPEN c;
    LOOP
    FETCH c BULK COLLECT INTO l_data LIMIT p_array_size;
    FORALL i IN 1..l_data.COUNT
    INSERT INTO t1 VALUES l_data(i);
    EXIT WHEN c%NOTFOUND;
    END LOOP;
    CLOSE c;
    END test_proc;
    In the above example t1 and the cursor have the same number of columns
    In my case the columns in the cursor loop are a small subset of the columns of table t1
    so can i use a forall to load that subset into the table t1? How does that work?
    Thanks
    J

    user7348303 wrote:
    checking if the value is valid and theres also some conditional processing rules ( such as if the value is a certain value no inserts are needed)
    which are a little more complex than I can put in a simpleWell, if the processing is too complex (and conditional) to be done in SQL, then doing that in PL/SQL is justified... but will be slower as you are now introducing an additional layer. Data now needs to travel between the SQL layer and PL/SQL layer. This is slower.
    PL/SQL is inherently serialised - and this also effects performance and scalability. PL/SQL cannot be parallelised by Oracle in an automated fashion. SQL processes can.
    To put in in simple terms. You create PL/SQL procedure Foo that processes SQL cursor and you execute that proc. Oracle cannot run multiple parallel copies of Foo. It perhaps can parallelise that SQL cursor that Foo uses - but not Foo itself.
    However, if Foo is called by the SQL engine it can run in parallel - as the SQL process calling Foo is running in parallel. So if you make Foo a pipeline table function (written in PL/SQL), and you design and code it as a thread-safe/parallel enabled function, it can be callled and used and executed in parallel, by the SQL engine.
    So moving your PL/SQL code into a parallel enabled pipeline function written in PL/SQL, and using that function via parallel SQL, can increase performance over running that same basic PL/SQL processing as a serialised process.
    This is of course assuming that the processing that needs to be done using PL/SQL code, can be designed and coded for parallel processing in this fashion.

  • How to use BULK COLLECT in oracle forms

    hi gurus,
    I am using oracle forms
    Forms [32 Bit] Version 10.1.2.0.2 (Production)
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - ProductionI wanna use bulk collect from database table lets say <employees>
    while working on database level with collections and records it's working very well for me, but when I try to use that technique on oracle forms it hits me error
    error 591 this feature is not supported in client side programmingI know I can use cursors to loop through the records of oracle tables ,
    but I'm convenient while using collections and arrays
    for example
    Set Serveroutput On
    Declare
          Type Rec_T Is Record (     
           Empid Number ,
           Empname Varchar2(100)
          Type V_R Is Table Of Rec_T Index By Binary_Integer;     
          V_Array V_R;
    Begin
       Select Employee_Id , First_Name
       Bulk Collect
       Into V_Array
          From Employees; 
       For Indx In V_Array.First..V_Array.Last Loop
       Dbms_Output.Put_Line('employees id '||V_Array(Indx).Empid ||'and the name is '||V_Array(Indx).Empname);
       End Loop;      
         End;I wanna use this same way on oracle forms , for certain purposes , please guide me how can I use ...
    thanks...

    For information, you can use and populate a collection within the Forms application without using the BULK COLLECT
    Francoisactually I want to work with arrays , index tables ,
    like
             record_type (variable , variable2);
             type type_name <record_type>  index by binary_integer
            type_variable type_name;
            and in main body of program
            select something
            bulk collect into type_variable
            from any_table;
           loop
                type_variable(indx).variable , type_variable(indx).variable2;
           end loop;
           this is very useful for my logic on which I am working
              like
              type_variable(indx).variable || type_variable(indx-1);
             if it's possible with cursors then how can I use cursor that can fullfill my this logic@Francois
    if it's possible then how can i populate without using bulk collect?
    thanks
    and for others replies: if I can use stored procedures please give me any example..
    thanks

  • Calling Stored procedure which uses Bulk Collect

    Hi All, I have Oracle stored procedure which uses Bulk Collect and returns table type parameter as output. Can anyone please help me how Can I call this kind of stored procedures which returns table type output using VB and Oracle's Driver. (I am successfully able to call using MS ODBC driver, but I want to use OraOLEDB driver.)

    861412 wrote:
    how Can I call this kind of stored procedures which returns table type output using VB and Oracle's Driver. This forum deals with the server-side languages SQL and PL/SQL.
    Your question deals with the client side and Visual Basic language.

  • How to use Bulk Collect and Forall

    Hi all,
    We are on Oracle 10g. I have a requirement to read from table A and then for each record in table A, find matching rows in table B and then write the identified information in table B to the target table (table C). In the past, I had used two ‘cursor for loops’ to achieve that. To make the new procedure, more efficient, I would like to learn to use ‘bulk collect’ and ‘forall’.
    Here is what I have so far:
    DECLARE
    TYPE employee_array IS TABLE OF EMPLOYEES%ROWTYPE;
    employee_data  employee_array;
    TYPE job_history_array IS TABLE OF JOB_HISTORY%ROWTYPE;
    Job_history_data   job_history_array;
    BatchSize CONSTANT POSITIVE := 5;
    -- Read from File A
    CURSOR c_get_employees IS
             SELECT  Employee_id,
                       first_name,
                       last_name,
                       hire_date,
                       job_id
              FROM EMPLOYEES;
    -- Read from File B based on employee ID in File A
    CURSOR c_get_job_history (p_employee_id number) IS
             select start_date,
                      end_date,
                      job_id,
                      department_id
             FROM JOB_HISTORY
             WHERE employee_id = p_employee_id;
    BEGIN
        OPEN c_get_employees;
        LOOP
            FETCH c_get_employees BULK COLLECT INTO employee_data.employee_id.LAST,
                                                                              employee_data.first_name.LAST,
                                                                              employee_data.last_name.LAST,
                                                                              employee_data.hire_date.LAST,
                                                                              employee_data.job_id.LAST
             LIMIT BatchSize;
            FORALL i in 1.. employee_data.COUNT
                    Open c_get_job_history (employee_data(i).employee_id);
                    FETCH c_get_job_history BULKCOLLECT INTO job_history_array LIMIT BatchSize;
                             FORALL k in 1.. Job_history_data.COUNT LOOP
                                            -- insert into FILE C
                                              INSERT INTO MY_TEST(employee_id, first_name, last_name, hire_date, job_id)
                                                                values (job_history_array(k).employee_id, job_history_array(k).first_name,
                                                                          job_history_array(k).last_name, job_history_array(k).hire_date,
                                                                          job_history_array(k).job_id);
                                             EXIT WHEN job_ history_data.count < BatchSize                        
                             END LOOP;                          
                             CLOSE c_get_job_history;                          
                     EXIT WHEN employee_data.COUNT < BatchSize;
           END LOOP;
            COMMIT;
            CLOSE c_get_employees;
    END;
                     When I run this script, I get
    [Error] Execution (47: 17): ORA-06550: line 47, column 17:
    PLS-00103: Encountered the symbol "OPEN" when expecting one of the following:
       . ( * @ % & - + / at mod remainder rem select update with
       <an exponent (**)> delete insert || execute multiset save
       merge
    ORA-06550: line 48, column 17:
    PLS-00103: Encountered the symbol "FETCH" when expecting one of the following:
       begin function package pragma procedure subtype type use
       <an identifier> <a double-quoted delimited-identifier> form
       current cursorWhat is the best way to code this? Once, I learn how to do this, I apply the knowledge to the real application in which file A would have around 200 rows and file B would have hundreds of thousands of rows.
    Thank you for your guidance,
    Seyed

    Hello BlueShadow,
    Following your advice, I modified a stored procedure that initially was using two cursor for loops to read from tables A and B to write to table C to use instead something like your suggestion listed below:
    INSERT INTO tableC
    SELECT …
    FROM tableA JOIN tableB on (join condition).I tried this change on a procedure writing to tableC with keys disabled. I will try this against the real table that has primary key and indexes and report the result later.
    Thank you very much,
    Seyed

  • Approach of using Bulk Collect

    Hi Experts,
    how to use bulk collect for uncertain number of columns of select statement.
    Master table structure:
    Create table tabmst
    (id number,
    cls_input varchar2(2000),
    price number);
    insert into tabmst(1,'select product, price from product',500);
    insert into tabmst(2,'select product, price,purchase_dt from product',100);
    insert into tabmst(3,'select * from product',1000);
    Currently I want to store Select statement of cls_input column in a local variable like
    dyn_qry:= cls_input; by using a cursor.
    Now my question is how to use Bulk Collect by using "Execute Immediate" in Bulk collect variable as there is not certainity of the number of columns from "Select Statment". Please suggest.
    Sample code:
    I created TYPE variable for Bulk Collect also support blk_var;
    Declare
    dyn_qry varchar2(3000);
    cursor c1 is select * from tabmst;
    begin
    for i in c1 loop
    dyn_qry:= cls_input;
    Execute immediate dyn_qry into blk_var;
    End Loop;
    End;
    Now I want to store values of Each "Select statements columns" which is executing by dynamic SQL. but it is uncertain that how many columns with return from dynamic SQL.
    Please suggest the approach on the same. Thanks in advance.

    >
    I don't think you can use bulk collect with EXECUTE IMMEDIATE. They do two different things. EXECUTE IMMEDIATE allows the execlution of dynamic SQL. BULK COLLECT provides optimization of SELECT statements when loading the contents into collections. I am not aware of any support for BULK COLLECT with EXECUTE IMMEDIATE.
    You may be able to do this a different way. If you must use dynamic SQL (I suggest you don't unless it is absolutely necessary. Dynamic SQL is hard to write, hard to debug, hard to maintain, and hard to tune) use a reference cursor instead. You can use the BULK COLLECT with the standard fetch.

  • Commit / Rollback  AND  ERROS when to use BULK COLLECT

    Hi
    Where can I to put commit when to use bulk collect for to insert or update ?,
    How many records I to must to commit ?
    declare
    TYPE t_cd_estrutura_comercial IS TABLE OF t_ec_pessoa_perfil_ciclo.cd_estrutura_comercial%TYPE   INDEX BY PLS_INTEGER;
    TYPE t_cd_tipo_estrutura_comercial IS TABLE OF t_ec_pessoa_perfil_ciclo.cd_tipo_estrutura_comercial%TYPE  INDEX BY PLS_INTEGER;
    TYPE t_nm_ciclo_operacional IS TABLE OF  t_ec_pessoa_perfil_ciclo.nm_ciclo_operacional%TYPE   INDEX BY PLS_INTEGER;
    TYPE t_cd_consultora IS TABLE OF t_ec_pessoa_perfil_ciclo.cd_consultora%TYPE  INDEX BY PLS_INTEGER;
    TYPE t_cd_perfil IS TABLE OF t_ec_pessoa_perfil_ciclo.cd_perfil%TYPE  INDEX BY PLS_INTEGER;
    TYPE t_cd_indicador IS TABLE OF t_ec_pessoa_perfil_ciclo.cd_indicador%TYPE  INDEX BY PLS_INTEGER;
    TYPE t_vl_indicador IS TABLE OF t_ec_pessoa_perfil_ciclo.vl_indicador%TYPE  INDEX BY PLS_INTEGER;
    TYPE t_dt_ultima_atualizacao IS TABLE OF t_ec_pessoa_perfil_ciclo.dt_ultima_atualizacao%TYPE   INDEX BY PLS_INTEGER;
    v_cd_estrutura_comercial t_cd_estrutura_comercial;
    v_cd_tipo_estrutura_comercial t_cd_tipo_estrutura_comercial;
    v_nm_ciclo_operacional t_nm_ciclo_operacional;
    v_cd_consultora t_cd_consultora;
    v_cd_perfil t_cd_perfil;
    v_cd_indicador t_cd_indicador;
    v_vl_indicador t_vl_indicador;
    v_dt_ultima_atualizacao t_dt_ultima_atualizacao;
    V_CURSOR  SYS_REFCURSOR;
    n  pls_integer :=0;
    begin
      ---open v_cursor 
      pkg_scnob_batch_indicadores.abre_cursor(275,v_cursor);
       LOOP
       FETCH V_CURSOR   BULK COLLECT INTO
         v_cd_estrutura_comercial,
         v_cd_tipo_estrutura_comercial,
         v_nm_ciclo_operacional,
         v_cd_consultora,
         v_cd_perfil,
         v_cd_indicador,
         v_vl_indicador,
         v_dt_ultima_atualizacao  LIMIT 1000;
           FORALL i IN 1 .. v_cd_estrutura_comercial.COUNT
              MERGE
                 INTO T_EC_PESSOA_PERFIL_CICLO tgt
                 USING ( SELECT v_cd_estrutura_comercial(i) cd_estrutura_comercial,                           
                          v_cd_tipo_estrutura_comercial(i) cd_tipo_estrutura_comercial,                      
                         v_nm_ciclo_operacional(i) nm_ciclo_operacional,                             
                         v_cd_consultora(i) cd_consultora,                                    
                         v_cd_perfil(i)    cd_perfil,                                       
                         v_cd_indicador(i) cd_indicador,                                     
                         v_vl_indicador(i) vl_indicador,                                     
                         v_dt_ultima_atualizacao(i) dt_ultima_atualizacao FROM  dual ) src
                 ON   (   src.CD_ESTRUTURA_COMERCIAL            = TGT.CD_ESTRUTURA_COMERCIAL        
                          AND src.CD_TIPO_ESTRUTURA_COMERCIAL   = TGT.CD_TIPO_ESTRUTURA_COMERCIAL   
                          AND src.NM_CICLO_OPERACIONAL          = TGT.NM_CICLO_OPERACIONAL          
                          AND src.CD_CONSULTORA                 = TGT.CD_CONSULTORA                 
                          AND src.CD_PERFIL                     = TGT.CD_PERFIL                     
                          AND  src.CD_INDICADOR                 = TGT.CD_INDICADOR  )
              WHEN MATCHED
              THEN
                UPDATE
                 SET TGT.VL_INDICADOR  = src.VL_INDICADOR ,                             
                     TGT.DT_ULTIMA_ATUALIZACAO  = src.DT_ULTIMA_ATUALIZACAO                     
              WHEN NOT MATCHED
              THEN
                 INSERT (tgt.CD_ESTRUTURA_COMERCIAL,                           
                         tgt.CD_TIPO_ESTRUTURA_COMERCIAL,                      
                         tgt.NM_CICLO_OPERACIONAL,                             
                         tgt.CD_CONSULTORA,                                    
                         tgt.CD_PERFIL ,                                       
                         tgt.CD_INDICADOR,                                     
                         tgt.VL_INDICADOR,                                     
                         tgt.DT_ULTIMA_ATUALIZACAO)                            
                 VALUES (src.CD_ESTRUTURA_COMERCIAL,                           
                         src.CD_TIPO_ESTRUTURA_COMERCIAL,                      
                         src.NM_CICLO_OPERACIONAL,                             
                         src.CD_CONSULTORA,                                    
                         src.CD_PERFIL ,                                       
                         src.CD_INDICADOR,                                     
                         src.VL_INDICADOR,                                     
                         src.DT_ULTIMA_ATUALIZACAO) ;
               -- Did I to must commit here ?     
               --         commit;
           n := n + SQL%ROWCOUNT;
        EXIT WHEN  V_CURSOR%NOTFOUND; 
      END LOOP; 
    exception
       when others then
           rollback;  -- and log errors
    end;

    There is a lot of questions like this
    Actually there no precise max of records to be committed
    The COMMIT

  • How can i use in the same time input line and mic to recorder in several track whit SONAR

    I have beem recording only input line because SONAR 4 not recognized the mic in, Help me!!!.
    Thank you. Sorry for my english

    LIMACAR wrote:
    How can i use in the same time input line and mic to recorder in several track whit SONAR 4. I have beem recording only input line because SONAR 4 not recognized the mic in, Help me!!!.
    Thank you. Sorry for my english
    Depending on which soundcard you have there, but if your card is capable for
    ASIO -> activate I/O drivers on Sonar/Options/Audio/Drivers -tab (mic/line sources should be listed there --> activate) and select the mic or/and Line sources on tracks "I" - dialog
    WDM/KS -> use the Surround Mixer or windows recording controls for recording source selection (mic should be listed there)
    MME32 - > same w/ WDM/KS
    If you use Asio4All --> same w/ WDM/KS & MME32
    No mic and line source simultaenously w/ WDM/KS and MME32.
    You perhaps be able to do this w/ kX drivers.
    jutapa
    ADDED:
    You can also install modded version of Audigy 2 drivers/software but I have never done this w/ Li've! 5. so I can't be sure if you get ASIO support for your card.
    Here are the instruction --> http://www.tech-pc.co.uk/audigy-2.php
    jutapaMessage Edited by jutapa on 05-25-2006 02:48 PM

  • How to use BULK COLLECT in Oracle Forms 11g

    Forms is showing error that "Feature is not support in Client Side Program" when i am trying to impliment Bulk collect in Forms 11g.
    i need to load full data from DB to my form becuase using cursor is very slow....
    Is there any method/Work around to achieve this ....

    declare
    type arr is table of emp%rowtype ;
    lv_arr arr;
    begin
    select * bulk collect in to lv_arr from emp;
    /*written code here to process the data and write in to file*/
    end;Unless you are inserting/updating the data you are holding in the array into a database table I don't think there is much peformance gain in using bulk-collect in conjunction with writing a file. Bulk processing will increase performance by minimizing context switches from the SQL to the PL/SQL engine, nothing more, nothing less.
    In any case bulk processing is not available in forms, if you really need to make use of it you need to do it in a stored procedure.
    cheers

  • Can I use HDMI port on my Macbook Pro for digital audio INPUT?

    Trying to find a way to get digital audio input to my new MBP with retina.  HDMI supports audio and video, so can I use that port as an input?

    It is not recommenced to use TimeMachine in a partition, either on the boot drive or externals because of problems in the past with that arrangement.
    You also need hardware protection in addition to software protection.
    If you want to do what I do, which is have a second 50% partition bootable clone of the first.
    1: Use BootCamp software to create yourself a 50% sized partition, then exit the program. (you also can do it in Disk Utility, but it's tricky)
    2: Head to Disk Utility and select the BOOTCAMP partition and change it's name and format to OS X Extended Journaled.
    (To map off bad sectors in advance, use Disk Utility Erase with the next to last on the right selection, improves reliability and makes for faster reads)
    3: Download Carbon Copy Cloner, now select your Macintosh HD partition and clone it to the second partition.
    4: Whenever you want to boot from the second partition, just hold the option key down on the keyboard to select it to boot from.
    You can access the folders on the second boot partition as well, to grab deleted files you accidentally erased (which is a rare thing for most to occur) however you should leave it alone and as a clone.
    You can update the clone when you need to, or schedule it to run automatically.
    You should also do this occasionally with a external drive, as the internal drive can die and take both partitions with it.
    I currently do this with my laptop, that way if I'm mobile and have a serious issue, I can boot from the cloned partition in seconds without carrying a drive around. If I need large space for something I wasn't aware of, I can erase the clone partition and later clone it again.
    Software does all the work, just pick a time your not using the machine and let it work. Painless really.
    Most commonly used backup methods

Maybe you are looking for