Advice needed on Bulk Load activity

Hi,
I'm posting this problem in this forum in order to get a best possible suggestion for the below scenario.
Any thoughts will be much appreciated. Thanks in advance for your time.
I have a table with 150 million rows with 4 columns in it,
Table A
userId
productId
eventId
createDate
Now I have to load 50 million new rows to be Merged/Loaded.
Of this new rows mostly(80%) of them are existing users.
I need to update the existing users(40 M) and Insert the new users(10 M).
Can any one suggest how to approach this issue for a solution.

It is going to be a routine activity. I did not receive the information about frequency of refresh, lets assume it is going to be once a week.
This cannot be done offline, there are multiple services that access this data and their SLA are strict.
But I like the idea of CTAS.
But what I need is a merge on the tableA using the new 50 million data.
So how about these options?
SOLUTION A:*
step 1) upload the new 50 million data into a stage tableS
step 2) Creating a new empty tableB using CTAS
step 3) INSERT /*+ append parallel (tableB,12) */ INTO tableB
SELECT /*+ parallel (x 12) parallel(y 12)*/*
nvl(y.userId, x.userId),
nvl(y.productId, x.productId),
nvl(y.eventId, x.eventId),
nvl(y.createDate, x.createDate)
FROM TABLEA x FULL OUTER JOIN tableS y
ON x.userId = y.userId;
step 4) drop tableA
step 5) rename tableB to tableA;
SOLUTION B:*
partition exchange will definitely be an excellent option.
But we need inputs on what basis we can partition?
If I partition by date, then the users in the new file could be spread across many partitions and it wont be possible to do it.

Similar Messages

  • Bulk create Active Directory Users and Groups in PowerShell using Excel XLSX source file instead of CSV

    Hi Scripting Guy.  I am a Server Administrator who is very familiar with Active Directory, but new to PowerShell.  Like many SysAdmins, I often need to create multiple accounts (ranging from 3-200) and add them multiple groups (ranging
    from 1 - 100).  Previously I used VBS scripts in conjunction with an Excel .XLS file (not CSV file).  Since VBS is essentially out the door and PowerShell is in - I am having to re-create everthing.
    I have written a PowerShell script that bulk creates my users and adds them to their corresponding groups - however, this can only use a CSV file (NOT an XLS file).  I understand that "CSV is much easier to use than Excel worksheets", but
    most times I have three sets of nearly identical groups (for Dev, QA and Prod).  Performing Search and Replace on the Excel template across all four Worksheets ensures the names used are consistent throughout the three environments.
    I know each Excel Worksheet can be exported as a separate CSV file and then use the PowerShell scripts as is, but since I am not the only SysAdmin who will be using these it leads to "unnecessary time lost", not to mention the reality that even
    though you clearly state "These tabs need to be exported using this naming standard" (to work with the PowerShell scripts) that is not the result.
    I've been tasked to find a way to modify my existing PowerShell/CSV scripts to work with Excel spreadsheets/workbooks instead - with no success.  I have run across many articles/forums/scirpts that let you update Excel or export AD data into an Excel
    spreadsheet (even specifying the worksheet, column and row) - but nothing for what I am trying to do.
    I can't imagine that I am the ONLY person who is in this situation/has this need.  So, I am hoping you can help.  How do I modify my existing scripts to reference "use this Excel spreadsheet, and this specific worksheet in the spreadsheet
    prior to performing the New-ADUser/Add-ADGroupMember commands".
    For reference, I am including Worksheet/Column names of my Excel Spreadsheet Template as well as the first part of my PowerShell script.  M-A-N-Y T-H-A-N-K-S in advance.
       Worksheet:  Accounts
         Columns: samAccountName, CN_DisplayName_Name, sn_LastName, givenName_FirstName, Password, Description, TargetOU
       Worksheets:  DevGroups / QAGroups / ProdGroups
         Columns:  GroupName, Members, MemberOf, Description, TargetOU
    # Load PowerShell Active Directory module
    Write-Host "Loading Active Directory PowerShell module." -foregroundcolor DarkCyan # -backgroundcolor Black
    Import-Module ActiveDirectory
    Write-Host " "
    # Set parameter for location of CSV file (so source file only needs to be listed once).
    $path = ".\CreateNewUsers-CSV.csv"
    # Import CSV file as data source for remaining script.
    $csv = Import-Csv -path $path | ForEach-Object {
    # Add '@saccounty.net' suffix to samAccountName for UserPrincipalName
    $userPrincinpal = $_."samAccountName" + "@saccounty.net"
    # Create and configure new AD User Account based on information from the CSV source file.
    Write-Host " "
    Write-Host " "
    Write-Host "Creating and configuring new user account from the CSV source file." -foregroundcolor Cyan # -backgroundcolor Black
    New-ADUser -Name $_."cn_DisplayName_Name" `
    -Path $_."TargetOU" `
    -DisplayName $_."cn_DisplayName_Name" `
    -GivenName $_."givenName_FirstName" `
    -SurName $_."sn_LastName" `
    -SamAccountName $_."samAccountName" `
    -UserPrincipalName $userPrincinpal `

    Here is the same script as a function:
    Function Get-ExcelSheet{
    Param(
    $fileName = 'C:\scripts\test.xls',
    $sheetName = 'csv2'
    $conn = New-Object System.Data.OleDb.OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source = $fileName;Extended Properties=Excel 8.0")
    $cmd=$conn.CreateCommand()
    $cmd.CommandText="Select * from [$sheetName$]"
    $conn.open()
    $cmd.ExecuteReader()
    It is called like this:
    Get-ExcelSheet -filename c:\temp\myfilename.xslx -sheetName mysheet
    Do NOT change anything in the function and post the exact error.  If you don't have Office installed correctly or are running 64 bits with a 32 bit session you will have to adjust your system.
    ¯\_(ツ)_/¯
    HI JRV,
    My apologies for not responding sooner - I was pulled off onto another project this week.  I have included and called your Get-ExcelSheet function as best as I could...
    # Load PowerShell Active Directory module
    Write-Host "Loading Active Directory PowerShell module." -foregroundcolor DarkCyan # -backgroundcolor Black
    Import-Module ActiveDirectory
    Write-Host " "
    # JRV This Function Loads the Excel Reader
    Function Get-ExcelSheet{
    Param(
    $fileName = 'C:\scripts\test.xls',
    $sheetName = 'csv2'
    $conn = New-Object System.Data.OleDb.OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source = $fileName;Extended Properties=Excel 8.0")
    $cmd=$conn.CreateCommand()
    $cmd.CommandText="Select * from [$sheetName$]"
    $conn.open()
    $cmd.ExecuteReader()
    # Set parameter for location of CSV file (so source file only needs to be listed once) as well as Worksheet Names.
    $sourceFile = ".\NewDocClass-XLS-Test.xlsx"
    # Add '@saccounty.net' suffix to samAccountName for UserPrincipalName
    $userPrincinpal = $_."samAccountName" + "@saccounty.net"
    # Combine GivenName & SurName for DisplayName
    $displayName = $_."sn_LastName" + ". " + $_."givenName_FirstName"
    # JRV Call the Get-ExcelSheet function, providing FileName and SheetName values
    # Pipe the data from source for remaining script.
    Get-ExcelSheet -filename "E:\AD_Bulk_Update\NewDocClass-XLS-Test.xlsx" -sheetName "Create DocClass Accts" | ForEach-Object {
    # Create and configure new AD User Account based on information from the CSV source file.
    Write-Host " "
    Write-Host " "
    Write-Host "Creating and configuring new user account from the CSV source file." -foregroundcolor Cyan # -backgroundcolor Black
    New-ADUser -Name ($_."sn_LastName" + ". " + $_."givenName_FirstName") `
    -SamAccountName $_."samAccountName" `
    -UserPrincipalName $userPrincinpal `
    -Path $_."TargetOU" `
    Below is the errors I get:
    Exception calling "Open" with "0" argument(s): "The 'Microsoft.Jet.OLEDB.4.0'
    provider is not registered on the local machine."
    At E:\AD_Bulk_Update\Create-BulkADUsers-XLS.ps1:39 char:6
    + $conn.open()
    + ~~~~~~~~~~~~
    + CategoryInfo : NotSpecified: (:) [], MethodInvocationException
    + FullyQualifiedErrorId : InvalidOperationException
    Exception calling "ExecuteReader" with "0" argument(s): "ExecuteReader
    requires an open and available Connection. The connection's current state is
    closed."
    At E:\AD_Bulk_Update\Create-BulkADUsers-XLS.ps1:40 char:6
    + $cmd.ExecuteReader()
    + ~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : NotSpecified: (:) [], MethodInvocationException
    + FullyQualifiedErrorId : InvalidOperationException

  • Bulk loading BLOBs using PL/SQL - is it possible?

    Hi -
    Does anyone have a good reference article or example of how I can bulk load BLOBs (videos, images, audio, office docs/pdf) into the database using PL/SQL?
    Every example I've ever seen in PL/SQL for loading BLOBs does a commit; after each file loaded ... which doesn't seem very scalable.
    Can we pass in an array of BLOBs from the application, into PL/SQL and loop through that array and then issue a commit after the loop terminates?
    Any advice or help is appreciated. Thanks
    LJ

    It is easy enough to modify the example to commit every N files. If you are loading large amounts of media, I think that you will find that the time to load the media is far greater than the time spent in SQL statements doing inserts or retrieves. Thus, I would not expect to see any significant benefit to changing the example to use PL/SQL collection types in order to do bulk row operations.
    If your goal is high performance bulk load of binary content then I would suggest that you look to use Sqlldr. A PL/SQL program loading from BFILEs is limited to loading files that are accessible from the database server file system. Sqlldr can do this but it can also load data from a remote client. Sqlldr has parameters to control batching of operations.
    See section 7.3 of the Oracle Multimedia DICOM Developer's Guide for the example Loading DICOM Content Using the SQL*Loader Utility. You will need to adapt this example to the other Multimedia objects (ORDImage, ORDAudio .. etc) but the basic concepts are the same.
    Once the binary content is loaded into the database, you will need a to write a program to loop over the new content and initialize the Multimedia objects (extract attributes). The example in 7.3 contains a sample program that does this for the ORDDicom object.

  • Load html files onto repository by bulk loader

    hi
    my subject is self explainatory .
    i want to load html files . i created one html file (for example test.html),then create test.html.md.properties and
    write these in it :
    nodeType = Ad
    but when i run bulk loader script say to me :
    java.rmi.remoteException : null is not defined.it must be in order to create node :/<parent>/test.html
    please help me

    Hi;
    I've created a webservice proxy (Stub) on JDeveloper 9.0.3 pre, this class it's a proxy for a webservice, and it's works fine on JDeveloper, but I need to upload this class to a Oracle 8i 8.1.7.3 Database, the database must be the client for this webservice, but when I try to create a java source on the database it gives me some compilation errors, because it can't find some classes (XERCES, SOAP, and JAXP) that are needed by the webservice proxy (Stub); so my question is how to load this classes to the DB? I've found a document on OTN that says (http://otn.oracle.com/tech/java/jsp/content.html -> Unleash the Power of Java Stored Procedures):
    "Database as Active Web Service Client
    A Java stored procedure can be a Web service requester. Attaching such a stored
    procedure as a trigger allows the database to automatically invoke external Web
    services upon data-driven events. Non-Java modules such as PL/SQL procedures
    and DBMS packages can be encapsulated with a Java wrapper that will invoke
    external Web services, on their behalf. Here are the steps for a proof of concept:
    ' Load XERCES, SOAP, and JAXP jar files.
    ' Grant connect and resolve to the SOAP server/port.
    ' Load the SOAP client Java class.
    ' Create a PL/SQL package to wrap the SOAP client as
    trigger.
    The complete code sample and instructions will be posted on the Oracle OTN
    Web site."
    It's dated on June 2002, could some one give me a hint on how to do it?
    Thanks
    Mario A ChavezSorry, due to lack of bandwidth, it took us longer than we thought for providing this. As i have just responded in OracleJVM forum, we will be posting the complete demo/code_sample, using Apache SOAP clients libs, in a couple of weeks. Demo is working we are documenting it.
    drop me an email ([email protected]) if you can't wait.
    Kuassi

  • Bulk load of user security details

    Hi,
    Any tips on how to go for a bulk load of user details in BPC, we have a list of users and their required authorisations.May be using scripts or some thing else. I know using DTS is one way, but trying to find if any workaround is there.?

    My advice to customers and partners is to always build a security matrix in excel to determine all the assignments. The matrix helps to determine if you have captured all the correct teams, tasks assignments, and access.  Try to ONLY setup users without any access 1st, then setup Memebraccess profiles and task profiles.  Bring this all together via the TEAM assignment, since a team may have only 1 TASK profile, but multiple Member access profiles.  While the building of security may take time, there are methods to minimize the current and future maintenance after the initial setup.  Plus, once you set-up the process via the admin console, you then may see in the table structures just how complex the assignments are for each of the components.  Once the tables are set, I still believe and I may be wrong, that an admin needs to process at a minimum the TEAMS from the admin console, to establish the connections required for by users for access.
    Hope this helps.

  • How to UPDATE a big table in Oracle via Bulk Load

    Hi all,
    in a datastore target as Oracle 11g, I have a big table having 300milions of record; the structure is One integer key + 10 columns attributes .
    In IQ Source i have the same table with the same size ; the structure is One integer key + 1 column attributes .
    What i need to do is to UPDATE that single field in Oracle from the values stored in IQ .
    Any idea on how to organize efficiently the dataflow and the target writing mode ? bulk load ? api ?
    thank you
    Maurizio

    Hi,
    You cannot do bulk load when you need to UPDATE a field. Because all a bulk load does is add records to your table.
    Since you have to UPDATE a field, i would suggest to go for SCD with
    source > TC > MO > KG >target
    Arun

  • Bulk Insert Task Cannot bulk load because the file could not be opened.operating system error error code 3(The system cannot find the path specified.)

    Following error i am getting after i chnaged the Path in Config File from
    \\vs01\d$\\Deployment\Files\temp.txt
    to
    C:\Deployment\Files\temp.txt
    [Bulk Insert Task] Error: An error occurred with the following error message: "Cannot bulk load because the file "C:\Deployment\Files\temp.txt" could not be opened. Operating system error code 3(The system cannot find the path specified.).". 

    I think i know whats going on. The Bulk Insert task runs by executing sql command (bulk insert) internally from the target sql server to load the file. This means that the SQL Server Agent of the target sql server should have permissions on the file you trying to load. This also means that you need to use UNC path instead to specify the file path (if the target server in on different machine)
    Also from BOL (see section Usage Considerations - last bullet point)
    http://msdn.microsoft.com/en-us/library/ms141239.aspx
    * Only members of the sysadmin fixed server role can run a package that contains a Bulk Insert task.
    Make sure you take care of this as well.
    HTH
    ~Mukti
    Mukti

  • Bulk loading of Customer data into Application

    Hi Guys,
    I am going on with the development of Teleservice module on a new instance.
    Now i need to migrate the data on the old instance to the new instance.
    Please let me know if i have to use only APIs to create the customer into Apps or whether i can bulk load into the seeded tables directly.
    This has to include even Service Requests data also.
    Please let me know if there is any integration violation if we go with bulk loading of data directly.

    You donot need to develop a code for loading customer data anymore. Oracle has provided the BUlk IMport functionality in 11.5.8 for importing the customer infromation (using Oracle Customers Online/Oracle Data Libraian modules). If you would like to create accounts in addition to customer parties, you will have to use TCA V2 apis or customer interface program. For migrating the service requests, i guess the only option is to use APIs. HTH, Venit

  • PL/SQL Bulk Loading

    Hello,
    I have one question regarding bulk loading. I did lot of bulk loading.
    But my requirement is to call function which will do some DML operation and give ref key so that i can insert to fact table.
    Because i can't use DML function in select statement. (which will give error). otherway is using autonomous transaction. which i tried working but performance is very slow.
    How to call this function inside bulk loading process.
    Help !!
    xx_f is function which is using autonmous transction,
    See my sample code
    declare
    cursor c1 is select a,b,c from xx;
    type l_a is table of xx.a%type;
    type l_b is table of xx.b%type;
    type l_c is table of xx.c%type;
    v_a l_a;
    v_b l_b;
    v_c l_c;
    begin
    open c1;
    loop
    fetch c1 bulk collect into v_a,v_b,v_c limit 1000;
    exit when c1%notfound;
    begin
    forall i in 1..v_a.count
    insert into xxyy
    (a,b,c) values (xx_f(v_a(i),xx_f(v_b(i),xx_f(v_c(i));
    commit;
    end bulkload;
    end loop;
    close c1;
    end;
    I just want to call xx_f function without autonoumous transaction.
    but with bulk loading. Please let me if you need more details
    Thanks
    yreddyr

    Can you show the code for xx_f? Does it do DML, or just transformations on the columns?
    Depending on what it does, an alternative could be something like:
    DECLARE
       CURSOR c1 IS
          SELECT xx_f(a), xx_f(b), xx_f(c) FROM xx;
       TYPE l_a IS TABLE OF whatever xx_f returns;
       TYPE l_b IS TABLE OF whatever xx_f returns;
       TYPE l_c IS TABLE OF whatever xx_f returns;
       v_a l_a;
       v_b l_b;
       v_c l_c;
    BEGIN
       OPEN c1;
       LOOP
          FETCH c1 BULK COLLECT INTO v_a, v_b, v_c LIMIT 1000;
          BEGIN
             FORALL i IN 1..v_a.COUNT
                INSERT INTO xxyy (a, b, c)
                VALUES (v_a(i), v_b(i), v_c(i));
          END;
          EXIT WHEN c1%NOTFOUND;
       END LOOP;
       CLOSE c1;
    END;John

  • CBO madness after bulk loading

    This is an extension of my other recent posts, but I felt it deserved it's own space.
    I have a table of telephone call records, one row for each telephone call made or received by a customer. Our production table has a 10-field PK that I want to destroy. In my development version, the PK for this table is a compound key on LOC, CUST_NO, YEAR, MONTH, and SEQ_NO. LOC is a char(3), the rest are numbers.
    After a bulk load into a new partition of this table, a query with these 5 fields in the where clause chooses a second index. That second index includes LOC, YEAR, MONTH, and two other fields not in the PK nor in the query. The production instance does the same thing, and I was certain that having the 5-field PK would be the magic bullet.
    Oracle SQL Developer's autotrace shows a "Filter Predicates" on CUST_NO and SEQ_NO, and then the indexed range scan on the other 3 fields in the second index. Still noteworthy is that query on just LOC, CUST_NO, YEAR and MONTH does use the PK.
    Here are the steps I've taken to test this:
    1. Truncate the partition in question
    2. Drop old PK constraint/index
    3. Create new PK constraint/index
    4. Gather table stats with cascade=>TRUE
    5. Bulk load data (in this case, 1.96 million rows) into empty partition
    6. autotrace select query
    7. Write to dizwell in tears
    This table also has two other partitions for past two cycles, each with around 30 million row.
    Yes, gathering table stats again makes things behave as expected, but that takes a fair bit of time. For the meantime we've put an index hint in the application query that was suffering the most.

    First, the CBO doesn't actually choose a full table
    scan, it chooses to use a second index.Depending on the query, of course. If the CBO thinks a partition is empty, I would suspect that it would find it most efficient to scan the smallest index, and the second index, with fewer columns, would be expected to be smaller. If it thinks they are equally costly, I believe it will use the one that was created first, though I wouldn't want to depend on that sort of failure.
    I've lowered the sample percentage to 10% and set
    CASCADE to FALSE and it still take 45 minutes in
    production. The staging table was something I was
    considering.
    Are statistics included in partition exchange? I've
    asked that question before but never saw an answer.Yes, partition-level statistics will be included. Table-level statistics will be automatically adjusted. From the SQL Reference
    http://download-east.oracle.com/docs/cd/B19306_01/server.102/b14200/statements_3001.htm#i2131250
    "All statistics of the table and partition are exchanged, including table, column, index statistics, and histograms. Oracle Database recalculates the aggregate statistics of the table receiving the new partition."
    You could also just explicitly set table-level statistics, assuming you don't need too many histograms, possibly gathering statistics for real later on.
    Justin

  • How to improve performance for Azure Table Storage bulk loads

    Hello all,
    Would appreciate your help as we are facing a challenge.
    We are tried to bulk load Azure table storage. We have a file that contains nearly 2 million rows.
    We would need to reach a point where we could bulk load 100000-150000 entries per minute. Currently, it takes more than 10 hours to process the file..
    We have tried Parallel.Foreach but it doesn't help. Today I discovered Partitioning in PLINQ. Would that be the way to go??
    Any ideas? I have spent nearly two days in trying to optimize it using PLINQ, but still I am not sure what is the best thing to do.
    Kindly, note that we shouldn't be using SQL/Azure SQL for this.
    I would really appreciate your help.
    Thanks

    I'd think you're just pooling the parallel connections to Azure, if you do it on one system.  You'd also have a bottleneck of round trip time from you, through the internet to Azure and back again.
    You could speed it up by moving the data file to the cloud and process it with a Cloud worker role.  That way you'd be in the datacenter (which is a much faster, more optimized network.)
    Or, if that's not fast enough - if you can split the data so multiple WorkerRoles could each process part of the file, you can use the VM's scale to put enough machines to it that it gets done quickly.
    Darin R.

  • Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 3 (NumberOfMultipleMatches).

    Hi,
    I have a file where fields are wrapped with ".
    =========== file sample
    "asdsa","asdsadasdas","1123"
    "asdsa","asdsadasdas","1123"
    "asdsa","asdsadasdas","1123"
    "asdsa","asdsadasdas","1123"
    ==========
    I am having a .net method to remove the wrap characters and write out a file without wrap characters.
    ======================
    asdsa,asdsadasdas,1123
    asdsa,asdsadasdas,1123
    asdsa,asdsadasdas,1123
    asdsa,asdsadasdas,1123
    ======================
    the .net code is here.
    ========================================
    public static string RemoveCharacter(string sFileName, char cRemoveChar)
                object objLock = new object();
                //VirtualStream objInputStream = null;
                //VirtualStream objOutStream = null;
                FileStream objInputFile = null, objOutFile = null;
                lock(objLock)
                    try
                        objInputFile = new FileStream(sFileName, FileMode.Open);
                        //objInputStream = new VirtualStream(objInputFile);
                        objOutFile = new FileStream(sFileName.Substring(0, sFileName.LastIndexOf('\\')) + "\\" + Guid.NewGuid().ToString(), FileMode.Create);
                        //objOutStream = new VirtualStream(objOutFile);
                        int nByteRead;
                        while ((nByteRead = objInputFile.ReadByte()) != -1)
                            if (nByteRead != (int)cRemoveChar)
                                objOutFile.WriteByte((byte)nByteRead);
                    finally
                        objInputFile.Close();
                        objOutFile.Close();
                    return sFileName.Substring(0, sFileName.LastIndexOf('\\')) + "\\" + Guid.NewGuid().ToString();
    ==================================
    however when I run the bulk load utility I get the error 
    =======================================
    Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 3 (NumberOfMultipleMatches).
    ==========================================
    the bulk insert statement is as follows
    =========================================
     BULK INSERT Temp  
     FROM '<file name>' WITH  
      FIELDTERMINATOR = ','  
      , KEEPNULLS  
    ==========================================
    Does anybody know what is happening and what needs to be done ?
    PLEASE HELP
    Thanks in advance 
    Vikram

    To load that file with BULK INSERT, use this format file:
    9.0
    4
    1 SQLCHAR 0 0 "\""      0 ""    ""
    2 SQLCHAR 0 0 "\",\""   1 col1  Latin1_General_CI_AS
    3 SQLCHAR 0 0 "\",\""   2 col2  Latin1_General_CI_AS
    4 SQLCHAR 0 0 "\"\r\n"  3 col3  Latin1_General_CI_AS
    Note that the format file defines four fields while the fileonly seems to have three. The format file defines an empty field before the first quote.
    Or, since you already have a .NET program, use a stored procedure with table-valued parameter instead. I have an example of how to do this here:
    http://www.sommarskog.se/arrays-in-sql-2008.html
    Erland Sommarskog, SQL Server MVP, [email protected]

  • Bulk Load question for an insert statement.

    I'm looking to put the following statement into a FORALL statement using BULK COLLLECT and I need some guidance.
    Am I going to be putting the SELECT statement into a cursor and then load the cursor values into a defined Nested Table type defined variable?
    INSERT INTO TEMP_ASSOC_CURRENT_WEEK_IDS
    SELECT aor.associate_office_record_id ,
    sched.get_assoc_sched_rotation_week(aor.associate_office_record_id, v_weekType.start_date) week_id
    FROM ASSOCIATE_OFFICE_RECORDS aor
    WHERE aor.OFFICE_ID = v_office_id
    AND (
    (aor.lt_assoc_stage_result_id in (4,8)
    AND v_officeWeekType.start_date >= trunc(aor.schedule_start_date)
    OR aor.lt_assoc_stage_result_id in (1, 2)
    ));

    I see people are reading this so for the insanely curious here's how I did it.
    Type AOR_REC is RECORD(
    associate_office_record_id dbms_sql.number_table,
    week_id dbms_sql.number_table); --RJS.***Setting up Type for use with Bulk Collect FORALL statements.
    v_a_rec AOR_REC; -- RJS. *** defining variable of defined Type to use with Bulk Collect FORALL statements.
    CURSOR cur_aor_ids -- RJS *** Cursor for BULK COLLECT.
    IS
    SELECT aor.associate_office_record_id associate_office_record_id,
    sched.get_assoc_sched_rotation_week(aor.associate_office_record_id, v_weekType.start_date) week_id
    FROM ASSOCIATE_OFFICE_RECORDS aor
    WHERE aor.OFFICE_ID = v_office_id
    AND (
    (aor.lt_assoc_stage_result_id in (4,8)
    AND v_officeWeekType.start_date >= trunc(aor.schedule_start_date)
    OR aor.lt_assoc_stage_result_id in (1, 2)
    FOR UPDATE NOWAIT;
    BEGIN
    BEGIN
    OPEN cur_aor_ids;
    LOOP
    FETCH cur_aor_ids BULK COLLECT into
    v_a_rec.associate_office_record_id, v_a_rec.week_id; --RJS. *** Bulk Load your cursor data into a buffer to do the Delete all at once.
    FORALL i IN 1..v_a_rec.associate_office_record_id.COUNT SAVE EXCEPTIONS
    INSERT INTO TEMP_ASSOC_CURRENT_WEEK_IDS
    (associate_office_record_id,week_id)
    VALUES
    (v_a_rec.associate_office_record_id(i), v_a_rec.week_id(i)); --RJS. *** Single FORALL BULK DELETE statement.
    EXIT WHEN cur_aor_ids%NOTFOUND;
    END LOOP;
    CLOSE cur_aor_ids;
    EXCEPTION
    WHEN OTHERS THEN
    dbms_output.put_line('ERROR ENCOUNTERED IS SQLCODE = '|| SQLCODE ||' AND SQLERRM = ' || SQLERRM);
    dbms_output.put_line('Number of INSERT statements that
    failed: ' || SQL%BULK_EXCEPTIONS.COUNT);
    End;
    Easy right?

  • Retry "Bulk Load Post Process" batch

    Hi,
    First question, what is the actual use of the scheduled task "Bulk Load Post Process"? If I am not sending out email notification, nor LDAP syncing nor generating the password do I still need to run this task after performing a bulk load through the utility?
    Also, I ran this task, now there are some batches which are in the "READY FOR PROCESSING" state. How do I re-run these batches?
    Thanks,
    Vishal

    The scheduled task carries out post-processing activities on the users imported through the bulk load utility.

  • OIM Bulk Load: Insufficient privileges

    Hi All,
    I'm trying to use the OIM Bulk Load Utility and I keep getting this error message:
    Exception in thread "main" java.sql.SQLException: ORA-01031: insufficient privileges
    ORA-06512: at "OIMUSER.OIM_BLKLD_SP_CREATE_LOG", line 39
    ORA-06512: at "OIMUSER.OIM_BLKLD_PKG_USR", line 281
    I've followed the instructions and gone over everything a few times. The utility tests the connection to the database OK.
    I don't know much about oracle db's so I am not sure how to do even basic troubleshooting. Could I just give my OIMUSER full permissions? Shouldn't it have full permission as it is?
    I did have to create a tablespace for this utility, maybe the OIMUSER needs to be give access to this? I have no idea....
    Any help would be greatly appreciated!
    Alex

    Even i got same error, at that time db oim user had following permission:
    CREATE TABLE
    CREATE VIEW
    QUERY REWRITE
    UNLIMITED TABLESPACE
    EXECUTE ON SYS.DBMS_SHARED_POOL
    EXECUTE ON SYS.DBMS_SYSTEM
    SELECT ON SYS.DBA_2PC_PENDING
    SELECT ON SYS.DBA_PENDING_TRANSACTIONS
    SELECT ON SYS.PENDING_TRANS$
    SELECT ON SYS.V$XATRANS$
    CONNECT
    RESOURCE
    Later DBA provided following additional permission and it worked like a charm:-
    CREATE ANY INDEX  
    CREATE ANY SYNONYM  
    CREATE ANY TRIGGER  
    CREATE ANY TYPE  
    CREATE DATABASE LINK  
    CREATE JOB  
    CREATE LIBRARY  
    CREATE MATERIALIZED VIEW  
    CREATE PROCEDURE  
    CREATE SEQUENCE  
    CREATE TABLE  
    CREATE TRIGGER  
    CREATE VIEW  

Maybe you are looking for

  • Edge-Origin getting b/w problem

    How to get the current up/down b/w and latency in case of edge-origin architecture when client connected to edge server ? Need urgent help! thanks, vishal sood

  • How to find the last run date of the report..

    please any one one help me how to find last run date of the report... for example if my report is zgrir...if i am exuted in last week if want to find when it executed last run date is req please any one help me...

  • ASA 8.2(1) Global and NAT statements, natting certain internal hosts

    Hi, I have what I believe will be an easy question, but I cannot find the answer and cannot afford to test it on our production ASA. I am running an ASA firewall, we are performing PAT with one Public IP Address for all inside traffic accessing the I

  • When I boot to the windows side my screen is pixelated.

    When I boot my MacBook to windows, my screen is pixilated.  Looks like a bunch of little Christmas lights flashing in the background.  When I boot over to the mac side the screen is fine.  Any idea why and how to fix it?

  • Communication Error in Nokia Mail in C3

    Hey all, I had got Nokia C3 a month ago and updated in to latest version everything was working fine till yesterday. At yesterday morning my Nokia Mail stop working whenever I try to open my accounts it keeps showing loading and then shows COMMUNICAT