Realising sqlloader bulk load like functionality through ALSB

Hi All,
I want to replace a a bourne shell script that ftp's a file of raw data from a certain location and uses sqlloader to load the raw data into an oracle database. An oracle control file splits the data into the discrete chunks and inserts into the correct column within a table.
From looking through the ALSB documentation using the ftp transport seems straightforward. However the options for uploading the raw data to the database seems overkill for my needs. I do not need to modify the data prior to inserting it into the database.
I can see that there are 2 options available to me:
(1) Use XQuery to parse the raw data split into fields and using fn-bea:execute-sql to insert the data into the database.
(2) Use a java call out and again parse the raw data and build up a pojo or collection and then persist the data using an object relational mapping tool or vanilla jdbc
(3) Use a java callout to execute sqlloader
Can anyone think of any other options? For example, is it possible to invoke sqlloader with alsb other than through a callout?
Thanks in advance,
Colin

You can use the fn-bea:execute-sql() to insert data into the database.
You have to write an Oracle function, and declare: PRAGMA AUTONOMOUS_TRANSACTION;
After that, inside the Oracle function you can have the insert statements or invoke a stored procedure to insert the data.
In your Xquery you will have to select the data from your Oracle function. e.g.: select functionname('datatobeinserted') from dual;
Regards,
Fabio Douek

Similar Messages

  • User Interface for bulk loading images using interMedia

    I would like to create an interface where users could bulk load images to a database. Has anyone created a web (or other) interface that would perhaps call a PL/SQL procedure or SQLloader?
    Is there a way for users to upload images from there own computers in bulk? Would they need to utilize SQLPLUS?
    While I have seen the examples and plan to create a web interface for uploading images one at a time, I have been requested to find a way for the users to upload images in bulk themselves (instead of them requesting us technical people to do it).
    Thanks for any suggestions.
    Judy

    <BLOCKQUOTE><font size="1" face="Verdana, Arial, Helvetica">quote:</font><HR>Originally posted by Simon Oxbury:
    Hi,
    There's a sample on OTN that discusses loading multimedia data in bulk into the interMedia types using both SQL*Plus (with PL/SQL) and SQL*Loader. Check out the following URL: http://otn.oracle.com/sample_code/products/intermedia/htdocs/avi_bulk_loading.html
    One major difference to consider between SQL*Loader and SQL*Plus (with PL/SQL) is that SQL*Loader can load data from files on the machine running SQL*Loader, which may be a different machine than the database, although it still needs an Oracle installation. Whereas SQL*Plus with PL/SQL can load data only from directories that are accessible to the database server and that have been defined in the server using the CREATE DIRECTORY command, which requires privs. Also note there are restrictions and issues specific to both NT and Unix when it comes to access network directories from the server.
    If SQL*Loader looks like a possibility, you might want to think about a simple Java program, Perl script, or some such, to create the SQL*Loader scripts. On the other hand, if you get into Java, then you could use Java to do the upload and, at the same time, provide some level of application-specific user interaction and/or error reporting, etc. Its easy to get a list of file names in a directory using the File.list or File.listFiles methods in Java. On the other hand, if we talking LOTs of files, then SQL*Loader may turn out to be more efficient.
    In order to better understand the variety of ways in which our customers are using interMedia, we'd be very interested in knowing a little more about your application, the interMedia functionality that you are using, and how you are developing and deploying your application. If you are able to help us, please send a short email message with some information about your application, together with any comments you may have, to the Oracle interMedia Product Manager, Joe Mauro, at [email protected]. Thank you!
    Regards,
    Simon<HR></BLOCKQUOTE>
    null

  • Initial load of articles through ARTMAS05

    Hi Retail experts, I need to built IDOCs to perform the initial load of articles through ARTMAS05.
    Although we tried to use an IDOC from a DEMO system as template, we couldn't get a successful IDOC so far. The Function module we are using is BAPI_IDOC_INPUT1, with IDOC type ARTMAS05.
    Does anybody has a guideline to set this up?
    Thanks in advance.

    I would welcome Bjorn's input on this, but, generally I accomplish this using LSMWs. I use SWO1 to explore the business object, but use LSMW (Legacy System Migration Workbench) to mass load. In the case of LSMW, you simply call the transaction LSMW.
    - From here, define a project, subproject and object. (Eg: Project = INITLOAD {initial data load}, Subproject = LOMD {logistics master data}, object = ARTMAS {article master}).
    - Maintain the object attributes. Here, you can chose from four options: standard/batchinput, batch input recording, Business Object Method (BAPI) or IDoc. Choose the Business Object method, use object BUS1001001 and method CLONE.
    - Define your source structure. Here, you will lay out what the input file's STRUCTURE is going to look like (not the fields). Since it's ARTMAS, it's not realistic to put all data into a single row in a text file, so you will likely use a structured input file - especially for variants, site-specific and sales-org-specific data.
    - Define your source fields. Here you will define the fields that are in your input file and assign them to your input structure. A lot of work goes into this step. Note - I would try to use names very close to the SAP names, since there is an automapping tool. Also, you can copy SAP table structures into your field structures which is very helpful if you plan to use say 75 - 80 percent of the fields of a particluar structure.
    - Maintain structure relations. You will assign your input structures to the corresponding ARTMAS structures in this step.
    - Map the fields and maintain conversion rules. Here you assign your input fields to the ARTMAS fields. Also, you can code ABAP in this step for conversion/translation purposes. It depends on your chosen ETL or ETCL or ECTL or CETL methodology (E = Extract, C = Cleanse, T = Transform, L = Load) on whether you will write ABAP conversion rules in this step.
    - Specify input files. This is where the data resides in it's text file input format. Typically, you will use a small data set that sits on your PC to test it, and then for a mass load, create a server-side directory on the SAP server, place the input file there, and you can use it. This speeds processing for large files considerably.
    - Assign files. Here you assign the previously specified input file to an input structure
    - Read data. This actually reads the data so that you can see how it's going to come in.
    - Convert data. This creates a psuedo IDoc. It is not an IDoc yet, but in IDoc format.
    - Start IDoc generation. This converts the converted file into a true IDoc.
    - Start IDoc processing. Here, your IDoc moves from 64 status to (hopefully) 53 status.
    Well, I hope this helps, and I would be interested in Bjorn's input. Also, Bjorn, what did you mean by the WRKKEY comment? I've never heard or seen a reference to this.

  • Bulk loading of Customer data into Application

    Hi Guys,
    I am going on with the development of Teleservice module on a new instance.
    Now i need to migrate the data on the old instance to the new instance.
    Please let me know if i have to use only APIs to create the customer into Apps or whether i can bulk load into the seeded tables directly.
    This has to include even Service Requests data also.
    Please let me know if there is any integration violation if we go with bulk loading of data directly.

    You donot need to develop a code for loading customer data anymore. Oracle has provided the BUlk IMport functionality in 11.5.8 for importing the customer infromation (using Oracle Customers Online/Oracle Data Libraian modules). If you would like to create accounts in addition to customer parties, you will have to use TCA V2 apis or customer interface program. For migrating the service requests, i guess the only option is to use APIs. HTH, Venit

  • PL/SQL Bulk Loading

    Hello,
    I have one question regarding bulk loading. I did lot of bulk loading.
    But my requirement is to call function which will do some DML operation and give ref key so that i can insert to fact table.
    Because i can't use DML function in select statement. (which will give error). otherway is using autonomous transaction. which i tried working but performance is very slow.
    How to call this function inside bulk loading process.
    Help !!
    xx_f is function which is using autonmous transction,
    See my sample code
    declare
    cursor c1 is select a,b,c from xx;
    type l_a is table of xx.a%type;
    type l_b is table of xx.b%type;
    type l_c is table of xx.c%type;
    v_a l_a;
    v_b l_b;
    v_c l_c;
    begin
    open c1;
    loop
    fetch c1 bulk collect into v_a,v_b,v_c limit 1000;
    exit when c1%notfound;
    begin
    forall i in 1..v_a.count
    insert into xxyy
    (a,b,c) values (xx_f(v_a(i),xx_f(v_b(i),xx_f(v_c(i));
    commit;
    end bulkload;
    end loop;
    close c1;
    end;
    I just want to call xx_f function without autonoumous transaction.
    but with bulk loading. Please let me if you need more details
    Thanks
    yreddyr

    Can you show the code for xx_f? Does it do DML, or just transformations on the columns?
    Depending on what it does, an alternative could be something like:
    DECLARE
       CURSOR c1 IS
          SELECT xx_f(a), xx_f(b), xx_f(c) FROM xx;
       TYPE l_a IS TABLE OF whatever xx_f returns;
       TYPE l_b IS TABLE OF whatever xx_f returns;
       TYPE l_c IS TABLE OF whatever xx_f returns;
       v_a l_a;
       v_b l_b;
       v_c l_c;
    BEGIN
       OPEN c1;
       LOOP
          FETCH c1 BULK COLLECT INTO v_a, v_b, v_c LIMIT 1000;
          BEGIN
             FORALL i IN 1..v_a.COUNT
                INSERT INTO xxyy (a, b, c)
                VALUES (v_a(i), v_b(i), v_c(i));
          END;
          EXIT WHEN c1%NOTFOUND;
       END LOOP;
       CLOSE c1;
    END;John

  • Bulk loading Watchfolders settings

    I'm looking for a way to load a pre-defined list of watchfolders and their output locations into AMEs watchfolder window, without doing it one by one (If have 150 different watchfolders).
    Currently having to load each watchfolder at a time, defining there output and locations. I have a list of 150 watchfolders and their respective output locations and I need to load this into Adobe Media Encoder through a bulk method.
    Is there a way I can change the Watchfolder.dll file or an equivalent way of loading these settings into AME, this would save me hours at a time.
    AME currently looses some of its watchfolders if AME crashes or closes and then on relaunching the application some aren't showing up.
    Any help in this area would be appreciated
    AME CC 2014

    Hello Mike,
    you might try to check that your JREPSVR is started in read/write mode (not read-only
    which is the default). You should have a -W in the CLOPT (after --), and you should
    se something like this in the ULOG when JREPSVR starts:
    repository file XXXX (yyyy records) is writable
    (don't remember the exact wording, though)
    Hope this helps,
    /Per
    "Mike" <[email protected]> wrote:
    >
    I am trying to do a bulk loading for Tuxedo services (simpapp comes with
    Tuxedo
    software). And get an error message like following:
    Exception in thread "main" bea.jolt.ServiceException: Service .GETKEYS
    is not
    available.
    at bea.jolt.JoltRemoteService.init(JoltRemoteService.java:108)
    at bea.jolt.JoltRemoteService.<init>(JoltRemoteService.java:64)
    at bea.joltadm.JSvcPkgTbl.initTable(jbld.java:768)
    at bea.joltadm.JSvcPkgTbl.<init>(jbld.java:748)
    at bea.joltadm.JBldDefRec.<init>(jbld.java:111)
    at bea.joltadm.jbld.main(jbld.java:703)
    I guess it is something wrong with the bulk load data as following:
    service=TOUPPER
    export=true
    inbuf=STRING
    outbuf=STRING
    It will be appreciated if someone can point out what’s wrong?
    Thanks,
    Mike

  • ORA-06502 during bulk load

    I am using v11.2 with the new Jena adapter.
    I am trying to upload data from a bunch of ntriple files to the triple store via the bulk load interface in the Jena adaptor- aka. bulk append. The code does something like this
    while(moreFiles exist)
    readFilesToMemory;
    bulkLoadToDatabase using the options "MBV_JOIN_HINT=USE_HASH PARALLEL=4"
    Loading the first set of triples goes well. But when I try to load the second set of triples, I get the exception below.
    Some thoughts:
    1) I dont think this is data problem because I uploaded all the data during an earlier test + when I upload the same data on an empty database it works fine.
    2) I saw some earlier posts with similar error but none of the seem to be using the Jena adaptor..
    3) The model also has a OWL Prime entailment in incremental mode.
    4) I am not sure if this is relevant but... Before I ran the current test, I mistakenly launched multiple of java processes that bulk loaded the data. Ofcourse I killed all the processes and dropped the sem_models and the backing rdf tables they were uploading to.
    EXCEPTION
    java.sql.SQLException: ORA-06502: PL/SQL: numeric or value error: character string buffer too small
    ORA-06512: at "MDSYS.SDO_RDF_INTERNAL", line 3164
    ORA-06512: at "MDSYS.SDO_RDF_INTERNAL", line 4244
    ORA-06512: at "MDSYS.SDO_RDF", line 276
    ORA-06512: at "MDSYS.RDF_APIS", line 693
    ORA-06512: at line 1
    at oracle.jdbc.driver.SQLStateMapping.newSQLException(SQLStateMapping.java:70)
    at oracle.jdbc.driver.DatabaseError.newSQLException(DatabaseError.java:131)
    at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:204)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:455)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:413)
    at oracle.jdbc.driver.T4C8Oall.receive(T4C8Oall.java:1034)
    at oracle.jdbc.driver.T4CCallableStatement.doOall8(T4CCallableStatement.java:191)
    at oracle.jdbc.driver.T4CCallableStatement.executeForRows(T4CCallableStatement.java:950)
    at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1222)
    at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3387)
    at oracle.jdbc.driver.OraclePreparedStatement.execute(OraclePreparedStatement.java:3488)
    at oracle.jdbc.driver.OracleCallableStatement.execute(OracleCallableStatement.java:3840)
    at oracle.jdbc.driver.OraclePreparedStatementWrapper.execute(OraclePreparedStatementWrapper.java:1086)
    at oracle.spatial.rdf.client.jena.Oracle.executeCall(Oracle.java:689)
    at oracle.spatial.rdf.client.jena.OracleBulkUpdateHandler.addInBulk(OracleBulkUpdateHandler.java:740)
    at oracle.spatial.rdf.client.jena.OracleBulkUpdateHandler.addInBulk(OracleBulkUpdateHandler.java:463)
    at oracleuploadtest.OracleUploader.loadModelToDatabase(OracleUploader.java:84)
    at oracleuploadtest.RunOracleUploadTest.main(RunOracleUploadTest.java:81)
    thanks!
    Ram.

    The addInBulk method needs to be called twice to trigger the bug. Here is a test case that passes only while the bug is present! (It is to remind me to remove the workaround code when the fix gets through to my code).
    @Test
         public void testThatOracleBulkBugIsNotYetFixed() throws SQLException {
              char nm[] = new char[22-TestDataUtils.getUserID().length()-TestOracleHelper.ORACLE_USER.length()];
              Arrays.fill(nm,'A');
              TestOracleHelper helper = new TestOracleHelper(new String(nm)); // actual name is TestDataUtils.getUserID() +"_" + nm
              GraphOracleSem og = helper.createGraph();
              Node n = RDF.value.asNode();
              Triple triples[] = new Triple[]{new Triple(n,n,n)};
              try {
                   og.getBulkUpdateHandler().addInBulk(triples, null);
                   // Oracle bug hits on second call:
                   og.getBulkUpdateHandler().addInBulk(triples, null);
              catch (SQLException e) {
                   if (e.getErrorCode()==6502) {
                   return; // we have a work-around for this expected error;
                   throw e; // some other problem.
              Assert.fail("It seems that an Oracle update (has the ora jar been updated?) resolves a silly bug - please modify BulkLoaderExportMode");
    Jeremy

  • Can someone reply this - pre-populating (bulk load) the OID - URGENT

    gurus,
    i'm using following -
    Database --> Oracle 9i
    Portal --> Oracle Portal 9iAS Release 2
    there are about 10,000 portal users. i would like to pre-populate the OID from the existing employee repository (employee repository is a custom Oracle database).
    question - is there a white paper that gives u all the api's required to do so. i've to accomplish the following tasks -
    1. create users
    2. give them privileges
    3. assign them to groups
    4. assign a default groups to users
    i need to achieve above as part of pre-populating the OID.
    ideas anyone ....?
    thanx a bunch.

    Hero,
    I just went through an exercise were I did a bulk load of users and did exectly the four steps you're asking for. I also applied the users to desinated groups.
    I'm on a HP-UX but solution can apply to any O/S.
    How do I get this to you?

  • Bulk Load and Auto-Provision

    I am wondering if there is an easy way to trigger auto-provisioning of managed resources based on a bulk load. For instance, after importing users through the bulk load utility I want the the Membership rules to be executed, which will assign user to the correct roles and therefore initiate the provisioning process.
    Thanks,
    Pete

    I would suggest a custom scheduled task that updates users with an empty hashmap. Essentially a "touch" function to update a user with the same data which will then trigger the group memberships.
    -Kevin

  • Creation of new partner function through /AFS/BAPI_SALESORD_CHANGE

    Hi all,
    Can anybody help me to create partner functions through the bapi /AFS/BAPI_SALESORD_CHANGE ??
    In the selection-screen we will pass the vbeln, PARVW and kunnr values ??
    Any kind of little help is very much appreciated.

    thats all you need to pass to change the partner type like bill to or ship to.
    You dont create partner functions from this bapi. This is to change the sales order and the details in sales order. Creating partner function comes under SPRO.
    Ask you sales functional people to create new partner function

  • Installed FF. 8.1 and after reboot, tryed to start FF and got message "Could not load XRE functions.

    I have been running I believe FF 6.3. I was getting nag popups to update FF. I went to help and used the selection there to up date. In doing so (more than once tryed this method) it would download and install like a usual update untill compleation. Then it would say Could not compleat update or Could not restart firefox because and instance is running. Close FF and try again. But there was no visiable FF running not even in Device Manager.
    I will mention here that I did install Skype last week and have seen a couple of messages concerning Skype not loading on start up but seeing it running in the tool bar if this may have anything to do with it.
    So looking for help at the FF site I saw recomended to DownLoad and install FF 8.1 so I did that. Then from Control Panel I uninstalled FF and rebooted and installed fresh leaving the old settings and bookmarks etc. Then rebooted again.
    When I started FF 8.1 I now get a message "Could not load XRE Functions".
    So remembering the last progy install on this box was Skype I went to the control panel and un installed it ---- and when I did, at the finish I got that same message "Could not load XRE Functions".---- Intresting---- hummmm but how do I get FF installed so I can get on with Life?
    I just tried to verify this new account and got the same Couldnt Load message when I clicked on the email link for this forum.
    Much Thanks in advance for anyones help.

    Do a clean (re)install and delete the Firefox program folder (C:\Program Files\Mozilla Firefox\).
    Download a fresh Firefox copy and save the file to the desktop.
    * Firefox 8.0.x: http://www.mozilla.com/en-US/firefox/all.html
    Uninstall your current Firefox version if possible.
    *Do NOT remove personal data when you uninstall the current version or you lose your bookmarks and other data in the profile folder.
    Remove the Firefox program folder before installing that newly downloaded copy of the Firefox installer.
    *It is important to delete the Firefox program folder to remove all the files and make sure that there are no problems with files that were leftover after uninstalling.
    Your bookmarks and other profile data are stored elsewhere in the Firefox Profile Folder and won't be affected by a reinstall, but make sure that you do not select to remove personal data if you uninstall Firefox.
    *http://kb.mozillazine.org/Profile_folder_-_Firefox
    *http://kb.mozillazine.org/Profile_backup
    *http://kb.mozillazine.org/Standard_diagnostic_-_Firefox#Clean_reinstall

  • Bulk Load option doesn't work

    Hi Experts,
    I am trying to load data to HFM using Bulk load option but it doesnt work. When I Change the option to SQL insert, the loading is successful. The logs say that the temp file is missing. But when I go to the lspecified location , I see the control file and the tmp file. What am I missing to have bulk load working?Here's the log entry.
    2009-08-19-18:48:29
    User ID...........     kannan
    Location..........     KTEST
    Source File.......     \\Hyuisprd\Applications\FDM\CRHDATALD1\Inbox\OMG\HFM July2009.txt
    Processing Codes:
    BLANK............. Line is blank or empty.
    ESD............... Excluded String Detected, SKIP Field value was found.
    NN................ Non-Numeric, Amount field contains non numeric characters.
    RFM............... Required Field Missing.
    TC................ Type Conversion, Amount field could be converted to a number.
    ZP................ Zero Suppress, Amount field contains a 0 value and zero suppress is ON.
    Create Output File Start: [2009-08-19-18:48:29]
    [TC] - [Amount=NN]     Batch Month File Created: 07/2009
    [TC] - [Amount=NN]     Date File Created: 8/6/2009
    [TC] - [Amount=NN]     Time File Created: 08:19:06
    [Blank] -      
    Excluded Record Count.............. 3
    Blank Record Count................. 1
    Total Records Bypassed............. 4
    Valid Records...................... 106093
    Total Records Processed............ 106097
    Begin Oracle (SQL-Loader) Process (106093): [2009-08-19-18:48:41]
    [RDMS Bulk Load Error Begin]
         Message:      (53) - File not found
         See Bulk Load File:      C:\DOCUME~1\fdmuser\LOCALS~1\Temp\tWkannan30327607466.tmp
    [RDMS Bulk Load Error End]
    Thanks
    Kannan.

    Hi Experts,
    I am facing the data import error while importing data from .csv file to FDM-HFM application.
    2011-08-29 16:19:56
    User ID...........     admin
    Location..........     ALBA
    Source File.......     C:\u10\epm\DEV\epm_home\EPMSystem11R1\products\FinancialDataQuality\FDMApplication\BMHCFDMHFM\Inbox\ALBA\BMHC_Alba_Dec_2011.csv
    Processing Codes:
    BLANK............. Line is blank or empty.
    ESD............... Excluded String Detected, SKIP Field value was found.
    NN................ Non-Numeric, Amount field contains non numeric characters.
    RFM............... Required Field Missing.
    TC................ Type Conversion, Amount field could be converted to a number.
    ZP................ Zero Suppress, Amount field contains a 0 value and zero suppress is ON.
    Create Output File Start: [2011-08-29 16:19:56]
    [ESD] ( ) Inter Co,Cash and bank balances,A113000,Actual,Alba,Dec,2011,MOF,MOF,,YTD,Input_Default,[NONE],[NONE],[NONE],1
    [ESD] ( ) Inter Co,"Trade receivable, prepayments and other assets",HFM128101,Actual,Alba,Dec,2011,MOF,MOF,,YTD,Input_Default,[NONE],[NONE],[NONE],35
    [ESD] ( ) Inter Co,Inventories ,HFM170003,Actual,Alba,Dec,2011,MOF,MOF,,YTD,Input_Default,[NONE],[NONE],[NONE],69
    [ESD] ( ) Inter Co,Financial assets carried at fair value through P&L,HFM241001,Actual,Alba,Dec,2011,MOF,MOF,,YTD,Input_Default,[NONE],[NONE],[NONE],103
    [Blank] -      
    Excluded Record Count..............4
    Blank Record Count.................1
    Total Records Bypassed.............5
    Valid Records......................0
    Total Records Processed............5
    Begin SQL Insert Load Process (0): [2011-08-29 16:19:56]
    Processing Complete... [2011-08-29 16:19:56]
    Please help me solve the issue.
    Regards,
    Sudhir Sinha

  • Issue with Bulk Load Post Process Scheduled Task

    Hello,
    I successfully loaded users in OIM using the bulk load utility.  I also have LDAP sync ON.  The documentation says to run the Bulk Load Post Process scheduled task to push the loaded users in OIM into LDAP.
    This works if we run the Bulk Load Post Process Scheduled Task right away after the run the bulk load.
    If some time had passed and we go back to run the Bulk Load Post Process Scheduled Task, some of the users loaded through the bulk load utility are not created in our LDAP system.  This created an off-sync situation between OIM and our LDAP.
    I tried to use the usr_key as a parameter to the Bulk Load Post Process Scheduled Task without success.
    Is there a way to force the re-evaluation of these users so they would get created in LDAP?
    Thanks
    Khanh

    The scheduled task carries out post-processing activities on the users imported through the bulk load utility.

  • Issue with Bulk Load Post Process

    Hi,
    I ran bulk load command line utility to create users in OIM. I had 5 records in my csv file. Out of which 2 users were successfully created in OIM and for rest i got exception because users already existed. After that if i run bulk load post process for LDAP sync and generate the password and send notification. It is not working even for successfully created users. Ideally it should sync successfully created users. However if there is no exception i during bulk load command line utility then LDAP sync work fine through bulk load post process.Any idea how to resolve this issue and sync the user in OID which were successfully created. Urgent help would be appreciated.

    The scheduled task carries out post-processing activities on the users imported through the bulk load utility.

  • Bulk loading BLOBs using PL/SQL - is it possible?

    Hi -
    Does anyone have a good reference article or example of how I can bulk load BLOBs (videos, images, audio, office docs/pdf) into the database using PL/SQL?
    Every example I've ever seen in PL/SQL for loading BLOBs does a commit; after each file loaded ... which doesn't seem very scalable.
    Can we pass in an array of BLOBs from the application, into PL/SQL and loop through that array and then issue a commit after the loop terminates?
    Any advice or help is appreciated. Thanks
    LJ

    It is easy enough to modify the example to commit every N files. If you are loading large amounts of media, I think that you will find that the time to load the media is far greater than the time spent in SQL statements doing inserts or retrieves. Thus, I would not expect to see any significant benefit to changing the example to use PL/SQL collection types in order to do bulk row operations.
    If your goal is high performance bulk load of binary content then I would suggest that you look to use Sqlldr. A PL/SQL program loading from BFILEs is limited to loading files that are accessible from the database server file system. Sqlldr can do this but it can also load data from a remote client. Sqlldr has parameters to control batching of operations.
    See section 7.3 of the Oracle Multimedia DICOM Developer's Guide for the example Loading DICOM Content Using the SQL*Loader Utility. You will need to adapt this example to the other Multimedia objects (ORDImage, ORDAudio .. etc) but the basic concepts are the same.
    Once the binary content is loaded into the database, you will need a to write a program to loop over the new content and initialize the Multimedia objects (extract attributes). The example in 7.3 contains a sample program that does this for the ORDDicom object.

Maybe you are looking for

  • Internal PLSQL Tables Access via SQL. But how ?

    Hello, I want to write the result of a database query in an internal PLSQL Table. After that i would like work with this internal PLSQL Table in a Package/Procedure/Function. Important for me is to access the internal Table via SQL because i have to

  • Data source - activation problem

    Hello, when I try to transport a datasource from the development system to the QA-system I always get the error: Syntax error in GP_ERR_RSAPTD1, row 25 (-> long text) Message no. RG102 Diagnosis Field '/BIC/CE8CO_PA_02A' is unknown. It is neither in

  • R12 Database Cloning with error AC-00423 / RC-50014

    Hi Friends. We have an R12.1.3 Application with 11gR2 database on a Sun Sparc solaris 10 machine. From production cold backups, we are trying to clone a new test instance. The cloning is failing while configuring Oracle Home in DB Tier. in the logs i

  • FICO implementation Check list

    Hi Experts, Is there any checklist on FICO to review whether the module is utilized properly / fully. Thanks in Advance.

  • "The photo "EGY_7895.jpg" could not be opened, because the original item cannot be found"

    I'm getting the following message when I click on any photo in my most recent import: "The photo "xxx.jpg" could not be opened, because the original item cannot be found". The thumbnails are in iphoto but it dosen't seems to reconize the files when I