Optimization for bulk data upload

Hi everyone!
I've got the following issue:
I have to do a bulk data upload using JMS deploy in a glassfish 2.1, to process and validate data in a Oracle 10g DB System before it is insert.
I have my web interface that loads a file and then delegates the process to a Stateless Session Bean which read a N number of lines and after that send a message to a JMS Queue. The JMS has to parse the each line, validate them with the data already in the DB, and finally persist the new data.
This process is high process consuming, and I need to improve the performance time. I tried to change glassfish default JMS and JDBC pool size, but I have no a huge difference.
Do you have any advice that could help me?
Thanks in advance!

Hi! thank you for you answer!
High process consuming is in the MDB
I'm grouping each N number of read lines in the EJB and then send the message to the JMS. The MDB process the persists each line as info in different related tables.
Thanks again!

Similar Messages

  • Enhancement_CIN_Capture Incoming Excise Invoice-J1IEX bulk data upload

    Dear All,
    Sub:CIN_Capture Incoming Excise Invoice-J1IEX bulk data upload option requirement
    We are capturing the Incoming excise invoices manually in the
    transaction J1IEX with huge datau2019s and according to the volume of data
    it is very difficult for us to enter manually and now we required for
    the option of bulk data processing to upload the data from the Excel
    file(received the softcopy from the supplier).
    As per our observations we found the BAPI named
    BAPI_EXCINV_CREATE_FROMDATA but the update level of this BAPI is not
    available in our system because as per the Indian Government norms one
    ofthe current Excise Duty Tariff is as below
    1. Basic Excise Duty (BED 8%).
    2. Education Cess (ECess 2%)
    3. Secondary Education Cess (SECess 1%)
    and we observed the SECess (1%) is not covered in the above mentioned
    BAPI so we are not in a position to proceed further.
    So Kindly update us is any other relevant option will solve the purpose.
    We are in a quite difficult situation to uplaod the datas to our system
    so please do the needful.
    Regards,
    PrabuM

    Please note that CIN uses the 'MB_MIGO_BADI' definition and 'CIN_PLUG_IN_TO_MIGO' is the implementation. You can create multiple implementations of this BADI. The same BADI is used for single step Capture & Post of excise invoice in MIGO. Kindly use this BADI as per your needs. SAP std does not support any BAPIs for Goods Receipts with Excise updates

  • Reg:Efficient solution for a data upload scenario

    Hi All,
            I have the following task.
             Required  data from a legacy system(generate  data only in the form of flat files)to SAP R3 as FB01 journals and the output file should be generated periodically(daily,weekly,fortnightly etc…)
    Solution Approaches:
    1)Write a BDC program to extract the data.
    2) Write a ABAP Progam to populate IDoc (if standard IDOc is available) or generate a Outbound proxy (If standard IDoc is not available) to push the data into SAP XI.
    Could anyone  tell me which would be the the best and efficient approach for this task and need your recommendations.
    Thanks in Advance.
    B.Lavanya
    Edited by: Lavanya Balanandham on Mar 31, 2008 2:23 PM

    Hi Lavanya,
    Required data from a legacy system(generate data only in the form of flat files)to SAP R3 as FB01 journals - use BDC for this thing because it will be better for large source files.
    the output file should be generated periodically(daily,weekly,fortnightly etc…)  - if this output file contains acknowledgment for the data uploaded by the above process, create a ABAP report for it and schedule it..........but if this output contains some other IDOC data which you need to send as a file to a third-party system, then go for SAP XI provided the IDOC data is not too large... but if the IDOC size is huge, then just create a ABAP report for outputting data to a file on application server and FTP the file to third-party system.
    Regards,
    Rajeev Gupta

  • Function module Vs BDC for master data upload

    Hi ,
    Please advice we should use the following function modules for master data upload or we should go for BDC.
    MP_RFC_SINGLE_CREATE
    MP_RFC_INACT_CHANGE
    MPLAN_CREATE
    MPLAN_CHANGE
    MPLAN_SET_DELETION_INDICATOR
    ASSET_MASTERRECORD_MAINTENANCE
    MPLAN_ITEM_CREATE
    MPLAN_ITEM_CHANGE
    GL_ACCT_MASTER_SAVE
    GL_ACCT_MASTER_SAVE
    GL_ACCT_MASTER_SAVE
    Actually, we have already used these function modules in our upload program, but we are not sure if these function modules will create any data inconsistency.
    Please let me know, if we should continue using the FMs, or there is any risk using the FMs and we should replace them by BDC.
    Thanks in advance.

    HI Vikram,
    Better to serch for the BAPI for uploading the master data.Becuase we have problems with BDC and FM's.
    If you use FM's it does n't contain all the fields which you want.IF you go for BDC this is not maintainable for future releaseas.IF you upgrade then screen may change.
    IF don' have any BAPI then better go for BDC.
    Thanks

  • LSMW used only for master data upload?

    Hi
    Can you please let me know if LSMW is used only for master data upload or we can also use it for transaction data ?

    Hi Christino.
    I have come across a standard SDN thread which deals with the uploading master data, refer it:
    [SDN Reference for uploading master data using LSMW|how can we upload master data by using LSMW;
    [SDN reference for which uploading is preferred (Master data or Transaction data)|Which one is better for uploading data LSMW or ECATT ?;
    Good Luck & Regards.
    HARSH

  • Tutorial for new Data Upload feature in 4.1?

    Greetings,
    Is there a tutorial available for using the new Data Upload feature in 4.1? We have not upgraded to 4.1 yet and I would like to try out the new feature on my APEX test workspace, so I can use the new feature once we are upgraded to 4.1.
    Thanks,
    John

    I installed the Product Portal Sample Database and went to page 21. Very nice looking. Is there any tutorial for this Sample application? In other words, is there a tutorial that uses this application as its basis?
    What I am really looking for (my apologies if I have not been clear) is a tutorial that steps you through the process of setting up the new feature of Data Upload in APEX 4.1. I would like to create a new application on my test workspace and learn how to set up the Data Upload page.
    Seeing the Data Load in action is very helpful though. Thanks for pointing me to this.
    Thanks,
    John

  • Query for monitor data upload

    Hi, Experts
        Normally in Cube we just have requestID, which only has number information and nothing else( request date, time, selection, type of data upload ... )
        Can I make a Bex query show information just like Cube manage?  becase we had to check whether there is duplicated selection request is not deleted or some missing request in case multi-datasource to one cube
        I can not find any useful query in BW statistics queries.
    thanks in advance.

    I'm also can not found enough information from table RSMONICDP
    In our case, Cube 0COOM_C02 have lots infosources, some are full upload and some are Delta upload. all of inforpackage are scheduled one process chain.
    then I go to log of this process chain, I found some error happened in some days, so some time the process chain is not finished, so that's means in Cube 0COOM_C02 have missing request and duplicated request.
    I'm hard to using cube-manage to found all of problem request because there are so many request and so little windowns.  so my question is, is there any Bex query or BW table can indicate similiar information within cube - manage - request tab.
    so I can analysis them in Excel, it's quict easy for me.
    thank you all

  • Xls. sheet for Master data upload

    Hi Can any body suggest me or send me the sample, how to maintain xls. sheet Template of  particular fields for Infotypes 0,1,2,7,8 etc for the purpose of data upload.
    <removed by Moderator>
    thanks
    S Mishra

    Hi Mishra,
    You can look into the Standard Business Blueprint Templates - Data Transfer Tool of SAP....and get an idea...
    Check this Note 1060029 - SAP Best Practices for HCM US - Variants, Misc, LSMW ...
    You will find files for infotypes and you can set up your templates based on these....
    Other way is to
    Go to SE11 >> Enter the Infotype Number PNNNN (with NNNN being the Infotype)...
    It presents you with the structure.......Copy that structure and you can create your excel sheet with that template....
    Good Luck !!!!
    Kumarpal Jain.

  • Clob is not working for bulk data files in PL/SQL XML program

    Hi Odie,
    we took your help to fix the our issue before
    "https://forums.oracle.com/forums/thread.jspa?threadID=2238458&tstart=105"
    working fine for : program is working for smaller size data.
    Issue : now we have problem with the largr size data .
    getting the below error:
    Arguments
    P_dir_name='/tmp'
    P_file_name='CCBGO.COLO_CNG.RESPONSES.20120802.00054131826'
    Environment will now switch to UTF-8 code-set.
    Parts of this log file may not display correctly
    as a result. This is an expected behavior.
    XML_REPORTS_XENVIRONMENT is :
    /apps/applmgr/product/OFDEV/ofdevora/806/guicommon6/tk60/admin/Tk2Motif_UTF8.rgb
    XENVIRONMENT is set to /apps/applmgr/product/OFDEV/ofdevora/806/guicommon6/tk60/admin/Tk2Motif_UTF8.rgb
    Current NLS_LANG and NLS_NUMERIC_CHARACTERS Environment Variables are :
    American_America.UTF8
    stat_low = 8B
    stat_high = 0
    emsg:was terminated by signal 11
    Appreciated for your earlier support.
    Kindly suggest .
    Many Thanks,
    Ramesh.

    Thanks ALex,
    your are true it is concurrent program error ,
    but it is working for small amount of data and generating the output and it is not working for larger data.
    i have placed the code which i have used kindly suggest where i am going wrong.
    i am calling the .rdf through the concurrent program, i've used the below query in RDF
    select
    BATCHHEADER
    ,BATCHTRAILER
    ,RqUID
    ,Severity
    ,PmtRefId
    ,StatusDesc
    ,ErrorDesc
    ,AsOfDate
    ,AsOfTime
    ,RqUID1
    ,SPRefId
    from table(CL_CXFRFXFH_PKG.rcacknowledgments(:P_dir_name,:P_file_name));
    kindly find the below code for the package CL_CXFRFXFH_PKG.
    ==========================
    CREATE OR REPLACE package body APPS.CL_CXFRFXFH_PKG is
    function rcacknowledgments (p_directory in varchar2, p_filename in varchar2)
    return TRecordTable pipelined
    is
    nb_rec number := 1;
    tmp_xml clob;
    tmp_file clob;
    rec TRecord;
    begin
    dbms_lob.createtemporary(tmp_file, true);
    tmp_file := dbms_xslprocessor.read2clob(p_directory, p_filename);
    rec.BATCHHEADER := regexp_replace(tmp_file, '.*<BATCHHEADER>(.*)</BATCHHEADER>.*', '\1', 1, 1, 'n');
    rec.BATCHTRAILER := regexp_replace(tmp_file, '.*<BATCHTRAILER>(.*)</BATCHTRAILER>.*', '\1', 1, 1, 'n');
    loop
    tmp_xml := regexp_substr(tmp_file, '<\?xml[^?]+\?>\s*<([^>]+)>.*?</\1>', 1, nb_rec, 'n');
    exit when length(tmp_xml) = 0;
    --dbms_output.put_line(tmp_rec);
    nb_rec := nb_rec + 1;
    select RqUID, Severity, PmtRefId, StatusDesc, ErrorDesc, AsOfDate, AsOfTime, RqUID1, SPRefId
    into rec.RqUID
    , rec.Severity
    , rec.PmtRefId
    , rec.StatusDesc
    , rec.ErrorDesc
    , rec.AsOfDate
    , rec.AsOfTime
    , rec.RqUID1
    , rec.SPRefId
    from xmltable(
    '/CMA/BankSvcRq' passing xmltype(tmp_xml)
    columns RqUID varchar2(3000) path 'RqUID'
    , Severity varchar2(3000) path 'XferAddRs/Status/Severity'
    , PmtRefId varchar2(3000) path 'XferAddRs/Status/PmtRefId'
    , StatusDesc varchar2(3000) path 'XferAddRs/Status/StatusDesc'
    , ErrorDesc varchar2(3000) path 'XferAddRs/Status/ErrorDesc'
    , AsOfDate varchar2(3000) path 'XferAddRs/Status/AsOfDate'
    , AsOfTime varchar2(3000) path 'XferAddRs/Status/AsOfTime'
    , RqUID1 varchar2(3000) path 'XferAddRs/RqUID'
    , SPRefId varchar2(3000) path 'XferAddRs/SPRefId'
    pipe row ( rec );
    end loop;
    dbms_lob.freetemporary(tmp_file);
    return;
    end;
    end;
    ============================================
    Many Thanks,
    Ramesh.

  • Bulk Data Upload

    Hi
    We have a requirement to load bulk data which would be a full dump (and not incremental) in CSV format almost every week from other applications.
    This implies that I can drop my tables and rebuild the same using the CSV files that I have received.
    I was just wondering is there is any real efficient tool or utility in ORacle (or outside) to import huge amount of data (apart from SQL Loader, Ext Tables and Data Pump)
    Regards
    Kapil

    I don't know of any tool apart from loader/Ext-table and Datapump.
    You may find tools which you can buy (and claim they are really good).
    Honestly, if you want to load flat file data (gigabytes or kilobytes) into Oracle, there is nothing better than SQL*loader, "if you use all its capabilities" (External tables and loader are same thing, just the wrapper is different).
    Cheers

  • Merge for bulk data

    Hi all,
    I want to insert bulk data from external table to database ..Program compiled successfully bt after executing the data doesn't insert to database..plz help me..
    External table:-bck_hotel
    HOTEL_CODE     NUMBER
    HOTEL_NAME     VARCHAR2(100)
    HOTEL_TYPE     VARCHAR2(100)
    HOTEL_ADDRESS VARCHAR2(100)
    HOTEL_NUMBER     NUMBER
    HOTEL_FACILITY     VARCHAR2(100)
    HOTEL1     VARCHAR2(100)
    LATITUDE     NUMBER
    LONGITUDE     NUMBER
    Database table:-hotel
    HOTEL_CODE     NUMBER
    HOTEL_NAME     VARCHAR2(100)
    HOTEL_TYPE     VARCHAR2(100)
    HOTEL_ADDRESS     VARCHAR2(100)
    HOTEL_NUMBER     NUMBER
    HOTEL_FACILITY      VARCHAR2(100)
    Code:
    CURSOR cur_hotels IS
    SELECT hotel_code, hotel_name, hotel_type, hotel_address, hotel_number,
    hotel_facility
    FROM bck_hotels;
    BEGIN
    OPEN cur_hotels;
    LOOP
    FETCH cur_hotels BULK COLLECT
    INTO v_hotel_code, v_hotel_name, v_hotel_type, v_hotel_address, v_hotel_number, v_hotel_facility LIMIT 1000;
    FORALL i IN 1 .. v_hotel_code.COUNT MERGE INTO hotels tgt USING (
    SELECT v_hotel_code(i) AS hotel_code, v_hotel_name(i) AS hotel_name,
    v_hotel_type(i) AS hotel_type,
    v_hotel_address(i) AS hotel_address,
    v_hotel_number(i) AS hotel_number,
    v_hotel_facility(i) AS hotel_facility
    FROM dual) src
    ON (src.hotel_code = tgt.hotel_code)
    WHEN MATCHED THEN UPDATE SET
    tgt.hotel_name = src.hotel_name, tgt.hotel_type = src.hotel_type, tgt.hotel_address = src.hotel_address, tgt.hotel_number = src.hotel_number, tgt.hotel_facility = src.hotel_facility
    WHEN NOT MATCHED THEN
    INSERT(tgt.hotel_code, tgt.hotel_name, tgt.hotel_type, tgt.hotel_address, tgt.hotel_number, tgt.hotel_facility)
    VALUES(src.hotel_code, src.hotel_name, src.hotel_type, src.hotel_address, src.hotel_number, src.hotel_facility);

    Hello,
    I wonder why are you using Bulk Collect, when the same can be accomplished by a simple Merge statement.
    Below can help:
    MERGE INTO hotels tgt USING
    (SELECT hotel_code,
      hotel_name,
      hotel_type,
      hotel_address,
      hotel_number,
      hotel_facility
    FROM bck_hotel
    ) src ON (src.hotel_code = tgt.hotel_code)
    WHEN MATCHED THEN
      UPDATE
      SET tgt.hotel_name   = src.hotel_name,
        tgt.hotel_type     = src.hotel_type,
        tgt.hotel_address  = src.hotel_address,
        tgt.hotel_number   = src.hotel_number,
        tgt.hotel_facility = src.hotel_facility WHEN NOT MATCHED THEN
      INSERT
          tgt.hotel_code,
          tgt.hotel_name,
          tgt.hotel_type,
          tgt.hotel_address,
          tgt.hotel_number,
          tgt.hotel_facility
        VALUES
          src.hotel_code,
          src.hotel_name,
          src.hotel_type,
          src.hotel_address,
          src.hotel_number,
          src.hotel_facility
        );Is it not true?
    Regards,
    P.

  • Creation of SO based on the input in table format  for bulk data : urgent

    Hi,
    The data from an external system will be sent to SAP in a bulk format and also as a Table. i.e. they are going to send around 30 Purchase orders from external system . I need to fetch all of them at a time and create SO for each one .
    Hence, anyone please provide a work around for the same.
    shyam

    HI Shyam
    Proceed as below:
    1. Identify the file layout like how it is going to be for multiple items.
    2. Upload the data to an internal table:
    3. Loop at internal table
    4. Populate the data required for BAPI structures
    5. AT END of PO number call a BAPI to create a Sales Order.
    Make sure you are clearing/refreshing the structures/internal tables for BAPI.
    Regards
    Eswar

  • How to improve performance for bulk data load in Dynamics CRM 2013 Online

    Hi all,
    We need to bulk update (or create) contacts into Dynamics CRM 2013 online every night due to data updated from another external data source.  The data size is around 100,000 and the data loading duration was around 6 hours.
    We are already using ExecuteMultiple web services to handle the integration, however, the 6 hours integraton duration is still not acceptable and we are seeking for any advise for further improvement. 
    Any help is highly appreciated.  Many thanks.
    Gary

    I think Andrii's referring to running multiple threads in parallel (see
    http://www.mscrmuk.blogspot.co.uk/2012/02/data-migration-performance-to-crm.html - it's a bit dated, but should still be relevant).
    Microsoft do have some throttling limits applied in Crm Online, and it is worth contacting them to see if you can get those raised.
    100 000 records per night seems a large number. Are all these records new or updated records, or are there some that are unchanged, in which case you could filter them out before uploading ? Or are there useful ways to summarise the data before loading
    Microsoft CRM MVP - http://mscrmuk.blogspot.com/ http://www.excitation.co.uk

  • .CSV file for Master data upload in GRC PC

    Hi all,
    We want to upload some Master Data for GRC Process Controls 3.0. Since we do not have access to MDUG tool yet, and we have a demo planned soon, we decided to go with the approach of uploading some data using the program GRPCB_UPLOAD.
    So I uploaded a .csv file with the following structure and was able to create master data. But, the object name was not updated.
    Structure
    Object Type     
    Object ID     
    Infotype     
    Sub-type     
    Start     
    End     
    Object abbreviation     
    Object Name     
    Language
    In addition, for some objects, we'd like to update the other attributes with data too.
    Would anyone have the .csv format for such a detailed upload for Process / Sub process / Control?
    Regards,
    Preksha

    Hi all,
    First of all, thanks in advance, but I´ve  tried to upload a template similar to Preksha´s template but when I uploaded it through GRPCB_UPLOAD program it doesn´t work properly, because when I execute the background process, I´ll check its status and then it is cancelled .
    In addition, I follow the recomendations from the pdf called "Process Control Upload Master Data v1" released by RIG in June 2008, but in contrast I can´t upload the structure correctly.
    Do you have any idea? Or could you give me an idea which explains how to create a correct template? Do you have any clue of what I would have done wrong?
    Thanks a lot.
    Regards.

  • Bulk data upload in DMS

    Hi Experts,
    I want to create Documents along with attachments in DMS.Can I use LSMW for this purpose or any BAdI's?
    Can we upload files using LSMW?
    Pls mention step by step solution to achieve this requirement.
    Regards,
    Sam

    Hi Sam,
    Additional data is also feasible to add thru BDC. we have done it in current project.
    First make a BDC record wiith considering all fields. And then pass these values i.e. varialbes.
    Abap'r can show you this. Take help from abap expert.
    Hope this will resolve the query.
    Regards,
    Ravindra

Maybe you are looking for

  • How to count number of autoextents that had took place between a period.

    urgent question Hi all how to provide me a report for getting the result of the Disk Drive Space usage (drive letter) that shows how quickly the space is being used up by the database(dbf files). Or how to count the number of autoextents that had too

  • I did reinstalled the rules why asked me a card sim? I can't do anythig

    Hello I would like to know why isn't work my iphone 4 because I juste did the reinstallation of the rules. I though to do the OS that's all. And now I can't do anything is asking me a new sim card because is cloused the sim card. Help me please !

  • Can't open iTunes 7

    I just upgraded to iTunes 7 yesterday when I installed my replacement iPod. Yesterday it was working fine, but today I could not reopen iTunes. I have not backed up the last 2 or 3 cds I bought and I am finding it difficult to back them up without ga

  • Help (JavaTV vs JMF vs MHP?!)

    hi, could someone delineate for me the roles of the different APIs mentioned in the title. From what I have read I infer that MHP is a framework which allows sending Xlets with the TV stream which are then executed on the set top box. What API should

  • How to load transaction data

    Hi, Can anyone share how to load transaction data from BI Cube to BPC Cube. How do i map key figures in BI to BPC in NW version. Pls share any document that explains. Thanks Veeru