Creation of data packages due to large amount of datasets leads to problems

Hi Experts,
We have build our own generic extractor.
When data packages (due to large amount of datasets) are created, different problems occur.
For example:
Datasets are now doubled and appear twice, one time in package one and a second time in package two. Since those datsets are not identical, information are lost while uploading those datasets to an ODS or Cube.
What can I do? SAP will not help due to generic datasource.
Any suggestion?
BR,
Thorsten

Hi All,
Thanks a million for your help.
My conclusion from your answers are the following.
a) Since the ODS is Standard - within transformation no datasets are deleted but aggregated.
b) Uploading a huge amount of datasets is possible in two ways:
   b1) with selction criteria in InfoPackage and several uploads
   b2) without selction criteria in InfoPackage and therefore an automatic split of datasets in data packages
c) both ways should have the same result within the ODS
Ok. Thanks for that.
So far I have only checked the data within PSA. In PSA number of datasets are not equal for variant b1 and b2.
Guess this is normal technical behaviour of BI.
I am fine when results in ODS are the same for b1 and b2.
Have a nice day.
BR,
Thorsten

Similar Messages

  • Regarding TXT File data truncation due to large amount of data

    Hi Guys,
    I am downloading data to txt.file in background.I am getting truncation of the records due to large amount of data. If it is less data it works good.
    I have checked the Internal table SIZE for this and anywhy i have declared in OCCURS 0 only.
    So please help me to find out what may this reason.I am confuced is there any limitation for TXT file??
    Please help me guys..
    Thanks in advance..
    Prabhu.R

    Hi Rakesh,
    two ways.
    1. Ask ur BASIS team to increase the memory level.
    2. Check the PACKAGE SIZE option of select statement
    Here u  won't select all the data once but in packets of specified size. So get the packets of data and process.
    Just press F1 on package size. That explanation will be enough to proceed further.
    Thanks,
    Vinod.

  • Data Transfer Prozess (several data packages due two huge amount of data)

    Hi,
    a)
    I`ve been uploading data from ERP via PSA, ODS and InfoCube.
    Due to a huge amount of data in ERP - BI splits those data in two data packages.
    When prozessing those data to ODS the system delete a few dataset.
    This is not done in step "Filter" but in "Transformation".
    General Question: How can this be?
    b)
    As described in a) data is split by BI into two data packages due to amount of data.
    To avoid this behaviour I enterd a few more selection criteria within InfoPackage.
    As a result I upload data a several time, each time with different selction criteria in InfoPackage.
    Finally I have the same data in ODS as in a), but this time without having data deleted in step "Transformation".
    Question: How is the general behaviour of BI when splitting data in several data packages?
    BR,
    Thorsten

    Hi All,
    Thanks a million for your help.
    My conclusion from your answers are the following.
    a) Since the ODS is Standard - within transformation no datasets are deleted but aggregated.
    b) Uploading a huge amount of datasets is possible in two ways:
       b1) with selction criteria in InfoPackage and several uploads
       b2) without selction criteria in InfoPackage and therefore an automatic split of datasets in data packages
    c) both ways should have the same result within the ODS
    Ok. Thanks for that.
    So far I have only checked the data within PSA. In PSA number of datasets are not equal for variant b1 and b2.
    Guess this is normal technical behaviour of BI.
    I am fine when results in ODS are the same for b1 and b2.
    Have a nice day.
    BR,
    Thorsten

  • Getting Short dump due to large amount of data

    Hi experts,
    When we are running RALM_ME_MEASP_FULL_DOWNLOAD_SD  program, every time we are getting Short dump due to large amount of data.
    please suggest how to run this program without short dump.
    Thanks & Regards
    Prashant Gupta

    Hi,
    you do run the wrong APP I guess. The service you are running is for MAM20. If you are interested in MAM30 and MAM25, please use the correct service.
    If you have a look for
    *FULL_DOWNLOAD_SD
    You find them all. Use the ones without the RALM in front - then the timeout should be solved as well. Bside that there is a great guid available that helps as well to solve some issues around server driven replication:
    [MAM SERVER DRIVEN SETUP GUIDE|http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/818ac119-0b01-0010-ba8b-b6e3f3490a63?quicklink=index&overridelayout=true]
    Hope that helps to solve yourt issue.
    Regards,
    Oliver

  • Send large amounts of data

    Hello everyone,
    I made an applet that receives data, signs and returns the signed data. When the amount of data is too big, I break it into blocks of 255 bytes and use the method Signature.update.
    OK, this is working fine, but perfomarnce is poor due to large amount of blocks. Is it possible to increase the size of the blocks?
    Thanks.

    Hi,
    You cannot change the block size but you can change what you send in.
    You may get better performance by sending multiples of your hash function block size as the card will not have to do internal buffering.
    You could also do as much of the work in your host application as possible and then just send in the data that you need to operate on with the private key. Generating the hash of the message does not require a private key so can be done in your host. You then send the result of the hash to the card to be encrypted with the private key. This will be the fastest method.
    Cheers,
    Shane

  • Azure Cloud service fails when sent large amount of data

    This is the error;
    Exception in AZURE Call: An error occurred while receiving the HTTP response to http://xxxx.cloudapp.net/Service1.svc. This could be due to the service endpoint binding not using the HTTP protocol. This could also be due to an HTTP request context being
    aborted by the server (possibly due to the service shutting down). See server logs for more details.
    Calls with smaller amounts of data work fine. Large amounts of data cause this error.
    How can I fix this??

    Go to the web.config file, look for the <binding> that is being used for your service, and adjust the various parameters that limit the maximum length of the messages, such as
    maxReceivedMessageSize.
    http://msdn.microsoft.com/en-us/library/system.servicemodel.basichttpbinding.maxreceivedmessagesize(v=vs.100).aspx
    Make sure that you specify a size that is large enough to accomodate the amount of data that you are sending (the default is 64Kb).
    Note that even if you set a very large value here, you won't be able to go beyond the maximum request length that is configured in IIS. If I recall correctly, the default limit in IIS is 8 megabytes.

  • Open Large amount of data

    Hi
    I have a file on application server in .dat format, it contains large amount of data may be 2 million of records or  more, I need to open the file to check the record count, is there any software or any option to open the file, I have tried opening with Notepad, excel .... it gives error..
    please let me know
    Thanks

    Hi,
    Try this..
    Go to AL11..
    Go to the file directory..Then in the file there will be field called length..which is the total length of the file in characters..
    If you know the length of a single line..
    Divide the length of the file by the length of single line..I believe you will get the number of records..
    Thanks,
    Naren

  • Bex Report Designer - Large amount of data issue

    Hi Experts,
    I am trying to execute (on Portal) report made in BEx Report Designer, with about 30 000 pages, and the only thing I am getting is a blank page. Everything works fine at about 3000 pages. Do I need to set something to allow processing such large amount of data?
    Regards
    Vladimir

    Hi Sauro,
    I have not seen this behavior, but it has been a while since I tried to send an input schedule that large. I think the last time was on a BPC NW 7.0 SP06 system and it worked OK. If you are on a recent support package, then you should search for relevant notes (none come to mind for me, but searching yourself is always a good idea) and if you don't find one then you should open a support message with SAP, with very specific instructions for recreating the problem from a clean input-schedule.
    Good luck,
    Ethan

  • How to send large amount of XML data in one CLOB variable

    Hi,
    I am sending large amount of XML data to TCP/IP port in one CLOB variable.
    My requirement is to send the whole data in one go in one CLOB variable.
    But that CLOB variable is not sufficient to hold all the data.
    Please suggest some solution.
    Thanks in advance

    Hi Here is my code:
    CREATE OR REPLACE PACKAGE BODY APPS.XXMB_WIP_PROD_TAG_DOOR_PKG
    AS
    PROCEDURE xxmb_get_xml_data_1270 (
    -- errbuf OUT VARCHAR2,
    -- retcode OUT NUMBER,
    p_org IN VARCHAR2,
    p_limit_to_global IN VARCHAR2,
    p_label IN VARCHAR2,
    p_printer IN VARCHAR2,
    p_quantity IN VARCHAR2,
    p_print_method IN VARCHAR2,
    p_enable_release IN VARCHAR2,
    p_enable_serial_no IN VARCHAR2,
    p_release IN VARCHAR2,
    p_rep_group IN VARCHAR2,
    p_cart_type IN VARCHAR2,
    p_cart_no_from IN VARCHAR2,
    p_cart_no_to IN VARCHAR2,
    p_serial_no IN VARCHAR2
    AS
    CURSOR c_xml_data_door (
    p_org IN VARCHAR2,
    p_label IN VARCHAR2,
    p_printer IN VARCHAR2,
    p_quantity IN VARCHAR2,
    p_print_method IN VARCHAR2,
    p_rep_group IN VARCHAR2,
    p_release IN VARCHAR2,
    p_cart_type IN VARCHAR2,
    p_cart_no_from IN VARCHAR2,
    p_cart_no_to IN VARCHAR2,
    p_serial_no IN VARCHAR2
    IS
    SELECT xxasa.item_id AS item_id, xcs.serial_number AS serial_number,xxcpf.cart_type,xcs.destination_cart_num cart,xcs.destination_slot_num slot
    CURSOR c_product_detail (
    l_product IN NUMBER,
    l_serial_num IN VARCHAR2,
    p_limit_to_global IN VARCHAR2
    IS
    SELECT xcra_specie.reference_id AS reference_id,
    xcra_ege.attribute_value AS ege, xcs.item_id AS item_id,
    AND msib.inventory_item_id = l_product
    and xcs.organization_id = nvl(p_org, xcs.organization_id)
    AND xcs.serial_number = NVL (l_serial_num, xcs.serial_number);
    /*-------------------------------------------------------+
    | Cursor to fetch the data for special Message Label |
    +-------------------------------------------------------*/
    CURSOR c_count (p_item_id IN NUMBER)
    IS
    SELECT xcrav.attribute_value, xcs.serial_number, xcs.cabinet_number
    FROM xxmb_czmfg_ref_attributes xcrav,
    cz_config_attributes cca,
    AND msib.organization_id = xcs.organization_id
    AND msib.inventory_item_id = xcs.item_id;
    /*--------------------------+
    | Common variables |
    +--------------------------*/
    v_limit_to_global VARCHAR2 (100);
    l_label_count NUMBER := 1;
    total_rec NUMBER;
    l_rewrite VARCHAR2 (1) := 'N';
    l_file_count NUMBER := 1;
    l_separate_line VARCHAR2 (10);
    BEGIN
    fnd_profile.get ('WMS_LABEL_OUTPUT_DIRECTORY', l_output_dir);
    fnd_profile.get ('WMS_LABEL_FILE_PREFIX', l_output_file_prefix);
    l_request_id := apps.fnd_global.conc_request_id;
    l_output_file_name :=
    l_output_file_prefix || l_request_id || l_file_end;
    l_dir_seperator := '/';
    IF (INSTR (l_output_dir, l_dir_seperator) = 0)
    THEN
    l_dir_seperator := '\';
    END IF;
    v_label := p_label;
    v_printer := p_printer;
    v_quantity := p_quantity;
    V_LIMIT_TO_GLOBAL := P_LIMIT_TO_GLOBAL;
    L_XML_CONTENT := '<?xml version="1.0" encoding="UTF-8" ?>';
    L_XML_CONTENT := L_XML_CONTENT || '<!DOCTYPE labels SYSTEM "label.dtd">';
    L_XML_CONTENT := L_XML_CONTENT || '<labels>';
    FOR r_xml_data_door IN c_xml_data_door (p_org,
    p_label,
    p_printer,
    p_quantity,
    p_print_method,
    p_rep_group,
    p_release,
    p_cart_type,
    p_cart_no_from,
    p_cart_no_to,
    p_serial_no
    LOOP
    -- dbms_output.put_line ( 1 );
    FOR r_product_detail IN
    c_product_detail (r_xml_data_door.item_id,
    r_xml_data_door.serial_number,
    v_limit_to_global
    LOOP
    -- dbms_output.put_line ( 2 );
    -- l_xml_content := '<?xml version="1.0" encoding="UTF-8" ?>';
    -- l_xml_content := l_xml_content || '<!DOCTYPE labels SYSTEM "label.dtd">';
    -- l_xml_content := l_xml_content || '<labels>';
    fnd_file.put_line (fnd_file.LOG, 'label cnt: ' || l_label_count);
    dbms_output.put_line (l_label_count);
    L_XML_CONTENT := L_XML_CONTENT || '<label _FORMAT='
    || '"'
    || 'lib://FRD/'
    || v_label
    || '"'
    || ' _PRINTERNAME='
    || '"'
    || v_printer
    || '"'
    || ' _QUANTITY='
    || '"'
    || v_quantity
    || '"'
    || '>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Color">'
    || R_PRODUCT_DETAIL.COLOR
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT ||'<variable name= "Model">'
    || R_PRODUCT_DETAIL.model
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Build_Date">'
    || R_PRODUCT_DETAIL.BUILD_DATE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Assy_Cart">'
    || R_PRODUCT_DETAIL.ASSY_CART
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Assy_Slot">'
    || R_PRODUCT_DETAIL.ASSY_SLOT
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Assy_Line">'
    || R_PRODUCT_DETAIL.ASSY_LINE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Finish_Cart">'
    || R_PRODUCT_DETAIL.FINISH_CART
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Finish_Slot">'
    || R_PRODUCT_DETAIL.FINISH_SLOT
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Serial_Number">'
    || R_PRODUCT_DETAIL.SERIAL_NO
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Serial_Number_Barcode">'
    || R_PRODUCT_DETAIL.SERIAL_NO
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Specie">'
    || R_PRODUCT_DETAIL.SPECIE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT ||'<variable name= "Truck_Group">'
    || R_PRODUCT_DETAIL.TRUCK_GROUP
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Label_Sequence_No">'
    || L_LABEL_COUNT
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT ||'<variable name= "WIP_Cart">'
    || R_PRODUCT_DETAIL.WIP_CART
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "WIP_Slot">'
    || R_PRODUCT_DETAIL.WIP_SLOT
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Cabinet_Sequence_No">'
    || R_PRODUCT_DETAIL.CAB_SEQ_NO
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "RAW_PART_NO">'
    || R_PRODUCT_DETAIL.RAW_PART_NO
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "JC">'
    || R_PRODUCT_DETAIL.JC
    || '</variable>' ;
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "QC">'
    || R_PRODUCT_DETAIL.QC
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Thickness">'
    || R_PRODUCT_DETAIL.THICKNESS
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Width">'
    || R_PRODUCT_DETAIL.width
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Length">'
    || R_PRODUCT_DETAIL.length
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Overlay">'
    || R_PRODUCT_DETAIL.OVERLAY
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Options">'
    || R_PRODUCT_DETAIL.OPTIONS
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Stop">'
    || R_PRODUCT_DETAIL.stop
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Profile_No">'
    || R_PRODUCT_DETAIL.PROFILE_NO
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Door_Style">'
    || R_PRODUCT_DETAIL.DOOR_STYLE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Glaze">'
    || R_PRODUCT_DETAIL.GLAZE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Shape">'
    || R_PRODUCT_DETAIL.SHAPE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Glass">'
    || R_PRODUCT_DETAIL.GLASS
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Hinge_Side">'
    || R_PRODUCT_DETAIL.HINGE_SIDE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Hinge_Type">'
    || R_PRODUCT_DETAIL.HINGE_TYPE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "EGE">'
    || R_PRODUCT_DETAIL.EGE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Door_Style_Code">'
    || R_PRODUCT_DETAIL.DOOR_STYLE_CODE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Finish_Technique">'
    || R_PRODUCT_DETAIL.FINISH_TECHNIQUE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Hinge_Location">'
    || R_PRODUCT_DETAIL.HINGE_LOCATION
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Construction_Type">'
    || R_PRODUCT_DETAIL.CONSTRUCTION_TYPE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Panel_Type">'
    || R_PRODUCT_DETAIL.PANEL_TYPE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Panel_Profile_No">'
    || R_PRODUCT_DETAIL.PANEL_PROFILE_NO
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Rail_Profile_No">'
    || R_PRODUCT_DETAIL.RAIL_PROFILE_NO
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Rail_1_Length">'
    || R_PRODUCT_DETAIL.RAIL_1_LENGTH
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Stile_Profile_No">'
    || R_PRODUCT_DETAIL.STILE_PROFILE_NO
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Rail_2_Length">'
    || R_PRODUCT_DETAIL.RAIL_2_LENGTH
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Stile_1_Length">'
    || R_PRODUCT_DETAIL.STILE_1_LENGTH
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Stile_2_Length">'
    || R_PRODUCT_DETAIL.STILE_2_LENGTH
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Panel_1_Width">'
    || R_PRODUCT_DETAIL.PANEL_1_WIDTH
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Panel_1_Length">'
    || R_PRODUCT_DETAIL.PANEL_1_LENGTH
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Panel_2_Width">'
    || R_PRODUCT_DETAIL.PANEL_2_WIDTH
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Panel_2_Length">'
    || R_PRODUCT_DETAIL.PANEL_2_LENGTH
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT ||'</label>';
    /*-----------------------------------------+
    | Handling XML data for special message |
    +-----------------------------------------*/
    FOR rec_count IN c_count (r_product_detail.item_id)
    LOOP
    L_XML_CONTENT := L_XML_CONTENT || '<label _FORMAT='
    || '"'
    || 'lib://FRD/SpecMessage_Door.btw'
    || '"'
    || ' _PRINTERNAME='
    || '"'
    || v_printer
    || '"'
    || ' _QUANTITY='
    || '"'
    || v_quantity
    || '"'
    || '>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Serial_Number">'
    || REC_COUNT.SERIAL_NUMBER
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Special_Message">'
    || REC_COUNT.ATTRIBUTE_VALUE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Cabinet_Sequence_No">'
    || REC_COUNT.CABINET_NUMBER
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT ||'</label>';
    EXIT WHEN c_count%NOTFOUND;
    end LOOP;
    -- L_XML_CONTENT := L_XML_CONTENT || '</labels>';
    fnd_file.put_line (fnd_file.LOG, l_xml_content);
    dbms_output.put_line ( l_xml_content );
    L_LABEL_COUNT := L_LABEL_COUNT + 1;
    -- apps.inv_print_request.sync_print_tcpip (l_xml_content,
    -- l_job_status,
    -- l_printer_status,
    -- l_status_type,
    -- l_return_status,
    -- l_return_msg
    END LOOP;
    END LOOP;
    l_xml_content := l_xml_content || '</labels>';
    fnd_file.put_line (fnd_file.LOG, l_xml_content);
    apps.inv_print_request.sync_print_tcpip (l_xml_content,
    l_job_status,
    l_printer_status,
    l_status_type,
    l_return_status,
    l_return_msg
    L_XML_CONTENT := null;
    /*--------------------------------------------------------------------------------------+
    | APPS.INV_PRINT_REQUEST.SYNC_PRINT_TCPIP will send the XML data to TCP/IP Port |
    +--------------------------------------------------------------------------------------*/
    fnd_file.put_line (fnd_file.LOG,
    'Printer Status:' || ' ' || l_printer_status
    fnd_file.put_line (fnd_file.LOG,
    'Return Status:' || ' ' || l_return_status
    fnd_file.put_line (fnd_file.LOG,
    'Return Message:' || ' ' || L_RETURN_MSG
    COMMIT;
    EXCEPTION
    WHEN OTHERS
    THEN
    fnd_file.put_line
    (fnd_file.LOG,
    'Unexpected error in the xxmb_get_xml_data_1270 procedure, error is : '
    || SQLERRM
    || ', '
    || SQLCODE
    END xxmb_get_xml_data_1270;
    END xxmb_wip_prod_tag_door_pkg;
    /

  • Create new table , 1 column has large amount of data, column type ?

    Hi, sure this is a noob question but...
    I exported a table from sql server, want to replicate in oracle.
    It has 3 columns, one of the columns has the configuration for reports, so each field in the column has a large amount of text in it, some upwards of 30k characters.
    What kind of column do I make this in oracle, I believe varchar has a 4k limit.
    creation of the table...
    REATE TABLE "RPT_BACKUP"
    "NAME" VARCHAR2(1000 BYTE),
    "DATA"
    "USER" VARCHAR2(1000 BYTE)
    ORGANIZATION EXTERNAL
    TYPE ORACLE_LOADER DEFAULT DIRECTORY "XE_FTP" ACCESS PARAMETERS ( records delimited BY newline
    skip 1 fields terminated BY ',' OPTIONALLY ENCLOSED BY '"' MISSING FIELD VALUES
    ARE NULL ) LOCATION ( 'REPORT_BACKUP.csv' ))
    Edited by: Jay on May 3, 2011 5:42 AM
    Edited by: Jay on May 3, 2011 5:43 AM

    Ok, tried that, but now when i do a select * from table i get..
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-30653: reject limit reached
    ORA-06512: at "SYS.ORACLE_LOADER", line 52
    29913. 00000 - "error in executing %s callout"
    *Cause:    The execution of the specified callout caused an error.
    *Action:   Examine the error messages take appropriate action.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • FI-SL data package too large on delta

    Hi Guys,
    I`m loading data from FI-SL total extractor ( 3FI_SL_xx_TT )
    No problem with Delta Init but when we try to load a regular delta there is some problem with the size of the packages, there are totally irregular package sizes: 70.158, 52.398, 299.784, 299.982, 57243, .. So I`m getting TSV_TNEW_PAGE_ALLOC_FAILED error on transfer from PSA to InfoCube.
    Unfortunately we have to load a heavy bunch of data on delta because they want this data monthly (an business issue) so I`m note able to load it in small requests every day.
    To make the things worse I have that start routine to carry on Balance (sapnote_0000577644 - DataSource 0EC_PCA_3 and FI-SL line item DataSources).
    It`s not taking the package size of the infopackage in consideration (1000 Kbs / 2 processes / 10 pkgs/IDoc Info).
    Config:
    - BI 7.0 SP 23/AIX/Oracle
    - ECC: AIX/Oracle PI_BASIS 2008_1_700 000 / SAP_ABA 700 015 / SAP_BASIS 700 015 / SAP_APPL 600 013
    I  checked sap note sapnote_0000917737 - Reducing the request size for totals record DataSources but it not apply to us...
    Someone have any idea how to fix it?
    Thanks in advance,
    Alex

    Chubbyd4d wrote:
    Hi masters,
    I have a package of 11,000 rows coding with around 70 procedures and function in my DB.
    I used array to handle the global data, and almost every procedure and function in the package using the array.
    The package was creating problem in debugging. Gave me the Program too large error. so it came to an idea to split the package to smaller packages.
    However, the problem that I would like to discuss are
    what is the advantage to split the package into few smaller packages. How is the impact on the memory.
    if I chunk the package into few packages, will it be easier if use GTT instead of array for the global data? How is the impact on the peroformance, since GTT is using I/OOne of our larger packages is over 20,000 lines and around 500 procedures on a 10.2 database.
    No problem or complaints about it being too large.
    As for splitting packages, that's entirely up to you. Packages are designed to put all related functionality in one place. If you feel a need to split it out to seperate packages then perhaps consider grouping together related functionality on a smaller granularity that you previously had.
    In relation to memory, smaller packages will take up less space obviously, but if you are going to be calling all the packages anyway then they'll all get loaded into memory and still take up about the same amount of memory (give or take a little).
    GTT's are generally a better idea than arrays if you are dealing with large amounts of data or you need to run queries against that data. If you need to effectively query against the data then you'll probably end up with worse performance processing an array in a query style than the IO overhead of a GTT. Of course you have to look at the individual processes and weigh up the pros and cons for each.

  • Couldn't copy large amount of data from enterprise DB to Oracle 10g

    Hi,
    I am using i-batis to copy data from enterprise DB to oracle and viceversa.
    The datatype of a field on EDB is 'text' and the datatype on oracle is 'SYS.XMLTYPE'
    i am binding these to a java string property in a POJO to bind values.
    I could successfully copy limited amount of data from EDB to oracle but if there is more data, i am getting the following exceptions with different oracle drivers ( but i could read large amount of data from EDB):
    --- Cause: java.sql.SQLException: ORA-01461: can bind a LONG value only for insert into a LONG column
    at com.ibatis.sqlmap.engine.mapping.statement.MappedStatement.executeUpdate(MappedStatement.java:107)
    at com.ibatis.sqlmap.engine.impl.SqlMapExecutorDelegate.update(SqlMapExecutorDelegate.java:457)
    at com.ibatis.sqlmap.engine.impl.SqlMapSessionImpl.update(SqlMapSessionImpl.java:90)
    at com.ibatis.sqlmap.engine.impl.SqlMapClientImpl.update(SqlMapClientImpl.java:66)
    at com.aqa.pojos.OstBtlData.updateOracleFromEdbBtlWebservice(OstBtlData.java:282)
    at com.aqa.pojos.OstBtlData.searchEdbAndUpdateOracleBtlWebservice(OstBtlData.java:258)
    com.ibatis.common.jdbc.exception.NestedSQLException:
    --- The error occurred in com/aqa/sqlmaps/SQLMaps_OSTBTL_Oracle.xml.
    --- The error occurred while applying a parameter map.
    --- Check the updateOracleFromEDB-InlineParameterMap.
    --- Check the parameter mapping for the 'btlxml' property.
    --- Cause: java.sql.SQLException: setString can only process strings of less than 32766 chararacters
    at com.ibatis.sqlmap.engine.mapping.statement.MappedStatement.executeUpdate(MappedStatement.java:107)
    at com.iba
    I have latest oracle 10g jdbc drivers.
    remember, i could copy any amount of data from oracle to EDB but not otherway around.
    PLease let me know if you have come across this issue, any recommendation is very much appreciated.
    Thanks,
    CK.

    Hi,
    I finally remembered how I solved this issue previously.
    The jdbc driver isn't able to directly call the insert with a column xml_type. The solution I was using was to build a wrapper function in plSQL.
    Here it is (for insert but I suppose tha update will be the same)
    create or replace procedure insertXML(file_no_in in number, program_no_in in varchar2, ost_XML_in in clob, btl_XML_in in clob) is
    begin
    insert into AQAOST_FILES (file_no,program_no,ost_xml,btl_xml) values(file_no_in, program_no_in, xmltype(ost_XML_in), xmltype(btl_XML_in));
    end insertXML;
    here is the sqlmap file I used
    <?xml version="1.0" encoding="UTF-8" ?>
    <!DOCTYPE sqlMap
    PUBLIC "-//ibatis.apache.org//DTD SQL Map 2.0//EN"
    "http://ibatis.apache.org/dtd/sql-map-2.dtd">
    <sqlMap>
         <typeAlias alias="AqAost" type="com.sg2net.jdbc.AqAost" />
         <insert id="insert" parameterClass="AqAost">
              begin
                   insertxml(#fileNo#,#programNo#,#ostXML:CLOB#,#bltXML:CLOB#);
              end;
         </insert>
    </sqlMap>
    an here is a simple program
    package com.sg2net.jdbc;
    import java.io.IOException;
    import java.io.Reader;
    import java.io.StringWriter;
    import java.sql.Connection;
    import oracle.jdbc.pool.OracleDataSource;
    import com.ibatis.common.resources.Resources;
    import com.ibatis.sqlmap.client.SqlMapClient;
    import com.ibatis.sqlmap.client.SqlMapClientBuilder;
    public class TestInsertXMLType {
         * @param args
         public static void main(String[] args) throws Exception {
              // TODO Auto-generated method stub
              String resource="sql-map-config-xmlt.xml";
              Reader reader= Resources.getResourceAsReader(resource);
              SqlMapClient sqlMap = SqlMapClientBuilder.buildSqlMapClient(reader);
              OracleDataSource dataSource= new OracleDataSource();
              dataSource.setUser("test");
              dataSource.setPassword("test");
              dataSource.setURL("jdbc:oracle:thin:@localhost:1521:orcl");
              Connection connection=dataSource.getConnection();
              sqlMap.setUserConnection(connection);
              AqAost aqAost= new AqAost();
              aqAost.setFileNo(3);
              aqAost.setProgramNo("prg");
              Reader ostXMLReader= Resources.getResourceAsReader("ostXML.xml");
              Reader bltXMLReader= Resources.getResourceAsReader("bstXML.xml");
              aqAost.setOstXML(readerToString(ostXMLReader));
              aqAost.setBltXML(readerToString(bltXMLReader));
              sqlMap.insert("insert", aqAost);
              connection.commit();
         public static String readerToString(Reader reader) {
              StringWriter writer = new StringWriter();
              char[] buffer = new char[2048];
              int charsRead = 0;
              try {
                   while ((charsRead = reader.read(buffer)) > 0) {
                        writer.write(buffer, 0, charsRead);
              } catch (IOException ioe) {
                   throw new RuntimeException("error while converting reader to String", ioe);
              return writer.toString();
    package com.sg2net.jdbc;
    public class AqAost {
         private long fileNo;
         private String programNo;
         private String ostXML;
         private String bltXML;
         public long getFileNo() {
              return fileNo;
         public void setFileNo(long fileNo) {
              this.fileNo = fileNo;
         public String getProgramNo() {
              return programNo;
         public void setProgramNo(String programNo) {
              this.programNo = programNo;
         public String getOstXML() {
              return ostXML;
         public void setOstXML(String ostXML) {
              this.ostXML = ostXML;
         public String getBltXML() {
              return bltXML;
         public void setBltXML(String bltXML) {
              this.bltXML = bltXML;
    I tested the insert and it works correctly
    ciao,
    Giovanni

  • Streaming large amounts of data of socket causes corruption?

    I'm wrinting an app to transfer large amounts of data via a simple client/server architecture between two machines.
    Problem: If I send the data too 'fast', the data arrives corrupted:
    - Calls to read() returns wrong data (wrong 'crc')
    - Subsequent calls to read() do not return -1 but allow me to read e.g. another 60 or 80 KBytes.
    - available() returns always '0'; but I'll get rid of that method anyway (as recommended in other forum entries).
    The behaviour is somewhat difficult to repeat, but it fails for me reliably when transferring the data between two separate machines and when setting the number of packets (Sender.TM) to 1000 or larger.
    Workaround: Reduce number of packages send to e.g. 1; or intruduce the 'sleep' on the Sender side. Another workaround: Changing alone to java.nio.* did not help, but when I got rid of the Streams and used solely ByteBuffers, the problem disappeared. Unfortunately the Streams are required by other parts of my application.
    I'm running the code on two dual-CPU machines connected via
    Below are the code of the Sender and the Listener. Please excuse the style as this is only to demonstrate the problem.
    import java.io.IOException;
    import java.io.OutputStream;
    import java.net.InetSocketAddress;
    import java.nio.channels.Channels;
    import java.nio.channels.SocketChannel;
    import java.util.Arrays;
    public class SenderBugStreams {
        public static void main(String[] args) throws IOException {
            InetSocketAddress targetAdr = new InetSocketAddress(args[0], Listener.DEFAULT_PORT);
            System.out.println("connecting to: "+targetAdr);
            SocketChannel socket = SocketChannel.open(targetAdr);
            sendData(socket);
            socket.close();
            System.out.println("Finished.");
        static final int TM = 10000;
        static final int TM_SIZE = 1000;
        static final int CRC = 2;
        static int k = 5;
        private static void sendData(SocketChannel socket) throws IOException {
            OutputStream out = Channels.newOutputStream(socket);
            byte[] ba = new byte[TM_SIZE];
            Arrays.fill(ba, (byte)(k++ % 127));
            System.out.println("Sending..."+k);
            for (int i = 0; i < TM; i++) {
                out.write(ba);
    //            try {
    //                Thread.sleep(10);
    //            } catch (InterruptedException e) {
    //                // TODO Auto-generated catch block
    //                e.printStackTrace();
    //                throw new RuntimeException(e);
            out.write(CRC);
            out.flush();
            out.close();
    import java.io.IOException;
    import java.io.InputStream;
    import java.net.InetSocketAddress;
    import java.nio.channels.Channels;
    import java.nio.channels.ServerSocketChannel;
    import java.nio.channels.SocketChannel;
    public class ListenerBugStreams {
        static int DEFAULT_PORT = 44521;
         * @param args
         * @throws IOException
        public static void main(String[] args) throws IOException {
            ServerSocketChannel serverChannel = ServerSocketChannel.open();
            serverChannel.socket().bind(new InetSocketAddress(DEFAULT_PORT));
            System.out.print("Waiting...");
            SocketChannel clientSocket = serverChannel.accept();
            System.out.println(" starting, IP=" + clientSocket.socket().getInetAddress() +
                ", Port="+clientSocket.socket().getLocalPort());
            //read data from socket
            readData(clientSocket);
            clientSocket.close();
            serverChannel.close();
            System.out.println("Closed.");
        private static void readData(SocketChannel clientSocket) throws IOException {
            InputStream in = Channels.newInputStream(clientSocket);
            //read and ingest objects
            byte[] ba = null;
            for (int i = 0; i < SenderBugStreams.TM; i++) {
                ba = new byte[SenderBugStreams.TM_SIZE];
                in.read(ba);
                System.out.print("*");
            //verify checksum
            int crcIn = in.read();
            if (SenderBugStreams.CRC != crcIn) {
                System.out.println("ERROR: Invalid checksum: "+SenderBugStreams.CRC+"/"+crcIn);
            System.out.println(ba[0]);
            int x = in.read();
            int remaining = 0;
            while (x != -1) {
                remaining++;
                x = in.read();
            System.out.println("Remaining:"+in.available()+"/"+remaining);
            System.out.println(" "+SenderBug.TM+" objects ingested.");
            in.close();
    }

    Here is your trouble:
    in.read(ba);read(byte[]) does not read N bytes, it reads up to N bytes. If one byte has arrived then it reads and returns that one byte. You always need to check the return value of read(byte[]) to see how much you got (also check for EOF). TCP chops up the written data to whatever packets it feels like and that makes read(byte[]) pretty random.
    You can use DataInputStream which has a readFully() method; it loops calling read() until it gets the full buffer's worth. Or you can write a little static utility readFully() like so:
        // Returns false if hits EOF immediately. Otherwise reads the full buffer's
        // worth. If encounters EOF in mid-packet throws an IOException.
        public static boolean readFully(InputStream in, byte buf[])
            throws IOException
            return readFully(in, buf, 0, buf.length);
        public static boolean readFully(InputStream in, byte buf[], int pos, int len)
            throws IOException
            int got_total = 0;
            while (got_total < len) {
                int got = in.read(buf, pos + got_total, len - got_total);
                if (got == -1) {
                    if (got_total == 0)
                        return false;
                    throw new EOFException("readFully: end of file; expected " +
                                           len + " bytes, got only " + got_total);
                got_total += got;
            return true;
        }

  • Storing large amounts of data

    Hello,
    I'd like to use Berkeley DB for logging large amounts of data - i.e. structures that are ~400KB in size and I need to store them ~10 times per second for up to several hours, but I get into quite big performance issues the more records I insert into the database. I've set the pagesize to its maximum (64KB - I split my data into several packages so it doesn't get stored on an overflow page) and experimented with several chache sizes (8MB, 64MB, 2GB, 4GB), but I haven't managed to get rid of the performance issues, independent of which access method I use (although I got the "best" results when using DB_QUEUE, but that varies heavily from day to day).
    To get to the point: Performance starts at "0" seconds per insert (where 1 "insert" = 7 real inserts because of splitting up the data), between the 16750. and 17000. insertion it takes ~0.00352 per insert and when reaching the 36000. insertion it already takes about 0.0074 seconds per insert, and so on ...
    Does anyone have an idea on how I can increase my performance? Because when the time needed for each insertion keeps increasing over time, it's not possible to keep the program running at its intended speed at some point.
    Thanks,
    Thomas

    Hello,
    A good starting point are the suggestions in the Berkeley DB Reference Guide at:
    http://www.oracle.com/technology/documentation/berkeley-db/db/programmer_reference/am_misc_tune.html
    http://www.oracle.com/technology/documentation/berkeley-db/db/programmer_reference/transapp_tune.html
    http://www.oracle.com/technology/documentation/berkeley-db/db/programmer_reference/transapp_throughput.html
    Thanks,
    Sandra

  • Large Amount of Data in JSF

    Hello,
    I am using the Table Group component for displaying data in my application designed in Java Studio Creator.
    I have enabled paging on the component. I use CachedRowSet on the bean for the page for getting the data. This works very well at the moment in my development environment. At the moment I am testing on small amount of data.
    I was wondering how does this component perform with very large amounts of data (>75,000 rows). I noticed that there is a button available for users to retrieve all the rows. So I was wondering apart from that instance, when viewing in a paged mode does the component get all the results from the database everytime ?
    Which component would be best suited for displaying large amounts of data in a table format?
    Thanks In Advance!!

    Thanks for your reply. The table control that I use does have paging as a feature and I have enabled it. It still takes time to load the data initially.
    I wonder if it is got to do with the logic of paging. How do you specify which set of 20 records to extract from SQL.
    Thanks for your help!!

Maybe you are looking for

  • How to stop sleep/hibernation with external display & lid down

    What is the proper way to get my MBA to not sleep, hibernate, etc. when I have an external display attached and lid closed while watching a movie or webcast. In such a case I am not touching the mouse, trackpad, or keyboard. I have bluetooth keyboard

  • Word document w/images in RGB pasted into InD CS5

    Hi! Working in InDesign CS5 some problem occured: I have received from 3rd party  doc in Word w/embedded images(a lot ((( and text, the images are RGB defined. Now as printing preprocess(finally to PDF) I need to format/create whole document in Indes

  • Airport does not want to connect to a weaker signal

    my airport has decided that it does not want to connect to a weaker signal. at home if i am next to the wireless source my computer connects just fine, but if i go downstairs to my room the signal strength starts dropping and then will disconnect all

  • Import from Excel to Project - blanks cells are having dates automatically added

    Hi. I'm very new to Project but have started setting up a Roadmap using it which is working fairly well so far, although have hit one snag. I have data in Excel: it is a list of projects with 4 columns containing the dates of various stages (planning

  • Challenge with FaceBook game App myvegas

    Just purchase newer iPad.  Using FaceBook app myvegas and am told Flash Player needs to update.  Go to update and am told iPad doesn't support.  Anyone else have this problem and if so were you able to resolve?