Load hung due to hung data packet

Hi,
We are getting problem daily Load hung due to hung data packet.We are doing daily changed the qm status to red and manually updated the datapacket.load completed.Is there any solutions is there to prevent this problem other than manual?
Sridhar

hi
You can try reducing the data packet size of the IP.
Goto the maintanence of the IP.
Scheduler menu option --> DataS. Default Data trannfer.
Now in the pop up window you can see
MAximum size of a data packet in kByte mentioned for the Full Upload, Delta Upload and Initializing Delta.
eg. If you are facing the problem in full load
reduce the data packet size to half.
save and then execute the IP.
This will solve your problem.
Regards
Shilpa

Similar Messages

  • Start routine: only once per teh load but not once per data packet

    Hi,
    I would like to execute some code in start routine (in update rules) only once per the load but not once per the data packet.
    How can I implement this.
    Regards,

    I once had that same requirement, but in a datasource in R3. I'm not sure if the same solution would work in BW, though. I used a memory id to keep the variable value between packets:
    DATA: ... n_globalvar TYPE n ...
    then I added...
    IMPORT n_globalvar FROM MEMORY ID 'ZMEMID01'.
    ...at the start of the routine to retrieve to the variable the value from the memory id.
    At the end of the code, I exported the variable back to the same memory id...
    EXPORT n_globalvar to MEMORY ID 'ZMEMID01'.

  • Data Load Error  due to Master data deletion

    Hi,
    While doing the transactional data load I am getting following error.
    +Master data/text of characteristic ZFOCUSGRP already deleted Message no RSDMD138   +
    ZFOCUSGRP  is  an  InfoObject (with Text). Last week we changed the source system from CRM to R/3 during that time we deleted all the Texts in ZFOCUSGRP manually from the table.
    This error is not happening always some time it load properly. I executed the RSRV for  InfoObject ZFOCUSGRP and InfoCube still this error happening.
    Is there any way to fix this error?
    Thanks in advance.
    Thanks
    Vinod

    check this:
    Re: Error while running InfoPackage
    Master data/text of characteristic 0MATERIAL already deleted
    Master data/text of characteristic ZXVY already deleted
    Hope it helps..

  • Data packet not yet processing in ODS load??

    Hi all,
    I got an error when I loaded data from IS to the ODS. Can someone let me know why and how to resolve it. Thank you in advance.
    Here is the error message in the monitor:
    <b>Warning: data packet 1 & 2 arrived BW; processing: data packet not yet processing.
    (No data packet numbers could be determined for request REQU_77H7ERP54VXW5PZZP5J6DYKP7)</b>
    <b>Processing end:
    transfer rules (0 record): missing message
    Update PSA (0 record): messing messages
    Update rules (0 record): messging messages</b>

    John,
    I dont think its space problem.In st22 go with detail note.
    What happend, how to correct it.Will help you to solve the problem.
    Check this note <b>613440</b> also.
    <b>Note : 647125</b>
    Symptom
    A DYNPRO_FIELD_CONVERSION dump occurs on screen 450 of the RSM1 function group (saplrsm1).
    Other terms
    DYNPRO_FIELD_CONVERSION, 450, SAPLRSM1
    Reason and Prerequisites
    This is caused by a program error.
    The screen contains unused, hidden fields/screen elements that are too small for the screen check that was intensified with the current Basis patch (kernel patch 880). These fields originate in the 4.0B period of BW 1.0 and are never used.
    Solution
    Depending on your BW system release, you must solve the problem as follows:
    BW 3.0B
               ImportSupport Package 14 for 3.0B (BW 3.0B Patch 14 or SAPKW30B14) into your BW system. This Support Package will be available when note 571695 with the short text,"SAPBWNews BW 3.0B Support Package 14", which describes this Support Package in more detail, is released for customers.
    BW 3.1 Content
               ImportSupport Package 8 for 3.1 Content (BW 3.10 Patch 08 or SAPKW31008) into your BW system.This Support Package will be availablewhen note 571743 with the short text, "SAPBWNews BW 3.1 Content Support Package 08", is released for customers.
    The dump occurs with the invisible G_NEW_DATUM date field on the bottom right of the screen, which is only 1 byte long and can be deleted.
    You can delete the following unused fields/screen elements:
    %A_G_NEW_NOW                Selectionfield group
    G_NEW_ZEIT                  Input/output field
    G_NEW_UNAME                Input/output field
    G_NEW_DATUM                Input/output field
    %#AUTOTEXT021               Text field
    G_NEW_NOW                  Selection button
    G_NEW_BATCH                 Selection button
    You can delete these fields/screen elements because they are not used anywhere.
    This deletion does not cause any problems.
    After you delete the fields/screen elements, you must also delete the following rows in the flow logic in screen 450:
    FIELD G_NEW_DATUM           MODULE DOKU_NEW_DATUM.
    FIELD G_NEW_ZEIT            MODULE DOKU_NEW_ZEIT.
    The function group is then syntactically correct.
    Unfortunately, we cannot provide an advance correction.
    The aforementioned notes may already be available to provide information in advance of the Support Package release.However, the short text will still contains the words "preliminary version" in this case.
    For more information on BW Support Packages, see note 110934.
    Thanks
    Ram

  • How to skip an entire data packet while data loading

    Hi All,
    We want to skip some records based on a condition while loading from PSA to the Cube, for which we have written a ABAP code in Start Routine .
    This is working fine.
    But there is a Data packet where all the records are supposed to be skipped and here it is giving Dump and Exception CX_RSROUT_SKIP_RECORD.
    The ABAP Code written is
    DELETE SOURCE_PACKAGE WHERE FIELD = 'ABC' .
    And for a particular data packet all the records satisfy the condition and gets deleted.
    Please advice how to skip the entire data packet if all the reocrs satisfy the condition to be deleted and handle the exception CX_RSROUT_SKIP_RECORD .
    Edited by: Rahir on Mar 26, 2009 3:26 PM
    Edited by: Rahir on Mar 26, 2009 3:40 PM

    Hi All,
    The Dump I am getting is :
    The exception 'CX_RSROUT_SKIP_RECORD' was raised, but it was not caught
    anywhere along
    the call hierarchy.
    Since exceptions represent error situations and this error was not
    adequately responded to, the running ABAP program 'GPD4PXLIP83MFQ273A2M8HU4ULN'
    has to be terminated.
    But this  comes only  when all the records in a particular Data Packet gets skipped.
    For rest of the Data Packets it works fine.
    I think if the Data Packet(with 0 records) itself can be skipped this will be resolved or the Exception will be taken care of.
    Please advice how to resolve this and avoid 'CX_RSROUT_SKIP_RECORD'  at earliest .
    Edited by: Rahir on Mar 27, 2009 6:25 AM
    Edited by: Rahir on Mar 27, 2009 7:34 AM

  • Last Data packet for the data load.

    Hi All,
    How to find the last data packet loaded to target for the data load in SAP BI? In the Table?
    Thank you,
    Adhvi

    Hi Adhvirao,
    When u r loading the data from ECC go to monitor ---> details ---> check the data pacaket which are failed double click on that data packet then in bottom u will able to see the IDOC no for ecc side which are pending in ecc containing some data. you can excute them manually to BI side thorgh BD87.
    Thanks,
    Deepak

  • Data Load Fails due to duplicate records from the PSA

    Hi,
    I have loaded the Master Data Twice in to the PSA.  Then, I created the DTP to load the data from the PSA to the InfoProvider.  The data load is failing with an error "duplicate key/records found".
    Is there any setting that I can configure by which even though I have duplicate records in PSA, I can be successfully able to load only one set of data (without duplicates) in to the InfoProvider?
    How can I set up the process chains to do so?
    Your answer to the above two questions is appreciated.
    Thanks,

    Hi Sesh,
    There are 2 places where the DTP checks for duplicates.
    In the first, it checks previous error stacks. If the records you are loading are still contained in the error stack of a previous DTP run, it will throw the error at this stage. In this case you will first have to clean up the previous error stack.
    The second stage will clean up duplicates  across datapackeages, provided the option is set in your datasource. But you should note that this will not solve the problem if you have duplicates in the same datapackage. In that case you can do the filtering yourself in the start routine of your transformation.
    Hope this helps,
    Pieter

  • Data packets/size - Generic extractor

    Hi all,
    We built a custom function module based datasource and it is extracting data to BW in one big packet of 900,000+ records and the load is taking about 18 hours. We are trying to spilt the BW extraction into smaller data packets to improve performance but unable to do so. Following is our extraction program...
    Please let me know where we are doing it wrong...
    This Program fetches/build e_t_data. The issue is, program does not splitting into packets as the SAP standard program does.
    ""Local interface:
    *"  IMPORTING
    *"     VALUE(I_REQUNR) TYPE  SRSC_S_IF_SIMPLE-REQUNR
    *"     VALUE(I_DSOURCE) TYPE  SRSC_S_IF_SIMPLE-DSOURCE OPTIONAL
    *"     VALUE(I_MAXSIZE) TYPE  SRSC_S_IF_SIMPLE-MAXSIZE OPTIONAL
    *"     VALUE(I_INITFLAG) TYPE  SRSC_S_IF_SIMPLE-INITFLAG OPTIONAL
    *"     VALUE(I_READ_ONLY) TYPE  SRSC_S_IF_SIMPLE-READONLY OPTIONAL
    *"     VALUE(I_DATAPAKID) TYPE  SBIWA_S_INTERFACE-DATAPAKID OPTIONAL
    *"  TABLES
    *"      E_T_DATA STRUCTURE  Z0333W OPTIONAL
    *"      I_T_SELECT TYPE  SRSC_S_IF_SIMPLE-T_SELECT
    *"      I_T_FIELDS TYPE  SRSC_S_IF_SIMPLE-T_FIELDS
    *"  EXCEPTIONS
    *"      NO_MORE_DATA
      DATA: lr_range_name TYPE rsselect-fieldnm.
      DATA: st_e_t_data TYPE z0333w.
      STATICS: l_cursor TYPE cursor.
      STATICS: called(1) TYPE c VALUE 'N'.
    Maximum number of lines for DB table
      STATICS: l_maxsize TYPE sbiwa_s_interface-maxsize.
      FIELD-SYMBOLS: <l_range> TYPE ANY,
                     <l_t_range> TYPE STANDARD TABLE.
      IF i_initflag = 'X'.
    Initialization: check input parameters
                    buffer input parameters
                    prepare data selection
    Fill parameter buffer for data extraction calls
        g_s_interface-requnr    = i_requnr.
        g_s_interface-isource   = i_dsource.
        g_s_interface-maxsize   = i_maxsize.
        g_s_interface-initflag  = i_initflag.
        g_s_interface-datapakid = i_datapakid.
        g_flag_interface_initialized = sbiwa_c_flag_on.
        REFRESH g_t_select.
        REFRESH g_t_fields.
        APPEND LINES OF i_t_select TO g_t_select.
        APPEND LINES OF i_t_fields TO g_t_fields.
      ELSE.
    first data package of first table -> open cursor
        IF g_counter_datapakid = 0.
    *--Get selection ranges
          LOOP AT i_t_select.
            MOVE-CORRESPONDING i_t_select TO <l_range>.
            APPEND <l_range> TO <l_t_range>.
          ENDLOOP.
          l_maxsize = g_s_interface-maxsize.
    fetch plants for the company code
          PERFORM get_plants.
    fetch mast data into internal table as we will be using MAST for validation
    of whether BOM exist or not.
          SELECT * FROM mast INTO TABLE it_mast
                             WHERE stlan = '1' OR stlan = '6'.
          SORT it_mast BY matnr werks stlan.
    Material BOM information
    First data package -> OPEN CURSOR
         OPEN CURSOR WITH HOLD l_cursor FOR
        SELECT mast~matnr
               mast~werks
               mast~stlnr
               mast~stlan
               mast~stlal
               stko~stlty
       FROM mast INNER JOIN stko
         ON stkostlnr = maststlnr AND
            stkostlal = maststlal
       FOR ALL entries IN gt_werks
      WHERE mast~matnr IN gr_matnr AND
            mast~werks IN gr_werks AND
            mast~stlan IN gr_stlan AND
            mast~werks = gt_werks-werks AND
            mast~stlal = '01' AND
            stko~stlty = 'M'  AND                    "Material BOM only
            ( maststlan = '1' OR maststlan = '6' ).
        ENDIF.
    Fetch records into interface table.
      named E_T_'Name of extract structure'.
        REFRESH: gt_mat_bom,gt_mat_bom1.
        FETCH NEXT CURSOR l_cursor
                   APPENDING CORRESPONDING FIELDS
                   OF TABLE  gt_mat_bom
                   PACKAGE SIZE i_maxsize.
        IF sy-subrc <> 0.
          CLOSE CURSOR l_cursor.
          RAISE no_more_data.
        ELSE.
    get BOM data and fill E_T_DATA
          PERFORM get_bom_data TABLES e_t_data.
        ENDIF.
    Increment Package
        g_counter_datapakid = g_counter_datapakid + 1.
      ENDIF.
    ENDFUNCTION
    Thanks,
    Anirudh.

    I'm not sure, but this might help:
    * Fetch records into interface table.
    * named E_T_'Name of extract structure'.
      DO.
        REFRESH: gt_mat_bom,gt_mat_bom1.
        FETCH NEXT CURSOR l_cursor
          APPENDING CORRESPONDING FIELDS
          OF TABLE gt_mat_bom
          PACKAGE SIZE i_maxsize.
        IF sy-subrc <> 0.
          EXIT.
        ELSE.
    * get BOM data and fill E_T_DATA
          PERFORM get_bom_data TABLES e_t_data.
        ENDIF.
    * Increment Package
        g_counter_datapakid = g_counter_datapakid + 1.
      ENDDO.
      CLOSE CURSOR l_cursor.
      RAISE no_more_data.
    Rob

  • Error in data loads-Error occured in the data seletion

    Hi,
    We have to do an init load for Period 10 from data source 0EC_PCA_3.
    When the period changes we usually copy the previous infopack and change the fiscal yesr period and do an init load.
    This time we have to change the fiscal period to 010, we created a new info pack but the fiscal year period we forgot to change and was 009 and scheduled a init load.
    The load failed with error mesg  IN EXTRACTION PART:
    1) syntax error in RSCDELTA
    2) Error occured in the data selection -RSM 340
    REQUEST part in monitor screen is green.
    TRANSFER part in monitor screen is green
    EXTRACTION shows:
    Data request received ---green
    Data selection scheduled -
    green
    Error occurred in the data selection -
    RED
    Processing (data packet): No data ---RED
    We deleted the red request in the ods , deleted the init load in scheduled -init for source system --- chnaged the fiscal yesr period to 010 and triggered a new load but again it gives the same error mesg.
    How to rectify this and do loads to the target successfully.
    Please suggest

    Please check the job log in the source system. It will give you the exact reason why extraction is failing. You need to act accordingly.
    Thanks..
    Shambhu

  • Data package and data packet

    Hit
    i want to know the difference between data package and data packet .when this comes in sap bw
    with regards
    tushar

    Hello,
    Data package term is related to DTP which is used to load Data from PSA to further Data Targets
    Start and end routine works at package level so routine run for each package one by one .By default package have sorted data based on keys (non unique keys (characteristics )of source or target) and by setting semantic keys you can change this order.So Package having more data will take more time in processing then package have lesser data .
    Data Packet Term is related to Info Package which is used to load data from Source System to BI (PSA).
    As per SAP standard, we prefer to have 50,000 records per one data packet.
    For every data packet, it does commit & save --- so less no. of data packets required.
    If you have 1 lakh records per data packet and there is an error in the last record, the entire packet gets failed.
    Hope it helps!

  • Arrived in BW Processing: Data packet not yet processed

    Hello Gurus,
    Data is being loaded from export datasource (8RL*****) ti 2 data targets
    The overall QM status is red (says processing is overdue)
    The details tab shows
    Extraction: Error occurred
    Transfer(IDoc and TRFC): Error occurred
    Processing (Data packet): Error occurred
       -- Transfer rule: Error occurred
         Bewegungsdaten received.Processing being started
         Message missing(35676 records) : Transfer rule finished
      Update rule (0 records):Missing messages
      Update(0 new/0 changed): Missing message
      Processing end: Missing message
    I checked the LUW in SM58 but didnt find anything.
    I checked infosource (PSA) - one request is in RED, but when i check the data in PSA there are no errors.
    What should I do in this case?
    What might be the excat error.
    Kindly inform.
    Regards,
    NIKEKAB

    Hi,
    Check the DataSource in RSA3, if it is working fine and able to see the data in RSA3, there is no problem in DS level, then checl the Mappings and any routines in BW for that DS, if this is also fine then check the below options.
    See Dumps in ST22, SM21 also.
    Check RFC Connection between ECC and BW systems, i.e. RSA1-->Source System->Right Click on Source system and Check.
    You must have the following profiles to BWREMOTE or ALEREMOTE users.So add it. Bcoz either of these two users will use in background to get extract the data from ECC, so add these profiels in BW.
    S_BI-WHM_RFC, S_BI-WHM_SPC, S_BI-WX_RFC
    And also check the following things.
    1.Connections from BW to ECC and ECC to BW in SM59
    2.Check Port,Partner Profiles,and Message Types in WE20 in ECC & BW.
    3.Check Dumps in ST22, and SM21.
    4.If Idocs are stuck i.e see the OLTP Idoc numbers in RSMO Screen in (BW) detials tab see in bottom, you can see OLTP Idoc number and take the Idoc numbers and then goto to ECC see the status in WE05 or WE02, if error then check the log else goto to BD87 in ECC and give the Idoc numbers and execute manually and see in RSMO and refresh.
    5.Check the LUWs struck in SM58,User Name = * (star) and run it and see Strucked LUWs and select our LUW and execute manually and see in RSMO in BW.
    See in SDN
    Re: Loading error in the production  system
    Thanks
    Reddy

  • Data packet not getting processed

    Hi SDN's
    I m loading data from one ODS to 4 regions , the source ODS is successfully loaded from der to the data targets the load is getting failed or taking loang time .
    upto transfer rules the data is successful, in update rules data packets are not getting processed
    kindly suggest solution, points will be assigned
    thx in advance

    Hi Katam,
    In the target ODSs go to the monitor screen for a particular request -> in the menu bar go to environment -> transactRFC-> in the datawarehouse -> give the ID and date -> execute.
    Check if there are entries in that. Usually this queue will be stuck and you need to execute the LUWs in the queue manually.
    if it says transaction recorded u need to execute it manually
    Please revert if any issues
    Edited by: Pramod Manjunath on Dec 19, 2007 4:48 PM

  • WHAT IS DATA PACKET SIZING IN BW?

    Hi Gurus,
    WHAT IS DATA PACKET SIZING IN BW? is it modeling, extraction or reproting related topic?
    Regards,RAMU.

    Hi,
    To have the control over datapacket size (no of records in a datapackage) you can have settings in infopackage.
    In info package > Scheduler --> DataS. Default Data Transfer.This would be a local settings whenever you run the load through that IP.The changes that you make here takes priority over the default settings (SBIW) and applicable only for the loads from this info package.Default is 20000kB. Try half of it. You should specify it for the correct update method, ie if its a delta IP, you shud mention the size against delta IP. you can define number of datapackets per idoc as 10.
    You can also try RSCUSTV* where * is an integer to change data load settings.
    change Datapackage size for Flat File extraction, use Transaction RSCUSTV6.
    change Datapackage size when upload from an R/3 system, set this value in R/3 Customizing (SBIW -> General settings -> Control parameters for data transfer).
    IN R/3, T-Code SBIW --> Genaral settings --> Maintain Control Parameters for Data Transfer (source system specific)
    Try
    http://help.sap.com/saphelp_nw04/helpdata/en/51/85d6cf842825469a51b9a666442339/frameset.htm
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/1955ba90-0201-0010-d3aa-8b2a4ef6bbb2
    This is related to loading performance, take a look at this doc 'bw load performance and analysis'
    Hope this helps.
    Thanks,
    JituK

  • IDoc looks fine, but "Processing (data packet)" : No Data

    Hi dear all:
    we trying to loading data from R3 ECC 5.0 into BI 7.0. During the loading process, the status keeps "yellow".
    When we checked the Details in the extractor monitor, the message shows in "Transfer (IDocs and TRFC) is "Request IDOC and Info IDoc 1 : Application document posted".
    But the "Processing (data packet): No Data. We have checked on R3 in tcodes BD87 with IDoc number."IDoc entries in tRFC queues" shows red light under section of "IDoc in outbound processing".
    In IDoc entries in tRFC queues, there is not revenant record for this IDoc. Also we used tcode SM37 to check job execution based on the request number of data loading. The job is shown released.
    Can anyone give us some idea about what happened and how can we fix it.
    Thank you very much
    SF

    Geeta,
    Try this...heard there needs to be a setting made in T-code..SMWQR ..
    U have to register the csa*  queue...try getting more inputs on this...this is what I know abt this..
    Hope this helps..
    Regards.
    Nick.

  • Data packets not updated in PSA

    Dear Friends,
    Can anybody help me on this topic.
    A process chain is in error and the error message is "Data records were marked as incorrect in the PSA"
    how can i maintenance the PSA to upload the relevant data packet
    or is there any other way to  edit the incorrect records or remove the error.
    Thanks in advance for your help
    With regards
    RYD

    Hi Riyad,
    goto your PSA manage> select the erroneous records> Correct them--> save it.
    Then come to Process chain> right click at the point where the status is red> Repeat
    This will load the data.
    Hope it helps!
    Regards,
    Pavan

Maybe you are looking for

  • Help with PDF docs

    How do I fix PDF when it open in a larger resolution?

  • Web Service Timeout & connection errors

    Using jwsdp 2, i have developed a client that has a timeout property set. How are timeouts processed? Is an exception thrown by the port? or does the web method just return with a null return value? I also have noticed that in simulating a connection

  • Cannot resize column or reorder in library

    Using version 4.9 of iTunes. I cannot seem to resize the first column in the Library or my iPod Library. For example, when I open my iPod library the first column on the left is the track name. I can't resize it. I have tried selecting Auto size colu

  • Help with Album Artwork nightmare?

    I don't know how a device that has been around for 10 years still has such unbelievable problems simply maintaining and transferring songs. I recently synched my iPhone 4S with a new MacBook Pro. So, of course, it wiped out mostly everything that was

  • HTML content in Email from ALSB 2.6

    HI, I am trying to send an email consistig of HTML content from ALSB 2.6. I have set the Content-Type to 'text/HTML' and made the following assignment to $body: <soap-env:Body> <HTML> <HEAD> </HEAD> <BODY>      <H2><center>Error in {fn:string($body/n