Manual processing of data packets

Hi,
Iam trying to manually process data packets into the cube. But the process is running, Iam getting ABAP debugging screen. What does this mean?
Thanks & Regards
AP

HI
please check if there is any error message.
check if there are any jobs in sm37 in respect to this load.
if there are none.
make the request red and try reloading again
normally you will load the data only till psa or if you edit and records is when you will do the manual update and at the same time you should make sure that you delete the request from the info cube
hope this helps

Similar Messages

  • Processing data packet (no data)

    Hi all,
    My issue is: I am trying to load data utilizing a custom datasource and a custom function module.  When the job loads it runs successfully but no records are pulled over.  However, I get records using the extractor checkor in the source system.  This is working fine in our production environment (R3) but does not work in our test environment (my sap.) 
    Has anyone encounter a similiar problem or can suggest a possible solution?
    Thank you.
    Rydell

    Hi Hari,
    Thank you!
    There are no start routines, but there is a date selection in the infopackage.  It uses an OLAP variable of the last 7 days.  Everything shows up green except the processing of data packet shows a warning of no data. 
    I am confused whether this is a problem on the bw side or the sap side.
    Regards,
    Rydell

  • Upadating data packets serially

    Hi
    How to update data packets serially. It means one packet after another packet.
    For example i have a data flow, which updates data into PAS and then into Infocube which is serial update. But processing data packets , it's taking 2 or 3 data packets together and processing.What is the way to process data packets one after another.
    I have got an error when processing 8 data packets.First 4 have been updated sucessfully and last 2 have been updated sucess fully but 5 & 6 data packets have not been updated, got an error which is unknow error Where data is cleaned and in good format and every thing is pefect as first 4 and last 2 data packets.
    Thanks in Advance
    Sreekanth J

    Hello Sreekanth J,
    By design, each data packet can be processed independently. If two data packets are failing, there must be some reason behind it worht investigating. Check for short dumps of system log entries.
    If you still want to process one data packet at a time, then load data to the PSA and then from PSA to data targets. Processing happens in one background job one packet at a time.
    Again - not sure if the later option will ensure that hte data packets will go through. Also, try to right click on the failed data packets and select update again.
    Cheers
    Aneesh

  • Arrived in BW Processing: Data packet not yet processed

    Hello Gurus,
    Data is being loaded from export datasource (8RL*****) ti 2 data targets
    The overall QM status is red (says processing is overdue)
    The details tab shows
    Extraction: Error occurred
    Transfer(IDoc and TRFC): Error occurred
    Processing (Data packet): Error occurred
       -- Transfer rule: Error occurred
         Bewegungsdaten received.Processing being started
         Message missing(35676 records) : Transfer rule finished
      Update rule (0 records):Missing messages
      Update(0 new/0 changed): Missing message
      Processing end: Missing message
    I checked the LUW in SM58 but didnt find anything.
    I checked infosource (PSA) - one request is in RED, but when i check the data in PSA there are no errors.
    What should I do in this case?
    What might be the excat error.
    Kindly inform.
    Regards,
    NIKEKAB

    Hi,
    Check the DataSource in RSA3, if it is working fine and able to see the data in RSA3, there is no problem in DS level, then checl the Mappings and any routines in BW for that DS, if this is also fine then check the below options.
    See Dumps in ST22, SM21 also.
    Check RFC Connection between ECC and BW systems, i.e. RSA1-->Source System->Right Click on Source system and Check.
    You must have the following profiles to BWREMOTE or ALEREMOTE users.So add it. Bcoz either of these two users will use in background to get extract the data from ECC, so add these profiels in BW.
    S_BI-WHM_RFC, S_BI-WHM_SPC, S_BI-WX_RFC
    And also check the following things.
    1.Connections from BW to ECC and ECC to BW in SM59
    2.Check Port,Partner Profiles,and Message Types in WE20 in ECC & BW.
    3.Check Dumps in ST22, and SM21.
    4.If Idocs are stuck i.e see the OLTP Idoc numbers in RSMO Screen in (BW) detials tab see in bottom, you can see OLTP Idoc number and take the Idoc numbers and then goto to ECC see the status in WE05 or WE02, if error then check the log else goto to BD87 in ECC and give the Idoc numbers and execute manually and see in RSMO and refresh.
    5.Check the LUWs struck in SM58,User Name = * (star) and run it and see Strucked LUWs and select our LUW and execute manually and see in RSMO in BW.
    See in SDN
    Re: Loading error in the production  system
    Thanks
    Reddy

  • Data packet not getting processed

    Hi SDN's
    I m loading data from one ODS to 4 regions , the source ODS is successfully loaded from der to the data targets the load is getting failed or taking loang time .
    upto transfer rules the data is successful, in update rules data packets are not getting processed
    kindly suggest solution, points will be assigned
    thx in advance

    Hi Katam,
    In the target ODSs go to the monitor screen for a particular request -> in the menu bar go to environment -> transactRFC-> in the datawarehouse -> give the ID and date -> execute.
    Check if there are entries in that. Usually this queue will be stuck and you need to execute the LUWs in the queue manually.
    if it says transaction recorded u need to execute it manually
    Please revert if any issues
    Edited by: Pramod Manjunath on Dec 19, 2007 4:48 PM

  • Upload data packet manually.

    Hi all ,
    I had just concluded an init upload from PSA to cube.
    Simultaneously I was trying to upload some of the packets manually, at times ,to speed up the process.
    Fine , all was going right.
    The following day i noticed my request in Red and 2 of the data packets giving the same error   :
    Data packet 000007 from request REQU_40HAPL9LAUVI6N0IP9P44DK0W is blocked
    Data packet 000008 from request REQU_40HAPL9LAUVI6N0IP9P44DK0W is blocked
    I was unable to upload these two data packets manually.
    I have to run the process again.
    Can anyone help me with the explanation  ???
    What does it mean by getting blocked.
    There wasn't any short dump either.
    Thanks in advance
    Shalabh

    Hi Roberto ,
    Message Class   RSM1
    Message No.  405
    Shalabh

  • Foreign trade data defaulting to manually processed invoices.

    Hello Experts,
    How to default foreign trade data to manually processing invoices?
    Presently it is being manuallay processed and we need to default this data automatially on Invoice.
    Regards,
    Preeti

    We have maintained all the necessary setups for the foreign trade data.
    however, while creating the billing document for export, it is not auto - defaulting the foreign trade data. can this be autopopulated?
    The source and destination mode of transport.

  • Data packet not yet processing in ODS load??

    Hi all,
    I got an error when I loaded data from IS to the ODS. Can someone let me know why and how to resolve it. Thank you in advance.
    Here is the error message in the monitor:
    <b>Warning: data packet 1 & 2 arrived BW; processing: data packet not yet processing.
    (No data packet numbers could be determined for request REQU_77H7ERP54VXW5PZZP5J6DYKP7)</b>
    <b>Processing end:
    transfer rules (0 record): missing message
    Update PSA (0 record): messing messages
    Update rules (0 record): messging messages</b>

    John,
    I dont think its space problem.In st22 go with detail note.
    What happend, how to correct it.Will help you to solve the problem.
    Check this note <b>613440</b> also.
    <b>Note : 647125</b>
    Symptom
    A DYNPRO_FIELD_CONVERSION dump occurs on screen 450 of the RSM1 function group (saplrsm1).
    Other terms
    DYNPRO_FIELD_CONVERSION, 450, SAPLRSM1
    Reason and Prerequisites
    This is caused by a program error.
    The screen contains unused, hidden fields/screen elements that are too small for the screen check that was intensified with the current Basis patch (kernel patch 880). These fields originate in the 4.0B period of BW 1.0 and are never used.
    Solution
    Depending on your BW system release, you must solve the problem as follows:
    BW 3.0B
               ImportSupport Package 14 for 3.0B (BW 3.0B Patch 14 or SAPKW30B14) into your BW system. This Support Package will be available when note 571695 with the short text,"SAPBWNews BW 3.0B Support Package 14", which describes this Support Package in more detail, is released for customers.
    BW 3.1 Content
               ImportSupport Package 8 for 3.1 Content (BW 3.10 Patch 08 or SAPKW31008) into your BW system.This Support Package will be availablewhen note 571743 with the short text, "SAPBWNews BW 3.1 Content Support Package 08", is released for customers.
    The dump occurs with the invisible G_NEW_DATUM date field on the bottom right of the screen, which is only 1 byte long and can be deleted.
    You can delete the following unused fields/screen elements:
    %A_G_NEW_NOW                Selectionfield group
    G_NEW_ZEIT                  Input/output field
    G_NEW_UNAME                Input/output field
    G_NEW_DATUM                Input/output field
    %#AUTOTEXT021               Text field
    G_NEW_NOW                  Selection button
    G_NEW_BATCH                 Selection button
    You can delete these fields/screen elements because they are not used anywhere.
    This deletion does not cause any problems.
    After you delete the fields/screen elements, you must also delete the following rows in the flow logic in screen 450:
    FIELD G_NEW_DATUM           MODULE DOKU_NEW_DATUM.
    FIELD G_NEW_ZEIT            MODULE DOKU_NEW_ZEIT.
    The function group is then syntactically correct.
    Unfortunately, we cannot provide an advance correction.
    The aforementioned notes may already be available to provide information in advance of the Support Package release.However, the short text will still contains the words "preliminary version" in this case.
    For more information on BW Support Packages, see note 110934.
    Thanks
    Ram

  • IDoc looks fine, but "Processing (data packet)" : No Data

    Hi dear all:
    we trying to loading data from R3 ECC 5.0 into BI 7.0. During the loading process, the status keeps "yellow".
    When we checked the Details in the extractor monitor, the message shows in "Transfer (IDocs and TRFC) is "Request IDOC and Info IDoc 1 : Application document posted".
    But the "Processing (data packet): No Data. We have checked on R3 in tcodes BD87 with IDoc number."IDoc entries in tRFC queues" shows red light under section of "IDoc in outbound processing".
    In IDoc entries in tRFC queues, there is not revenant record for this IDoc. Also we used tcode SM37 to check job execution based on the request number of data loading. The job is shown released.
    Can anyone give us some idea about what happened and how can we fix it.
    Thank you very much
    SF

    Geeta,
    Try this...heard there needs to be a setting made in T-code..SMWQR ..
    U have to register the csa*  queue...try getting more inputs on this...this is what I know abt this..
    Hope this helps..
    Regards.
    Nick.

  • Error when trying to view reports or manually processing the TFS data warehouse and analysis services cube

    Hello Guys,
    I am trying to configure the reporting for TFS using SQL Server. But I get following error when viewing any report:
    So I try to manually process the cube to check if it works. I am following this article: https://msdn.microsoft.com/en-us/library/ff400237.aspx
    When I click on GetProcessingStatus and invoke it (with last field set as TRUE) I get following error:
    Please advice as to how to resolve this issue and be able to see the reports.

    I have managed to resolve this issue. Note that for all purposes of this question and answer, mydomain\tfsadmin is a generic user (used to install all software). This is a proof of concept account, for test purposes
    The thing is that while installation of the SQL Analysis service I had given the username: mydomain\tfsadmin (a generic user for testing) as the Analysis service administrator, instead of the 'Domain admin' group.
    Still I have managed to resolve this issue.
    Steps:
    1. Make sure that the user (mydomain\tfsadmin) is a member of Analysis Server -> TFS_Analysis db -> Roles -> TFSWarehouseAdministrator and TFSWarehouseDataReader. (This actually happens automatically when you run the TFS Admin console and configure
    reporting and provide the username that will access the Analysis db, etc. In my case the user is 'mydomain\tfsadmin')
    2. It is a bad practice to manually process the cube (you can do this is make sure that there are no errors, but only after completing the following steps till step 5)
    3. Also make sure that NT AUTHORITY\NETWORK SERVICE is member of the Analysis db -> Roles -> TFSWarehouseAdministrator ->
    this will resolve the error which appears in the 2nd screenshot in the question.
    4. Then you can right click on the Analysis db and run process. If this is throwing error as shown in above reply of mine, then you need to follow https://msdn.microsoft.com/en-us/library/vstudio/ff400237.aspx
    a. http://localhost:8080/tfs/TeamFoundation/Administration/v3.0/WarehouseControlService.asmx
    b. Choose
    ProcessWarehouse, run it.
    c. Choose ProcessAnalysisDatabase, type 'Full' and run it.
    d. Choose GetProcessingStatus,
    enter 'TRUE' in last field and run it.
    e. I don't get any errors at this point.
    5. Now you can connect to the Analysis server via SQL Mgmt Console, right click on Analysis db (TFS_Analysis) and click Process. It all works good.
    6. Now you can browse to the report URL (to get this URL you can open Team explorer 2013, connect to you team project, on RHS click 'Reports', Click 'Go to Site'.
    DONE.

  • How to set a data packet to red status.

    I have a big load which ended mostly in green data packets except for two, which were red,  because of server / resources issues. I tried to update them manually and they turned yellow but all processes ended and they are still in yellow. I want to try a manual update again, but I can't because they have to be in red status. How can i change the status of a single packet? This was a pretty hard load, and I can't afford to loose what so far made it okay to the Infocube.

    Hi,
      You need to force the request to red and then ur datapacket will turn to red then update the two packets manually then force it to green.
    Regards,
    Malar

  • Problem while  data processing TRANSACTION data from DSO to CUBE

    Hi Guru's,
    we are facing problem while  data processing TRANSACTION data from DSO to CUBE. data packets  processing very slowly  and updating .Please help me regarding this.
    Thanks and regards,
    Sridhar

    Hi,
    I will suggest you to check a few places where you can see the status
    1) SM37 job log (give BIrequest name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
    2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. See if its accessing/updating some tables or is not doing anything at all.
    If its running and if you are able to see it active in SM66 you can wait for some time to let it finish.
    3) RSMO see what is available in details tab. It may be in update rules.
    4) ST22 check if any short dump has occured.
    You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
    If you feel its active and running you can verify by checking if the number of records has increased in the cube.
    Thanks,
    JituK

  • How to set the data packet in order??

    we have 20 datapackets that is in sequence order how to set in order wise ?
    Example:- this is the scenario Data packets is in this order 20,1,10,5,9,13,4,8,13,2,15,19,14Like this
    I have to set in order wise how to doit??????

    HI
    I think you sjust should use a process chain.
    Call transaction RSPC and build it within the order needed.
    regards
    Tom

  • How to process large data files in XI  ?  100 MB files ?

    Hi All
       At present we have a scenario as follows
      It is File to IDoc ....Problem is the size of the file
      We need to transfer 100mb file to SAP R/3 system ? So this huge data how to
      process ?
    Adv thanx and regards
    Rakesh

    Hi,
    In general, an extra sizing for XI memory consumption is not required. The total memory of the SAP Web Application Server should be sufficient except in the case of large messages (>1MB).
    To determine the memory consumption for processing large messages, you can use the following rules of thumb:
    Allocate 3 MB per process (for example, the number of parallel messages per second may be an indicator)
    Allocate 4 kB per 1kB of message size in the asynchronous case or 9 kB per 1kB message size in the synchronous case
    Example: asynchronous concurrent processing of 10 messages with a size of 1MB requires 70 MB of memory
    (3MB + 4 * 1MB) * 10 = 70 MB With mapping or content-based routing where an internal representation of the message payload may be necessary, the memory requirements can be much higher (possibly exceeding 20 kBytes per 1kByte
    message, depending on the type of mapping).
    The size of the largest message thus depends mainly on the size of the available main memory. On a normal 32Bit operating system, there is an upper boundary of approximately 1.5 to 2 GByte per process, limiting the respective largest message size.
    please check these links..
    /community [original link is broken]:///people/michal.krawczyk2/blog/2006/06/08/xi-timeouts-timeouts-timeouts
    Input Flat File Size Determination
    /people/shabarish.vijayakumar/blog/2006/04/03/xi-in-the-role-of-a-ftp
    data packet size  - load from flat file
    How to upload a file of very huge size on to server.
    Please let me know , your problem is solved or not..
    Regards
    Chilla..

  • Manual Process VS System Process or Interaction

    Hi everyone,
    I'm a business consultant and I'm still on my first steps in Oracel BPA.
    My question is about the representation of the activities in a Business Process...In Visio you can have different symbols to represent Manual Processes and Processes that use Systems.
    In Oracle BPA you have Human Tasks and Automated Activity...so, if I want to represent a Manual Process, should I choose the Human Task symbol? Or only if there's a workflow pattern associated to that activity? Because you can be only Counting Cash in a Vault...would that be a Human Task in Oracle BPA?
    Thank you in advance for your help.
    Pedro

    Awesome stuff sudhir..It is really a great logic to use..Thanks for ur help..
    This is for future generation...
    One Note:The field symbol should be declared as type any instead of "standard table".Because the T181 is used as global workarea(strucuture) in SAPMV45A program rather than internal table.
    Here you go the modified code.
    FIELD-SYMBOLS:<fs_t180> TYPE any .
      DATA:wa_t180 TYPE t180,
           c_t180(15) TYPE c VALUE '(SAPMV45A)T180'.
      ASSIGN (c_t180) to <fs_t180>.
       wa_t180 = <fs_t180>.
    If wa_t180-tcode = 'VA01' or  wa_t180-tcode = 'VA02'.
    Endif.
    Thanks,
    Vigneswaran S

Maybe you are looking for