IDoc not getting processed in Real Time job

Hello,
The configurations are done as per the below link to receive iDoc from SAP in BODS.
How to Receive IDOC in BODS from SAP (Outbound to SAP)
An iDoc is sent and Real Time Services on management console shows that request is received successfully. But I cannot see output xml file created. When I checked the real time services and client interfaces, I could see all are in green.
But the real time job status shows its in running state. It has been in running state for 2 days. Trace shows as below. There is no error message.
14.2) 04-24-14 18:47:41 (18417:1140171104)      JOB: Optimizing job <JOB_BODS_RT_IDocMatmas01>.
(14.2) 04-24-14 18:47:47 (18417:1140171104)      JOB: Job <JOB_BODS_RT_IDocMatmas01> is started.
(14.2) 04-24-14 18:47:47 (18417:1140171104) DATAFLOW: Data flow <DF_RT_IDocMatmas01> is initialized.
(14.2) 04-24-14 18:47:47 (18417:1140171104) DATAFLOW: Data flow <DF_RT_IDocMatmas01> using IN MEMORY Cache.
(14.2) 04-24-14 18:47:48 (18417:1140171104) DATAFLOW: Data flow <ActaDefaultNotificationSender> is initialized.
(14.2) 04-24-14 18:47:49 (18417:1140171104) DATAFLOW: Data flow <ActaDefaultNotificationSender> using IN MEMORY Cache.
Can anyone please let me know if there is anything missing in the configuration? How do I resolve this issue?
DS version: 4.2

Hi,
Try to reimport the metadata of IDOC into PI in idx2. Also check the port definition in ECC and whether the segment is released or not. Please go through the below discussions it may help you.
EDISDEF: Port XXX segment defn E2EDK35 in IDoc type ORDERS05 CIM type
IDoc 0000000000181226 was saved but cannot or should not be sent
Regards,
Priyanka

Similar Messages

  • RSEOUT00 - Idoc not getting processed

    Hi All,
    I have few idocs in prod.system with status 30 and they are not getting processed using RSEOUT00.
    I tried using back ground job / forgrond / BD87 also but they are not getting converted to 03 status.
    receiver port is xmlgen.
    Any thing like idoc lock ? or something other that is preventing it process ?
    Please let me know if anything.

    check out the multiple threads in SDN regarding this issue
    idoc remains random in status 30
    some of the common pointers looks to be locked IDOC (check in SM12), SM58 to see if anything is in the RFC queue (though I doubt that would be the case)..
    How are you generating your idoc? standard FM or MASTER_IDOC_DISTRIBUTE?

  • Article IDoc not getting processed in PI for particle article type

    Hello All,
         I have an strange behaviour in my system.
    i.e 1, we are creating an article IDOC by using BD10 for particulat article type.
    2, IDOC created succesfully , then
    3, Process the IDOC in BD87 to push the IDOC to PI system .
    4, But i could not see the message in PI that too only for one article type.
    I'm wondering why this is not getting processed only for one article type and not for others.
    we also checked the basic filter for the message type in dist model and even i could not see any condition in PI also,
      Is there any thing else i should check? can you please help us on this.
    Regards , Sethu.

    Hi,
    Try to reimport the metadata of IDOC into PI in idx2. Also check the port definition in ECC and whether the segment is released or not. Please go through the below discussions it may help you.
    EDISDEF: Port XXX segment defn E2EDK35 in IDoc type ORDERS05 CIM type
    IDoc 0000000000181226 was saved but cannot or should not be sent
    Regards,
    Priyanka

  • ALE - Idoc not getting processed automatically, instead going fine in Debug

    Hello All,
    I have a problem with Inbound idoc processing.
    IDoc settings were made 'immediate processing' in WE20, and the process code is set correctly and the attached of Z-Fun. module to process the idocs to process code.
    The error in inbound side is as 'Function module not allowed: APPL_IDOC_INPUTI' 51 status. Am sure i have to do some thing with ALE settings.
    Can some body help me with correction step or sequence of steps  for this ALE process.
    Thanks & Regards, Jilan

    Hi Jilan,
    Plz follow the steps:
    ALE Configuration:
    •     WE30 - IDOC type creation
    •     WE31 - Create segment
    •     WE81 - Message type creation
    •     WE82 - Link IDOC type to Message type
    •     SE37   - Create Inbound Function Module
    •     BD51  - Maintain entry for Function Module
    •     WE57 - Maintain
    •     BD57  - Link Function Module, IDOC type and Message type
    •     WE42 – Create Inbound Process Code
    •     BD67  - Link Process code to Function Module
    •     WE20 - Create Partner Profile
    •     BD64  -  Display Distribution Model
    •     WE02 -  IDoc List, Display all Inbound/Outbound IDocs
    •     WE14 -  Outbound Processing of IDoc’s
    •     BD20 -  Inbound Processing of IDoc’s
    1st Step:   Create  a Segment  ( WE31)
             Segment is a structure for passing data in IDoc. It actually contains the IDoc data, just like the DDIC table/structure. Segment is created with all the required fields. Then Save it.  But, to actually use this Segment, you have to Release the Segment, otherwise u can’t use the Segment, by menu, EDIT--> Set Release.
             Now, if you want to do some change to this Segment, u cant, unless & until u Cancel the Release, by Edit--> Cancel Release.
    2nd Step:   Create IDOC Type  ( WE30 )
            After creating the Segment, now we have to create the IDoc Type. IDOC Type is like an envelop of a letter, which contains the data inside it, & also some more information like address. IDoc type can be Basic Or Extended.
           Basic IDoc Type:  Using some SAP existing IDOC type (ex. MATMAS) or Custom IDOC type.
           Extension IDOC Type:  When we need some fields extra to an existing IDOC type, then we can extend that Basic Idoc Type by another Segment. This is called Extended idoc type
    3rd Step:   Create Message Type (WE81)
         Message Type is like the Postman for sending the Letter.
      4th Step:   Attach Message Type to the IDOC Type (WE82)
    5th Step:   Create a Function Module (SE37)
           Write the Processing logic in a Function Module.
    6th Step:   Mention the IDOC Type, i.e, 0/1/2 (BD51)
    7th Step:   Assign the Message Type, IDOC Type & Function Module (WE57)
    8th Step:   Create a Custom Process Code (WE42)
    9th Step:   Attach the Function Module to the Process Code (BD67)
    10th Step: Execute Inbound IDOC’ s (WE19)
    Example Function Module: IDOC_INPUT_ORDERS (Standard FM to create Sales Order).
    reward is useful,
    regards,
    Tanmay

  • IDOC status 64 not getting processed.

    Hi Gurus,
          I have a problem where IDOC's with status 64 are not getting processed via background job. the IDOCs weere getting processed with the background job till today evening, but suddenly no changes were made and the processing of IDOC's with status 64 has stopped and they keep filling up, when i checked the log of the Background job it says finished, and the log has the following information,
    17.09.2009 19:05:23 Job started                                                                   00           516          S
    17.09.2009 19:05:23 Step 001 started (program RBDAPP01, variant POS_IDOCS, user ID SAPAUDIT)      00           550          S
    17.09.2009 19:05:25 No data could be selected.                                                    B1           083          I
    17.09.2009 19:05:25 Job finished                                                                  00           517          S
    Kindly advise on this.
    Regards,
    Riaz.

    When i process the IDOC's using BD87, they gets processed wiwth out aany issues no dump is displayed, idoc gets a status 53 after processing via BD87.
    There is another problem how ever there are other jobs scheduled to run in the background other than this background job which are getting cancelled and have two different error logs for different jobs they are as follows.
    1) 18.09.2009 02:00:06 Job started                                             00           516          S
    18.09.2009 02:00:06 Logon not possible (error in license check)             00           179          E
    18.09.2009 02:00:06 Job cancelled after system exception ERROR_MESSAGE      00           564          A
    2) 18.09.2009 00:55:27 Job started                                                          00           516          S
    18.09.2009 00:55:27 Step 001 started (program RSCOLL00, variant , user ID JGERARDO)      00           550          S
    18.09.2009 00:55:29 Clean_Plan:Gestartet von RSDBPREV                                   DB6PM         000          S
    18.09.2009 00:55:29 Clean_Plan:beendet                                                  DB6PM         000          S
    18.09.2009 01:00:52 ABAP/4 processor: DBIF_RSQL_SQL_ERROR                                00           671          A
    18.09.2009 01:00:52 Job cancelled                                                        00           518          A
    This is a production server and the Licence is valid till 31.12.9999, and has no issues with the license.

  • Unable to Run Real Time Job Idoc Processing.

    hi experts,
    i am trying to run a job in which i m trying to process a SAP IDOC.
    my source is an MATMAS03 type IDOC MESSAGE.
    I m running it through Management Console.
    the RFC is working fine, but when the idoc is getting processed, at the end i m getting the below error.
    *(12.2) 12-28-11 18:11:08 (E) (6132:5932) Unknown: R3RfcClient(DI_CLIENT) ActaIDocFunc::Process() encountered processing error for Requeste(1) (Processing timeout.) (IDOC
                                                      0000000000650143+/T90CLNT090). (BODI-300129)*
    here DI_CLIENT in my RFC, which i have maintained using the below link:
    http://wiki.sdn.sap.com/wiki/display/BOBJ/ConfigureSAPtosendIDOCstoDI
    please help me share if u have any other link or sample document to transfer data from SAP ECC using IDOC.
    thanks

    Hello Sasha,
         I am also getting the same problem but apart from error 300129 i am also gettiing 300137.
         Is it that way that i have to use "R3RfcClient" only as the Program ID.?
        Do you have any clue to solve other error also?
    Thanks in Advance.

  • IDOCS Error - Not getting processed automatically

    Hi All,
    We are loading hierarchy for a Product from R/3 system using the standard datasource.
    When we execute the info package, IDOCs are not getting processed automatically.
    We are facing the below error message.
    Error when updating Idocs in Business Information Warehouse
    Diagnosis
    Errors have been reported in Business Information Warehouse during IDoc update:
    No status record was passed to ALE by the applicat
    System Response
    Some IDocs have error status.
    Procedure
    Check the IDocs in Business Information Warehouse . You do this using the extraction monitor.
    Error handling:
    How you resolve the errors depends on the error message you get.
    But when we goto BD87 and process the IDOCs manually, these are getting posted and the hierarchy is loading.
    Can someone please guide me on what is the issue with the IDOCs and how to make them to post automatically.
    Thanks in Advance
    Regards,
    Sachin

    Hi,
    This will happen due to Non-updated IDOCu2019s in the Source system i.e., occurs whenever LUWu2019s are not transferred from the source system to the destination system. If you look at RSMO of the status tab, the error message would appear like u201CtRFC Error in Source Systemu201D or u201CtRFC Error in Data Warehouseu201D or simply u201CtRFC Erroru201D depending on the system from where data is being extracted. Sometimes IDOC are also stuck on R/3 side as there were no processors available to process them. The solution for this Execute LUWu2019s manually. Go to the menu Environment -> Transact. RFC -> In the Source System from RSMO which will asks to login into the source system. The u201CStatus Textu201D for stucked LUWu2019s may be Transaction Recorded or Transaction waiting. Once you encounter this type of status Execute LUWu2019s manually using u201CF6u201D or Editexecute LUWu2019s(F6).Just keep on refreshing until you get the status u201CTransaction is executingu201D in the Production system. We can even see the stuck IDOCu2019c in Transaction BD87 also.Now the data will be pulled into the BW.
    Hope it helps a lot.
    Thanks and Regards,
    Kamesham

  • Real time job doesn't receive automatically changes in SAP ECC 6

    Hi Experts,
    <br/>
    <br/>I'm trying to (using Data Integrator) automatically retrieve data from SAP ECC and store it in a local database table. I followed the steps written in this article: http://wiki.sdn.sap.com/wiki/display/BOBJ/Receiving+IDOCs. I'm modifying some records in the Cost Center Master Data (CSKS) and using COSMAS01 as the IDOC in which I'm trying to send the information.
    <br/>
    <br/>All seems to be OK, since I manually sent a Cost Center data row using the /nbd16 transaction, SAP ECC displayed a message telling that an IDOC had been generated. Also, when I checked the IDOC status in BD87 it had status 03, meaning that it had been sent correctly.
    <br/>
    <br/>In Data Services when I clicked on 'View Data' in the local database table that I put in the dataflow I could see the rows that I manually sent from the ECC. However, after truncating the local table, it bothered me that when I tried to send manually a second IDOC with another row from the Cost Center Master Data table, my real-time job didn't receive the request and consequently didn't insert the desired rows. I didn't change anything in the configuration, so my first question to any of you is that if you know what could it be that causes my real-time job to not receive the request from ECC?. I tried making a third try, but at my company they had to shut down the ECC server, so it's going to be a while before I make another try.
    <br/>
    <br/>Now, the REAL question of this post is that after I sent the first IDOC successfully (and before sending manually the second one mentioned earlier), I changed a record in the CSKS table directly in the ECC, hoping that it would automatically generate an COSMAS01 IDOC and send all the data in the table to the Data Integrator. This didn't happen and no IDOC was generated, so do any of you know why the automatic change didn't trigger the sending of the IDOC with the data?
    <br/>
    <br/>As I said before, I made the configuration in ECC and DI following the steps in the link written at the start of the post, and it was tested OK by successfully sending a COSMAS01 IDOC once, manually using bd16.
    <br/>
    <br/>In advance I thank you all for your cooperation. This is my first thread in the SDN forum so also please excuse any mistakes in my english.
    <br/>
    <br/>Best regards...

    You do not need to schedule this job every 10 min.  Why?
    If we can't schedule this job every 10 mints how can I able to retrive the delta records into Queue(RSA7)
    What is your advice I mean how to schedule in order to get the delta?
    Thanks

  • Webservices for real time job.

    Hi all,
    I have configured a real time job that is available as a web service. There was no problem for the first time. However I wanted to change some of the input and output fields on the real time job.
    So i modified the real time job in the Designer.
    I removed and created the Real time service in the administrator. I have also renamed the Real Time SErvice.
    I recreated the webservice for the renamed real time service.
    However I am still getting the "old" wsdl with the old real-time service name.
    Seems like the WSDL is not updated.
    Am i missing out any steps to "refresh" the wsdl?
    Thank you.

    which version if DI ?
    the WSDL should get updated once you add the Real time Service to the Webservice
    do you see the Real time service in the WebService status page ?
    click on View WSDL, it will open the WSDL in IE, do a find for the service name in that

  • Real Time Job with no Message Target?

    Hi All.  I'm curious to know if anyone has built a real time job without an xml message target.  I thought that any message source 'getting' from a Topic does not require a response.  Well data services is not letting me get away with not having the message target. 
    The message I'm receiving is one that I want to insert into a table, it is not being used in a lookup to then send out a new record out to a message target.
    Real Time jobs are still a new concept to me.  Thanks in advance for your help.

    Hi,
    Try using a row generation transform, generate one row and pass a hard coded value as output message. You cannot do away with an output message in the case of real time.
    Regards,
    Suneer

  • File lock in real time job

    Hi All,
    I encountered an file lock error when creating a real time job.
    After a dataflow, I have a script to move the processed file to archive folder. (e.g. move c:\source\order.xml c:\archive). When I test run it, I received a 50306 error. It saids "The process cannot access the file because it is being used by another process. 0 file(s) moved.". However, the df and script were running OK in batch job. Also, I can move the files manually after the job failed. Can anyone help me with that? Is it something to do with the setting of realtime services?
    Many thanks!
    Knight

    hi,
    Not sure but you can check in sm12 if there is any lock entry, if so than manually delete and than check again.
    Ray

  • Error while running Real Time Job through SOAP UI

    Hi Experts,
    I am using real time job for search the duplicates in the data base using attribute values. After running the job iam getting correct result in the xml message.
    I configured real time job in management console and taking the WSDL and giving the in put in SOAP UI then iam getting the message like this after 2 timedouts.
    <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
    <soapenv:Body>
    <soapenv:Fault>
    <faultcode>soapenv:Sender</faultcode>
    <faultstring>Web Services is unable to process the request to call real-time service 'RT_SEARCH' using Access Server 'uplt-abk-052:4000'. Error: Server sent back error: Communication Error. See real time job log for details.</faultstring>
    <faultactor/>
    <detail/>
    </soapenv:Fault>
    </soapenv:Body>
    </soapenv:Envelope>
    In the real time job log file iam getting the message like this
    (12.2) 06-01-10 09:37:41 (E) (0620:2732) Unknown: SP(RT_SEARCH, UPLT-ABK-052:3501)::flowThread() Flow became invalid while waiting for reply from real time job. (BODI-300136)

    In Management Console,
    PLease take a look at websevices-> partciular JOB-> historyLog-> errorlog and tracelog to see the more details of the error. It should definetly whats the cause of the error.
    -Subhadra

  • Is there a way to create dependency on the real-time jobs

    Hi,
    We have around 80 real-time services running and loading the changed data into the target.
    The process being used is
    IBM Informix > IBM CDC > JMS (xml messages) > DS real-time services > Oracle EDW.
    While using the above process,  when ever there is change in the fact table and the dimension table, both the real-time services are loading the data at the same time into the target. This is causing issues in looking up data with the timing issue.
    Is there a way where we can create a dependency and resolve the timing issue and make sure the lookup table is loaded and then the master table is loaded?
    Please let me know.
    Thanks,
    C

    Hello
    With the design you curently have, you will have potential sequencing issues.  There is no magic in Data Services to solve this.
    You might want to consider building more complex real-time jobs that accept more complex data structures and have logic to process the data in dependency order.
    Michael

  • Bulky Message are not getting processed

    Hi All,
    Recently we have upgraded our XI systems to EHP4.
    After upgrade Bulky files are not getting processed into XI system. Small files are getting processed successfully.
    Earlier we had faced same issue at that time we have maintained some parameters in transaction RZ10, RZ11, SXMB_ADM-->
    Integration Configuration--> HTTP time out parameter.
    Same parameters are still there. But now the message is not getting processed in XI when we check into sxi_monitor. Message mapping is also not seems to be a problem as we havnt put any bulky logic over there.
    The message is not getting processed ahead of  Receiver Grouping .  Please find attached log from Performanceheader
    <?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
    - <!--  Receiver Grouping
      -->
    - <SAP:PerformanceHeader xmlns:SAP="http://sap.com/xi/XI/Message/30">
    - <SAP:RunTimeItem>
      <SAP:Name type="ADAPTER_IN">INTEGRATION_ENGINE_HTTP_ENTRY</SAP:Name>
      <SAP:Timestamp type="begin" host="evoxicqab">20091210134229.930472</SAP:Timestamp>
      </SAP:RunTimeItem>
    - <SAP:RunTimeItem>
      <SAP:Name type="ADAPTER_IN">INTEGRATION_ENGINE_HTTP_ENTRY</SAP:Name>
      <SAP:Timestamp type="end" host="evoxicqab">20091210134232.052577</SAP:Timestamp>
      </SAP:RunTimeItem>
    - <SAP:RunTimeItem>
      <SAP:Name type="CORE">INTEGRATION_ENGINE</SAP:Name>
      <SAP:Timestamp type="begin" host="evoxicqab">20091210134232.059576</SAP:Timestamp>
      </SAP:RunTimeItem>
    - <SAP:RunTimeItem>
      <SAP:Name type="CORE">INTEGRATION_ENGINE</SAP:Name>
      <SAP:Timestamp type="end" host="evoxicqab">20091210134232.07151</SAP:Timestamp>
      </SAP:RunTimeItem>
    - <SAP:RunTimeItem>
      <SAP:Name type="DBQUEUE">DB_ENTRY_QUEUING</SAP:Name>
      <SAP:Timestamp type="begin" host="evoxicqab">20091210134232.071518</SAP:Timestamp>
      </SAP:RunTimeItem>
    - <SAP:RunTimeItem>
      <SAP:Name type="DBQUEUE">DB_ENTRY_QUEUING</SAP:Name>
      <SAP:Timestamp type="end" host="evoxicqab">20091210134237.239947</SAP:Timestamp>
      </SAP:RunTimeItem>
    - <SAP:RunTimeItem>
      <SAP:Name type="PLSRV">PLSRV_RECEIVER_DETERMINATION</SAP:Name>
      <SAP:Timestamp type="begin" host="evoxicqab">20091210134237.241179</SAP:Timestamp>
      </SAP:RunTimeItem>
    - <SAP:RunTimeItem>
      <SAP:Name type="PLSRV">PLSRV_RECEIVER_DETERMINATION</SAP:Name>
      <SAP:Timestamp type="end" host="evoxicqab">20091210134237.250385</SAP:Timestamp>
      </SAP:RunTimeItem>
    - <SAP:RunTimeItem>
      <SAP:Name type="PLSRV">PLSRV_INTERFACE_DETERMINATION</SAP:Name>
      <SAP:Timestamp type="begin" host="evoxicqab">20091210134239.999045</SAP:Timestamp>
      </SAP:RunTimeItem>
    - <SAP:RunTimeItem>
      <SAP:Name type="PLSRV">PLSRV_INTERFACE_DETERMINATION</SAP:Name>
      <SAP:Timestamp type="end" host="evoxicqab">20091210134240.001292</SAP:Timestamp>
      </SAP:RunTimeItem>
    - <SAP:RunTimeItem>
      <SAP:Name type="PLSRV">PLSRV_RECEIVER_MESSAGE_SPLIT</SAP:Name>
      <SAP:Timestamp type="begin" host="evoxicqab">20091210134240.001413</SAP:Timestamp>
      </SAP:RunTimeItem>
    - <SAP:RunTimeItem>
      <SAP:Name type="PLSRV">PLSRV_RECEIVER_MESSAGE_SPLIT</SAP:Name>
      <SAP:Timestamp type="end" host="evoxicqab">20091210134240.026156</SAP:Timestamp>
      </SAP:RunTimeItem>
    - <SAP:RunTimeItem>
      <SAP:Name type="DBQUEUE">DB_SPLITTER_QUEUING</SAP:Name>
      <SAP:Timestamp type="begin" host="evoxicqab">20091210134240.026164</SAP:Timestamp>
      </SAP:RunTimeItem>
      </SAP:PerformanceHeader>

    Hi
    Have you checked this thread, same discussion here
    Performance of XI Interfaces
    Also check this blog
    /people/michal.krawczyk2/blog/2006/06/08/xi-timeouts-timeouts-timeouts
    Regards
    Ramesh

  • Data packet not getting processed

    Hi SDN's
    I m loading data from one ODS to 4 regions , the source ODS is successfully loaded from der to the data targets the load is getting failed or taking loang time .
    upto transfer rules the data is successful, in update rules data packets are not getting processed
    kindly suggest solution, points will be assigned
    thx in advance

    Hi Katam,
    In the target ODSs go to the monitor screen for a particular request -> in the menu bar go to environment -> transactRFC-> in the datawarehouse -> give the ID and date -> execute.
    Check if there are entries in that. Usually this queue will be stuck and you need to execute the LUWs in the queue manually.
    if it says transaction recorded u need to execute it manually
    Please revert if any issues
    Edited by: Pramod Manjunath on Dec 19, 2007 4:48 PM

Maybe you are looking for

  • How do I find the model number on my keyboard?

    How do I find the model number on my keyboard? I have an anodised aluminium (short with no numeric pad on it) keyboard that came with my iMac that I bought in 2009, and I am wanting to find out what model it is. Any suggestions? Thanks.

  • Can anyone help with an Error Code 4450

    I keep trying to burn cd's and everytime that I do, itunes cancels the disk burn and gives me the error code 4450. can anyone help out with this? thanks..

  • Need help iPhone 5 not working correctly

    my iPhone 5 keeps pressing all the keys around the key I want it to press on the keyboard also problems using lock screen numbers have re-started and the phone just reverts to not working correctly also when I got this phone I did it from a back up o

  • Starting from 9.2.2 on recent machines

    Hi everybody, For bad or good reasons and mainly because our very specific 'house-made' repair-tracking system still runs on OS9, we are actually investigating the most recent machines that will start in 9.2.2 (not Classic, real OS9 startup). In part

  • Problem On Editing a Smartform in PDF with Chinese

    Hello, we are using SRM and in  a specific transaction we are generating a SAP message wich is editing a form for the PO. When we want to edit it on SRM in chinese we have problem with a lot of  ### ans everything wich doesn't mean anything. To gener