Flat file to Asynch RFC: R3 not getting updated

hi all,
         I have a flat file to asynch RFC scenario.I can see the file getting deleted and the chequered flag in MONI.In adapter monitoring i can see the message being processed for the RFC.But it has to update an R3 table which is not getting done.
Can anybody give ideas where the problem lies...in earlier blogs someone had suggested to check sm58 but dnt know what to give the inputs for this...
help highly appreciated and points will be awarded
regards,'
Nisheeta

Hi Nisheeta,
1)I hope your Receiver RFC adpater is working fine (In adapter monitoring it should be green)
2) As you mentioned that your RFC is working correctly with the given inputs (which in this case you are passing thru file, right?). Just check oncemore.
3) Call transaction SM58 . This tool lists only those transactonal RFCs that could not be carried out successfully or that had to be planned as batch jobs. The list includes the LUW ID and an error message. Error messages displayed in SM58 are taken from the <b>target system</b>. To display the text of the message, double-click on the message.
so u have to check in sap29...
Regards
Suraj
PS: <b>Guyz stop asking for points, let her problem get solved first... Thanks</b>

Similar Messages

  • File to JDBC -  JDBC database not getting updated using UPDATE_INSERT

    Hi ,
        I have a done a file to JDBC Scenario.File is successfully picked up by sender file channel and recieved by reciever JDBC channel.There is no error in receiver communication channel.I used UPDATE_INSERT action in mapping which has to update the database with all records it recieve.But,i found there is no update in Database table.I donot have access to database,so i made a sender jdbc channel querying all data in it to see what new value is populated.
    I made a dummy sender jdbc channel to check what are the values updates on the run of above interface.But , i found no update occur.Please help me out of this problem.
    Thanks
    Deepak Jaiswal

    The reason I ask , because you will see whether the message got successfully delivered to the target system.
    I want  to see information like below
    2010-04-14 14:31:35 Success Message successfully received by messaging system. Profile: XI URL: http://server:port/MessagingSystem/receive/AFW/XI Credential (User): PIISUSER
    2010-04-14 14:31:35 Success Using connection JDBC_http://sap.com/xi/XI/System. Trying to put the message into the receive queue.
    2010-04-14 14:31:35 Success Message successfully put into the queue.
    2010-04-14 14:31:35 Success The message was successfully retrieved from the receive queue.
    2010-04-14 14:31:35 Success The message status set to DLNG.
    2010-04-14 14:31:35 Success Delivering to channel: Communuication channel
    2010-04-14 14:31:35 Success MP: Entering module processor
    2010-04-14 14:31:35 Success MP: Processing local module localejbs/CallSapAdapter
    2010-04-14 14:31:35 Success Receiver JDBC adapter: processing started; QoS required: ExactlyOnce
    2010-04-14 14:31:35 Success JDBC adapter receiver channel CC: processing started; party  , service DB_BS
    2010-04-14 14:31:35 Success Database request processed successfully
    2010-04-14 14:31:35 Success MP: Leaving module processor
    2010-04-14 14:31:35 Success The message was successfully delivered to the application using connection JDBC_http://sap.com/xi/XI/System.
    2010-04-14 14:31:35 Success The message status set to DLVD.
    raj.

  • Data is not getting updated in table using RFC

    Hi Experts,
    In my scenario, I am calling one RFC using RFC receiver channel. After running scenario, channel is showing status that RFC executed successfully. But when I am checking tables in R/3 system, data is not getting updated.
    Moreover , when we tried to execute the RFC manually in R/3 system, that time data uploaded into table successfully.
    Could anybody tell me what would the reason that data not uploading into table when we send it through XI.
    Regards,
    Sari

    HI Sari,
    as you have scenario with RFC receiver.. and as you mentioned that it not updating tables when run through PI but when you execute RFC manually tables got updates.. then following are the options you can check..
    -- if you check RFC communication channel and if everything ok on then.. this means that your RFC is getting triggered successfully..but as you said tables are not updated.. for this you can go to SXMB_MONI and check the log take payload after mapping.. and compare it with the input when you try to execute it manually.. I think the input when you try manually and input to RFC when you try through PI is different and that is causing the Problem.. you will be able to see the difference in input then check.. I think the problem is data and not RFC communication channel..so by using this you will come to know difference
    -- else if possible configure your ID in PI in RFC Receiver and then check and put breakpoint on ABAP side.. so that when PI will hit RFC you will get it in debug mode and able to see what is going wrong..
    Thanks,
    Bhupesh

  • File to Proxy----Tables not getting updated.

    Hi all,
    I have File to proxy scnerio, where Data from file is uploaded in to BAPI in turn
    updated in to tables.
    If i take pay load from Moni  and test in SPROXY then tables is getting updated.
    But when i run Scnerio form XI tables are not getting updated.
    Please help......

    Hi ,
    Check this out if you missed any step.. This is exactly on your senario.
    /people/prateek.shah/blog/2005/06/14/file-to-r3-via-abap-proxy
    hope this will help you.
    Regards
    Aashish Sinha
    PS : reward points if helpful

  • File does not get updated in document library

    I'm encountering weird behavior which I hope you can suggest on how to further troubleshoot. There are some document libraries that whenever I upload a new version of a file, the file's contents do not get updated (when I open the file after check-in,
    it still shows the old content) but the Modified By and Modified fields get updated.
    Remote Blob Storage is turned on for our site; could it be RBS-related? However, only a few document libraries exhibit this kind of behavior.

    Hi Kev,
    There's no particular pattern in the differences between libraries exhibiting the behavior and those that do not. If it matters, the Style Library is experiencing this kind of behavior. We also have a document library renamed to "Images" (its URL
    is /Images) that also exhibits this behavior. I also tried disabling and enabling versioning, still the same behavior. I even tried deleting the file and reuploading with the same file name, still the same behavior!

  • How to handle flat file dissembler in biz-talk from getting spited in to multiple files???

    Hi,
    How to handle flat file dissembler in biz-talk from getting spited into multiple files according to number of transaction sets???
    where the Map flow is from Flat file to X12 Standard files.
    Please advise...

    You can go through following links:
    Developing
    Custom Pipeline Components
    How
    to Develop BizTalk Custom Pipeline Components - Part1
    Custom
    BizTalk Pipeline Disassembler Component
    Creating
    a custom BizTalk 2010 pipeline component–Part I
    add one more which talks about extending the FFDASM @http://msdn.microsoft.com/en-us/library/ee267856(v=bts.10).aspx
    Regards,
    Rachit
    Vote, if you find it useful.

  • Changes to Excel files are not getting updated

    File updates not reflected. I modified content, added data validation and saved the file back - next day file is still the way it was before modification.
    Other file, another site but similar situation.
    Content changed - form added but next day the change is not reflected. File updates not reflected. I modified content, added data validation and saved the file back - next day file is still the way it was before modification.
    Other file, another site but similar situation.
    Content changed - form added but next day the change is not reflected.

    Hi,
    According to your post, my understanding is that changes to Excel files are not getting updated.
    I recommend to configure Excel Services data refresh by using Secure Store and an external Office Data Connection (ODC) file.
    For more information, you can refe to:
    Configure Excel Services data refresh by using external data connections in SharePoint Server 2013
    If you use PowerPivot, you can refer to:
    Refreshing PowerPivot Data in SharePoint 2013
    If you use Office Web Apps, you can use New-SPWOPISuppressionSetting -extension xlsx -action view to refresh data.
    You can refer to:
    PowerPivot for SharePoint - Browser Refresh Fails (Data Refresh not supported in Office Web Apps)
    In addition, here is a similar thread for your reference:
    http://social.technet.microsoft.com/Forums/office/en-US/2c4009f3-62bc-4af0-9e16-e40e9d418e3b/user-changes-to-sharepoint-documents-lost
    Best Regards,
    Linda Li
    Linda Li
    TechNet Community Support

  • Matlab workspace not getting updated while labview mex file is running

    Hi,
    Let me start from the overview of the system for better understanding.
    The system is to acquire data from a DAQ6070e using labview 7.1 . this data is then to be used in matlab for processing. All this needs to happen in real time hence the data is to be acquired from labview and outputted into matlab in real time.
    in order to do this we generated the labview and the matlab codes seperately. After days of hard work we managed to import the labview vi into matlab using a math interface toolkit, but now have run into another interesting problem.
    the workspace in matlab is not getting updated during the time the labview vi is running inside matlab. it only updates after the vi has run . for example if i need to monitor the system for 300 seconds then ideally the labview vi should output the samples for each acquisition and output it to matlab, what actually is happening is that the labview keeps running for 300 seconds without outputting any data during those 300 seconds and only outputs the last set of acquisition values at the end of the 300 seconds. Which beats the purpose of the whole program.
    Kindly help
    Regards
    Manuj

    Thanks Mate,
    What you said about "MEX file will not be able to "store" data between runs" ,i realised in the morning. as a result i dont think your suggestion of for loop will also be effective as labview mex file will not acquire data during matalb code execution and vice versa.
    Hence i have decided to do it the other way...we will now try to run the matlab code in labview.
    this leads to other problem
    As this is a real time operation i need to find out a way of storing the data into computer memory and then reading from it(when i say memory i do not mean writing it to a file and readin from it as that is slow) becasue i think the buffer i am specifying in my AI config is not big enough.
    I was thinking maybe i should use a RT FIFO queue but then i have not been able to find out the difference between buffer memory and fifo memory. ie is buffer memory supposed to be the ram memory or is it the hard disk memory?
    I am posting the vi here for easy understanding
    All help is very much appreciated
    Attachments:
    RT-loop2ver21.vi ‏231 KB

  • OpenScript 12.2.0.1 Failed to access the databanks,CSV index file not getting updated

    Hi,
    I have been working on OATS for quite sometime,Off late i'm facing a unique problem out of nowhere
    when i run web functional script i'm getting the error "oracle.oats.scripting.modules.basic.api.exceptions.VariableNotFoundException:"
    Even though i have added the required databank,These were working fine for past 6 months suddenly started give me problem.
    1 thing i noticed was if i do any modification in the databank CSV file the corresponding INDEX file is not getting updated.Hence the system is unable to read the databank data i guess.
    Has anyone faced the same problem?Please help.

    Hi,
    I have been working on OATS for quite sometime,Off late i'm facing a unique problem out of nowhere
    when i run web functional script i'm getting the error "oracle.oats.scripting.modules.basic.api.exceptions.VariableNotFoundException:"
    Even though i have added the required databank,These were working fine for past 6 months suddenly started give me problem.
    1 thing i noticed was if i do any modification in the databank CSV file the corresponding INDEX file is not getting updated.Hence the system is unable to read the databank data i guess.
    Has anyone faced the same problem?Please help.

  • Adapter Engine Cache not getting Updated

    Hello Everyone,
    My adapter engine Cache is not getting updated in my development server,
    PFA... file for the error description after i refresh my Full ID cache from Administration tab
    Regards,
    Sushant

    Hi Sushant,
    Please check the RFC destination "INTEGRATION_DIRECTORY_HMI" is working or not?
    Check the parameters as suggested by others for this destination type.
    Check the USER which is mentioned for this RFC destination is locked or not? if so please unlock it.
    Try to change the pwd for this user and try again..
    IF none of them are working please refer this notes 764176
    Regards
    Bhargava Krishna

  • Base table is not getting updated

    Hi friends,
    am trying to update the attribute18 from the per_vacancies table by adding 1 to it
    when am running the concurrent program it is not updating the table
    what might be the reason
    this is my code
    CREATE OR REPLACE PACKAGE BODY apps.update_vacancy_pkg
    IS
    PROCEDURE update_vacancy_prc (p_err_buf OUT VARCHAR2, p_ret_code OUT NUMBER)
    IS
    l_object_version_number per_vacancies.object_version_number%TYPE;
    l_vacancy_id per_vacancies.vacancy_id%TYPE;
    l_inv_pos_grade_warning BOOLEAN;
    l_inv_job_grade_warning BOOLEAN;
    l_validate NUMBER := 0;
    l_assignment_changed NUMBER;
    l_return_status VARCHAR2 (240);
    l_openings VARCHAR2 (240);
    l_err_msg VARCHAR2 (200);
    l_vac_id per_vacancies.vacancy_id%TYPE;
    l_attr VARCHAR2 (100);
    l_attr1 VARCHAR2 (100);
    l_attr2 VARCHAR2 (100);
    l_attr3 VARCHAR2 (100);
    l_attr4 VARCHAR2 (100);
    BEGIN
    BEGIN
    SELECT object_version_number, attribute7, vacancy_id
    INTO l_object_version_number, l_openings, l_vac_id
    FROM per_vacancies
    WHERE vacancy_id IN (SELECT MAX (vacancy_id)
    FROM per_vacancies);
    EXCEPTION
    WHEN OTHERS
    THEN
    l_vac_id := 0;
    l_openings := 0;
    END;
    BEGIN
    SELECT MAX (attribute18)
    INTO l_attr
    FROM per_vacancies;
    SELECT (REGEXP_SUBSTR (l_attr, '.*C'))
    INTO l_attr1
    FROM DUAL;
    SELECT REGEXP_REPLACE (l_attr, '^.*C(.+)$', '\1')
    INTO l_attr2
    FROM DUAL;
    SELECT TO_NUMBER (l_attr2) + 1
    INTO l_attr3
    FROM DUAL;
    SELECT l_attr1 || l_attr3
    INTO l_attr4
    FROM DUAL;
    EXCEPTION
    WHEN OTHERS
    THEN
    l_attr4 := 0;
    END;
    l_object_version_number := 0;
    l_assignment_changed := 0;
    l_return_status := 0;
    per_vacancy_swi.update_vacancy
    (p_validate => l_validate,
    p_effective_date => SYSDATE,
    p_vacancy_id => l_vac_id,
    p_object_version_number => l_object_version_number,
    p_number_of_openings => l_openings,
    p_budget_measurement_value => l_openings,
    p_attribute18 => l_attr4,
    p_assignment_changed => l_assignment_changed,
    p_return_status => l_return_status
    COMMIT;
    fnd_file.put_line (fnd_file.LOG, 'Executed Vacancy' || l_vac_id);
    fnd_file.put_line (fnd_file.LOG,
    'Assignment chaged = ' || l_assignment_changed
    fnd_file.put_line (fnd_file.LOG, 'Return Status = ' || l_return_status);
    fnd_file.put_line (fnd_file.LOG, 'l_attr4= ' || l_attr4);
    END;
    END;
    it is hsowing in the log file with 1 added to the attribute18
    but the table is not getting updated
    can some one of you suggest a solution pls
    thanks
    Edited by: 776317 on Apr 27, 2011 9:15 PM
    Edited by: 776317 on Apr 27, 2011 10:01 PM

    but how come the seeded api has this
    PROCEDURE update_vacancy
    (p_validate in number default hr_api.g_false_num
    ,p_effective_date in date
    ,p_vacancy_id in number
    ,p_attribute17 in varchar2 default hr_api.g_varchar2
    ,p_attribute18 in varchar2 default hr_api.g_varchar2
    .);Enable trace/debug, this may give you an idea about the data which is not saved.
    Thanks,
    Hussein

  • HIREDATE & FIREDATE of Employee are not getting updated in t-code PA20.

    Hi all,
    HIRE DATE & FIRE DATE are not getting updated whenever the data of an employee loaded into SAP T-Code PA20 .
    And even the input data contains the both the fields HIRE DATE & FIRE DATE
    Can any one please suggest us why its happening.
    Thanks,
    Lava.

    Hi Girishkg,
    The default IIS limit for the upload file size is 30MB. If the selected files are larger than 30MB, the files are not uploaded.
    The limit can be set in web application’s web.config file or in IIS .
    There are two detailed articles about setting the limit:
    http://www.brainlitter.com/2009/07/13/sharepointcannot-upload-documents-larger-than-30mb-on-windows-server-2008-or-sbs-2008-application-servers/
    http://expectedexception.wordpress.com/2011/02/08/upload-multiple-files-fail-without-error/
    Feel free to reply if the issue still exists.
    Best regards
    Wendy Li
    TechNet Community Support

  • CPS parameters are not getting updated in SAP

    We have recently migrated our SAP and CPs servers, after that we are  facing few CPS issues  and it is affecting lot of key batch jobs. One typical example is given below.
    CPS is supposed to trigger a job  DD simulation, by copying the SOURCE_DATE_ID -1-Dec-2006  and modifies the parameters BUDAT and FAEDA_HIGH. The parameter FAEDA_HIGH is calculated as time.now()+ 5 working days as defined in the expression. When we try to run the job today with the date 24 Dec 2010, CPS has calculated the FAEDS_HIGH correctly as 2-Jan-2011. IN SAP the variant gets created but for the field FAEDA_HIGH the date value is not getting updated properly it is display some other date. (e.g)  It is showing as 4-jan-2011,  I removed the xpression and gave another date for the FAEDA_HIGH field as 10/01/2011, but in SAP it is not getting updated.
    Has anyone faced similar kind of issue. share you experience
    Regards,
    Ramprasad.

    Hi ,
    PLease find attached out XBP version
    XBP connection pool for SAP system ISU
      XMI interface version           : 2.0
    Since we have taken backup and restored the SAP Database, the CPs transport file already exists in the System.
    Basically what happens is
    So actually what is happening is, the CPS parameter FAEDS_HIGH expressions are  not used for due date calculation.. CPS just copies the source date-id to target id  and similarly for the due date based on the difference
    source_id variant date 01-12-2006   due date 02-12-2006  ( Next day)
    Target_id Variant date 25-12-2010  due date 26-12-21010  ( Next Day)
    I tried changing the Source date id from 01-12-2006 to 24-12-2010.
    Source Variant date 24-12-2010   due date 04-01-2011 ( 10 Days)
    Target Variant date 25-12-2010  due date 05-01-2011 ( 10 Days)
    Regards,
    Ram

  • Acrobat 7.0 properties not getting updated

    Hello Everybody,
    The PDF properties of the file that i am changing in my application is not reflected in the PDF File
    I have used the below code to set the property:
    CAcroPDDoc acroPDDoc;
    acroPDDoc.SetInfo (_T("Creator"),pdfDocProperties.m_Creator);
    acroPDDoc.Save(1,strFileName) ;
    The document properties sets fine for the file in Acrobat 9.0 where this is not properly setting for Acrobat 7.0.
    Below is the code for the acroPDDoc.Save
    long CAcroPDDoc::Save(short nType, LPCTSTR szFullPath)
    long result;
    static BYTE parms[] =
    VTS_I2 VTS_BSTR;
    InvokeHelper(0x17, DISPATCH_METHOD, VT_I4, (void*)&result, parms,
    nType, szFullPath);
    return result;
    Is the syntax wrong or its as per the specification of Acrobat 7.0?
    Regards,
    Nethaji

    ok Aandi
    let me summarize what our problem is
    we are using the below code for setting document properties say "Eg: Title " for PDF file through our application
    CAcroPDDoc acroPDDoc;
    acroPDDoc.CreateDispatch(_T("AcroExch.PDDoc"))
    if ( acroPDDoc.Open( strFileName ) )
    acroPDDoc.SetInfo (_T("Title"),pdfDocProperties.m_Title);
    acroPDDoc.Save(1,strFileName) ;
    acroPDDoc.Close();
    The title field is not getting updated in the PDF FILE; Whenever "acroPDDoc.Save" is executed the Title filed gets updated with "Title" as it is .we have tried several scenarios given by you but still we cannot conclude this one.
    The acroPDDoc.Save code is :
    long CAcroPDDoc::Save(short nType, LPCTSTR szFullPath)
    long result;
    static BYTE parms[] = VTS_I2 VTS_BSTR;
    InvokeHelper(0x17, DISPATCH_METHOD, VT_I4, (void*)&result, parms, nType, szFullPath);
    return result;
    The above acroPDDoc.Save works properly for Acrobat 9.0.it updates the title field
    This problem occurs in Acrobat7.0 and not in 9.0
    This is our problem please help us.

  • Backing bean properties not getting updated

    I have an input text in my JSP page and it is component-bound to backing bean. The problem is that when I change value of the InputText in the JSP, the value is not getting updated in the backing bean.
    Here is the InputField definition in JSP page.
    <h:inputText binding="#{backing_employeeEdit.firstName}"
    id="firstName"/>
    and here is the setter in backing bean
    public void setFirstName(HtmlInputText inputText1) {
    this.firstName = inputText1;
    Pls note that there is no validation errors because the page navigates successfully. The only problem is that new value is not updated.

    I don't know, wht exactly u wants to do... Here is java code and is working fine for me...
    Emp.java
    public void action() {
         System.out.println(inputText.getValue());
         Emp tt = new Emp();
         inputText.setValue(inputText.getValue().toString()+"bilal");
         tt.setInputText(inputText);
         System.out.println(inputText.getValue());
    JSP...
    <f:view>
    <h:form>
    <h:inputText binding="#{emp.inputText}" />
    <h:commandButton value="submit" action="#{emp.action}" />
    </h:form>
    </f:view>
    faces-config file contents....
    <managed-bean>
              <managed-bean-name>emp</managed-bean-name>
              <managed-bean-class>
                   com.nous.application.Emp
              </managed-bean-class>
    <managed-bean-scope>session</managed-bean-scope>
    </managed-bean>
    Here scope has to be session, otherwise u will lose old value...

Maybe you are looking for

  • After recent updates I am unable to airplay music from my iPhone 4s to Apple TV

    I have iPhone 4s updated to os6 and new apple TV the black one. it was working fine but recently I noticed I can no longer do airplay. I can not even mirror stuff anymore. everything is updated, whats the deal? I tried regular player and that has no

  • SYSTEM_NOT_CONFIGURED_AS_XMB problem

    Hi All,      Am trying to construct a simple XI Scenario as described in the following blog. /people/srinivas.vanamala2/blog/2007/02/05/step-by-step-guide-xml-file-2-xml-file-scenario-part-ii My application is able to read the xml file but the receiv

  • What is a good way on a resume to say you created End user accounts

    I also created in-house solutions of telling the users how to correct the most common problems they may encounter.

  • Source code as output

    hello everybody, This is my post to this forum. My question is "How to print the soource code as output"? Is there any method to echo the same code as output? Thanks in advance.

  • MBAM client installation

    Hi, I am encountering MBAM client installation issues on some Windows7 64bit HP laptop and tablets (same hardware is used for all clients). MBAM with SCCM topology environmnet is in place and tested OK on 32bit and 64bit systems but on some i am gett