I want to capture system message during data upload.

Hello,
I want to capture system message during data upload.
How should I do this?
suppose during call transaction system shows message:-
" Info record for vendor 102925 and material DYND80000 does not exist"
then How can I get that message?

Hi Megha,
CALL TRANSACTION tcode USING i_bdcdata
                          MODE lws_mode
                        UPDATE lws_update
                      MESSAGES INTO <b>i_messages.</b>
  CLEAR wa_error.
  IF sy-subrc NE 0.
    cnt_failed = cnt_failed + 1.
    wa_error-status = c_fail.
    LOOP AT i_messages . "WHERE msgtyp EQ 'E' OR msgtyp = 'A'.
      flg_fail = c_x.
*---Calling FM to get for Error Message Text
      <b>CALL FUNCTION 'MESSAGE_PREPARE'  </b>                     "#EC *
        EXPORTING
          language               = 'E'
          msg_id                 = i_messages-msgid
          msg_no                 = i_messages-msgnr
        IMPORTING
          msg_text               = lws_text
        EXCEPTIONS
        function_not_completed = 1      " Invalid Date Error Description
          message_not_found      = 2
          OTHERS                 = 3.
<b>Reward points if this helps.
Manish</b>

Similar Messages

  • Has anyone been able to upload an ibooks file with audio only files (m4a) in it? I keep getting the following error message during the upload in iTunes Producer: ERROR ITMS-9000: "Files of type audio/x-m4a are not allowed outside of widgets.

    Has anyone been able to upload an ibooks file with audio only files (m4a) in it? I keep getting the following error message during the upload in iTunes Producer: ERROR ITMS-9000: "Files of type audio/x-m4a are not allowed outside of widgets. then it names the file as an m4p file. Everything works beautifully on the iPad through Preview, and validates through iTunes Producer up until the attempted upload. If you've been able to accomplish this, please let me know how you prepared your audio files. Many thanks.

    Hello Fellow iBook Authors!
    Today I received the same error that you all have been discussing.  I tried selecting the DRM
    and this did not work for me, though I'm glad it did for some.  Here's what I did as a work-around. . .
    Since iBooks Author did not have a problem with Videos, I simply used one of my video programs, ScreenFlow to turn the audio into a video file m4v.  I added an image and extended the length or timing of the image to span the length of the audio file.  Then exported as an .mov.  I then opened QuickTime and opened the file and exported the file to iTunes. 
    You can use iMovie, Camtasia or any other progam that will allow you to export the audio as a movie file.  Does this make sense?  I hope this helps, at least in the short-term.
    Michael Williams

  • How do I set up an auto reply for all received messages during dates that I am not using email?

    I want an auto reply to go to all senders of email during dates that I will not be able to access email. I want to customize this reply.

    https://support.mozilla.org/en-US/kb/vacation-response

  • Columns Required during Data upload in DTW

    Hi,
    How can we know which column is required during an upload of opening balances in DTW?
    I am trying to Upload BP balances, but if the required columns are left empty, the upload does not happen.
    Jyoti

    Check this WIKI which explains 'How can I determine which fields are mandatory in a Data Transfer Workbench (DTW) template? '
    [https://www.sdn.sap.com/irj/sdn/wiki?path=/pages/viewpage.action&pageid=36438280]

  • Capturing system time and date and the time difference and date difference.

    Hello Experts,
    i have a requirement the i need to capture select query start time and date as i runthe program on midnight (ie) 11:55 p.m.
    I tried system fields sy-datum and sy-uzeit for capturing the system date and time when my program runs but it gets reset when the date changes.
    i tried get time field option but it is resetting the sy-datum to zero so i am able to capture the date of the next day and resulting in wrong calculation of time .
    please help with your valuable answers.
    Thank you,
    Srinivas.

    Hi,
    As per my understanding you want to fetch those data which are created between the program last run date time and current date time.
    If I am right please follow the below procedure.
    1. Create one custom table to store program name and program execution last date time
    2. at event INITIALIZATION populate to fields date and Time using the statement sy-datum
    initialization.
      v_curr_time = sy-uzeit.           " Current time
      v_curr_date = sy-datum.           " Current date
    3. Get the last run date and time from the above custom table.
    4. Do your logic.
    5. at last Update the custom table with current date and time based on the step two (Here do not take sy-datum like that, take the value in the global variable populated in step 2)
    Thanks
    Subhankar

  • Capturing Dynamic lengths during data transmissions From CRM to Others

    hi friends
    i am sending data from CRM to SAP and then SAP to CRM
    during the transmiision of data one of the field amount have length 30 in CRM and 17 in SAP, while sending values From CRM if amount length exceeds 17 then my application is going to short dump.
    i have validation part in SAP Server where i can do some code to avoid short dump, but wat exactly i need to code to capture dynamic lenght of CRM amount Values
    can any one have idea to capture dynamic lengths of fields during the transmissions
    Regards
    Anil P

    Hi Surender,
      I think i still have problem. I have like 26 fields in dynamic selections. I am filling one field and checking for that one in the program. Below is my program and please let me know where i doning wrong
    TYPES: RSDS_SELOPT_T LIKE RSDSSELOPT OCCURS 10.
    TYPES: BEGIN OF RSDS_FRANGE,
    FIELDNAME LIKE RSDSTABS-PRIM_FNAME,
    SELOPT_T TYPE RSDS_SELOPT_T,
    END OF RSDS_FRANGE.
    TYPES: RSDS_FRANGE_T TYPE RSDS_FRANGE OCCURS 10.
    TYPES: BEGIN OF RSDS_RANGE,
    TABLENAME LIKE RSDSTABS-PRIM_TAB,
    FRANGE_T TYPE RSDS_FRANGE_T,
    END OF RSDS_RANGE.
    DATA: WA_RSDS_RANGE TYPE RSDS_RANGE.
    CALL FUNCTION 'RS_REFRESH_FROM_DYNAMICAL_SEL'
      EXPORTING
        CURR_REPORT              = sy-cprog
        MODE_WRITE_OR_MOVE       = 'M'
    IMPORTING
       P_TRANGE                 = WA_RSDS_RANGE
    EXCEPTIONS
      NOT_FOUND                = 1
      WRONG_TYPE               = 2
      OTHERS                   = 3
    IF SY-SUBRC <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.

  • Error Message during Data Migration.

    Hi All,
    I have to migrate data for opportunity. I read the data and passed it to "BAPI_BUSPROCESSND_CREATEMULTI". This FM returns a message i.e. "A log has been generated for single document". When i am passing the GUID to FM "BAPI_BUSPROCESSND_SAVE" it is giving an error message i.e. "The document could not be saved".
    I am not able to find why it is not saving that GUID. if anyone has any idea or if you the sample code then please reply.
          CALL FUNCTION 'BAPI_BUSPROCESSND_CREATEMULTI'
            TABLES
              HEADER               = lt_bapibus20001_header_ins
              OPPORTUNITY      = lt_bapibus20001_opportunity
              INPUT_FIELDS     = lt_bapibus20001_input_fields
              PARTNER            = lt_bapibus20001_partner_ins
              STATUS               = lt_bapibus20001_status_ins
              RETURN             = lt_bapiret2.
          lw_objects_to_save-guid  = lv_guid.
          APPEND lw_objects_to_save TO lt_objects_to_save.
          CALL FUNCTION 'BAPI_BUSPROCESSND_SAVE'
            EXPORTING
              update_task_local = space
              save_frame_log    = GC_X
            IMPORTING
              LOG_HANDLE        = LOG_HANDLE
            TABLES
              objects_to_save   = lt_objects_to_save
              saved_objects     = lt_saved_objects
              return            = lt_bapiret2.

    I was going through the log and there is an error 250 in there too.

  • System crash during data archiving

    Gurus:
    We have a serious concern that is if our R3 server crash during our data archiving runs:
    1) if crash happens at write phase, can we simply remove  the generated archive , restart the write  job?
         We are NOT sure if  the second run will archive those records that were archived in the failed first run.
    2) if crash happens at delete phase, a simple restart of the same delete job will be enough? (We use "store before delete").
    Because this archiving is too important to us, please help. 
    Thanks a lot!

    Christoph:
    Thank you for your reply.
    This crash has not happened yet.
    We are making a plan about how to cope with this possibility.
    But we do not have experience with any crash during archiving process.
    Therefore we ask  for people's experience about following scenarios:
    1) if crash happens at write phase, can we simply remove the generated archive , restart the write job?
    We are NOT sure if the second run will archive those records that were archived in the failed first run.
    2) if crash happens at delete phase, a simple restart of the same delete job will be enough? (We use "store before delete").
    Would you please help?
    Thank you again.

  • List of data targets is not visible during data upload

    Hi all,
      I am trying to load user defined transactional data into an info object, i will do all necessary customization steps such as creating application component,assiging data sources,creating info packages and then creating update rules in info cubes, moreover i wrote a routine which calculates sales reveune based on cost and quantity sold.
    My problem is that when i created infopackage , it does not list any data targets, Plz any one can give tips in this regard.
    thanks in advance
    regards,
    a.fahrudeen
    Message was edited by:
            FAHRUDEEN MUSTAFA

    Hi Fahrudeen,
    Am a little confused here... you say you want to load Transaction data and load it into the InfoObject?? what was that??
    You can load the Transaction data only into your data targets such as InfoCube and DataStore Objects... If you are loading the data into your InfoObjects, then that would mean that you are loading the Master data for which obviously you won't have your data targets listed in your InfoPackage... Only in case of loading the transaction data would you have your Data Targets listed in your InfoPackage...
    Regards
    Manick

  • Remove double records during data upload from one InfoCube to another

    Dear Experts
    We have transactional financial data available in an InfoCube including cummulated values by period . Some companys have 12 reporting periods (0FISCPER3) and some companys have 16 periods (but all 16 periods are not always filled) . The data must be prepared for a consolidation system which expects only 12 periods. Therefore I bulit up a routine with the following logik:
    If period > 12, result = 12, else result = source field.
    But as the data target is (must be) an InfoCube the new values with reporting period 12 are not overwritten, but summarised instead. This means the original records with period 12 and the new records - see example:
    Records before transformation:
    Period   Amount
    12          100
    13           120
    Records after transformation in the InfoCube:
    Period   Amount
    12          100
    12           120
    This would lead to the following aggregation:
    Period   Amount
    12          240
    But as the values per period are cummulated, the consolidation system only needs the last period. So there should be only  one record left in the InfoCube:
    Period   Amount
    12           120
    Is it possible to delete dublicate records or do you have any other idea to to keep only the record with the last period (e.g. period 13) and to assign the value 12?
    Thanks a lot in advance for your help!
    Regards
    Marco

    Hi,
    You have two options here, you can put DSO in between the Datasource and infocube, and load the delta using the change log.
    Second is use delete the overlapping request from the infocube, it will delete the previuos requests and load the new request.
    Check the below article:
    [Automatic Deletion of Similar or Identical Requests from InfoCube after Update|http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/e0431c48-5ba4-2c10-eab6-fc91a5fc2719]
    Hope this helps...
    Rgs,
    Ravikanth

  • Issue while creating service PR during data upload from xl

    Hi Sap gurus,
    I have an issue while uploading the data from exl sheet for purchase requisition.
    Here I am using the following bapi and code is follows.
      Call function 'BAPI_REQUISITION_CREATE'
    EXPORTING
       SKIP_ITEMS_WITH_ERROR =
        Importing
          number = prn
        tables
          requisition_items = i_item
          requisition_account_assignment = i_accnt_assngmnt
       REQUISITION_ITEM_TEXT =
       REQUISITION_LIMITS =
       REQUISITION_CONTRACT_LIMITS =
        requisition_services =  i_item_service
        requisition_srv_accass_values = i_item_servval
          return = return1.
       REQUISITION_SERVICES_TEXT =.
    Purchase requisition is being created however it is not updating service details though I am using the item category as ‘D’ which is nothing but service.
    In bapi updated service details.  Please can you help me to come out of this issue?
    Also inform me what is difference between line number and package number in service line item details.
    I am passing the item number to both fields like line number and package number.
    Or is it configuration issue please let me know.

    Hi Experts,
    If I use mapping as shown below then I am getting error "Business Partner C6220 does not exit" and Bp are not created.
    1 C6220 <b>PERS_NUMBER</b>
    2 Mr PERS_TITLE_KEY
    3 A PERS_FIRSTNAME
    4 D PERS_INITIAL
    5 Kale PERS_LASTNAME
    6 [email protected] PERS_E_MAIL
    7 A1 Road PERS_STREET
    8 Pune PERS_CITY
    9 IN PERS_COUNTRYISO
    10 411001 PERS_POSTL_COD1
    11 13 PERS_REGION
    12 PERS_LANGU_CORR
    Can someone suggest solution?
    Thanks and regards
    Ambar

  • Capture a message

    Hello guru's,
    I have the following problem:
    I have a BPM where I do a sync call to a partner. My abstract Message Interface sends out ORDERS05 IDOC and needs to Receive a ORDRSP05 IDOC. Now I get back a different kind of message, it is an error message. So If i look in my moni I see at Response Message a message with a different format.
    Now I want to capture that message and do some mapping with it. So I created a switch which determines if the response is an IDOC or not. If not then I wanted to do some other mapping, but this doesn't work, I get the message: no payload found.
    How can I do mappings with the response message which is different than ORDRSP05? Is it possible?
    Best regards,
    Guido

    When you are getting this different message, was there any error reported by system?  I meant any exception raised? In such case you can use and exception handler for mapping this new message in error mapping.
    But as you said, does the error has any XI SOAP strucuture which is understood by XI? Then only you would be able to put a mapping for this.
    VJ

  • Avoiding system messages in oracle forms.

    Hi All,
    I'm Working with Oracle forms using oracle form builder in R12. I want to Avoid System message (Save message) so i'm using :system.message_level := 5; but i can't avoid Save message. how can i solve this.
    Thanks & Regards,
    Macks

    You could override on-message trigger and forms will fire this trigger instead of standard message. Whatever you write here will happen. You could suppress given message and let everything else go.

  • FIM data upload issues

    Hi
    When i am using FIM 10.0 to upload data from flat file i get an error as follows:
    Failed to generate ATL file: The job launch for job 'test_FF' failed with error 'The repository bods for the batch job test_FF cannot be found.'
    Experts please advise
    regards
    Divneet

    Hej Marc,
    thank you for your trouble.
    The problem was solved and the job was executed successfully.
    The error was a wrong description of the DataServices URL. Instead of the name of the server “http://localhost:8080/” was used.
    Although the system message was “Data Services Information. The information you provided is correct”, this was the cause of the failed job execution.
    Sometimes one has to review it again and again to find such a snag.
    Thank you!
    Manuel

  • Tables used in MM Master Data Upload using LSMW or BDC?

    Hi,
    Can anyone provide me with the following information about uploading MM Master Data using LSMW or BDC :-
    1. What and all tables are used for uploading Material Master, Vendor Master, Info Record, Open PO, Open PR, RFQ, Open Contracts/Agreements ?
    2. What problems are faced during Data Upload ?
    3. What error appears/encouontered during upload ?
    4. What is the diffrence b/w LSMW and BDC ? Both can be used for Data upload so, what differences are b/w them ?
    5. Any other thing to remember/know during Data Upload ?
    Thanks,
    Lucky

    Hi Lucky,
    Dont get angry by seeing my response.
    See each and every question posted by u is available , in this forum,
    u need to do just a search on individual post.
    see what ever ur posted, u r definetly got the solutions through others,
    remember most of these answers are fetching by this threads itself (60-70%)
    a few solutions are there that are given by own,
    so better to search first, in the forum , if not satisfied with the solutions then .........
    Just its an friendly talk
    and Reward points for helpful answers
    Thanks
    Naveen khan

Maybe you are looking for