Creating reconciliation events from a flat file--a design question

Hello,
I am currently evaluating an existing OIM implementation to rebuild it using OIM 11g and have a question regarding the ideal method to create reconciliation events from a flat file.
The current implementation is using a web service call to process a flat file and creates the reconciliation events. This runs every hour.
Although this looks cool but I thought there was no need to go to the extent.
If OIM cannot consume the flat file directly, meaning if it needs some data massage, I can always load the data from the flat file into an external table, write a pl/sql procedure to transform the data and put it into a temporary global table and create reconciliation events like that.
What would be the ideal method to load data from a flat file into OIM?
THanks
Khanh

If it's a flat file, then have you looked at GTC option? And why any staging in between? OIM can read flat files just fine either through GTC or write up your own recon code.
-Bikash

Similar Messages

  • Need to create sale order from the flat file & mail has to be sent

    Hi Experts,
              I have a requirement to create a sale order from a flat file and once the oder is created, mail has to be sent to customer as well as to internal user with the order details. I want to know how this process can be implemented and what adapters are needed to execute this.
    it would be very helpful, if i get an step-by-step procedure.
    Points assured for any helpful answers.
    Thanks in Advance
    Jai

    HI Jai,
    You need to create two interfaces as file will be sending the Sales oreder details that you need to capture in IDOC or RFC and then have to trigger to create the sales order. For this Standard BAPIs are also available.
    These RFC or BAPIs will response with Sales order details that you need to divert to Mail adapter with the use of BPM and also have to go for Async to Sync bridge.
    File -
    >XI (BPM) ---> BAPI/RFC  (Request)
    MailAdapter <- XI (BPM) <--- BAPI/RFC (Response)
    For this refer below links for step by step
    /people/arpit.seth/blog/2005/06/27/rfc-scenario-using-bpm--starter-kit - File to RFC
    https://www.sdn.sap.com/irj/sdn/weblogs?blog=/pub/wlg/1685 [original link is broken] [original link is broken] [original link is broken] - File to Mail
    /people/krishna.moorthyp/blog/2005/06/09/walkthrough-with-bpm - Walk through BPM
    /people/siva.maranani/blog/2005/05/22/schedule-your-bpm - Schedule BPM
    /people/sriram.vasudevan3/blog/2005/01/11/demonstrating-use-of-synchronous-asynchronous-bridge-to-integrate-synchronous-and-asynchronous-systems-using-ccbpm-in-sap-xi - Use of Synch - Asynch bridge in ccBPM
    https://www.sdn.sap.com/irj/sdn/weblogs?blog=/pub/wlg/1403 [original link is broken] [original link is broken] [original link is broken] - Use of Synch - Asynch bridge in ccBPM
    If you will use IDOC then In IDOC inbound processing you need to trigger for Sales order creation and then have to send the Sales Order generated IDOC as an Outbound to Mail Adapter
    Flat file -
    > XI ---> IDOC
    IDOC -
    > XI ---> Mail
    This will be bit easy scenario to develope as compare to using BAPI with BPM
    /people/ravikumar.allampallam/blog/2005/06/24/convert-any-flat-file-to-any-idoc-java-mapping - Any flat file to any Idoc
    configuring IDOC-XI-Mail scenario using following web-log:
    /people/michal.krawczyk2/blog/2005/03/07/mail-adapter-xi--how-to-implement-dynamic-mail-address
    /people/sravya.talanki2/blog/2005/08/18/triggering-e-mails-to-shared-folders-of-sap-is-u - Triggering Email from folder
    Thanks
    Swarup

  • Process chains from the flat file by using filezilla client version in  BI

    Hi experts,
    please let me know how to create the process chains from  flat file by using filezilla client version.
    so far, I didn't work with file zilla FTP. can anybody give detailed step by step procedure to find the flat files and download them and creating process chains from that flat file.
    Thanks & Regards,
    Babu..

    Hi,
    Check these:----
    Process chain configuration for Flat file loading
    http://wiki.sdn.sap.com/wiki/display/BI/Howtowriteroutinetofetchcurrentday%27sfilename
    Regards,
    Suman

  • Create Shopping cart (or PO) in SRM/EBP from a flat file

    Hi all
           I am looking for a function module or BAPI or any program that i can use to create shopping cart in SRM from a flat file.
    Is there a way to set up a break point in ITS transactions that are used to create shopping carts through portal? Can we debug HTML templates?
    If any one knows how to do any of those please let me know.
    Your help is greatly appreciated.
    Thanks
    Sreenivas

    Hi,
    You would get better response for your question from
    Internet Transaction Server (ITS)
    Regarding your question about break point. yes it can be done.
    Regards
    Raja

  • BAPI_PO_CREATE1 not able to create PO's for multiple rows from the flat fil

    Hi
    i am uploading PO's from a flat file into SAP using the BAPI_PO_CREATE1. Everything works fine if the flat file hast only one record.
    if the flat file has more than one record then while loading the second record the BAPI returns a error message. I am calling the BAPI in a loop.
    The strange thing is that if i load the second record individually the program is able to create the PO. So only when i have multiple records in the flat file i am unable to load the PO into SAP. I debugged and checked all the internal tables passed to the BAPI. All seems to have the data correctly but still the BAPI fails.
    any idea where i am going wrong?
    the code looks something like this.
    LOOP AT HEADER_ITAB.
       PERFORM FILL_HEADER_RECORDS.
    LOOP AT ITEM_ITAB WHERE EBELN eq HEADER_ITAB-EBELN.
         PERFORM FILL_ITEM_RECORDS.
    ENDLOOP.
      PERFORM CERATE_PO_VIA_BAPI.
    ENDLOOP.

    What is the error message. Are you trying something like this:
        LOOP AT T_DATA1.
          AT NEW LIFNR.
            READ TABLE T_DATA1 INDEX SY-TABIX.
            PERFORM INIT_TABLES.
            PERFORM FILL_DATA.
    --Call the BAPI to create PO
            PERFORM CREATE_PO.
          ENDAT.
        ENDLOOP.
    FORM CREATE_PO .
      CALL FUNCTION 'BAPI_PO_CREATE1'
        EXPORTING
          POHEADER                     =   POHEADER
          POHEADERX                    =   POHEADERX
        POADDRVENDOR                 =
        TESTRUN                      =
        MEMORY_UNCOMPLETE            =
        MEMORY_COMPLETE              =
        POEXPIMPHEADER               =
        POEXPIMPHEADERX              =
        VERSIONS                     =
        NO_MESSAGING                 =
        NO_MESSAGE_REQ               =
        NO_AUTHORITY                 =
        NO_PRICE_FROM_PO             =
       IMPORTING
         EXPPURCHASEORDER             =  EXPPURCHASEORDER
         EXPHEADER                    =  EXPHEADER
         EXPPOEXPIMPHEADER            =  EXPPOEXPIMPHEADER
       TABLES
         RETURN                       =  RETURN
         POITEM                       =  POITEM
         POITEMX                      =  POITEMX
        POADDRDELIVERY               =
         POSCHEDULE                   =  POSCHEDULE
         POSCHEDULEX                  =  POSCHEDULEX
         POACCOUNT                    =  POACCOUNT
        POACCOUNTPROFITSEGMENT       =
         POACCOUNTX                   =  POACCOUNTX
        POCONDHEADER                 =
        POCONDHEADERX                =
         POCOND                       =  POCOND
         POCONDX                      =  POCONDX
        POLIMITS                     =
        POCONTRACTLIMITS             =
        POSERVICES                   =
        POSRVACCESSVALUES            =
        POSERVICESTEXT               =
        EXTENSIONIN                  =
        EXTENSIONOUT                 =
        POEXPIMPITEM                 =
        POEXPIMPITEMX                =
        POTEXTHEADER                 =
          POTEXTITEM                   = POTEXTITEM
        ALLVERSIONS                  =
         POPARTNER                    =  POPARTNER
      CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
        EXPORTING
          WAIT   = 'X'
        IMPORTING
          RETURN = RETURN1.
      DATA: L_NAME TYPE LFA1-NAME1.
      CLEAR L_NAME.
      SELECT SINGLE NAME1
               FROM  LFA1
               INTO L_NAME
               WHERE LIFNR = POHEADER-VENDOR.
      LOOP AT RETURN.
        WRITE : / RETURN-TYPE,
                  RETURN-ID,
                  RETURN-MESSAGE.
        WRITE : '--> For vendor:',
                 POHEADER-VENDOR,
                 L_NAME.
      ENDLOOP.
    ENDFORM.                    " CREATE_PO

  • Custom code for Target Source Reconciliation from a flat file

    Hi Experts,
    I need help in writing a custom code for Target Source Reconciliation from a flat file to OIM. The flat file will contain account details for different application instances. I am working on 11gr2.
    Thanks,
    Subin

    All right, all right, not so quickly.
    I am at the stage of trying to put one dimension
    array. But I stuck in one place, this is the program:
    import java.io.*;
    public class FromFile {
    public static void main(String[] args) throws IOException {
    File inputFile = new File("mac.txt");
    FileReader in = new FileReader(inputFile);
    int c;
    for(int i = 0; i < 10; i++) {
         c = in.read();
    System.out.println(c);
    and I try to read: 1 2 3 4 from text file
    This is the result so far...
    49
    32
    50
    32
    51
    32
    52
    -1
    -1
    -1
    well,
    I think I know what's wrong. I must change ASCII numbers into
    ints. But I dont' know how to do it. Some nice book, or
    tutorial on streams would come in handy. Could you correct
    it?.

  • What is the best way to load and convert data from a flat file?

    Hi,
    I want to load data from a flat file, convert dates, numbers and some fields with custom logic (e.g. 0,1 into N,Y) to the correct format.
    The rows where all to_number, to_date and custom conversions succeed should go into table STG_OK. If some conversion fails (due to an illegal format in the flat file), those rows (where the conversion raises some exception) should go into table STG_ERR.
    What is the best and easiest way to archive this?
    Thanks,
    Carsten.

    Hi,
    thanks for your answers so far!
    I gave them a thought and came up with two different alternatives:
    Alternative 1
    I load the data from the flat file into a staging table using sqlldr. I convert the data to the target format using sqlldr expressions.
    The columns of the staging table have the target format (date, number).
    The rows that cannot be loaded go into a bad file. I manually load the data from the bad file (without any conversion) into the error table.
    Alternative 2
    The columns of the staging table are all of type varchar2 regardless of the target format.
    I define data rules for all columns that require a later conversion.
    I load the data from the flat file into the staging table using external table or sqlldr without any data conversion.
    The rows that cannot be loaded go automatically into the error table.
    When I read the data from the staging table, I can safely convert it since it is already checked by the rules.
    What I dislike in alternative 1 is that I manually have to create a second file and a second mapping (ok, I can automate this using OMB*Plus).
    Further, I would prefer using expressions in the mapping for converting the data.
    What I dislike in alternative 2 is that I have to create a data rule and a conversion expression and then keep the data rule and the conversion expression in sync (in case of changes of the file format).
    I also would prefer to have the data in the staging table in the target format. Well, I might load it into a second staging table with columns having the target format. But that's another mapping and a lot of i/o.
    As far as I know I need the data quality option for using data rules, is that true?
    Is there another alternative without any of these drawbacks?
    Otherwise I think I will go for alternative 1.
    Thanks,
    Carsten.

  • Data from a Flat file int a Cube

    Hi experts!
    Just a quick one, could we load data from a flat file directly to an infocube?
    Or we woud need to create a ODS to load that from the flat file there and later on form the ODS to the cube?
    Thanks you very much for your time!!

    Hi,
    You can directly load the flat data into the info cube. This should not be a problem.
    Create a flat file datasource according to the structure of the flat file. Create Transformations and DTP to the Cube.
    Load the flat file data into the PSA and then DTP the request into the cube.
    If the flat file load is a full load and one time load and data has to be deleted and loaded on the next load then above approach is fine.
    But if your load is every day load and delta then it would be appropriate to have a DSO in between the data flow.
    Hope it helps.

  • Problem in the BDC program to upload the data from a flat file.

    Hi,
    I am required to write a BDC program to upload the data from a flat file. The conditions are as mentioned below:-
    1) Selection Screen will be prompted to user and user needs to provide:- File Path on presentation server (with F4 help for this obligatory parameter) and File Separator e.g. @,#,$,%,... etc(fields in the file will be separated by using this special character) or fields may be separated by tab(tab delimited).
    2) Finally after the data is uploaded, following messages need to be displayed:-
    a) Total Number of records successfully uploaded.
    b) Session Name
    c) Number of Sessions created.
    Problem is when each record is fetched from flat file, the record needs to be split into individual fields separated by delimiter or in case tab separated, then proceeding in usual manner.
    It would be great if you provide me either the logic, pseudocode, or sample code for this BDC program.
    Thanks,

    Here is an example program,  if you require the delimitor to be a TAB, then enter TAB on the selection screen, if you require the delimitor to be a comma, slash, pipe, whatever, then simply enter that value.  This example is simply the uploading of the file, not the BDC, I assume that you know what to do once you have the data into the internal table.
    REPORT zrich_0001.
    TYPES: BEGIN OF ttab,
            rec TYPE string,
           END OF ttab.
    TYPES: BEGIN OF tdat,
           fld1(10) TYPE c,
           fld2(10) TYPE c,
           fld3(10) TYPE c,
           fld4(10) TYPE c,
           END OF tdat.
    DATA: itab TYPE TABLE OF ttab.
    data: xtab like line of itab.
    DATA: idat TYPE TABLE OF tdat.
    data: xdat like line of idat.
    DATA: file_str TYPE string.
    DATA: delimitor TYPE string.
    PARAMETERS: p_file TYPE localfile.
    PARAMETERS: p_del(5) TYPE c.
    AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
      DATA: ifiletab TYPE filetable.
      DATA: xfiletab LIKE LINE OF ifiletab.
      DATA: rc TYPE i.
      CALL METHOD cl_gui_frontend_services=>file_open_dialog
        CHANGING
          file_table = ifiletab
          rc         = rc.
      READ TABLE ifiletab INTO xfiletab INDEX 1.
      IF sy-subrc = 0.
        p_file = xfiletab-filename.
      ENDIF.
    START-OF-SELECTION.
      TRANSLATE p_del TO UPPER CASE.
      CASE p_del.
        WHEN 'TAB'.
          delimitor = cl_abap_char_utilities=>horizontal_tab.
        WHEN others.
          delimitor = p_del.
      ENDCASE.
      file_str = p_file.
      CALL METHOD cl_gui_frontend_services=>gui_upload
        EXPORTING
          filename = file_str
        CHANGING
          data_tab = itab.
      LOOP AT itab into xtab.
        CLEAR xdat.
        SPLIT xtab-rec AT delimitor INTO xdat-fld1
                                         xdat-fld2
                                         xdat-fld3
                                         xdat-fld4.
        APPEND xdat to idat.
      ENDLOOP.
      LOOP AT idat into xdat.
        WRITE:/ xdat-fld1, xdat-fld2, xdat-fld3, xdat-fld4.
      ENDLOOP.
    Regards,
    Rich Heilman

  • Uploading Data from a Flat File

    Hi
    I am trying to Upload data from a Flat File to the MDS. I have a few questions on the Process.
    a) XI would be the Interface, and one end would be a file adapter with the Flat File format. On the other end, which Interface should I use - ABA Business Partner In or MDM Business Partner In. I do not understand the differences between them and where should which one be used?
    B) At the moment,I want to map the data to standard Object type, Business Partner  BUS1006. This also has a staging area already defined in the System. Can I view the data which is imported into the Staging area ? How so ?
    C) The struture (or data) that I wish to upload has few fields, and does not map to the BP structure easily. In such a scenario does it make sense to
    Create a new Object type
    (ii) Create a new Business Partner type with the appropriate fields only ..
    What I need to know is if either of these options is feasible and what are the pros & cons of doing this, in terms of effort, skillset & interaction with SAP Development ?
    Kindly do reply if you have any answers to these questions.
    Regards,
    Gaurav

    Hi Markus,
    Thanks for your inputs.
    I have tried uploading some dats from a Flat file to the Business Partner Onject type, BUS1006. I initially got some ABAP Parsing errors on the MDM side, but after correcting that, I find my message triggers a short dump - with the Method SET_OBJECTKEYS, not finding any keys in the BP structure that has been created.
    Q1 - How do I get around this problem ? Is it necessary for me to specify Keys in the PartyID node of the Interface ABABusinessPartnerIn. or is it something else ?
    Q2 - How is this general process supposed to work? I would assume that for staging, I would get incomplete Master data or data from flat files, which need not neccesarily contain keys. The aim is to use the matching strategies in the CI to identify duplicates and consolidate them.
    Thanks in advance for your reply.
    Regards,
    Gaurav

  • Error while uploading data from a flat file

    Hi All,
    I am trying to load data from a flat file to an ODS. The flat file contains a field, say FLAT01, of numerical type(Type Decimal, Length 18, Decimals 2). I have created an infoobject, say ZIO10, with type CURR and currency field, 0CURRENCY. In the transfer rules, I have assigned a constant for 0CURRENCY as the flat file doesn't have any currecy field.
    The problem is the flat file doesn't have any value in that field, FLAT01, for some records, infact most of the records. When I try to load it to BW,from applicatin server, it is throwing an error "Contents from field ZIO10 cannot be converted in type CURR ->longtext".
    I have debugged the transfer rules and find that the empty value is being represented as '#' and when it is trying to copy it into the CURR type field, it is throwing the error.
    Can someone please let me know how to solve it? why is it taking '#' for enpty value? Can't we change that?
    Any help would be highly appreciated.
    Best Regards,
    James.
    Message was edited by:
            James Helsinki

    Hi SS,
    I am sorry that I was not clear with my explanation. I have created ZIO10 as a keyfigure with type AMOUNT and the currency as 0CURRENCY.
    There is no currency coming in from the flat file so I used a constant in transfer rules to fill 0CURRENCY.
    The field which is coming as '#' is FLAT01 which is assigned to ZIO10.
    Best Regards,
    James.

  • Multiple idocs from single flat file

    Hi All
    I want to send data from a flat file to SAP(file to idoc)
    My flat file structure is
    id,name,number,city
    2,R1,234,SD
    2,R2,457,MD
    3,R4,789,HG
    3,R6,235.HG
    The Field 'id' will change..after  every change in 'id' ,seperate idoc should be created.
    I have checked the following thread.
    Re: Content conversion for seperate idoc
    In the above thread ,it is asked to map v.no with remove context and use SPLIT BY VALUE on value change then do the mapping accordingly ,you can create 3 idocs for the same.
    I'm confused about how to do these mappings.
    Please explain the mapping in detail.
    Please help
    Regards
    Reema

    if  your source data type is like
    MT_Source
        Record           0-unbounded
           id                    ----1
           name              -----1
           number           -----1
           city               -------1
    then in the sender file communication channel you have to specify  file content conversion parameters as
    Parameter name               parametervalue
    Document Name                 MT_Source
    Recordset Structure           Record,*
    choose + to add more parameters
    Name                                  Value
    Record.fieldSeparator                      ,
    Record.fieldNames                        id,name,number,city
    Record.endSeparator                      'nl'
    then do the maping
    as
    id ---->removeContext---->SplitByValue(Value change)---->Target Idoc
    map according to your requirement for other fields

  • Step by Step details on how to load data from a flat file

    hi can anyone explain how to load data from a flat file. Pls giv me step by step details. thnx

    hi sonam.
    it is very easy to load data from flat file. whn compared with other extrations methods...
    here r the step to load transation data from a flat file.......
    step:1 create a Flat File.
    step:2 log on to sap bw (t.code : rsa1 or rsa13).
    and observe the flat file source system icon. i.e pc icon.
    step:3 create required info objects.
    3.1: create infoarea
         (infoObjects Under Modeling > infoObjects (root node)-> context menu -
    > create infoarea).
    3.2:  create char /keyfig infoObject Catalog.(select infoArea ---.context menu --->create infoObject catalog).
    3.3:   create char.. infoObj and keyFig infoObjects accourding to ur requirement and activate them.
    step:4 create infoSource for transaction data and create transfer structure and maintain communication structure...
        4.1: first create a application component.(select InfoSources Under modeling-->infosources<root node>>context menu-->create  applic...component)
       4.2: create infoSource  for transation data(select appl..comp--.context menu-->create infosource)
    >select O flexible update and give info source name..
    >continue..
    4.4: *IMp* ASSIGN DATASOURCE..
      (EXPAND APPLIC ..COMP..>EXPAND YOUR INFOSOURCE>CONTEXT MENU>ASSIGN DATASOURCE.)
    >* DATASOURCE *
    >O SOURCE SYSTEM: <BROWSE AND CHOOSE YOUR FLAT FILE SOURCE SYSTEM>.(EX:PC ICON).
    >CONTINUE.
    4.5: DEFINE DATASOURCE/TRANSFER STRUCTURE  FOR IN FOSOURCE..
    > SELECT TRANSFER STRUCTURE TAB.
    >FILL THE INFOOBJECT FILLED WITH THE NECESSARY  INFOOBJ IN THE ORDER OR SEQUENCE OF FLAT FILE STRUCTURE.
    4.6: ASSIGN TRANSFER RULES.
    ---> NOW SELECT TRANSFER RULES TAB. AND SELECT PROPOSE TRANSFER RULES SPINDLE LIKE ICON.
    (IF DATA TARGET IS ODS -
    INCLUDE 0RECORDMODE IN COMMUNICATION STRUCTURE.)
    --->ACTIVATE...
    STEP:5  CREATE DATATARGET.(INFOCUBE/ODS OBJECT).
    5.1: CREATE INFO CUBE.
    -->SELECT YOUR INFOAREA>CONTEXT MENU>CREATE INFOCUBE.
    5.2: CREATE INFOCUBE STRUCTURE.
    ---> FILL THE STRUCTURE PANE WILL REQUIRE INFOOBJECTS...(SELECT INFOSOURCE ICON>FIND UR INFOSOURCE >DOUBLE CLICK > SELECT "YES" FOR INFOOBJECT ASSIGNMENT ).
    >MAINTAIN ATLEAST  ON TIME CHAR.......(EX; 0CALDAY).
    5.3:DEFINE AND ASSIGN DIMENSIONS FOR YOUR CHARACTERISTICS..
    >ACTIVATE..
    STEP:6 CREATE UPDATE RULES FOR INFOCUDE USING INFOSOURCE .
    >SELECT UR INFOCUBE >CONTEXT MENU> CREATE UPDATE RULES.
    > DATASOURCE
    > O INFOSOURCE : _________(U R INFOSOURCE). AND PRESS ENTER KEY.......
    >ACTIVATE.....UR UPDATE RULES....
    >>>>SEE THE DATA FLOW <<<<<<<<----
    STEP:7  SCHEDULE / LOAD DATA..
    7.1 CREATE INFOPACKAGE.
    --->SELECT INFOSOURCE UNDER MODELING> EXPAND UR APPLIC.. COMP..> EXPAND UR INFOSOURCE..> SELECT UR DATASOURCE ASSIGN MENT ICON....>CONTEXT MENU> CREAE INFOPACKAGE..
    >GIVE INFOPACKAGE DISCREPTION............_________
    >SELECT YOUR DATA SOURCE.-------> AND PRESS CONTINUE .....
    >SELECT EXTERNAL DATA TAB...
    > SELECT *CLIENT WORKSTATION oR APPLI SERVER  >GIVE FILE NAME > FILE TYPE> DATA SAPARATER>
    >SELECT PROCESSING TAB
    > PSA AND THEN INTO DATATARGETS....
    >DATATARGET TAB.
    >O SELECT DATA TARGETS
    [ ] UPDATE DATATARGET CHECK BOX.....
    --->UPDATE TAB.
    O FULL UPDATE...
    >SCHEDULE TAB..
    >SELECT O START DATA LOAD IMMEDIATELY...
    AND SELECT  "START" BUTTON........
    >>>>>>>>>>
    STEP:8 MONITOR DATA
    > CHECK DATA IN PSA
    CHECK DATA IN DATA TARGETS.....
    >>>>>>>>>>> <<<<<<<<<----
    I HOPE THIS LL HELP YOU.....

  • Mapping Error: INSUFFICIENT_ACCESS_ON_CROSS_REFERENCE_ENTITY when attempting to Upsert from a Flat File source to SFDC Account and Contact target

    I'm getting an INSUFFICIENT_ACCESS_ON_CROSS_REFERENCE_ENTITY error when running a mapping that attempts to Upsert data from a Flat File to SFDC Account and Contacts.  I also tried to split The error message for Account: WRITER_2_*_1> WRT_8164 [2015-07-26 08:49:57.707] Error loading into target [Account] : Error received from salesforce.com.  Fields [].  Status code [INSUFFICIENT_ACCESS_ON_CROSS_REFERENCE_ENTITY].  Message [Please enter a value for Country].WRITER_2_*_1> CMN_1053 [2015-07-26 08:49:57.707] : Rowdata: ( RowType=1(update) Src Rowid=1 Targ Rowid=1 Upsert is based on the External ID: Account_External_ID__c The error message for Contact: WRITER_1_*_1> WRT_8164 [2015-07-26 08:49:55.305] Error loading into target [Contact] : Error received from salesforce.com.  Fields [].  Status code [INSUFFICIENT_ACCESS_ON_CROSS_REFERENCE_ENTITY].  Message [Please enter a value for Country].WRITER_1_*_1> CMN_1053 [2015-07-26 08:49:55.305] : Rowdata: ( RowType=0(insert) Src Rowid=1 Targ Rowid=1 Upsert is based on the External ID: Contact_External_ID__c Any ideas on how to proceed?

    Hello,Can you please help me understand that limitations of the free data loader? In this link  - http://www.informaticacloud.com/editions-integration.html# - I see the below features listed.No-code, wizard driven cloud integrationMulti-tenant SaaS solutionDatabase and file ConnectivityFlexible schedulingBulk API support (for Salesforce.com)Unlimited rows/day24 jobs/day1 Secure AgentLimited to 1 userCommunity supportCloud Data MaskingQuestions:When I view licenses in my free data loader, under Feature Licences, it shows the License type for Salesforce Connectivity/Bulk API as “Trial”. Can’t I create a scheduled Data Synch task to upsert records in Salesforce using Bulk API mode?Is the email notification option (for success, warning and failure of data synch task) available on the free version (and not as a trial)?I understand there is a limit of 24 jobs/day. But is there a limit on the number of scheduled data synch tasks that can be created?Data Masking is listed as a feature above for the free edition. However, when I view the licenses in my free data loader, Data Masking is shown as “Trial”. Can you please clarify this?Is there a limit on the number of Connections that can be created?ThanksSanjay

  • BAPI to upload line items from a flat file to VA01

    Hi guys,
    I have a requirement wherein i need to upload data containing line items from a flat file to VA01.Please tell me how do i go about this.
    Thanks and regards,
    Frank.

    Hi
    Frank this code might help u and this is the BAPI to create sales document BAPI_SALESDOCU_CREATEFROMDATA1 if i am helpful to u in any way plzz reward and dont forget to reward me plzzz
    for any further quiries my mail id [email protected]
        Include           YCL_CREATE_SALES_DOCU                         *
         Form  salesdocu
         This Subroutine is used to create Sales Order
         -->P_HEADER           Document Header Data
         -->P_HEADERX          Checkbox for Header Data
         -->P_ITEM             Item Data
         -->P_ITEMX            Item Data Checkboxes
         -->P_LT_SCHEDULES_IN  Schedule Line Data
         -->P_LT_SCHEDULES_INX Checkbox Schedule Line Data
         -->P_PARTNER  text    Document Partner
         <--P_w_vbeln  text    Sales Document Number
    DATA:
      lfs_return like line of t_return.
    FORM create_sales_document changing P_HEADER  like fs_header
                                       P_HEADERX like fs_headerx
                                       Pt_ITEM   like t_item[]
                                       Pt_ITEMX  like t_itemx[]
                                       P_LT_SCHEDULES_IN  like t_schedules_in[]
                                       P_LT_SCHEDULES_INX like t_schedules_inx[]
                                       Pt_PARTNER  like t_partner[]
                                       P_w_vbeln  like w_vbeln.
    This Perform is used to fill required data for Sales order creation
      perform sales_fill_data changing p_header
                                       p_headerx
                                       pt_item
                                       pt_itemx
                                       p_lt_schedules_in
                                       p_lt_schedules_inx
                                       pt_partner.
    Function Module to Create Sales and Distribution Document
      perform sales_order_creation using p_header
                                         p_headerx
                                         pt_item
                                         pt_itemx
                                         p_lt_schedules_in
                                         p_lt_schedules_inx
                                         pt_partner.
      perform return_check using p_w_vbeln .
    ENDFORM.                                 " salesdocu
        Form  commit_work
        To execute external commit                                    *
    FORM commit_work .
      CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
    EXPORTING
       WAIT          = c_x
    ENDFORM.                                 " Commit_work
    Include ycl_sales_order_header          " To Fill Header data and Item data
    Include ycl_sales_order_header.
         Form  return_check
        To validate the sales order creation
    FORM return_check using pr_vbeln type vbeln.
    if pr_vbeln is initial.
        LOOP AT t_return into lfs_return .
          WRITE / lfs_return-message.
          clear lfs_return.
        ENDLOOP.                             " Loop at return
      else.
        perform commit_work.                 " External Commit
        Refresh t_return.
        fs_disp-text = text-003.
        fs_disp-number = pr_vbeln.
        append fs_disp to it_disp.
      if p_del eq c_x or p_torder eq c_x or
        p_pgi eq c_x or p_bill eq c_x.
        perform delivery_creation.           " Delivery order creation
        endif.                               " If p_del eq 'X'......
      endif.                                 " If p_w_vbeln is initial
    ENDFORM.                                 " Return_check
    *&      Form  sales_order_creation
          text
         -->P_P_HEADER  text
         -->P_P_HEADERX  text
         -->P_PT_ITEM  text
         -->P_PT_ITEMX  text
         -->P_P_LT_SCHEDULES_IN  text
         -->P_P_LT_SCHEDULES_INX  text
         -->P_PT_PARTNER  text
    FORM sales_order_creation  USING    P_P_HEADER like fs_header
                                        P_P_HEADERX like fs_headerx
                                        P_PT_ITEM like t_item[]
                                        P_PT_ITEMX like t_itemx[]
                                        P_P_LT_SCHEDULES_IN like t_schedules_in[]
                                        P_P_LT_SCHEDULES_INX like t_schedules_inx[]
                                        P_PT_PARTNER like t_partner[].
        CALL FUNCTION 'BAPI_SALESDOCU_CREATEFROMDATA1'
        EXPORTING
          sales_header_in     = p_p_header
          sales_header_inx    = p_p_headerx
        IMPORTING
          salesdocument_ex    = w_vbeln
        TABLES
          return              = t_return
          sales_items_in      = p_pt_item
          sales_items_inx     = p_pt_itemx
          sales_schedules_in  = p_p_lt_schedules_in
          sales_schedules_inx = p_p_lt_schedules_inx
          sales_partners      = p_pt_partner.
    ENDFORM.                    " sales_order_creation

Maybe you are looking for