FLAT FILE ISSUE IN LSMW

Hi,
I am migrating customer Master in LSMW using BUS0050.
Now I am using two structures: BKNA1 (General Customer master Record 1) and BKNVK (Customer Master contact person)
BKNA1-> NAME1,SORTL,STRAS,ORT01,PSTLZ,LAND1,SPRAS,STCEG
|
|_____BKNVK -> NAME1, ABTNR, NAMEV
I have created two source structures which have a hierarchial relationship.
ZCUSTHEADER
|
|______ZCUSTCONTACT
I have mapped all source fields from these structures to target structures (BKNA1 and BKNVK)
Now my dilemma is here I am not using a flat source structure but a hierarchial source structure. A given customer can have more then one contacts.
I am confused as in what format should I ask the flat files from legacy.
If I use two different files, 1) Containing Customer header data for all customers in one file and 2) All Customers Contact infos in other file, then the problem is how will the LSMW will know as what contact infos belongs to what customer header data ?
Or do I need to use just one file ? How should the structure of flat file look like in these case ?

Hi Rajesh
Please goto the LSMW Option <b>Maintain Source Fields</b>
Define a field 'IDENTIFIER' at the top of the list of fields and double click on it
in the <b>'Identifying Field Content'</b> field give HEADER for ZCUSTHEADER structure
' for the header structure and 'LINE' for the second structure ZCUSTCONTACT
Now when you ask for the legacy values
HEADER should begin the lines in which the contents for the first structure
and LINE should begin for the second structure
Hope this helps.
e.g
HEADER         20050801       AB             
LINE           21             0001001549     
LINE           50             2310030        
HEADER         20050901       AB             
LINE           21             0001001549     
LINE           50             2310030
Message was edited by: Dominic  Pappaly

Similar Messages

  • I need a sample flat file for working lsmw

    Hi experts,
                  pls send me the flat file what u have got for doing lsmw or bdc.I am new to ABAP and i need it for training purpose.I have worked with lsmw for converting 3 fields only . i have to wok more on it .

    hi,
    z368     0001     sai     i     in     en
    z345     0001     rao     r     in     en
    this is a flat file separated by 'tab'. for uploading vendor master ddata(XK01).
    got specify files and give name and path and tab as delimitor
    LSMW:use Recording for practise.
    We use Lsmw to upload large amount of data.Almost 60% logic will be provided by system and error handling is also very easy.
    LSMW supports Session method and Direct Input only.It doesn't support all Transaction method.We can use BAPI's and IDoc's also.
    We need to follow some steps.In case of session method
    The steps are:
    1.Maintain Object Attributes : here we can specify which mehod we are using
    .select and execute it.
    It contains 4 options
    *Standard Batch/Direct Input : we can you predified objects provided by sap
    *Batch Input Recording : we can record a particular transaction: give name ->select overview->provide recording name->do the recording->select default all->save->f3
    *BAPI
    *Idoc->save -> back
    2.Maintain Source Structures-we have to maintain a structure nothing but Internal table->save -> back
    3.Maintain Source Fields : we have to maintain fields we can create and we can copy from repository->save -> back
    4.Maintain Structure Relations: here our structure is assigned to Our recording if we use recording->save -> back
    5.Maintain Field Mapping and Conversion Rules : here fields are mapped;just go to EXTRAS->auto field mapping,for checking unassigned fields->EXTRAS->un assigned fields->save -> back
    6.Maintain Fixed Values, Translations, User-Defined Routines: here we can specify fixed values and user defined sub routines->save -> back
    7.Specify Files : we have to give the path of file and name and delimitor details->save -> back
    8.Assign Files : we have to assign our file,it will be automatically assigned -> save -> back
    9.Read Data->execute->records will be read
    10.Display Read Data->displays the no.of records read
    11.Convert Data->execute ->f3(writes records)
    12.Display Converted Data->execute->f3
    13.Create Batch Input Session->creates session->execute->f3
    14.Run Batch Input Session->controls leads to sm35->process->check i log if it contains errors.
    also go through the links:
    steps for LSMW: detail documnets: http://www.scmexpertonline.com/downloads/SCM_LSMW_StepsOnWeb.doc
    http://sapabap.iespana.es/sapabap/manuales/pdf/lsmw.pdf
    http://www.sap-img.com/sap-data-migration.htm
    https://websmp207.sap-ag.de/lsmw
    http://www.sapbrain.com/TOOLS/LSMW/SAP_LSMW_steps_introduction.html
    http://esnips.com/doc/8e732760-5548-44cc-a0bb-5982c9424f17/lsmw_sp.ppt
    http://esnips.com/doc/f55fef40-fb82-4e89-9000-88316699c323/Data-Transfer-Using-LSMW.zip
    http://esnips.com/doc/1cd73c19-4263-42a4-9d6f-ac5487b0ebcb/LSMW-with-Idocs.ppt
    http://esnips.com/doc/ef04c89f-f3a2-473c-beee-6db5bb3dbb0e/LSMW-with-BAPI.ppt
    http://esnips.com/doc/7582d072-6663-4388-803b-4b2b94d7f85e/LSMW.pdf
    REFER.
    Re: ALE

  • Flat file issue in extract program

    Hi All
    I have developed an extract program, which downloads the data from SAP and dump it in a flat file, in desired format.
    Flat file contains various rows.
    In one row, i want to keep 80 charecter length blank at the end.
    Ex:
    HDR2345 c 20060125 0r42z1005.5USD
    Now after "USD", i want to keep 80 char length as Blank spaces.
    If i code it as  v_record+33(80) = ' '.. it won't work out...
    Please suggest me...
    Regards
    Pavan

    Hi Pavan,
    Please try this code.
    DATA: L_SPACE TYPE X VALUE '20'.
    MOVE L_SPACE TO V_RECORD+33(80)
    OR
    MOVE L_SPACE TO V_RECORD+79(80).
    This should work.
    Regards,
    Ferry Lianto

  • Issue in calling webdynpro for loading flat file in IP

    Hello,
    We have implemented the How-To load flat files in IP paper and we are facing an issue in that. The generated webdynpro runs on ABAP Application server but we need to call this from a Web template which runs on portal server.
    These two servers are different and we don't want to hardcode the url in webtemplate as the server names will change in Q & P for webdynpro. Can anyone advise how we can read the ABAP Application server details in WAD may be using javascript and then generate the URL for webdynpro dynamically.
    Regards,
    Deepti

    Hi Deepti,
    We have also facing the same challenge. Could you please post the solution in which URL path change dynamically.
    I have created the ABAP function module based on the below link.
    http://wiki.sdn.sap.com/wiki/display/Snippets/BICustomWebItemtoLaunchanABAPWeb+Dynpro
    But not sure how to call the function module using custom web item & java script. Coould you please explain the steps on the same ?
    Thanks,
    Gopi R

  • Issue in lsmw ; error in read file step

    hi,
    i 'm facing some really strange issue in lsmw..
    i have created one header structure and 3 sub structures.
    until now its working fine
    but when i created 4th sub structure and in the read file step, its giving the following error :
    "Generation aborted. Reason: No fields with equal names"
    again if i delete that extra structure, its working fine..
    i really don't understand whats the problem... is there any restriction on number of structures?
    i need ur help guys
    Thanks in advance
    shekhar

    In step 13 Create Batch Input Session .....select Keep Batch input folder check box and goto step14 Run batch Input Session ....
    Slect your session name and click on Process....it will give one window and select Display Errors Only....then it will execute without press enter every screen...if any error is there it will stop and display error screen.
    Mohan

  • Regarding Upload of Goods Issue  data from flat file

    Hi,
    i am working with interface for Goods Issue
    The following fields are provided in the flat file: BLDAT , BUDAT ,BWARTWE , WERKS and LGORT.
    We also have to maintain Internal Order number in SAP which is not provided in the flat file ; how to go about it?
    I have a doubt: since item level data is not provided in the flat file ; how that data will be captured ?
    Should I go for BAPI_GOODSMVT_CREATE or write a BDC ?

    Hi,
    You can only do a goods issue against a delivery. I dont understand how you can Issue goods using a flat file.you will have to check with your functional consultant.
    regards,
    Mahesh

  • Flat file truncation issue

    I am attempting to perform a fairly standard operation, extract a table to a flat file.
    I have set all the schemas, models and interfaces, and the file is produced how I want it, apart from one thing. In the source, one field is 100 characters long, and in the output, it needs to be 25.
    I have set the destination model to have a column physical and logical length of 25.
    Looking at the documentation presented at http://docs.oracle.com/cd/E25054_01/integrate.1111/e12644/files.htm - this suggests that setting the file driver up to truncate fields should solve the issue.
    However, building a new file driver using the string 'jdbc:snps:dbfile?TRUNC_FIXED_STRINGS=TRUE&TRUNC_DEL_STRINGS=TRUE' does not appear to truncate the output.
    I noticed a discrepancy in the documentation - the page above notes 'Truncates strings to the field size for fixed files'. The help tooltip in ODI notes 'Truncates the strings from the fixed files to the field size'. Which might explain the observed lack of truncation.
    My question is - what is the way to enforce field sizes in a flat file output?
    I could truncate the fields separately in each of the mapping statements using substr, but that seems counter-intuitive, and losing the benefits of the tool.
    Using ODI Version:
    Standalone Edition Version 11.1.1 - Build ODI_11.1.1.5.0_GENERIC_110422.1001

    bump
    If this is an elementary issue, please let me know what I've missed in the manual.

  • Issue with flat file loading timing out

    Hello
    I have a scenario, where I am loading a flat file with 65k records into a cube. The problem is, in the loading process, I have to look up the 0Material table which has a million records.
    I do have an internal table in the program, where I select a subset of the Material table ie around 20k to 30k records. But my extraction process takes more than 1 1/2 hrs and is failing (timed out).
    How can i address this issue? I tried building indexes on the Material table and its not helping.
    Thanks,
    Srinivas.

    Unfortunately, this is BW 3.5, so there is no END routine option here. And I tried both .csv and notepad file methods, and both are creating problems for us.
    This is the total code, do you guyz see any potential issues:
    Start Routine (main code)
    refresh i_oldmats.
    refresh ZI_MATL.
    data: wa_datapak type transfer_structure.
    loop at datapak into wa_datapak. (** I collect all the old material numbers from my flat file into an internal table i_oldmats**)
       i_oldmats-/BIC/ZZOLDMATL = wa_datapak-/BIC/ZZOLDMATL.
       collect i_oldmats.
    endloop.
    sort i_oldmats.
      SELECT /BIC/ZZOLDMATL MATERIAL (** ZI_MATL only has recs. where old materials exist, this gets about 300k records out of 1M**)
             FROM /BI0/PMATERIAL
             INTO ZI_MATL FOR ALL
             ENTRIES in i_oldmats WHERE
              /BIC/ZZOLDMATL = i_oldmats-/BIC/ZZOLDMATL .
                collect    ZI_MATL.
      Endselect.
      Sort ZI_MATL.
    Transfer rule routine (main code)
    IF TRAN_STRUCTURE-MATERIAL = 'NA'.
        READ TABLE ZI_MATL INTO ZW_MATL
        WITH KEY /BIC/ZZOLDMATL = TRAN_STRUCTURE-/BIC/ZZOLDMATL
        BINARY SEARCH.
        IF SY-SUBRC = 0.
          RESULT = ZW_MATL-MATERIAL.
        ENDIF.
      ELSE.
        RESULT = TRAN_STRUCTURE-MATERIAL.
      ENDIF.
    Regards,
    Srinivas.

  • XML to Flat File design issue

    Hi,
    A newbie to SSIS but was able to create an SSIS package which extracts XML data from one of the SQL server columns using "Execute SQL Task" and passes that to a for each loop container which contains an XML task for apply transform to each input
    xml and append it to a flat file(additonally using a script task within for each container).
    All good so far but now I want to apply conditional splitting to "Execute SQL Task" in the above and additionally pipe it to the exact similar additional process(For each container doing xml task as described above) for a different kind of
    flat file.
    If I alter the design to use the data flow approach and apply OOTB conditional split, I run into not knowing how to connect and execute more than one  foreach container and embedded XML and script task (dataflow to control flow connection)
    It is easy to put everything in a sequence container and repeat the Execute SQL Task . But to me that is very inefficient.
    Any creative ideas or pointers to some internet content which tells me how can this be done most efficiently.
    Hope my question makes sense and let me know if you need more clarification.
    Thanks in advance.
    SM

    As I understand what you're asking for is a way to create conditional branches to do different typeof processing for each subset of data. For this you can do like below 
    1. Add set of tasks to additional processing for each type of flat file and link them to Execute sql task.
    2. Make the precedence constraint option as Expression And Constraint for each of them. Make constraint as OnSuccess and expression based on your condition. You may need to create SSIS variables to capture value of fields to be used in expression manipulation.
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Flat  File to InfoCube Load Issue

    Hello All,
                  I am newbie in the SAP BW land. Probably a simple question to BW Experts.
    This is crude implementation of the Fu and Fu example.
    Environment SAP  BW 7.0
    Iam trying to load a flat file to the Info Cube.
    1. I am able to successfully preview the data from the data sources
    2. Also able to load the data to the PSA, which i see from PSA maintenance screen
    Issue arises when i try to load the data to the Info Cube by running the DTP Process, it just does not load the data.
    I ma unable to figure of what the issue is
    Following the structure of the Flat File with 1 line of data
    Day      Unit of Measure     Customer ID     Meterial Number     Sales Representative ID     Sales Office     Sales Region     Price of Material     Sales quantity     Sales revenue     Currency
    20100510     gms     1122334455     92138     9710274     Chicago     NorthEast     890     50     928     USD
    Following is the structure of the Info Cube
    Characteristics
    Time
    Unit
    Customer ID
    Material Number
    Sales Representative ID
    Sales Office
    Sales Region
    Measures
    Price of Material     
    Sales quantity     
    Sales revenue
    --- also where can i find a data flow diagram or steps of what needs to be done to design a standard cube loading process
    Please tell me where am I  going wrong.

    II think i made progress but i still do not see the data in the InfoCube
    When i visit the PSA by
    visiting the source system -> Manage PSA -> Select the line item -> PSA Maintenance-> I see the record completely like this
    @5B@     1     1     05/10/2010     GMS     1,122,334,455     92,138     9,710,274     CHICAGO     NORTHEAST     890     50     928
    But when i run the DTP Process i see the following error message
    I think this is due to a data type mismatch between flat file and the infocube.
    Please adivse
    Error I see is
    @5C@     No SID found for value 'GMS ' of characteristic 0UNIT     BRAIN     70     @35@     @3R@
    Also where can i click or extract the InfoCube structure in plain text so that i can post it on the forum for your better visibility

  • JCA flat file read issue

    Hi,
    We needs to read a flat file and transform it to destination xml format. Then send it to destination file location.
    Steps we have done:
    1. Create JCA adapter and configure the flat file schema
    2. Create proxy based on the jca
    3. create transformation for the source to target schema (this has no namespace)
    4. Create BS for sending the output
    Everything workins as expected when testing from OSB test console. But then the file is placed in the source folder, the output xml has the namespace xmlns:soap-env="http://schemas.xmlsoap.org/soap/envelope/" to the root node.
    e.g,
    <Root xmlns:soap-env="http://schemas.xmlsoap.org/soap/envelope/" >
    <Child>
    <Child1/>
    <Child2/>
    <Child3/>
    </Child>
    </Root>
    But expected output is
    <Root>
    <Child>
    <Child1/>
    <Child2/>
    <Child3/>
    </Child>
    </Root>
    We tried converting the xml to string then repalcing the xmlns:soap-env="http://schemas.xmlsoap.org/soap/envelope/" value with balnk.
    Also we tried with hadcorded xml using assign action instead of transformation. Even the harded xml also having the namespace to the root node.
    But the end system is failing due to this namespace value.
    Help me to resolve the issue.
    Thanks,
    Vinoth
    Edited by: Vinoth on Jun 8, 2011 10:12 PM

    Ideally your endsystem should not fail if you specify any number of namespace identifiers in the XML unless you are using them in elements or attributes within the XML. They should try to resolve this on their side.
    But to see whats going on in OSB, can you please paste log the $body variable in the before the publish action and paste the content here. Or send the sbconfig of the Proxy and business to me on my email mentioned in the profile if possible.

  • In LSMW flat file date format to be converted to user profile date setting.

    hi all,
      i got a flat file in which date is in mm/dd/yyyy format.i converted the data using lsmw but this conversion is valid only if user has set his date profile as mm/dd/yyyy in his user profile setting->own data. now if user has some other setting then it will give error. so how to convert date and how to do mapping with same . please help.

    Sunil,
    use below fm to get current user date format:
    data: idate TYPE sy-datum,
              tdat8 type string.
    CALL FUNCTION 'DATUMSAUFBEREITUNG'
         EXPORTING
           IDATE                 = idate
         IMPORTING
            TDAT8                 = tdat8.
    Amit.

  • Flat file load issue-Urgent

    Hello
    I'm trying to load a master data infoobject with a flat file and getting errors.
    The infoobject ZCOLLECT_DATE has two fields - ZQUARTER and ZCOLLECTDATE
    ZCOLLECT_DATE is of Data type NUMC, Length 5, conversion routine PERI5 and Output length 6
    I created a flat file in Excel the following way:
    Left the first row blank
    Second column - 1/2008 (for ZQUARTER)
    Thrid column - 01/11/2008 (for ZCOLLECTDATE)
    I saved it as a CSV (save as-> csv).
    In the infopackage, I've selected this file in the External data tab, File type CSV, Data separator ; , Escape sign "
    When I try to load the infoobject, I get the following error:
    "Error in conversion exit CONVERSION_EXIT_PERI5_INPUT"
    I thought this was because the Length in my infoobject was 5, whereas ZQUARTER is of length 6 in my flat file (eg-1/2008) and ZCOLLECTDATE is of format mm/dd/yyyy.
    Any thoughts on what the issue is?
    Thanks
    Alex

    Hi Hemant
    I changed it to 12008, 22008 etc.. but still get the error ""Error in conversion exit CONVERSION_EXIT_PERI5_INPUT"
    Also, when I save the xls file as CSV, I get two pop-ups saying some features are not compatible and only those features that are compatible will be saved. Are these normal warnings, or could they be causing the issue?
    Also, in the infoobject, since it only consists of dates, do I need to specify conversion routine PERI5 in Infoobject properties/maintenance?
    Thanks

  • Issue importing flat files into SQL server 2008R2

    I have csv files that the last column may not have data. SSIS (version 2008R2) will insert the next row's data into the empty cells, which causes my package to fail.
    I know this is resolved in 2012 but we are using 2008R2 still. Any suggestions?
    Thanks.

    Hi sqlBI2014,
    After testing the issue that Flat File Connection Manager cannot handle file with uneven number of columns in each row in my SQL Server 2008 R2 environment, I can reproduce it.
    Based on my research, the issue is caused by the Column Delimiter gets first preference and then Row delimiter. So the next row data would be inserted in the empty column cell. This is by design in SQL Server 2005, SQL Server 2008 or SQL Server 2008 R2.
    Good news is that the issue is fixed in SQL Server Data Tools which comes with SQL Server 2012. In SQL Server 2012, by default, the Flat File Connection Manager always checks for a row delimiter in unquoted data, and starts a new row when a row delimiter
    is found. This enables the connection manager to correctly parse files with rows that are missing column fields.
    If you still want to fix the issue in SQL Server 2008 R2, there is a sample component posted to the CodePlex that might help you with this:
    http://ssisdfs.codeplex.com/
    References:
    http://blogs.msdn.com/b/dataaccesstechnologies/archive/2013/03/13/flat-file-source-cannot-handle-file-with-uneven-number-of-columns-in-each-row.aspx
    https://connect.microsoft.com/SQLServer/feedback/details/293193/ssis-import-of-flat-file-with-uneven-number-of-columns
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Ssis issue - load recent added 5 flat files

    HI
    i have one query like i have 5 flat files in one folder today i loaded into sql server table,tomorrow again added 5 flat files in same folder.i want to load recent added 5 flat files,
    can any one give me answer?

    You need to have a loop and inside it have a step to check for file modified date for getting the recent files. Much better is if filename has timestamp part use that to determine the latest files
    see example here
    http://visakhm.blogspot.in/2012/05/package-to-implement-daily-processing.html
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

Maybe you are looking for