File transaction

Using the File transactions  
Posted: Sep 30, 2008 11:55 AM       E-mail this message      Reply 
Hi,
While creating a logical file, and uploading it in the Development system using the transaction 'FILE', I am getting a pop up to create a request, which i create.
Now the question is, if i need to create the logical file and upload the files to the application server in quality and production, do I need to create new entries in Quality and production, or do i need to create it in Development, giving the quality system, and production system path in seperate entries and transport it in two seperate entires in Quality and production respectively.
I will certainly award points for the solution.
Thanks and Regards,
Thiru

Hi,
Just create in DEV server and assign the physical path
to logical pfilepath as
Physical path:   F:\archivecomp\<SYSID>\<FILENAME>.archiving
and transport to quality and prod.
Regards,
Raju.

Similar Messages

  • FILE transaction - How to use.

    Hi,
    In the FILE transaction I need to set up a logical file name and then link this to a physical file. How do i access the physical file folder. For example there is a physical file /usr/sap/trans/data/textfile.... how do I display this?
    regards,
    Warren.

    Hi,
    Go to Transaction FILE
    Step1)Create the Synatx group
       For example
        UNIX
    Step2) Assign Logical file to Laogicl path
    Step3)
       Assgin the Physical Path ( usr/sap/trans/data/textfile ) to Logical path ( ZTEST_FILEPATH)
    Thanks
    PK

  • Problem in Flat file transaction data loading to BI7

    Hi Experts,
    I am new to BI7, got problem in activating data source for transaction data and create transfer routine to calculate sales revenue which is not there in flat file but flat file has got price per unit and quantity sold.
    Flat file fields are:
    Cust id   SrepID  Mat ID  Price per unit  Unit measure  Quantity   TR. Date
    cu100    dr01      mat01         50                 CS              5     19991001       
    created info objects are
    IO_CUID
    IO_SRID
    IO_MATID
    IO_PRC
    IO_QTY
    IO_REV
    0CALDAY
    When i created IO_QTY, unit/curency was given as 0UNIT. In creation of DS, under fields tab, extra field CS was shown I did confuse to which object should i map it. and at the same time when i enter IO_QTY its prompting one dailogbox with 0UNIT, so i said ok. Now i can't map CS to unit as system created automatic ally 0UNIT when i enter IO_QTY. Could you please give me solution for this and procedure to enter formulae for calculating Sales revenue at this level instead of query level. Your solutions will be more appreciated with points.
    Await for ur solutions
    Thanks
    Shrinu

    Hi Sunil,
    Thanks for your quick response. I tried to assign the infoobjects under the fields tab in creation of DS. I have not reached to data target yet.
    Could you please answer to my total question.
    Will you be able to send some screen shots to my email id [email protected]
    Thanks
    Shri

  • Problem in loading of flat file transaction data into BI7

    Hi Experts,
    I am new to BI7, got in problem in creation of Data source.
    I have fields in flat file are:
    CUSTID,SREPID,MATID,PRICE PER UNIT,UNIT,QUANTITY,TRANS.DATE
    one of the row under the above fields is
    C100,SR100,MAT01, 50,CS,5,19991005
    Created infoobjects are:
    IO_CUSID
    IO_SRID
    IO_MATID
    IO_PRC
    IO_QTY
    IO_REV
    0CALDAY
    IO_QTY was created with Unit/currency '0UNIT'
    In creation of Data source, under field tab system shown extra CS field, I did confuse that which infoobject should be map to this field. At the same time i created IO_QTY with 0UNIT so when i entered IO_QTY under infoobjects column, system automatically created 0UNIT infoobject. I didn't completly understand why one of the unit (CS) sat under the fields column. could you clarify me please.
    And also could you please tell me the procedure of creating transfer routine for sales revenue as we didn't have this object in flat file (SS). In earlier version of BW(Addison Wesley Step by Step book) i did this at infosource level. Here i am loading the data without infosource so just i would like to know is it possible to create transfer routine at DS level or do i need to create Infosource in this case. If it is the case also could you please make me understand with procedure to create transfer routine for sales revenue.
    Your solutions will be more appreciated with points.
    Thank you,
    Shri

    Hi,
    1.Open Administrator Workbench: Modeling, from the menu or using the transaction RSA1
    2.Go to Source Systems (File)and Create a new System.
    3. Double click on the datasources, create data source
    4.Entitle the Datasource, choose Transaction Data as Data Type Datasource
    5. In next screen, Go to Extraction , Activate the data source
    6.Go to Preview. Press Read Preview Data.
    7.Data will be loaded: Save and activate the datasource
    8.Go to InfoProvider, go to Info area and create Infocube
    9.Entitle the infocube, press create
    10.Display all info objects: . Choose your info objects , Move the characteristics and key figure to the infocube by drag and drop adn activate
    11.Go to InfoCube, create transformation
    12.Click on the ‘Show Navigator’ button. Match the fields in navigator. Save and activate.
    13.go to info provider, your info cube, create data transfer process
    14.Choose: Extraction Mode: full
    Update: Valid records update, Reporting possible (Request green)
    Activate, execute
    14.Double click on the data source, Read Preview Data
    15.Create an InfoPackage by right clicking on the datasource. Execute the load into PSA.
    16.Verify data was loaded into PSA before proceeding with the next step.
    17. Execute the Data Transfer Process
    18.Monitor the results (Menu: GoTo: DTP Monitor)
    19.Try to display the data at the BW Frontend.
    Thanks,
    Sankar M

  • Using the File transactions

    Hi,
    While creating a logical file, and uploading it in the Development system using the transaction 'FILE', I am getting a pop up to create a request, which i create.
    Now the question is, if i need to create the logical file and upload the files to the application server in quality and production, do I need to create new entries in Quality and production, or do i need to create it in Development, giving the quality system, and production system path in seperate entries and transport it in two seperate entires in Quality and production respectively.
    I will certainly avoid points for the solution.
    Thanks and Regards,
    Thiru

    Hello,
    normally you create these entries in development system and transport them to quality and production.
    Remember: most of these entries are client independent customizing.
    The whole sense of these logical defintions are to have an abstraction between the filenames in ABAP-programs and the real filenames on the file-system.
    For example: we have interface-files that reside in
    development system (T44) on a path named /usr/sap/T44/data/in/fi, in quality system (A44) on /usr/sap/A44/data/in/fi
    So we defined a logical path that ZFI_FI_IN with points to the physical path /usr/sap/<SYSID>/data/in/fi/<FILENAME>. That fits for all systems, because <SYSID> is a parameter
    which is translated by the function module FILE_GET_NAME.
    There are some more parameters that you also may use, like <CLIENT> or  <DATE> .
    Regards Wolfgang

  • EBS BAI2 file - Transaction 475, check clearing - extra character appending

    Hello experts,
    We are using Algorithm 13 for processing checks which are processed using the transactin 475 on the incoming bank file.
    In the BAI2 file format, the check numer is contained in record 16 in the following format-
    16,475,58740,0,9180914733,67689/
    88,CHECK NO=0000000067689
    The problem that we are having is - if the EBS program is run as a automated batch job then SAP is appending the '/'  (that appears at the end of the line in record 16) in the check number that it uses to match the documents for clearing. Therefore in the above example, our SAP system is trying to find the check number '67689/' in the check lots to match against the payment document. However our check numbers are just 67689 therefore SAP is not able to find a match and posts the document in 'On Account' state.
    But if the same file is run manually in the Workstation upload mode, then there is no extra '/' appearing in the check number and SAP correctly clears the Cash Clearing account. I know there is nothing wrong in the file sent to us by the bank. Plus running it manually does not cause this issue.
    Has anyone else faced this problem before and if you can suggest on what possibly could be going wrong then it would be a great help for me.
    Thanks!

    I understand wanting to find the issue - not just apply a band aid.  And I understand the frustration of not being able to recreate an issue in a test system.  But at least a search string is only configuration and not ABAP code in a user exit....  And if it works, it will relieve your users from clearing the checks manually.
    If you do go with a search string, you should be able to isolate the last 5 digits by only mapping those.  For example, you'd set the search string to: 
    CHECK NO=00000000#####
    Then in the mapping, blank out all values except for the 5 # symbols - so the mapping would be 17 blank spaces followed by 5 #'s.  I've been able to successfully extract reference numbers for ACH deposit clearings this way - I don't see why it wouldn't also work in your situation with check numbers.
    Regards,
    Shannon

  • MT940 file transactions appear in different years (2011-2012)

    Hi
    We have received a MT940 file with the value date and document date in different years.
    e.g.  :61: 1203013012....
    Here 12 means 2012, 0301 means value date = 03 January 2012, 3012 means document date = 30th December 2011
    However the posting is made in 2012 only for both value date and document date which is incorrect
    Value date = 03 Jan 2012
    Document date = 30 Dec 2011
    This is a compliance / SOX issue as the document date year posted is incorrect.
    There are only 2 characters used to specify the year in this case which is 2012 - there should be a facility to specify 2011 also
    I assume this scenario is quite common and must have occurred for other users also
    I would appreciate any feedback on this item
    Thanks in advance
    Rahul Kochhar

    Thanks Nico
    Actually the issue is with changing the file because this would be considered as a SOX violation and would need approvals.
    We have asked the users to use FF67 and do a manual posting line by line and modify the document date manually.
    Both options need manual intervention and workaround and are tedious in nature.
    I was under the impression that this would be a standard issue and faced by many people hence the post -  maybe SAP has a note that can be used?
    Please do let me know if you have any more suggestions
    Thanks
    Rahul

  • How to do archieve using SARA and FILE transaction

    Hi,
    I am using ECC6. i have requirement to do archiving. Please suggest me how to do archive using FILE and SARA.
    Regards,
    Asif

    Hi Asif
    Please go through the following links..
    http://www.sap-basis-abap.com/sapta009.htm
    Re: archiving data through SARA tcode
    If you face any problem , search notes with SARA, you will get enough infromation.
    Hope this helps.
    Thanks
    Anindya

  • Flat file transactional data load error

    I am trying to load a flat into a BPC model. I am using a DM " Import transational data load from flat file" to load the data. The transformation file and data file are validated correctly.
    However,when i am running the DM package, i am hitting the below message.
    Task Name : Load
    Cannot perform Read.
    Model : Package Status : Error.
    We have two secure dimension in the model. I have tried different combinations even with PRimary admin role and i am still getting the same error message.
    Is this a secirity related error ? The model has been transported from DEV . In DEV, i am not facing any errors.
    Any advise/ help?

    Hi King,
    I think in the back end you need check the option real time load behavior, Is it in planning mode or loading mode. If possible share your error message screen.
    Goto RSA1 ---> select your cube--> Right click---> change real time load behavior--> Change it in to planning mode
    Regards,
    Saida Reddy Gogireddy

  • Performance Optimizing with limitation file transactions

    Hi there,
    after installation an SAP NetWeaver Portal v6, I have worked on page compression to have a better performance during loading the welcome (home) page. With HTTP Watch tool, I can see that more than 60 files has been transported from server to client over network until the home page is loaded. Is there a way to limit the count of these files (same as file transfer by loading home page of SDN -> only near to 20 files!)?
    Thanks for info and help.
    Regards,
    Cengiz

    Yes,
    By using light framework page you can reduce the footprint, see http://help.sap.com/saphelp_nw70/helpdata/en/87/6d57c6fe824f3bbbcae725f4729bee/frameset.htm
    Note that there are some restrictions.
    Dagfinn

  • File Transactions - Can I use JTA or JCA

    Hi,
    I am a novice to JTA or JCA.
    My requiremnents are such. I have to read some information in an object and based on which I appnd this information into a particular file. There are about 50 files that I have to deal with.
    Now, suppose an error occurs in either one of the files, I have to rollback all the file manupilations that I had done. How do I do this?
    Is this kind of functionality possible with JTA? If not what else can I use?
    Cheers,
    --- Chandan !!!!!!!!

    Hi,
    For this to work with JTA, you need to get an XAResource implementation for each file. This instance is either programmed by you, or by a JCA vendor for file systems (not sure if there are any).
    The idea would be to make the editing of the file actually work on a COPY of the file, created at opening time. If you do commit on the XAResource handle, then the copy replaces the original file. On rollback, you just delete the copy. However, you also need to have locks in order to handle concurrent edits.
    Guy

  • Use transaction FILE to store data from a cube into a file with Open Hub

    Hi people:
    I'm using BI 7.0 .Mi requirement is to make a flat file using the information of a virtual cube. The file name must have the number of the month and the year. I know that this is possible through FILE transaction.
    Can anybody give me a clue how this transaction is used?Which are the steps in order to assemble the name of the file? Or is there any other option? I have defined the physical directory where the file must be leaved
    Any help will be great. Thanks in advanced

    Hi,
    pick up the code which you need from below.
    REPORT RSAN_WB_ROUTINE_TEMP_REPORT .
    TYPES: BEGIN OF y_source_fields ,
             /BIC/ZTO_ROUTE TYPE /BIC/OIZTO_ROUTE ,
             ZINT_HU__Z_WM_HU TYPE /BIC/OIZ_WM_HU ,
             CREATEDON TYPE /BI0/OICREATEDON ,
             ROUTE TYPE /BI0/OIROUTE ,
             PLANT TYPE /BI0/OIPLANT ,
             PLANT__0STREET TYPE /BI0/OISTREET ,
             PLANT__0CITY TYPE /BI0/OICITY ,
             PLANT__0REGION TYPE /BI0/OIREGION ,
             PLANT__0POSTAL_CD TYPE /BI0/OIPOSTAL_CD ,
             /BIC/ZRECVPLNT TYPE /BIC/OIZRECVPLNT ,
             ZRECVPLNT__0STREET TYPE /BI0/OISTREET ,
             ZRECVPLNT__0CITY TYPE /BI0/OICITY ,
             ZRECVPLNT__0REGION TYPE /BI0/OIREGION ,
             ZRECVPLNT__0POSTAL_CD TYPE /BI0/OIPOSTAL_CD ,
             KYF_0001 TYPE /BI0/OIDLV_QTY ,
             ROUTE__Z_CR_DOCK TYPE /BIC/OIZ_CR_DOCK ,
             REFER_DOC TYPE /BI0/OIREFER_DOC ,
           END OF y_source_fields .
    TYPES: yt_source_fields TYPE STANDARD TABLE OF y_source_fields .
    TYPES: BEGIN OF y_target_fields ,
             RECORDTYPE TYPE /BI0/OISTREET ,
             CONTAINER TYPE /BI0/OICITY ,
             /BIC/ZTO_ROUTE TYPE /BIC/OIZTO_ROUTE ,
             TRACKINGNUMBER TYPE /BIC/OIZ_WM_HU ,
             PO TYPE /BI0/OICITY ,
             STAGEDDATE TYPE /BI0/OICITY ,
             MOVEMENTTYPE TYPE /BI0/OICITY ,
             ROUTE TYPE /BI0/OIROUTE ,
             PLANT TYPE /BI0/OIPLANT ,
             PLANT__0STREET TYPE /BI0/OISTREET ,
             PLANT__0CITY TYPE /BI0/OICITY ,
             PLANT__0REGION TYPE /BI0/OIREGION ,
             PLANT__0POSTAL_CD TYPE /BI0/OIPOSTAL_CD ,
             ORIGINCONTACTNAME TYPE /BI0/OISTREET ,
             ORIGINCONTACTPHONE TYPE /BI0/OISTREET ,
             /BIC/ZRECVPLNT TYPE /BIC/OIZRECVPLNT ,
             ZRECVPLNT__0STREET TYPE /BI0/OISTREET ,
             ZRECVPLNT__0CITY TYPE /BI0/OISTREET ,
             ZRECVPLNT__0REGION TYPE /BI0/OISTREET ,
             ZRECVPLNT__0POSTAL_CD TYPE /BI0/OISTREET ,
             DESTINATIONCONTACTNAME TYPE /BI0/OISTREET ,
             DESTINATIONCONTACTPHONE TYPE /BI0/OISTREET ,
             RCCCODE TYPE /BI0/OISTREET ,
             GLCORCLLICODE TYPE /BI0/OISTREET ,
             JFCCODE TYPE /BI0/OISTREET ,
             DESCRIPTIONOFWORK1 TYPE /BI0/OISTREET ,
             DESCRIPTIONOFWORK2 TYPE /BI0/OISTREET ,
             INSTRUCTIONS TYPE /BI0/OISTREET ,
             REQUESTEDSHIPDATE TYPE /BI0/OICITY ,
             ROUTE__Z_CR_DOCK TYPE /BIC/OIZ_CR_DOCK ,
             REQUESTEDDELIVERYDATE TYPE /BI0/OICITY ,
             ATTSEORDER TYPE /BI0/OICITY ,
             CUBE TYPE /BI0/OISTREET ,
             WEIGHT TYPE /BI0/OISTREET ,
             PIECES TYPE /BI0/OIREFER_DOC ,
             REEL TYPE /BI0/OISTREET ,
             REELSIZE TYPE /BI0/OISTREET ,
             VENDORSKU TYPE /BI0/OISTREET ,
             ATTSESKU TYPE /BI0/OISTREET ,
             COMPANYNAME TYPE /BI0/OISTREET ,
             OEM TYPE /BI0/OISTREET ,
             REFER_DOC TYPE /BI0/OIREFER_DOC ,
             REFERENCENUMBER2 TYPE /BI0/OISTREET ,
             REFERENCENUMBER3 TYPE /BI0/OISTREET ,
             REFERENCENUMBER4 TYPE /BI0/OISTREET ,
           END OF y_target_fields .
    TYPES: yt_target_fields TYPE STANDARD TABLE OF y_target_fields .
    Begin of type definitions -
    *TYPES: ...
    End of type definitions -
    FORM compute_data_transformation
         USING     it_source TYPE yt_source_fields
                   ir_context TYPE REF TO if_rsan_rt_routine_context
         EXPORTING et_target TYPE yt_target_fields .
    Begin of transformation code -
      DATA: ls_source TYPE y_source_fields,
            ls_target TYPE y_target_fields,
            var1(10),
            var2(10),
            year(4),
            month(2),
            day(2),
            date(10),
            it_workdays type table of /bic/pzworkdays,
            wa_workdays type /bic/pzworkdays,
            sto_date(10),
            V_tabix TYPE sy-tabix,
            Y_tabix TYPE sy-tabix,
            sto_var1(10),
            sto_year(4),
            sto_month(2),
            sto_day(2),
            sto_final_date(10),
            W_HEADER LIKE LS_TARGET-RECORDTYPE,
            W_HEADER1(12) TYPE C VALUE 'HEDR00000000',
            W_FOOTER LIKE W_HEADER VALUE 'TRLR0000',
            CNT(5),
            CMD(125) TYPE C.
    **********CODE FOR GENRATING CSV FILE PATH*******************
    data: OUTFILE_NAME(100) TYPE C,
          OUTFILE_NAME1(10) TYPE C VALUE '/sapmnt/',
          OUTFILE_NAME3(18) TYPE C VALUE '/qoutsap/ATTUVS',
          DATE1 LIKE SY-DATUM,
          DD(2) TYPE C,
          MM(2) TYPE C,
          YYYY(4) TYPE C.
    MOVE SY-DATUM+6(2) TO DD.
    MOVE SY-DATUM+4(2) TO MM.
    MOVE SY-DATUM(4) TO YYYY.
    CONCATENATE YYYY MM DD INTO DATE1.
    CONCATENATE OUTFILE_NAME1 SY-SYSID OUTFILE_NAME3 '.CSV' INTO
    OUTFILE_NAME.
    **********END OF CODE FOR GENRATING CSV FILE PATH*************
      OPEN DATASET OUTFILE_NAME FOR OUTPUT IN TEXT MODE ENCODING DEFAULT.
    Code for generating Header.
      CONCATENATE W_HEADER1  SY-DATUM SY-UZEIT INTO W_HEADER.
      APPEND W_HEADER TO ET_TARGET.
      TRANSFER W_HEADER TO OUTFILE_NAME.
      CLEAR W_HEADER.
    End of code for generating Header.
    code for excluding the rows who's Quantity(PIECES) equal to zero.
      LOOP AT it_source INTO ls_source where KYF_0001 NE '0'.
    end of code for excluding the rows who's Quantity(PIECES) equal to
    *zero
        MOVE-CORRESPONDING ls_source TO ls_target.
        ls_target-RECORDTYPE = 'PKUP'.
        ls_target-CONTAINER = ''.
        ls_target-TRACKINGNUMBER = ls_source-ZINT_HU__Z_WM_HU.
        ls_target-PO = ''.
    Date Conversion for Staged Date.
        var1 = ls_source-CREATEDON.
        year = var1+0(4).
        month = var1+4(2).
        day = var1+6(2).
        CONCATENATE month '/' day '/' year INTO date.
    End of Date Conversion for Staged Date.
        ls_target-STAGEDDATE = date.
        ls_target-MOVEMENTTYPE = 'P'.
        ls_target-ORIGINCONTACTNAME = ''.
        ls_target-ORIGINCONTACTPHONE = ''.
        ls_target-DESTINATIONCONTACTNAME = ''.
        ls_target-DESTINATIONCONTACTPHONE = ''.
        ls_target-RCCCODE = ''.
        ls_target-GLCORCLLICODE = ''.
        ls_target-JFCCODE = ''.
        ls_target-DESCRIPTIONOFWORK1 = ''.
        ls_target-DESCRIPTIONOFWORK2 = ''.
        ls_target-INSTRUCTIONS = ''.
        ls_target-REQUESTEDSHIPDATE = date.
    Calculating STO Creation Date + 3 working Days.
        select /BIC/ZWORKDAYS from /bic/pzworkdays into table it_workdays.
        loop at it_workdays into wa_workdays.
            if  wa_workdays-/bic/zworkdays = ls_source-CREATEDON.
                V_tabix = sy-tabix.
                Y_tabix = V_tabix + 3.
            endif.
            If sy-tabix = y_tabix.
                sto_date = wa_workdays-/bic/zworkdays.
            endif.
        Endloop.
        clear v_tabix.
        clear Y_tabix.
        sto_var1 = sto_date.
        sto_year = sto_var1+0(4).
        sto_month = sto_var1+4(2).
        sto_day = sto_var1+6(2).
        CONCATENATE sto_month '/' sto_day '/' sto_year INTO sto_final_date.
    End Of Calculating STO Creation Date + 3 working Days.
        ls_target-REQUESTEDDELIVERYDATE = sto_final_date.
        ls_target-ATTSEORDER = ''.
        ls_target-CUBE = ''.
        ls_target-PIECES = ls_source-KYF_0001.
        ls_target-REEL = ''.
        ls_target-REELSIZE = ''.
        ls_target-VENDORSKU = ''.
        ls_target-ATTSESKU = ''.
        ls_target-COMPANYNAME = 'AT&T'.
        ls_target-OEM = ''.
        ls_target-REFERENCENUMBER2 = '0'.
        ls_target-REFERENCENUMBER3 = '0'.
        ls_target-REFERENCENUMBER4 = '0'.
        APPEND ls_target TO et_target.
        TRANSFER ls_target TO OUTFILE_NAME.
        CNT = CNT + 1.
      ENDLOOP.
        CNT = CNT + 2.
    Code for generating Header -Footer.
      SHIFT CNT LEFT DELETING LEADING SPACE.
      CONCATENATE W_FOOTER CNT INTO W_HEADER.
      APPEND W_HEADER TO ET_TARGET.
    End of code for generating Header -Footer.
    Code for file permissions
      TRANSFER W_HEADER TO OUTFILE_NAME.
      CLOSE DATASET OUTFILE_NAME.
      CONCATENATE 'chmod 644' OUTFILE_NAME INTO CMD SEPARATED BY SPACE.
      CALL 'SYSTEM' ID 'COMMAND' FIELD CMD.
    End of code for file permissions
    End of transformation code -
    ENDFORM.
    Hope it helps
    bhaskar

  • File system transactions

    Is there a jdbc driver or a library of some sort that allows me to enlist file system operations in the same transaction with database operations? I have a weblogic 8.1 ejb application that generates some files to be ftped by clients, and I want the file generation to be part of the same database transaction, so if something fails in the database then I don't want to generate the file and vice-versa, if the file generation fails for whatever reason (such as out of space) I want to rollback the database transaction. I know there is an article about doing file transactions (http://www.onjava.com/lpt/a/1273) but I was thinking that since 2002 when this article was published someone has written a library that is suitable to use in a j2ee environment.
              Thank you,
              Costa

    It's not the only driver that can work with any DB. Any type 3 driver will do just the same as it basically needs another driver on the server end to actually connect to the DB (a type 3 driver is actually nothing more than a proxy, possibly including secure communication and other features).
    However, it will not be able to join tables from different databases, even of the same type (e.g. 2 SQL Servers). If the server doesn't do it itself the JDBC driver won't.
    Alin.

  • LSMW: Error concerning the logical path in Specify Files step

    Hi,
    I am trying on an ECC 6.0 EHP6 system to upload (open) POs from an existing SAP ERP system. Since I faced some complications in batch input recording method, I decided to go with the standard batch/direct input method.
    I selected object 0085 (Purchase Order) and method 0001 (Purchase Order). The program that is used is RM06EEI0. I followed all the usual steps, but in the "Specify Files" step I get the message:
    '****.lsmw.conv' does not exist; edit the logical path using transaction FILE
    Being aware of both the FILE and SF01 transactions, I created the logical path and file (through the FILE transaction). However, after the modification, I get the message:
    Logical file '****' is not assigned to physical file '****.lsmw.conv'
    There is also the related SAP Note 753511 (Logical and physical path and file name in transaction LSMW) that refers to this case.

    Hi,
    Click on specify file radio button>Legacy data-on the PC Front end
    where input file location like C:\mydocument\desktop\test.txt
    and give input in Name field: mydocument
    in delimiter section > select Tabular
    File structure section> tick on Field order matches source structure definition
    File type: Record end marker(Text file)
    then press enter key and other steps as same.
    Second thing you said that you have used Batch input recording
    after creation and recording finish , scroll the page below on recording and put a cursor unwanted field and remove unwanted input field.
    I hope your problem will resolve. you can find the steps pdf on Google search try now.
    Thanks
    Sanjeet Kumar

  • How to create the logical file name

    Hi All,
        I have requirement where i need to post the Inventory Management data. for this i need to execute std. program RM07MMBL which is having logical file name in the selection screen. but i have placed my input file in application server.
       Can any one tell how to create the Logical filename which refers the physical path. I also tried <b>t-code SF01 & Table - FILENAMEC</b>I and found that we need to add an entry in this table but i really dont know how we have to do since this table cannot be maintained in SM30 also..
      Help Me.
      Thanks in Advance!...
    Regards,
    Ramkumar

    Hi Ram,
    Try using FILE transaction code...
    Follow these steps to create:::
    Double click on Logical file Path Definition
    Click on New Entries,
    Give Logical file Path name as Z_LOGICAL PATH and save it
    now choose this path and double click on Assignemt of Physical path to Logical path
    double click on the OS name
    Give some description and give some Physical path name from AL11 transaction and save it
    Now Double click on Logical File Name Definition,
    Click on New Entries,
    Give some logical file name: Z_LOGICAL_FILENAME
    Physical file: test
    Data Format: BIN
    Logical Path: Z_LOGICAL PATH
    Hope this helps
    Regards,
    Phani
    Message was edited by:
            Sivapuram Phani Kumar

Maybe you are looking for