LSMW in ECC6.0 - Read Data Error

Dear SAP Gurus,
I am trying to create an LSMW in ECC6.0 for uploading GL Masters data for Chart of Account.  I have got thru till "Read Data"
On Executing "Read Data" I am getting this error
"Loading from front end is not allowed for packed/hexadec. fields"
This comes up even though I had not given "Hexadecimal Lth field" but nothing works and Code Page selected is ASCII (not IBM DOS)
Please Help!
Thanks

Sorry it doesnt works.  I am not in a position to understand as to why this error is coming up at first place.
While specifying the file type I am giving the parameters correctly, that is, I am not giving the Code Page as IBM DOS.
Parameters:
File Contents: Data for One Source Structure
Delimiter: Tabulator
File Structure:
     Field Names at Start of File
     AND
     File Type: Record End Marker (Text File)
Code Page: ASCII

Similar Messages

  • Meter reading data error... error message Details: CRM_IU_IC_BI013

    Hello all,
    After i confirm the BP and premise in web UI 2007 i go to Meter Reading then I get 2 messages one is error message Details: CRM_IU_IC_BI013
    when i click on details I get the following info
    Diagnosis
    An error (communication error) occurred during the remote call of function module ISU_METERREAD_GET_REMOTE in meter reading data processing.
    System Response
    The system cannot execute the action.
    Procedure
    This is probably only a temporary disruption. Restart your application later.
    If the problem is a permanent one, contact SAP.
    and second message is a information message saying New Meter Reading Results will be created
    any suggestions?
    Thanks,
    Raj.

    Hi,
    in most of the cases, I'd expect an RFC or authority issue here. But for details debugging would be needed to check, which return code the function module provides. Before you could check the remote connection and if the rfc-user has sufficiant access rights.
    KR
    Uwe

  • LSMW-step 9 read data

    Dear All,
    Kindly find below image i am not able to execute step 9 read data in LSMW
    Regards
    Rajasekaran

    Hi Raja,
    Go to 'Maintain source fields' and check for the field for which you have set Type hexadecimal. Change it to C.
    Thanks & Regards,
    Ramagiri

  • LSMW Read Data Error

    Hi Experts,
    While Specifying the path to upload legacy data in lsmw, i have mentioned the lan server path directly to read flat file from specified network location.
    Say example:
    beprod4\em\4_Testing_Phase\05_Object\Upload Files\Master Codes\Master Codes.txt
    An error message "File '
    beprod4\em\4_Testing_Phase\05_Object\Upload Files\'does not exist or is currently locked" is coming up while reading the data from the specified path.
    Can you advice me possible reasons for the error and to resolve it.
    Thanks,
    Kris.

    Hi,
    In LSMW, steps; After Specify File, Did u assign the file in the next step?
    Also check if the file exists in the location and is in the same format(Separators..etc) as specified while in the LSMW step: Specify files.
    Regards
    Shiva

  • Must Read, Data Errors Caused by HW & SW

    Read the article even if you don't read the underlying PhD Thesis [you should]. This is why we have some problems in LR that are not Adobe's fault.
    http://preview.tinyurl.com/2dz9ry
    Mel

    Interesting article but it's not news. File systems are what they are. As a Mac user I see file system errors, too. The simple fact is is that storage capacities have grown faster than than file systems have changed. Who would have thought terabyte drives would be so accessible? This was considered mainframe not too long ago. I left the PC side of the table so I can't comment personally on NTFS's ability to handle larger drives. On the Mac side of things the combination of Unix journaling and HFS+ keep a fairly good handle on things. Oddly, this one files system is not on the list.
    But the key issue is vigilance.
    Few people check their drives until things gets wobbly. Worse, they back up bad data since they have no clue the file system is getting a bit screwed. And yet, even worse, they back up to drives of questionable quality.
    My apologies to anyone who may be offended by that observation.
    My suggestions to anyone who can't afford to lose a byte of data: Test your drives, internal and external A LOT. Test them often, test them before backing up, test them after backing up, note the age of your drives and think "three years and they're gone". Maybe sooner. As professionals we have no time for down time. Data recovery is frighteningly expensive and time consuming.
    A few hard facts: Smart Reporter is not all that smart. It tests hardware integrity not data integrity. No OS monitors the integrity of an external drive. Not all backup software does a byte by byte comparison. If it works fast, it's streaming data, not cloning data.
    Bringing this back to LR. While it's easy to poke at the flaws and eccentricities of a 1.1 program, it does what it does. I have high hopes for this program.

  • LSMW:problem While reading Data

    Hi Experts,
    I have specified one field length as 180 of type CHAR  of one field called Name in the maintain source field step.But the problem is while  reading the data characters upto lenght 60 are  coming.rest are getting truncated(i mean after 60 till 180).i mean after 60 rest it is showing blank.What exactly can be the problem?i am not able to sort it out.
    Is there any limit to the length of a particular field that we specify?
    Any pointers will be highly appreciated.
    Regards,
    Rahul
    Edited by: Rahul Kumar Sinha on Feb 11, 2009 10:05 AM

    if your passing the text up to 180 chars but target filed is only 60 char length, so that ist taking only 60 chars.  put the break point on the text filed in step5(Maintain Field Mapping and Conversion Rules) and check...

  • Lsmw error 'Specify Files first' while executing option 'Read Data'.

    Hi All,
    while trying to create mass users through LSMW, i am gettign the error 'Specify Files first' while executing option 'Read Data'.
    I have specified file (.txt), manually created (with'1 tab space' between each term). i have used option 'On the PC(Frontend)'. No error while saving.
    Next- i have executed 'Assign Files', where the file appeared automatically.
    Next- Here, when i try to execute 'Read Data', i get the error 'Specify Files first'.
    Can anyone help in resolving this error.

    Hello Plaban,
    What is your purpose? Are you tring to upload the data with more file variant?                                                                               
    There are two possibilities to make file names more variable:                                                                               
    1.If you choose system dependent file names by setting the according flag (in the first step 'Maintain object attributes'), you can define file names for each system in step 'Specify files' by assigning the files to a specific system ID (with a double-click on the specified file).                                                                               
    2.You can use wildcards in the file names where you can assign several values.                                                               
    I hope this helps you.
    Regards,
    Blanca

  • LSMW Read Data behaving differently in production and in ECC

    Hi All,
    I have writte a LSMW for CJ12 longtext and it is working fine in development but in production it giving error message at 13th step as BDC_INSERT, Transaction code .. is invalid and when i try to simulate the issue i found that in Read Data step i found the difference as transactions read in development are 399 where as the transactions read in production are 267 for the same file.And i debugged the Display Read program and observed that the read program used GUI_UPLOAD to read the data from the text file.and there itself the GUI_UPLOAD internal table contains 399 in development and 267 in production.I tried to simulate why this is happening but coldn't figure out.Can any one help me why this has happened.And we are using ECC6 with EHP4 in development and EHP3 in production and it has to upgrade to EHP4.

    if you are really certain about the source file (I doupt too) then there are only 2 more possibilities
    a) your LSMW object is different (transport it again from developement to production, only so you can be sure that LSMW object and source will be the same)
    b) a real SAP error because of different release levels. Open a ticket at SAP to get help.

  • LSMW - Read Data not showing in Converted Data

    Hi,
    I am using LSMW to load data from a flat file (via recording).  I have it working except I have one issue.  When I display the data in read data all fields are showing.  No errors popup when I select convert data.  When I select display converted data the last few fields are missing.  Any ideas why or how to fix this?
    Any help is appreciated as I do not know how to proceed.
    Thanks,
    Annette
    Edited by: Annette Ameel on Dec 4, 2008 5:37 PM

    HI Bush,
    Currently I am also facing the same problem. In "display read data" it is showing all the field values. No error msg in "convert data". But only 2 fields are displaying in "display converted data". not showing any values in rest of the fields. Kindly suggest me how you fixed this problem

  • LSMW Read Data shortdump

    Hi guys..
    I'm currently testing out on LSMW
    I'm trying to create new material for transaction MM01.
    But when i click on Step 9: Read Data
    i get a shortdump
    What happened?
        The current ABAP program had to be terminated becaus
        ABAP processor detected an internal system error.
        The current ABAP program "/1CADMC/SAP_LSMW_READ_0000
         because the ABAP
        processor discovered an invalid system state.
        1:   SQL error
        2:   Invalid value in call
        3:   Screen number in header (field DNUM) and in ID
        4:   Internal error in the database interface
        8:   Memory filled (used up)
        16:  Buffer too small for data
        32:  Unkown table in call
        64:  Invalid selection
        128: Object with this key exists more than once
    Here is my source Fields
    Source Fields
           MM01SS                    MM01 Source Structure
               MATNR                          C(018)    Material
               MBRSH                          C(001)    Industry Sector
               MTART                          C(004)    Material Type
               MEINS                          C(003)    Base Unit Of Measure
               MAKTX                          C(040)    Material Description
    Below are links to the field mapping stuff
    <a href="http://img93.imageshack.us/img93/4043/1ju0.jpg">Screen shot 1</a>
    <a href="http://img167.imageshack.us/img167/3240/2rn4.jpg">Screen shot 2</a>
    the dummy.txt is having :
    MATNR     MBRSH     MTART     MEINS     MAKTX
    AC26     I     aa     BAG     xxx1
    AC27     I     bb     BAG     xxx2
    AC28     I     aa     BAG     xxx3
    AC29     I     bb     BAG     xxx4
    AC30     I     aa     BAG     xxx5
    AC31     I     bb     BAG     xxx6
    AC32     I     aa     BAG     xxx7
    AC33     I     bb     BAG     xxx8
    AC34     I     aa     BAG     xxx9
    AC35     I     bb     BAG     xxx10
    AC36     I     aa     BAG     xxx11
    AC37     I     bb     BAG     xxx12
    AC38     I     aa     BAG     xxx13
    AC39     I     bb     BAG     xxx14
    AC40     I     aa     BAG     xxx15
    AC41     I     bb     BAG     xxx16
    AC42     I     aa     BAG     xxx17
    AC43     I     bb     BAG     xxx18
    AC44     I     aa     BAG     xxx19
    AC45     I     bb     BAG     xxx20
    AC46     I     aa     BAG     xxx21
    AC47     I     bb     BAG     xxx22
    AC48     I     aa     BAG     xxx23
    Did i leave out anything?

    oh yea, one more thing , the file assigning part
    Files
            Legacy Data          On the PC (Frontend)
                Test data                      C:\Documents and Settings\XXXXXXX\Desktop\dummy.txt
                                               Data for One Source Structure (Table)
                                               Separator Tabulator
                                               Field Names at Start of File
                                               With Record End Indicator (Text File)
                                               Code Page ASCII
            Legacy Data          On the R/3 server (application server)
            Imported Data        File for Imported Data (Application Server)
                Imported Data                  TEST_CREATE_MATERIAL.lsmw.read
            Converted Data       File for Converted Data (Application Server)
                Converted Data                 TEST_CREATE_MATERIAL.lsmw.conv
            Wildcard Value       Value for Wildcard '*' in File Name

  • LSMW / In Production, throws me out of SAP at READ DATA - 9th stage

    Hi experts,
    Am trying to load data via LSMW from text file, my issue is,
    When I do this in Production, it just crashes and throws me out of SAP at the "READ DATA" stage,
    So, any clue that Why it is so? How to fix it?
    thanq
    Edited by: Alvaro Tejada Galindo on Feb 27, 2008 12:25 PM
    Edited by: Srinivas on Feb 27, 2008 12:29 PM
    Edited by: Srinivas on Feb 27, 2008 12:30 PM

    Hi Srinivas,
    Strange that this only occurs in Production. Could try to switch on debugging and step forward until the error occurs? Perhaps this gives a clue what the problem might be.
    Regards,
    John.

  • Issue regarding LSMW, not showing all records after Read Data

    Dear Experts,
    Plz help..  I need to configure a LSMW for ROH material.. so I made recording, n then create a structure.. accordingly.
    the problem is.. at the time of reading data from .txt file.at the step of Display Read Data it showing only 50 records, rest are not displaying.. but not showing any error.. so my question to our expert that is their any limitations for reading data... if their.. then how could it be elasticated...
    plz suggest..
    regards,
    sandy

    the read step has a selection screen where you can define from which to which record you want read.
    make sure this selection is empty for a test.
    Further check if your source file has more than 50 records.
    then make sure that you really read this source file and not any other version  (check path and file name in step 7 of LSMW)

  • Error reading data from CLOB column into VARCHAR2 variable

    Hi all,
    Am hitting an issue retrieving data > 8K (minus 1) stored in a CLOB column into a VARCHAR2 variable in PL/SQL...
    The "problem to be solved" here is storing DDL, in this case a "CREATE VIEW" statement, that is longer than 8K for later retrieval (and execution) using dynamic SQL. Given that the EXECUTE IMMEDIATE statement can take a VARCHAR2 variable (up to 32K(-1)), this should suffice for our needs, however, it seems that somewhere in the process of converting this VARCHAR2 text to a CLOB for storage, and then retrieving the CLOB and attempting to put it back into a VARCHAR2 variable, it is throwing a standard ORA-06502 exception ("PL/SQL: numeric or value error"). Consider the following code:
    set serveroutput on
    drop table test1;
    create table test1(col1 CLOB);
    declare
    cursor c1 is select col1 from test1;
    myvar VARCHAR2(32000);
    begin
    myvar := '';
    for i in 1..8192 loop
    myvar := myvar || 'a';
    end loop;
    INSERT INTO test1 (col1) VALUES (myvar);
    for arec in c1 loop
    begin
    myvar := arec.col1;
    dbms_output.put_line('Read data of length ' || length(myvar));
    exception when others then
    dbms_output.put_line('Error reading data: ' || sqlerrm);
    end;
    end loop;
    end;
    If you change the loop upper bound to 8191, all works fine. I'm guessing this might have something to do with the database character set -- we've recently converted our databases over to UTF-8, for Internationalizion support, and that seems to have changed underlying assumptions regarding character processing...?
    As far as the dynamic SQL issue goes, we can probably use the DBMS_SQL interface instead, with it's EXECUTE procedure that takes a PL/SQL array of varchar2(32K) - the only issue there is reading the data from the CLOB column, and then breaking that data into an array but that doesn't seem insurmountable. But this same basic issue (when a 9K text block, let's say, turns into a >32K block after being CLOBberred) seems to comes up in other text-processing situations also, so any ideas for how to resolve would be much appreciated.
    Thanks for any tips/hints/ideas...
    Jim

    For those curious about this, here's the word from Oracle support (courtesy of Metalinks):
    RESEARCH
    ========
    Test the issue for different DB version and different characterset.
    --Testing the following PL/SQL blocks by using direct assignment method(myvar := arec.col1;) on
    different database version and different characterset.
    SQL>create table test1(col1 CLOB);
    --Inserting four CLOB data into test1.
    declare
    myvar VARCHAR2(32767);
    begin
    myvar := RPAD('a',4000);
    INSERT INTO test1 (col1) VALUES (myvar);
    myvar := RPAD('a',8191);
    INSERT INTO test1 (col1) VALUES (myvar);
    myvar := RPAD('b',8192);
    INSERT INTO test1 (col1) VALUES (myvar);
    myvar := RPAD('c',32767);
    INSERT INTO test1 (col1) VALUES (myvar);
    commit;
    end;
    --Testing the direct assignment method.
    declare
    cursor c1 is select col1, length(col1) len1 from test1;
    myvar VARCHAR2(32767);
    begin
    for arec in c1 loop
    myvar := arec.col1;
    --DBMS_LOB.READ(arec.col1, arec.len1, 1, myvar);
    dbms_output.put_line('Read data of length: ' || length(myvar));
    end loop;
    end;
    The following are the summary of the test results:
    ===================================
    1. If the database characterset is WE8ISO8859P1, then the above direct assignment
    method(myvar := arec.col1;) works for database version 9i/10g/11g without any
    errors.
    2. If the database characterset is UTF8 or AL32UTF8, then the above direct assignment method(myvar := arec.col1;) will generate the "ORA-06502:
    PL/SQL: numeric or value error" when the length of the CLOB data is greater
    than 8191(=8K-1). The same error can be reproduced across all database versions
    9i/10g/11g.
    3. Using DBMS_LOB.READ(arec.col1, arec.len1, 1, myvar) method to read CLOB data into a VARCHAR2 variable works for both WE8ISO8859P1 and UTF8
    characterset and for all database versions.
    So - it seems as I'd surmised, UTF8 changes the way VARCHAR2 and CLOB data is handled. Not too surprising, I suppose - may you all be lucky enough to be able to stay away from this sort of issue. But - the DBMS_LOB.READ workaround is certainly sufficient for the text processing situations we find ourselves in currently.
    Cheers,
    Jim C.

  • Error in Reading data from a xml file in ESB

    Hi,
    i created a inbound file adapter service which reads data from a xml file and passes it to the routing service and from there updates to the database.....
    (everything created in jdeveloper)
    But i am getting error....it is not getting updated to the database...when i check the database(select * from table) its showing one row selected but i couldnt find the data....
    Transformation mapping also i did...
    i think may be some error in reading the data from the xml file but not so sure.....
    please reply to this mail as soon as possible its very urgent

    Michael R wrote:
    The target table will be created when you execute the interface, if you set the option on the flow tab as instructed in step #6 of the "Setting up ODI Constraint on CLIENT Datastore" Section.
    Option     Value
    CREATE_TARG_TABLE      trueHi Michel,
    This was not my required answer.I am sorry that I was unable to clarify my question.Actually
    +This project executed successfully with some warning.Target Table is automatically created in database and also populated with data.But when I right-click Target Datastore(in >Mapping Tab of the Interface), and then select Data to View Data that needs to be inserted in the target table.I get some error like this:-...+This above line is the result of my project my problem is
    when I right-click Target Datastore(in Mapping Tab of the Interface), and then select Data to View Data that already inserted in the target table.Is not shown by the view data operation.
    I meant to say I am facing this error
    At the10(1010 written) step of
    Creating a New ODI Interface to Perform XML File to RDBMS Table Transformation
    wehre it says
    Open the Interface tab. Select Mapping tab, right-click Target Datastore - CLIENT, and then select Data. View Data inserted in the target table. Close Data Editor. Close the tabs...
    In my case when I use my sqldeveloper I can see data successfully inserted in my target table and also in error table (data that can't satisfy the constraint) .But I was unable to check this by following the above mentioned 10 th step and got this error.
    Thanks

  • Error while reading data through External Table!!!

    CREATE TABLE "COGNOS"."EXT_COGNOS_TBS9_TEST"
    (     "ITEM_DESC" VARCHAR2(200 BYTE),
    "EXT_CODE" VARCHAR2(20 BYTE),
    "RC_DATE" DATE,
    "RES_KD_AMNT" NUMBER(18,3),
    "RES_FC_AMNT" NUMBER(18,3),
    "NRES_KD_AMNT" NUMBER(18,3),
    "NRES_FC_AMNT" NUMBER(18,3),
    "TOTAL" NUMBER(18,3),
    "OF_WHICH_OVR1" NUMBER(18,3)
    ORGANIZATION EXTERNAL
    ( TYPE ORACLE_LOADER
    DEFAULT DIRECTORY "EXTDATADIR"
    ACCESS PARAMETERS
    ( RECORDS
    DELIMITED BY NEWLINE LOAD WHEN *({color:#ff0000}EXT_CODE LIKE 'TBS9%'{color})* FIELDS TERMINATED BY ','
    MISSING FIELD VALUES ARE NULL )
    LOCATION
    ( 'TBS9_TEST.CSV'
    External table creation went through successfully but am getting error while reading data. Am quite sure error is because of above line in red color. Could you please help me in transforming logic.
    Thanks in Advance,
    AP

    Let's start with the basics...
    1) You state that you are getting an error. What error do you get? Is this an Oracle error (i.e. ORA-xxxxx)? If so, please include the error number and the error message as well as the triggering statement. Or is the problem that rows are getting written to the reject file and errors are being written to the log file? If so, what record(s) are being rejected and what are the reasons given in the log file? Or perhaps the problem is something else?
    2) You state that you are quite sure that the problem relates to the hilighted code. What makes you quite sure of this?
    Justin

Maybe you are looking for

  • Process for adding a boolean option to the web service API

    Hey guys, Here's a little background: I'm currently working on adding an optional "strict" mode to some of the unmarshalling functions in SchemaMarshaller that will throw exceptions when receiving bad data for certain fields, and also improving the d

  • Acrobat v8 Standard Admin point install patches

    We have an application that will not work with Acrobat 9 so at this point I need to create an Admin point install for version 8 and would like to update the admin point to the most recent update for Acrobat 8.  Is there a chart I can use that list th

  • No main program for include..

    hi, i have created an include and when i do a syntax check, i get the message, "there is no main program for this include". what does that mean ? how to overcome this issue ? only when this is resolved further statements can be checked. thks

  • When to NEVER attribute of transactions in ejb

    Hi all Can any one tell me when to use NEVER . Plz tell me the sinaroo that we will use it. Thq

  • Can't read image file

    Dears, I was having forms version 10.1.2.0.2 and I was using webutil to upload an image and to download it. My problem is sometimes the image files are uploaded and sometimes not => I got "Can't read image file D:\...\image.jpg" I upgraded to forms 1