LSMW of runs with a mandator export/- import

At present we migrate our system landscape of SAP the releases 4.6C on mySAPERP2005. Before the migration of the quality assurance system (FQ2) we imported here in the mandator 151 a mandator export of the productive mandator (FP2 system/mandator 150) with the profile SAP_ALL. Shortly thereafter the Upgrade on mySAPERP2005 was then accomplished. The specialized divisions announce now that the defined LSMW of runs differs in the FQ2 system from the FP2 system (production) and/or still the conditions before the mandator import has. It looks apparent in such a way, as if the runs and/or the definition those does not stand behind it at all was copied. Does the LSMW become runs with a mandator export/- import with copies? How can we examine that?

Hi,
Are you running the report for all the selections(all the data )?
execute the report with data selection and check?
Is the data in the Ic - compressed?
Regards
KP

Similar Messages

  • SYSTEM Copy with Data ( SAP Export/Import way)

    Hello ,
    There is requirement at my client site to build SAP system copy from Production System  without copying Data but
    should have Programs/ Structures / Repository/ Tables & views etc.
    Though We have thought of building SAP system with Export/ Import way and then deleting the Client from copied system
    after that running Remote client copy from Source to Target system with SAP_CUST profile
    But I have heard with SAP Export/Import way , We can have SAP data copy skipped and only structure to copy. If there is any way
    of such kind then Please help me in letting know the same
    Thanks
    Deepak Gosain    

    Hi Deepak
    Kindly refer the SCN link difference between the Client copy Export / import & Remote copy method
    Difference between remote client copy and client Import/Export- Remote client copy steps
    BR
    SS

  • Materialized View with "error in exporting/importing data"

    My system is a 10g R2 on AIX (dev). When I impdp a dmp from other box, also 10g R2, in the dump log file, there is an error about the materialized view:ORA-31693: Table data object "BAANDB"."MV_EMPHORA" failed to load/unload and is being skipped due to error:
    ORA-02354: error in exporting/importing data
    Desc mv_emphora
    Name              Null?    Type
    C_RID                      ROWID
    P_RID                      ROWID
    T$CWOC            NOT NULL CHAR(6)
    T$EMNO            NOT NULL CHAR(6)
    T$NAMA            NOT NULL CHAR(35)
    T$EDTE            NOT NULL DATE
    T$PERI                     NUMBER
    T$QUAN                     NUMBER
    T$YEAR                     NUMBER
    T$RGDT                     DATEAs i ckecked here and Metalink, I found the info is less to do with the MV? what was the cause?

    The total lines are 25074. So I used the GREP from the OS to get the lines involved with the MV. Here are:
    grep -n -i "TTPPPC235201" impBaanFull.log
    5220:ORA-39153: Table "BAANDB"."TTPPPC235201" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
    5845:ORA-39153: Table "BAANDB"."MLOG$_TTPPPC235201" exists and has been truncated. Data will be loaded but all dependent meta data will be skipped due to table_exists_action of truncate
    8503:. . imported "BAANDB"."TTPPPC235201"                     36.22 MB  107912 rows
    8910:. . imported "BAANDB"."MLOG$_TTPPPC235201"               413.0 KB    6848 rows
    grep -n -i "TTCCOM001201" impBaanFull.log
    4018:ORA-39153: Table "BAANDB"."TTCCOM001201" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
    5844:ORA-39153: Table "BAANDB"."MLOG$_TTCCOM001201" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
    9129:. . imported "BAANDB"."MLOG$_TTCCOM001201"               9.718 KB      38 rows
    9136:. . imported "BAANDB"."TTCCOM001201"                     85.91 KB     239 rows
    grep -n -i "MV_EMPHORA" impBaanFull.log
    8469:ORA-39153: Table "BAANDB"."MV_EMPHORA" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
    8558:ORA-31693: Table data object "BAANDB"."MV_EMPHORA" failed to load/unload and is being skipped due to error:
    8560:ORA-12081: update operation not allowed on table "BAANDB"."MV_EMPHORA"
    25066:ORA-31684: Object type MATERIALIZED_VIEW:"BAANDB"."MV_EMPHORA" already exists
    25072: BEGIN dbms_refresh.make('"BAANDB"."MV_EMPHORA"',list=>null,next_date=>null,interval=>null,implicit_destroy=>TRUE,lax=>
    FALSE,job=>44,rollback_seg=>NULL,push_deferred_rpc=>TRUE,refresh_after_errors=>FALSE,purge_option => 1,parallelism => 0,heap_size => 0);
    25073:dbms_refresh.add(name=>'"BAANDB"."MV_EMPHORA"',list=>'"BAANDB"."MV_EMPHORA"',siteid=>0,export_db=>'BAAN'); END;the number in front of each line is the line number of the import log.
    Here is my syntax of import pmup:impdp user/pw  SCHEMAS=baandb DIRECTORY=baanbk_data_pump DUMPFILE=impBaanAll.dmp  LOGFILE=impBaanAll.log TABLE_EXISTS_ACTION=TRUNCATEYes I can create the MV manually and I have no problem to refresh manually it after the inmport.

  • Issue with Memory ID export / import

    Hi Experts,
    We are facing strange issue during export/import from memory ID.
    We are exporting value to Memory ID from module pool program and trying to import same value in SAP workflow method.
    While importing the value from memory ID it is failing to import and gives sy-subrc =4.
    Any idea how can we do export/import in module pool program to Workflow method?
    Regards,
    Sanjana

    Hi sanjana,
    Please check the link. Here you can find some examples also.
    http://wiki.scn.sap.com/wiki/display/Snippets/Import+and+Export+to+Cluster+Databases
    P.S
    http://help.sap.com/saphelp_nw04/helpdata/en/fc/eb3bf8358411d1829f0000e829fbfe/content.htm
    You just need to create a key and an ID to save data into INDX table.Once its saved you can use same key and id to import .
    Again mentioning , dont forget to delete using same key and ID once you have imported else it may give error.
    Regards,
    Sandeep Katoch

  • Copy Folder with Joins. Export/Import Folder with joins. In EUL.

    Ok, I've got a custom folder which has been made up by dragging items from 2 or 3 other folders into it.
    It then has some joins of it's own, quite a few.
    When trying to create a workbook from it, it takes 9 mins to run a query.
    I need to work out what is slowing it down. If I create the same workbook against the folder which has the majority of the items in the custom folder, it runs instantly.
    So I suspect it is one of the joins causing it.
    My plan was to duplicate the folder, then remove joins until I find out which one is causing it.
    However, if I cut n paste the folder, I get a copy without the joins.
    If I export the folder and import it I get a copy without the joins.
    Question then - how can I get a copy of a folder WITH the joins ?
    I'm slightly concerned that when I export my EUL from the dev database and import it into the live database that I'm not going to get any joins since the export and import into the dev database is not retaining the joins.
    Anyone ?

    Hi,
    The preferences for Disco Plus are set in the pref.txt file on the apps server and for Disco Desktop in the Windows Registry. I think the defaults are set on so unless you have changed them this is unlikely to help.
    I think I read somewhere that the 11g optimiser will remove unused outer joins or where there is a foreign key constraint. I may have made that last bit up as I cannot find a reference to it, but it may be worth exploring.
    To speed things up you could look at why this join is slowing things down. It could be that you need an index on the join column.
    The join actually is used, in that it has to check in the other table that a record exists. This is why Discoverer cannot remove the join from the complex folder query. If it did and there where no matching records in the other table then you would get a different result.
    Rod West

  • Working fine when running with eclipse, when exported, it doesnt.

    I have made a little mp3 player with the help of Jlayer 1.0 library. Now, it is running perfectly when I run it from eclipse. But when I export it to a .jar file, strange thing happens.
    The program starts ok, then I click add song button, a popup pops and wants me to choose a file, well I click the choose
    the file button, it opens a JfileChooser, choose file, everything ok so far. Now, I want to click ok so that the song would be added to the program, but nothing happens when I click ok. when running eclipse it would add the song to the Jtree and so on, now it does absolutely nothing, the box wont disappear, no error message, nothing. I wanted to check if it was somekind of bug in the button, but it would seem it does do some basic statements on the beginning, but when it would have to create
    new datastructure, it never comes back. I tested this with printing text when the program completed a statement and got the statement which makes the program idle. And the program doesnt freeze, it just wont do anything, and it wont even finish the method it is supposed when ok is clicked.
    Any suggestions? Im totally clueless.

    Yes, you probably don't have the application correctly set up to run outside of Eclipse. How to do that is different for each IDE. I am aware of the following tutorial for NetBeans, and (probably) Eclipse has something like that. You need to ask at the Eclipse website for that, since these forums are for the Java language, not Eclipse support.
    http://www.netbeans.org/kb/articles/javase-deploy.html

  • VBA and RFC - How to deal with tables in export/import parameters

    Hi,
    maybe one of you can support me...
    I have a couple of vba modules reading and writing data to SAP by means of function modules. As usual, they use export and import parameters as well as tables (in the table section). They work very well.
    Now, I want to execute some newer function modules and they don't use tables in the table section (because they're obsolete nowadays), instead they expect the tables as export resp. import parameters. I tried it in a couple of ways, without any success.
    Has anybody tried this before? And if so, do you mind share some snippets?
    Best regards, Thomas

    Hello Thomas,
    I think we discussed the same problem here, but unfortunately without any result. Please, take a look at this post and let us know.
    Cheers
    Stefan

  • Success with export/import of Discoverer portlets?

    Hi folks.
    Before I begin working with the 9iAS export/import procedures, can anyone tell me if they've been successful exporting and importing pages with Discoverer portlets from one 9iAS server to another?
    Are there any gotchas?

    Ravishankar,
    Thanks.
    So are you saying:
    Pages with Discoverer portlets will be transported with the Discoverer portlets in place.
    However, after the transport, the end-user will have to "Edit Defaults" again.
    And user customizations would have to be re-applied.
    John

  • Regarding Distribution Monitor for export/import

    Hi,
    We are planning to migrate the 1.2TB of database from Oracle 10.2g to MaxDB7.7 . We are currently testing the database migration on test system for 1.2TB of data. First we tried with just simple export/import i.e. without distribution monitor we were able to export the database in 16hrs but import was running for more than 88hrs so we aborted the import process. And later we found that we can use distribution monitor and distribute the export/import load on multiple systems so that import will get complete within resonable time. We used 2 application server for export /import but export completed within 14hrs but here again import was running more than 80hrs so we aborted the import process. We also done table splitting for big tables but no luck. And 8 parallel process was running on each servers i.e. one CI and 2 App servers. We followed the document DistributionMonitorUserGuide from SAP. I observerd that  on central system CPU and Memory was utilizing above 94%. But on 2 application server which we added on that servers the  CPU and Memory utilization was very low i.e. 10%. Please find the system configuration as below,
    Central Instance - 8CPU (550Mhz) 32GB RAM
    App Server1 - 8CPU (550Mhz) 16GB RAM
    App Server2 - 8CPU (550Mhz) 16GB RAM
    And also when i used top unix command on APP servers i was able to see only one R3load process to be in run state and all other 7 R3load  process was in sleep state. But on central instance all 8 R3load process was in run state. I think as on APP servers all the 8 R3load process was not running add a time that could be the reason for very slow import.
    Please can someone let me know how to improve the import time. And also if someone has done the database migration from Oracle 10.2g to MaxDB if they can tell how they had done the database migration will be helpful. And also if any specific document availble for database migration from Oracle to MaxDB will be helpful.
    Thanks,
    Narendra

    > And also when i used top unix command on APP servers i was able to see only one R3load process to be in run state and all other 7 R3load  process was in sleep state. But on central instance all 8 R3load process was in run state. I think as on APP servers all the 8 R3load process was not running add a time that could be the reason for very slow import.
    > Please can someone let me know how to improve the import time.
    R3load connects directly to the database and loads the data. The quesiton is here: how is your database configured (in sense of caches and memory)?
    > And also if someone has done the database migration from Oracle 10.2g to MaxDB if they can tell how they had done the database migration will be helpful. And also if any specific document availble for database migration from Oracle to MaxDB will be helpful.
    There are no such documents available since the process of migration to another database is called "heterogeneous system copy". This process requires a certified migration consultant ot be on-site to do/assist the migraiton. Those consultants are trained specially for certain databases and know tips and tricks how to improve the migration time.
    See
    http://service.sap.com/osdbmigration
    --> FAQ
    For MaxDB there's a special service available, see
    Note 715701 - Migration to SAP DB/MaxDB
    Markus

  • Export / Import of Apex Data

    Hello,
    We are planning to move our Oracle databases from Windows Servers to Linux ones. In a test lab, we have successfully exported all data from an Oracle DB instance on Windows and imported that data on an Oracle DB instance on Linux (with the old export/import Oracle tool)
    But in fact...it was all data except Apex data. The complete Apex environment is available on our new Linux server but no trace of all our workspaces and applications. The only available workspace is Internal.
    We do not want to export / import all workspaces, applications, images and files one by one (by hand on the Apex environment).
    - Is there a particular method to follow to include the Apex data in the export / import process ?
    OR
    - Is there a trick or a utility program that can automate the export / import of Apex data ?
    Thank you for your help.
    - Patrice

    Patrice,
    There is an Apex utility (it's called ApexExport) that can be setup to run workspace, application and page exports. It's a Java based utility and sometimes can be a little maddening to get configured to run but it will do the job once you get it configured. The utility will be on the database machine in the Apex installation directory. There is pretty good documentation available from Oracle but John Scott blogged about it here http://jes.blogs.shellprompt.net/2006/12/12/backing-up-your-applications/ as well.
    Earl

  • SQL Dev3.0 Prod: Export/Import connection wont accept network path

    Update....
    SQL Developer 3.0 Production (win xp 32 bit, with JRE download):
    Export/Import connection wont accept network path.
    but you can map the network path to a drive and use that.
    Steps:-
    1. setup SQL Developer with at least one oracle database connection.
    2. On menu "View > Connections" (if you dont see connection tree window)
    3. right click "Connections", do Export to local drive as an *.xml file (eg c:\connections.xml)
    4. again right click "Connections"
    5. Click Export Connections
    6. Click Browse
    7. Enter network drive path for the *.xml file
    example:- path name as \\MyFriendsPC\path\connections.xml
    Result:- SQL Developer wont accept UNC style path
    8. On windows folder open C:\
    9. Copy c:\connections.xml
    10. On windows folder goto \\MyFriendsPC\path\
    11. paste connections.xml at \\MyFriendsPC\path\
    12. On SQL Developer right click "Connections" on the connection tree box
    13. do "Import Connections"
    14. Click "Browse"
    15. Enter UNC path \\MyFriendsPC\path\connections.xml
    Result:- SQL Developer wont accept UNC style path
    PS: If you manually enter the UNC path in the box before the "browse" button it works. Or if you mapped to network folder as a drive it works.

    So what exacly should I do?What exactly you should not do is mess around with the default cipher suite settings, unless you really know what you're doing, which isn't in evidence.

  • Export/Import Process in the UI for Variations Content Translation is Generating CMP Files with No XML

    We have a SharePoint 2010 Publishing Website that uses variations to deliver contain to multiple languages. We are using a third-party translation company to translate publishing pages. The pages are
    exported using the  export/import using the UI process described here: "http://blogs.technet.com/b/stefan_gossner/archive/2011/12/02/sharepoint-variations-the-complete-guide-part-16-translation-support.aspx".
    Certain sub-sites are extremely content-intensive. They may contain many items in the Pages library as well as lists and other sub-sites. 
    For some sub-sites (not all), the exported CMP file contains no XML files. There should be a Manifest.XML, Requirements.XML, ExportSettings.XML, etc., but there are none. After renaming the CMP file
    to CAB and extracting it, the only files it contains are DAT files.
    The only difference I can see between the sub-sites that generate CMP files with no XML files is size. For example, there is one site that is 114 MB that produces a CMP file with no XML files. Small
    sites do not have this problem. If size is the problem, then I would think the process would generate an error instead of creating a single CMP file that contains only DAT files. However, I do not know exactly what the Export/Import Process in the UI is doing.
    This leads to two questions:
    1.
    Does anyone know why some CMP files, when renamed to *.CAB and extracted, would not contain the necessary XML files?
    2. Second, if exporting using the UI will not work, can I use PowerShell? I have tried the Export-SPWeb, but the Manifest.XML does not contain translatable
    content. I have not found any parameters that I can use with Export-SPWeb to cause the exported CMP to be in the same format as the one produced by the Export/Import process in the UI.
    As a next step, we could try developing custom code using the Publishing Service, but before doing this, I would like to understand why the Export/Import process in the UI generates a CMP that
    contains no XML files.
    If no one can answer this question, I would appreciate just some general help on understanding exactly what is happening with the Export/Import Process -- that is, the one that runs when you select
    the export or import option in the Site Manager drop down. Understanding what it is actually doing will help us troubleshoot why there are no XML files in certain export CMPs and assist with determining an alternate approach.
    Thanks in advance
    Kim Ryan, SharePoint Consultant kim.ryan@[no spam]pa-tech.com

    I wanted to bump this post to see about getting some more responses to your problem. I'm running into the same problem as well. We're running a SharePoint 2010 site and are looking at adding variations now. The two subsites with the most content take a
    while to generate the .cmp file (one to two minutes of the browser loading bar spinning waiting on the file). Both files are generated with a lot of .dat files but no .xml files. I was thinking like you that it must be a size issue. Not sure though. Did you
    ever happen to find a solution to this problem?

  • Export / import tablespace with all objects (datas, users, roles)

    Hi, i have a problem or question to the topic export / import tablespace.
    On the one hand, i have a database 10g (A) and on the other hand, a database 11g (B).
    On A there is a tablespace called PRO.
    Furthermore 3 Users:
    PRO_Main - contains the datas - Tablespace PRO
    PRO_Users1 with a role PRO_UROLE - Tablespace PRO
    PRO_Users2 with a role PRO_UROLE - Tablespace PRO
    Now, i want to transfer the whole tablespace PRO (included users PRO_MAIN, PRO_USER1, PRO_User2 and the role PRO_UROLE) from A to B.
    On B, I 've created the user PRO_Main and the tablespace PRO.
    On A , i execute following statement:
    expdp PRO_Main/XXX TABLESPACES=PRO DIRECTORY=backup_datapump DUMPFILE=TSpro.dmp LOGFILE=TSpro.log
    On B:
    impdp PRO_Main/XXX TABLESPACES=PRO DIRECTORY=backup_datapump DUMPFILE=TSpro.dmp LOGFILE=TSpro.log
    Result:
    The User PRO_Main was imported with all the datas.
    But i 'm missing PRO_USER1, PRO_User2 and the role PRO_UROLE...
    I assume, i 've used wrong parameters in my expd and / or impdp.
    It would be nice, if anybody can give me a hint.
    Thanks in advance.
    Best Regards,
    Frank

    When you do a TABLESPACE mode export by specifying just the tablespaces, then all that gets exported are the tables and their dependent objects. The users, roles, and the tablespace definitions themselves don't get exported.
    When you do a SCHEMA mode export by specifying the schemas, you will get the schema definitions (if the schema running the export is privied) and all of the objects that the schema owns. The schema does not own roles or tablespace definitions.
    In your case, you want to move
    1. schemas - which you already created 1 on your target database
    2. roles
    3. everything in the tablespaces owned by multiple schemas.
    There is no 1 export/import command that will do this. This is how i would do this:
    1 - move the schema definitions
    a. you can either create these manually or
    b1. expdp schemas=<your list of schemas> include=user
    b2 impdp the results from b1.
    2. move the roles
    expdp full=y include=role ...
    remember, this will include all roles. If you want to limit what gets exported, then use:
    include=role:"in ('ROLE1', 'ROLE2', ETC.)
    impdo the roles just exported
    3. move the user information
    a. If you want to move all of the schema's objects like functions, packages, etc, then you need to use a schema mode
    export
    expdp user/password schemas=a,b,c ...
    b. If you want to move only the objects in those tablespaces, then use the tablespace export
    expdp user/password tablespaces=tbs1, tbs2, ...
    c. import the dumpfile generated in step 3
    impdp user/password ...
    Hope this helps.
    Dean

  • What roles are required to run the export import?

    Hi,
    I want some info on roles are required to run the export import.
    I have done the following in sys user:
    GRANT EXP_FULL_DATABASE TO IMP_TEST
    GRANT IMP_FULL_DATABASE TO IMP_TEST
    But Doing the above command the IMP_TEST will get all priviles (also the system tables)
    Is there any mechanism to restrict the user to export and import only the tables not anything?
    Thanks and Regadrs

    Hi!
    If you create a user like this
    CREATE USER imp_test identified by password;then the user will be able to import or export only its own schema. If you grant the exp_full_database and imp_full_database roles to this user like you did then the use will be able to export or import the hole database (with SYS schema). But as far as I know you can't set any more restrictions to the user.
    yours sincerely
    Florian W.

  • Problem Exporting / Importing print forms with program RSTXSCRP

    Hi!
    there is problem Exporting / Importing print forms with program RSTXSCRP.
    when I transfer print form from one SAP system to other, logo image and some text is missing... what could be the reason?
    Maybe there is other way to transfer print forms between SAP systems without using program RSTXSCRP?
    Will reward,
    Mindaugas

    Hi,
    when you transfer print forms from one sap system to another, logos are not transfered automatically. these are actually .bmp files which need to be existing in every client. then only when you run your print program logo can be seen in your print output. hence logos need to be created explicitly in every client where you want to run your print program.
    thanks,
    sksingh

Maybe you are looking for

  • Change of Exchange rate in VA02

    Dear Experts We have sales order which is having around 5 line item and in that 1st line item alone has been invoiced in January and later items still not invoiced In the month of February, the invoice for 1st line item has been cancelled , so to avo

  • C:\Documents and Settings\User "c:/Program Files/Java/jdk1.5.0_14\bin\java.

    I have a class "Stimulus.java" in which another jar's class has been used. When I created jar and executed I got below error. I used "-classpath" to set dependancy jar file. Any clues are appreciated. import modulus.Modulus; public class Stimulus{   

  • AI Acquire Waveforms High and Low Limit

    Hey everybody, I am currently working with Labview 7.1 and a daq card. I am collecting 4 channels and using ai acquire waveforms to do this. I want to have different high and low voltage limits for each channel though. Is there a way to do this with

  • Printer drops connection

    Recently had router replaced with BT Homehub 5 because of damage from electrical storm. Ever since then HP Officejet 6700 keeps dropping wireless connection and I have to reinstall. Can anyone help please?

  • Can I record a we video clip ?

    Can I record a wee video clip using i photo?