Sandbox system- Test data

Hi,
I am not sure which section to post this thread.
Can anybody provide me with a document, that gives the basic steps to be configured in a sandbox system to create test data.
We have a sandbox system installed and we have the test data in text files related to SD,MM,FI and IS-Retail modules around 280 tables.
I need to know the basic configuration that has to be made for an IDES system to import all this test data and what is the easiest method for importing all these flat files and prepare it as a test system. The client wants this to be done in 2 or 3 days. I am not sure if that is possible.
Could anybody please advice
Regards,

> Source System(Production): Solaris 10/Oracle (NW SP17 and ECC 6.0 with Ehp3(DIMP, SAP-APPL and EA-APPL)
> Target System(Sandbox)   : Windows 2K3/Oracle
> 1. Since we are having Ehp3 in source system, is there any impact while doing the system copy?
> 2. How to prepare the target system with same patch level of SAP software components as our source system
You can't do the system copy as "I install a new system and overwrite the database".
There are a few points to consider:
a) a heterogeneous system copy requires a certified migration consultant on-site, otherwise you'll loose support for problems during the migration and for the target system (see http://service.sap.com/osdbmigration and Hinweis 82478 - SAP System OS/DB Migration).
b) the only supported way of doing such a migration is using R3load as the tool of choice. You'll have to start sapinst on the source system, export your database and import it into the target system
c) direct database copies are not possible (Solaris SPARC is a BigEndian platform, Windows is LIttleEndian, Windows won't be able to open the files created by Solaris)
Markus

Similar Messages

  • Not able to change the data of test data containers in production system

    Dear All,
    We have created eCATT scripts in Development SolMan System and moved the transports to Production SolMan System.  Customer wants to change the data at Test data containers and run the scripts in production system but we are not able to edit the data. 
    May be the reason is SCC4 transaction code has set the below option.
    Changes and transports for client-specific Objects
    u2022 No changes allowed
    Customer doesnu2019t want to change the above option and wanted to change the test data containers to give different datau2019s and run the eCATT scripts.
    Could you please let me know the solution for this?
    Your help is really appreciated.
    Thanks,
    Mahendra

    eCatt has the feature where you don't need to transport the scripts or test configuration to our target system. We can keep all our scripts and test data in Solman and run this script any other system in your landscape using the System data container and target system.
    Maintain the production as one of the target system in System container in Solman and point that system while running the script. Change the test data in Solman to run this script.
    Let me know if you need more information
    thanks
    Venkat

  • System Wide Data Formats Not copied from Development to test

    I am using OBIEE 10g. When I copy my dashboards and reports via the Catalog Manager from the development environment to the test environment the system wide data formats are not being copied over. That is, all the ones that I saved under "Column Properties->Column Format->Save As System Wide Default" are not being copied over. I am using archive and unarchive in the Catalog Manager to move from development to test environment. So my question is where are these system-wide defaults stored and what do I need to do get them copied over?
    Thanks in advance!

    Hi
    the system wide column formats are stored in catalog under system/metadata folder.
    copy this folder from dev env to test env. via catalog manager, restart ps.
    assign points if found helpful.

  • Test data for mb01

    Hi,
    Can we get some test data for posting GR in mb01?If yes, where can we get the data?
    I am on a 4.7 ides system.
    Regards,
    CA

    Hi,
    You cannot do the testing in prod. server, if you want to do so you have to use copyPD (sandbox) or quality server, data might be available there, or you can fetch data from ME2N by entering Selection Parameters as WE101 Open goods receipt (Open PO)
    and post GRN with MB01.
    =-=
    Pradip Gawande

  • How to load Test data from a Text file in ECATT

    Hi,
    I have created a test configuration with a test script, system data container, and test data container.
    I have done the recording of a transaction and created the script. Parameterization is done for the script and have imported those parameters from script in to the data container.
    I am trying to load a the data from a text file on the local work-station. The data is not being read.
    Please explain this in detail (step by step) as I am very new to ECATT.
    I am trying this on SAP ECC 6.0 IDES server.
    Thanks in Advance
    Vikas Patil

    Please explain this in detail (step by step) as I am very new to ECATT.
    Thanks in Advance
    Vikas Patil

  • Oracle EPM 11.1.2 issue with system-jazn-data.xml & HIT entries

    Have been working on configuring Oracle EPM 11.1.2 and have one final issue from the diagnostic utility that I cannot figure out. Configuration sequence is as follows and each step is installed in its own database:
    Step 1 - Foundation/Shared Services/Calc Mgr/EPMA/Essbase to a single relational DB. I am not configuring the web server until the final step.
    Step 2 - Hyperion Performance Scorecard
    Step 3 - Planning
    Step 4 - Profitability
    Step 5 - RA and configure web server.
    I have used both SQL Server Express 2008 and Oracle DB 11g and get the same result.
    When I complete the install, restart all of the services, and run the diagnostic utility, I get a failure with foundation services indicating that the file "system-jazn-data.xml" cannot be found. No real help is provided with the error message and have found no help in the docs or on the web. I have searched the disk and the file seems to be in the proper place per the docs. I have done partial configs and do not get the error. I have then compared the system-jazn-data.xml file from the successful config to the system-jazn-data.xml file from the failed config they are identical. Both files seem to be bloated with tens of thousands of lines, most of them blank.
    I had reached a point where I thought the issue was related to Performance Scorecard and removed that step. I am now getting the error again.
    Anyone seeing this issue? Is it just a bogus message in the diagnostic report and can be ignored? Any other thoughts?
    Thanks
    EPMCloud

    Update - After going through the install many more times, I still do not know what the issue is, but I believe I have figured out how to resolve it. It appears that if you go back (after everything is installed and configured) and reconfigure the application server for Foundation services, the issue is corrected.
    I am running some final test now and if I discover something different, I will update the post.
    EPMCloud

  • Test data corrupted issue while re-opening the Test in PTF8.53

    Hi,
    While re-opening the test case in PTF8.53,there is a pop up message displaying test data is corrupted.
    Ran successfully yesterday,trying to reopen however getting this issue.
    Any solution to retreive it back?

    Actually....iam opening the form from windows xp.Means what? Are u opening in the browser or in the Forms Builder?
    ----in the browser
    when i create a new form in windows xp using developer suite (i have installed developer suite) i am able to connect the database. i checked the tnsping in server its working.....
    This means developer suite has been configured for database.---i configured the developer suite(172.16.7.123) to 9idb(172.16.7.2)
    but i am not able to open the form.What error are u getting when u open the form?
    ---when i open the form in the browser (http://appsworld.ncc.com:7778/forms/frmservlet?config=test)
    its asking me the username,password,database.
    i gave username=ncc password=nccpwd database=test
    In the db server's tnsnames.ora...i have given the db details under test.
    if i do "tnsping test" the result is ok. but when i open the form in the browser im getting the ora-12514 error after giving the above said details of username.
    But....when i give the username,password and database of infrastructure i.e.,
    username=system password=oradba database=orcl then the form is opening in the browser.

  • How to generate test data for all the tables in oracle

    I am planning to use plsql to generate the test data in all the tables in schema, schema name is given as input parameters, min records in master table, min records in child table. data should be consistent in the columns which are used for constraints i.e. using same column value..
    planning to implement something like
    execute sp_schema_data_gen (schemaname, minrecinmstrtbl, minrecsforchildtable);
    schemaname = owner,
    minrecinmstrtbl= minimum records to insert into each parent table,
    minrecsforchildtable = minimum records to enter into each child table of a each master table;
    all_tables where owner= schemaname;
    all_tab_columns and all_constrains - where owner =schemaname;
    using dbms_random pkg.
    is anyone have better idea to do this.. is this functionality already there in oracle db?

    Ah, damorgan, data, test data, metadata and table-driven processes. Love the stuff!
    There are two approaches you can take with this. I'll mention both and then ask which
    one you think you would find most useful for your requirements.
    One approach I would call the generic bottom-up approach which is the one I think you
    are referring to.
    This system is a generic test data generator. It isn't designed to generate data for any
    particular existing table or application but is the general case solution.
    Building on damorgan's advice define the basic hierarchy: table collection, tables, data; so start at the data level.
    1. Identify/document the data types that you need to support. Start small (NUMBER, VARCHAR2, DATE) and add as you go along
    2. For each data type identify the functionality and attributes that you need. For instance for VARCHAR2
    a. min length - the minimum length to generate
    b. max length - the maximum length
    c. prefix - a prefix for the generated data; e.g. for an address field you might want a 'add1' prefix
    d. suffix - a suffix for the generated data; see prefix
    e. whether to generate NULLs
    3. For NUMBER you will probably want at least precision and scale but might want minimum and maximum values or even min/max precision,
    min/max scale.
    4. store the attribute combinations in Oracle tables
    5. build functionality for each data type that can create the range and type of data that you need. These functions should take parameters that can be used to control the attributes and the amount of data generated.
    6. At the table level you will need business rules that control how the different columns of the table relate to each other. For example, for ADDRESS information your business rule might be that ADDRESS1, CITY, STATE, ZIP are required and ADDRESS2 is optional.
    7. Add table-level processes, driven by the saved metadata, that can generate data at the record level by leveraging the data type functionality you have built previously.
    8. Then add the metadata, business rules and functionality to control the TABLE-TO-TABLE relationships; that is, the data model. You need the same DETPNO values in the SCOTT.EMP table that exist in the SCOTT.DEPT table.
    The second approach I have used more often. I would it call the top-down approach and I use
    it when test data is needed for an existing system. The main use case here is to avoid
    having to copy production data to QA, TEST or DEV environments.
    QA people want to test with data that they are familiar with: names, companies, code values.
    I've found they aren't often fond of random character strings for names of things.
    The second approach I use for mature systems where there is already plenty of data to choose from.
    It involves selecting subsets of data from each of the existing tables and saving that data in a
    set of test tables. This data can then be used for regression testing and for automated unit testing of
    existing functionality and functionality that is being developed.
    QA can use data they are already familiar with and can test the application (GUI?) interface on that
    data to see if they get the expected changes.
    For each table to be tested (e.g. DEPT) I create two test system tables. A BEFORE table and an EXPECTED table.
    1. DEPT_TEST_BEFORE
         This table has all EMP table columns and a TEST_CASE column.
         It holds EMP-image rows for each test case that show the row as it should look BEFORE the
         test for that test case is performed.
         CREATE TABLE DEPT_TEST_BEFORE
         TESTCASE NUMBER,
         DEPTNO NUMBER(2),
         DNAME VARCHAR2(14 BYTE),
         LOC VARCHAR2(13 BYTE)
    2. DEPT_TEST_EXPECTED
         This table also has all EMP table columns and a TEST_CASE column.
         It holds EMP-image rows for each test case that show the row as it should look AFTER the
         test for that test case is performed.
    Each of these tables are a mirror image of the actual application table with one new column
    added that contains a value representing the TESTCASE_NUMBER.
    To create test case #3 identify or create the DEPT records you want to use for test case #3.
    Insert these records into DEPT_TEST_BEFORE:
         INSERT INTO DEPT_TEST_BEFORE
         SELECT 3, D.* FROM DEPT D where DEPNO = 20
    Insert records for test case #3 into DEPT_TEST_EXPECTED that show the rows as they should
    look after test #3 is run. For example, if test #3 creates one new record add all the
    records fro the BEFORE data set and add a new one for the new record.
    When you want to run TESTCASE_ONE the process is basically (ignore for this illustration that
    there is a foreign key betwee DEPT and EMP):
    1. delete the records from SCOTT.DEPT that correspond to test case #3 DEPT records.
              DELETE FROM DEPT
              WHERE DEPTNO IN (SELECT DEPTNO FROM DEPT_TEST_BEFORE WHERE TESTCASE = 3);
    2. insert the test data set records for SCOTT.DEPT for test case #3.
              INSERT INTO DEPT
              SELECT DEPTNO, DNAME, LOC FROM DEPT_TEST_BEFORE WHERE TESTCASE = 3;
    3 perform the test.
    4. compare the actual results with the expected results.
         This is done by a function that compares the records in DEPT with the records
         in DEPT_TEST_EXPECTED for test #3.
         I usually store these results in yet another table or just report them out.
    5. Report out the differences.
    This second approach uses data the users (QA) are already familiar with, is scaleable and
    is easy to add new data that meets business requirements.
    It is also easy to automatically generate the necessary tables and test setup/breakdown
    using a table-driven metadata approach. Adding a new test table is as easy as calling
    a stored procedure; the procedure can generate the DDL or create the actual tables needed
    for the BEFORE and AFTER snapshots.
    The main disadvantage is that existing data will almost never cover the corner cases.
    But you can add data for these. By corner cases I mean data that defines the limits
    for a data type: a VARCHAR2(30) name field should have at least one test record that
    has a name that is 30 characters long.
    Which of these approaches makes the most sense for you?

  • Problem about implementation of a controller in main_vi based on the test data of sub_vi

    Hi Group:
     I am developing a measurement and control system based on the PCI-6040E and SCXI1001,1102, and 1160.
    But now, I have a question ,which is how to implement a controller in main_vi based on the test data of sub_vi.
    my  program is consist of main_vi and  several sub_vi,  the main_vi is used for controlling and give the operation signal to hardware system (relay). and sub_vi is used for sampling data.
    for example: the sub_vi collects data from the hardware and the control algorithm and output is implemented in the main_vi, so the question is how to transfer the test data from the sub_vi to main_vi synchronously.
    I try to set the output connecter, but the data just return when  sub_vi is closed.
    anybody have good idea about it.
    Thanks a lot

    Hi hanwei,
    I'm making the assumption that you are not using LabVIEW Real-Time. If you are, you should use RT FIFOs. Otherwise, follow MikeS81's advice.
    I would go a step further and say that most likely it would be a better design to remove the loop (while loop?) from the subVI and instead place the subVI within a while loop in your top-level VI. This way on each execution of your subVI it will quickly return your data. Then you can also perform all the queue and dequeue operations in the two seperate while loops in your top-level VI. This organization will be easier to read, document, and understand.
    Mike Lyons
    National Instruments
    http://www.ni.com/devzone

  • Delete a sequence of test data from SE37

    Hi
    I am relatively new to ABAP and I was wondering whether it is possible to delete a few test data records from SE37
    in one go, namely, for example delete 10 records of test data belonging to one user.
    I thank you for your reply.
    Sincerely
    Yuval

    Hi Peery
    Instead of directly deleting the entries from table EUFUNC.
    Enter the Author name in table EUFUNC and get the data.
    Then based on Object name & Date & Time manually delete the entries from Function Modules.
    It would maintain the data consistency and would keep your system stable.

  • Guidence regarding creating  a test data and running any function module

    hi pals,
       can you plz tell me how to create a test data, commit and run any function module, in details(step wise) ??

    Hi yawmark,
    I appologise for including a poor code example ealier on, i had to come up with something quick. I will however, visit the sugeted sites you mentioned, thank you.
    I did however, create a simple class that compiles and can be used to set, return, reset and print a few details about a person. I hope that is of better use than my previous example. here it is:
    public class SimplePerson {
        int age = 0;
        String firstName = "";
        String lastName = "";
        /** Creates a new instance of SimplePerson */
        public SimplePerson() {
        /** Sets the age of this person */
        private void setAge(int takeAge){
            age = takeAge;
        /** Returns the age of this person */
        private int getAge(){
            return age;
        /** Sets the First Name of this person */
        private void setFirstName(String takeName){
            firstName = takeName;
        /** Returns the First Name of this person */
        private String getFirstName(){
            return firstName;
        /** Sets the Last Name of this person */
        private void setLastName(String takeName){
            firstName = takeName;
        /** Returns the Last Name of this person */
        private String getLastName(){
            return firstName;
        /** Resets the details of this person back to back to
         *  the default form.
        private void resetAll(){
            age = 0;
            firstName = "";
            lastName = "";
        /** Prints all the details this person has */
        private void printAll(){
            System.out.println("Age: " +Integer.toString(age)
                    + "\n First Name: " + firstName
                    + "\n Last Name: " + lastName );
    }Cheers mate

  • InfoSet in DEV system returns data, in other systems not

    Hello
    I created the InfoSet containing two ODSes and three Characteristics. The InfoSet returns data in Development System correctly. I transported the solution (ODSes, Characteristics and InfoSet) to other systems (Test and QA BW Systems). Data has been loaded to ODSes and activated as well as to Characteristics, but the Info Set returns no data there.
    Kindly please, let me know your hints on this.
    Thank you in advance,
    eMeS

    Check if you have some authorization issue...

  • Upload test data file

    Hi all,
       After chaining the test data, is it req. to open test data container and upload the latest file?
      How to Automatically pick the latest test data?
    Regards,
    Sree...

    Hi Jwalith,
    After a test data file is created in the system,
    you add that to the container by going to the Varaints tab,
    <b>select the radio button "External Variants / Path" and mention the path of the file where it is located in the system.</b>
    All the varaints created in the file would be displayed in the variants list.
    Hope this helps.
    Best regards,
    Harsha

  • Clean up all test data

    Hi all,
    HOw ya'all? My collegues and I have been loading data to our BWQ system to test our extraction process individually. Now i want to clean up all this test data from all over the system(master data, ods,psa cubes etc) and start loading QA data systematically in order to run integrated data load test.
    Every time i try to clean up the data from one of these data containers i get msg like
    The data cannot be deleted coz it has already been loaded to other infoproviders.Do u really want to deleted the data and all the associate settings. If u do so, u need to set up those setting all over again.
    And some other msgs that i do not remember at his moment.
    What i need is structured and sequential steps to clean up the data completely in the system. Can anybody help?
    Thanks,
    Prathibha.

    Hi,
    Start deleting data from highest level.
    i.e. Infocube->Ods->masterdata objects.
    It is require to delete the data from all the associate objects . Relation can be easily find out by data flow and where used list in case of infobject.
    Hope it helps.
    Regards,
    Aditya

  • VssNullProver stopped after installing Update Rollup 3 for System Center Data Protection Manager 2012 R2

    I recently installed Update Rollup 3 for System Center Data Protection Manager 2012 R2.
    As part of the update I updated all DPM Agents, including the ones on our Hyper-V Servers (which are part of a cluster). I also rebooted every DPM Protected Servers. After reboot all Hyper-V Servers raise a warning that the VssNullProvider service has stopped.
    Backup seems to work properly. I can start the VssNullProvider service manually. But after the next backup the service it is stopped again. Exactly the same issue occurs on our test Hyper-V Cluster. All our Hyper-V Servers run Windows Server 2012 R2 and
    we use System Center Virtual Machine Manager 2012 R2.
    Something is not ok with Update Rollup 3. Any suggestions?
    Boudewijn Plomp, BPMi Infrastructure & Security
    Please remember, if you see a post that helped you please click "Vote as Helpful" and if it answered your question, please click "Mark as Answer".

    Hi
    SEE NEW BLOG:
    Support Tip Service Manager alert for the VSSNullProvider service after installing
    DPM 2012 R2 UR3
    Please remember to click “Mark as Answer” on the post that helps you, and to click “Unmark as Answer” if a marked post does not actually answer your question. This can be beneficial to other community members reading the thread. Regards, Mike J. [MSFT] This
    posting is provided "AS IS" with no warranties, and confers no rights.
    Thanks!
    Boudewijn Plomp, BPMi Infrastructure & Security | Please remember, if you see a post that helped you please click "Vote as Helpful" and if it answered your question, please click "Mark as Answer".

Maybe you are looking for