Test Data in Quality

Hi Everyone,
Pl tell me what is the essential data that is needed to get tested in Quality server for a particular plant.?
How this testing is carried out before transfering the data into Production server.?
Thanks in adv.

Hi Tehsin,
Thanks for your response.
what do you mean by Transactional testing?
How to carry out Job authorization testing?
Pl share detailed answer.
Thanks in adv.

Similar Messages

  • How can eCATT used for testing of z program on DEV system &data on Quality?

    Hello All,
    Background - Custom object is (for e.g. any report program) which I want to test is present on DEV system  and required data (e.g. data that will be passed to selection screen fields of program) on Quality system.
    Question -How to do testing of program using eCATT without transporting my program to Quality box? Please anyone can give me any insights on this to fix the data availability issue. 
    My efforts - Created System data container specifying RFC connection for Quality system. Created Test Data Container with data available on Quality box. Here, I specified target system as 'Quality' for the data.  Created Test script (for SE38 tcode ) and did the parameterization(giving program name as parameter , other selection screen fields) and set the Target system for each parameter to QUALITY.  Created Test Configuration specifying with all required parameters .
    When I ran the test script from Test Configuration, test script started to run on Quality system. But failed saying program/object does not exist on quality box.
    Importance - This is required as no data available on DEV box, need to move program/object to Quality system. Hence, multiple transort requests are created.
    Edited by: Tripti on Feb 26, 2012 9:29 AM

    Thank you for your quick response, BalusC. I really appreciate your answer.
    Yes, you are right. If I manually code the same amount of those components in the JSF pages instead of generating them dynamically, the server will still run out of memory. That is to say, JSF pages might not accommodate a great deal of concurrent visiting. If I upgrade the server to just allow 1,000 teachers making their own test papers at the same time, but when over 2,000 students take the same questionnaire simultaneously, the server will need another upgrading. So I have to do what you have told me, using JS+DOM instead of upgrading the server endlessly.
    Best Regards, Ailsa

  • Maintaining eCATT Scritps and Test Data files inMercury Quality Center Tool

    Hi all,
      As our client using Mercury Quality Centre for test management,  
    1. I what to Save my eCATT Scripts into Mercury  Quality Centre Tool. How do this..
       2. Then i should able to run the same from Quality Centre. Steps for doing this.
       any one have the solution for this...
       Pl. tell me.
    Thanks in Advance
    sree..
    Message was edited by: Rajesh Bhattathiripad .A

    Rajesh, Hi.
    There is a special Mercury solution that requires some professional services assistance but might be able to help you with getting a built-in capability for your eCATT scripts as a new test type in Quality Center. This new test type will give you the ability to:  
        - Save a reference to your eCATT scripts in QC (not the entire eCATT test but a link to it on the SAP backend)
        - Run the eCATT script from QC
        - Save a log and the runs' status into QC after an execution of an eCATT script ended
    For this to be implemented on your system, you should probably contact Mercury's PSO.
    If you like to do it yourself (and this has a lot of work and knowledge involved), you can create your own "home-made" solution using SAP's Standard recording APIs and QualityCenter's VAPI-XP test type to generate a script that will call the SECATT transaction and actually execute it on demand.
    Please note that saving the actual eCATT Blob into QC is not really possible since it is ABAP based and has to reside in the R3 system only.
    I hope this can help a little.

  • Test data for mb01

    Hi,
    Can we get some test data for posting GR in mb01?If yes, where can we get the data?
    I am on a 4.7 ides system.
    Regards,
    CA

    Hi,
    You cannot do the testing in prod. server, if you want to do so you have to use copyPD (sandbox) or quality server, data might be available there, or you can fetch data from ME2N by entering Selection Parameters as WE101 Open goods receipt (Open PO)
    and post GRN with MB01.
    =-=
    Pradip Gawande

  • Testing Data Integrity

    Hello,
    I am trying to backup the important data on my Emac using DVD's. Once I have my DVD's burned, I'd like to test them as time goes on to make sure that they are still in good shape. Is there software that I can use to test data integrity? I expect that after a certain amount of years, the DVD's will begin to have problems. It seems that monitoring the condition of these DVD's would be a good practice.
    Any advice for me?
    Thanks

    That's a good question to which I don't have a
    definite answer. To be really thorough, such a
    utility would need to do a checksum test on every
    file on the disc, which would be massively
    time-consuming. The usual hard drive oriented disk
    repair utilities such as Tech Tool Pro, Disk Warrior,
    and Disk Utility can't be used to run "verify" checks
    on a read-only DVD volume, as far as I can check or
    recall. You might get better exposure for this
    question posting it to the broader Tiger usage
    forum.
    The NIST does have advice on storing discs to prolong
    their useful life:
    Care and Handling of CDs and
    DVDs
    NIST CD-R Guidlines
    And, of course, start with high-quality blank discs
    in the first place:
    Blank DVD Media Quality Guide
    Thanks for your reply. I thought I read long ago that there was software to check disk conditions. But perhaps I am wrong about this. If I am the only one thinking of doing this, there must be a better way to make sure my back up disks don't fail me when needed.

  • How to load Test data from a Text file in ECATT

    Hi,
    I have created a test configuration with a test script, system data container, and test data container.
    I have done the recording of a transaction and created the script. Parameterization is done for the script and have imported those parameters from script in to the data container.
    I am trying to load a the data from a text file on the local work-station. The data is not being read.
    Please explain this in detail (step by step) as I am very new to ECATT.
    I am trying this on SAP ECC 6.0 IDES server.
    Thanks in Advance
    Vikas Patil

    Please explain this in detail (step by step) as I am very new to ECATT.
    Thanks in Advance
    Vikas Patil

  • Not able to change the data of test data containers in production system

    Dear All,
    We have created eCATT scripts in Development SolMan System and moved the transports to Production SolMan System.  Customer wants to change the data at Test data containers and run the scripts in production system but we are not able to edit the data. 
    May be the reason is SCC4 transaction code has set the below option.
    Changes and transports for client-specific Objects
    u2022 No changes allowed
    Customer doesnu2019t want to change the above option and wanted to change the test data containers to give different datau2019s and run the eCATT scripts.
    Could you please let me know the solution for this?
    Your help is really appreciated.
    Thanks,
    Mahendra

    eCatt has the feature where you don't need to transport the scripts or test configuration to our target system. We can keep all our scripts and test data in Solman and run this script any other system in your landscape using the System data container and target system.
    Maintain the production as one of the target system in System container in Solman and point that system while running the script. Change the test data in Solman to run this script.
    Let me know if you need more information
    thanks
    Venkat

  • Test data corrupted issue while re-opening the Test in PTF8.53

    Hi,
    While re-opening the test case in PTF8.53,there is a pop up message displaying test data is corrupted.
    Ran successfully yesterday,trying to reopen however getting this issue.
    Any solution to retreive it back?

    Actually....iam opening the form from windows xp.Means what? Are u opening in the browser or in the Forms Builder?
    ----in the browser
    when i create a new form in windows xp using developer suite (i have installed developer suite) i am able to connect the database. i checked the tnsping in server its working.....
    This means developer suite has been configured for database.---i configured the developer suite(172.16.7.123) to 9idb(172.16.7.2)
    but i am not able to open the form.What error are u getting when u open the form?
    ---when i open the form in the browser (http://appsworld.ncc.com:7778/forms/frmservlet?config=test)
    its asking me the username,password,database.
    i gave username=ncc password=nccpwd database=test
    In the db server's tnsnames.ora...i have given the db details under test.
    if i do "tnsping test" the result is ok. but when i open the form in the browser im getting the ora-12514 error after giving the above said details of username.
    But....when i give the username,password and database of infrastructure i.e.,
    username=system password=oradba database=orcl then the form is opening in the browser.

  • Populating the test data in table of IDES ECC 5.0 in Oracle

    Hi Guys,
    I have installed IDES ECC 5.0 successfully without any errors. But I don't see the data in tables like PA0001 etc. Can some body give me the steps for populating the tables with test data. I was able to sign on using DDIC in client 000.
    Thanks,

    you are using wrong client, login into client 800. check tcode SCC4 to check which client you want to log into.

  • So_new_document_att_send_api1....please give me test data

    Hi Experts,
    i am using so_new_document_att_send_api1 FM for send a mail to users with out any attachment,
    plz give me test data , why i am using so_new_document_att_send_api1    instead of so_new_document_send_api1 , due to i need text in BOLD.  i can send data in HTML format.
    plz help me on this issue.
    Thanks.

    Hi,
    I have mentioned the use of u r FM's are
    SO_NEW_DOCUMENT_ATT_SEND_API1 - Sends emails with texts and attachments
    SO_NEW_DOCUMENT_SEND_API1 - Sends emails with texts.
    check the follwing example..
    It will be useful to u....
    CONSTANTS : C_HIGH TYPE SODOCCHGI1-PRIORITY VALUE '1' .
    DATA : I_CONTENT TYPE TABLE OF SOLISTI1  ,
    I_REC TYPE TABLE OF SOMLRECI1 .
    DATA : WA_DOCDATA TYPE SODOCCHGI1 ,
    WA_CONTENT TYPE SOLISTI1   ,
    WA_REC     TYPE SOMLRECI1  .
    FILL DOCUMENT DATAWA_DOCDATA-OBJ_NAME    = 'MESSAGE' .
    WA_DOCDATA-OBJ_DESCR   = 'test'    .
    WA_DOCDATA-OBJ_LANGU   = 'E'       .
    WA_DOCDATA-SENSITIVTY  = 'F'       .
    WA_DOCDATA-OBJ_PRIO    = C_HIGH    .
    WA_DOCDATA-NO_CHANGE   = 'X'       .
    WA_DOCDATA-PRIORITY    = C_HIGH    .
    FILL OBJECT CONTENTCLEAR WA_CONTENT .
    WA_CONTENT-LINE = 'test mail' .
    APPEND WA_CONTENT TO I_CONTENT .
    FILL RECEIVERSCLEAR WA_REC .
    WA_REC-RECEIVER = SY-UNAME .
    WA_REC-REC_TYPE = 'B'.
    APPEND WA_REC TO I_REC .
    CALL FUNCTION 'SO_NEW_DOCUMENT_SEND_API1'
      EXPORTING
        DOCUMENT_DATA              = WA_DOCDATA
      TABLES
        OBJECT_CONTENT             = I_CONTENT
        RECEIVERS                  = I_REC
      EXCEPTIONS
        TOO_MANY_RECEIVERS         = 1
        DOCUMENT_NOT_SENT          = 2
        DOCUMENT_TYPE_NOT_EXIST    = 3
        OPERATION_NO_AUTHORIZATION = 4
        PARAMETER_ERROR            = 5
        X_ERROR                    = 6
        ENQUEUE_ERROR              = 7
        OTHERS                     = 8.
    IF SY-SUBRC <> 0.
      MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
      WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    Regards
    Kiran

  • How to avoid multiple users accessing same test data via parameterization in LR??

    i am using LR11.5, i have the following test data&colon;
    TestData
    1
    2
    3
    4
    5
    when i run this script from Controller with 3 users LR picks it as user1->1, user2->1, user3->1
    How do i achieve this case: user1->1, user2->2, user3->3 ??
    Any help would be great.

    I have a related question. i created 2 websites/domains then i went to users and created 2 seperate "network" users then i went to ftp and selected each website and added only user A to site A and user B to site B. what's weird is that when i try to ftp using either of the users it seems to land on the same site. i looked at shared security for the folders and it only shows user a on site a folder and user b on site b folders. am i doing somehitng wrong or is this how it works in mountain lion server? i just want to give the domain owner ftp access so they can manage their files and only thier files. i also had to turn on open directory so that it would not create a local user but a network user. do i need to turn that off and just deal with having a bunch of local users as ftp user? i want to host multiple websites on the server and NO users remote on to server besides ftp.
    edit 1: i only have 1 IP running on the server which i don't think it has any affect on this but thought i mention it :-)
    edit 2: i just noticed one more thing that may help. i used filezilla to remote in using both users, one at a time. it seems to allow both users in but then it shows same directories. i then created a file using the one that was not supposed to have acces and it never sows up. but if i remote desktop to server i can see the new file in the correct folder. so it may have something to d o with the directory listing.

  • SharePoint PPS 2013 Dashboard Designer error "Please check the data source for any unsaved changes and click on Test Data Source button"

    Hi,
    I am getting below error in SharePoint PPS 2013 Dashboard Designer. While create the Analysis Service by using "PROVIDER="MSOLAP";DATA SOURCE="http://testpivot2013:9090/Source%20Documents/TestSSource.xlsx"
    "An error occurred connecting to this data source. Please check the data source for any unsaved changes and click on Test Data Source button to confirm connection to the data source. "
    I have checked all the Sites and done all the steps also. But still getting the error.Its frustrating like anything. Everything is configured correctly but still getting this error.
    Thanks in advance.
    Poomani Sankaran

    Hi Poomani,
    Thanks for posting your issue,
    you must have to Install SQL Server 2012 ADOMD.Net  on your machine and find the browse the below mentioned URL to create SharePoint Dashboard with Analysis service step by step.
    http://www.c-sharpcorner.com/UploadFile/a9d961/create-an-analysis-service-data-source-connection-using-shar/
    I hope this is helpful to you, mark it as Helpful.
    If this works, Please mark it as Answered.
    Regards,
    Dharmendra Singh (MCPD-EA | MCTS)
    Blog : http://sharepoint-community.net/profile/DharmendraSingh

  • ECATT questions regarding handling multiple line items in test data conatin

    Hi all,
    We are using  ECATT for automation.
    We do have a query .
    The Question is
    When we have
    Ist   level   Header information     [One entry per header]    
    IInd  level  Item Level information [Per Header Multiple item entries is possible]
    For eg.
    Header
    Sales ordno Customer
    S001           C001
    Item
    Salesordno itemNo Material
    S001          001         M001
    S001          002         M002
    We need  to call an API for creating the salesorder using ECATT .We want to achieve the same with test data container.
    How we can approach this?
    Reg,
    Ganesh
    Message was edited by:
            GaneshKumar P

    hi,
    you can do that by using ABAP...ENDABAP statement.
    By using this command you can write any ABAP login in between.
    and u can also call RFC's, BAPI in between.
    all you need to do is wirte your ABAP code in se38 and make sure that its working fine. then copy ur code to ecatt tool and pleace that in between ABAP...ENDABAP statements.
    hope this helps.
    >>reward if this helps.
    regards,
    kvr

  • How to change the HTML/XML test data to test my_abap_proxy in SE80?

    Hello,
    I am trying to test my_abap_proxy in SE80 by usng a HTML/XML file of my desk top, the test data is as below
      <LoadingDate>20110422</LoadingDate>
      <DocumentDate>20110422</DocumentDate>
      <SDDocumentCategory>G</SDDocumentCategory>
    so on.......
    Now, i want to change the Loading date in this test data, so, i tried after Loading file in Test Service Provider screen of my_abap_proxy of SE80, but i could not make change ANY data in this test data, pls. let me know how to change the data?
    Thank you

    Hello ABAP_SAP_ABAP ,
                                          Youneed to select the option "generate template data " and then on the displaying template there you will see the option for editing in the top left corner toolbar.
    Then after editing you can do a XMl syntax check and test your proxy.
    Hope this helps.
    Thanks,
    Greetson

  • How to generate test data for all the tables in oracle

    I am planning to use plsql to generate the test data in all the tables in schema, schema name is given as input parameters, min records in master table, min records in child table. data should be consistent in the columns which are used for constraints i.e. using same column value..
    planning to implement something like
    execute sp_schema_data_gen (schemaname, minrecinmstrtbl, minrecsforchildtable);
    schemaname = owner,
    minrecinmstrtbl= minimum records to insert into each parent table,
    minrecsforchildtable = minimum records to enter into each child table of a each master table;
    all_tables where owner= schemaname;
    all_tab_columns and all_constrains - where owner =schemaname;
    using dbms_random pkg.
    is anyone have better idea to do this.. is this functionality already there in oracle db?

    Ah, damorgan, data, test data, metadata and table-driven processes. Love the stuff!
    There are two approaches you can take with this. I'll mention both and then ask which
    one you think you would find most useful for your requirements.
    One approach I would call the generic bottom-up approach which is the one I think you
    are referring to.
    This system is a generic test data generator. It isn't designed to generate data for any
    particular existing table or application but is the general case solution.
    Building on damorgan's advice define the basic hierarchy: table collection, tables, data; so start at the data level.
    1. Identify/document the data types that you need to support. Start small (NUMBER, VARCHAR2, DATE) and add as you go along
    2. For each data type identify the functionality and attributes that you need. For instance for VARCHAR2
    a. min length - the minimum length to generate
    b. max length - the maximum length
    c. prefix - a prefix for the generated data; e.g. for an address field you might want a 'add1' prefix
    d. suffix - a suffix for the generated data; see prefix
    e. whether to generate NULLs
    3. For NUMBER you will probably want at least precision and scale but might want minimum and maximum values or even min/max precision,
    min/max scale.
    4. store the attribute combinations in Oracle tables
    5. build functionality for each data type that can create the range and type of data that you need. These functions should take parameters that can be used to control the attributes and the amount of data generated.
    6. At the table level you will need business rules that control how the different columns of the table relate to each other. For example, for ADDRESS information your business rule might be that ADDRESS1, CITY, STATE, ZIP are required and ADDRESS2 is optional.
    7. Add table-level processes, driven by the saved metadata, that can generate data at the record level by leveraging the data type functionality you have built previously.
    8. Then add the metadata, business rules and functionality to control the TABLE-TO-TABLE relationships; that is, the data model. You need the same DETPNO values in the SCOTT.EMP table that exist in the SCOTT.DEPT table.
    The second approach I have used more often. I would it call the top-down approach and I use
    it when test data is needed for an existing system. The main use case here is to avoid
    having to copy production data to QA, TEST or DEV environments.
    QA people want to test with data that they are familiar with: names, companies, code values.
    I've found they aren't often fond of random character strings for names of things.
    The second approach I use for mature systems where there is already plenty of data to choose from.
    It involves selecting subsets of data from each of the existing tables and saving that data in a
    set of test tables. This data can then be used for regression testing and for automated unit testing of
    existing functionality and functionality that is being developed.
    QA can use data they are already familiar with and can test the application (GUI?) interface on that
    data to see if they get the expected changes.
    For each table to be tested (e.g. DEPT) I create two test system tables. A BEFORE table and an EXPECTED table.
    1. DEPT_TEST_BEFORE
         This table has all EMP table columns and a TEST_CASE column.
         It holds EMP-image rows for each test case that show the row as it should look BEFORE the
         test for that test case is performed.
         CREATE TABLE DEPT_TEST_BEFORE
         TESTCASE NUMBER,
         DEPTNO NUMBER(2),
         DNAME VARCHAR2(14 BYTE),
         LOC VARCHAR2(13 BYTE)
    2. DEPT_TEST_EXPECTED
         This table also has all EMP table columns and a TEST_CASE column.
         It holds EMP-image rows for each test case that show the row as it should look AFTER the
         test for that test case is performed.
    Each of these tables are a mirror image of the actual application table with one new column
    added that contains a value representing the TESTCASE_NUMBER.
    To create test case #3 identify or create the DEPT records you want to use for test case #3.
    Insert these records into DEPT_TEST_BEFORE:
         INSERT INTO DEPT_TEST_BEFORE
         SELECT 3, D.* FROM DEPT D where DEPNO = 20
    Insert records for test case #3 into DEPT_TEST_EXPECTED that show the rows as they should
    look after test #3 is run. For example, if test #3 creates one new record add all the
    records fro the BEFORE data set and add a new one for the new record.
    When you want to run TESTCASE_ONE the process is basically (ignore for this illustration that
    there is a foreign key betwee DEPT and EMP):
    1. delete the records from SCOTT.DEPT that correspond to test case #3 DEPT records.
              DELETE FROM DEPT
              WHERE DEPTNO IN (SELECT DEPTNO FROM DEPT_TEST_BEFORE WHERE TESTCASE = 3);
    2. insert the test data set records for SCOTT.DEPT for test case #3.
              INSERT INTO DEPT
              SELECT DEPTNO, DNAME, LOC FROM DEPT_TEST_BEFORE WHERE TESTCASE = 3;
    3 perform the test.
    4. compare the actual results with the expected results.
         This is done by a function that compares the records in DEPT with the records
         in DEPT_TEST_EXPECTED for test #3.
         I usually store these results in yet another table or just report them out.
    5. Report out the differences.
    This second approach uses data the users (QA) are already familiar with, is scaleable and
    is easy to add new data that meets business requirements.
    It is also easy to automatically generate the necessary tables and test setup/breakdown
    using a table-driven metadata approach. Adding a new test table is as easy as calling
    a stored procedure; the procedure can generate the DDL or create the actual tables needed
    for the BEFORE and AFTER snapshots.
    The main disadvantage is that existing data will almost never cover the corner cases.
    But you can add data for these. By corner cases I mean data that defines the limits
    for a data type: a VARCHAR2(30) name field should have at least one test record that
    has a name that is 30 characters long.
    Which of these approaches makes the most sense for you?

Maybe you are looking for