Test results for PrintService example needed

I have just added a demo of the PrintService to
the web start examples at physci.
http://www.physci.org/jws/#prs
Slight hitch in testing, though - there is no printer
attached to this development box.
Would anybody with a printer, care to open one
or two short documents, fire off a test print, and
report back?
Does it work at all?
Does the print work well?
The code is sandboxed, & an archive of the
source (*.java, *.jnlp & build.xml) is also
available at the link.

I am guessing this isn't the entire problem, but I don't see that memcpy is a supported Windows Store API:
http://msdn.microsoft.com/en-us/library/windows/apps/dn424765.aspx
Do you see anywhere else that says it is supported?
Matt Small - Microsoft Escalation Engineer - Forum Moderator
If my reply answers your question, please mark this post as answered.
NOTE: If I ask for code, please provide something that I can drop directly into a project and run (including XAML), or an actual application project. I'm trying to help a lot of people, so I don't have time to figure out weird snippets with undefined
objects and unknown namespaces.

Similar Messages

  • Unable to download web test results for failed Availability Monitor request - HTTP 500

    When attempting to download the Web Test results for a failed request in my availability monitor, the server returns HTTP 500.
    The full URL attempting to be accessed is https://stamp2.app.insightsportal.visualstudio.com/api/WebTestResult?fileName=eProd Dx1 Api_2015-02-25T16:58:00.000Z.webtestresult&subscriptionId=REMOVEDFORSECURITYPURPOSES&resourceGroup=eprod-dx1webapi&webTestId=eprod
    dx1 api-eprod-dx1webapi&location=us-ca-sjc-azr&timestamp=1424883480
    I can provide the sub ID as needed.

    Thinking this is an Azure Portal issue. Please move.

  • Web test result for a URL which needs a client certificate to authenticate

    Hi,
    we want to check URL response of a .asmx URL which needs a client certificate for authenticating.
    I got the cert object and then passed it to invoke-webrequest cmdlet, but no matter what i try I always get this error
    “The underlying connection was closed”.
    code1
    $WebClient = New-Object System.Net.WebClient
    [System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}
    $WebClient.DownloadString(“https://server1/mywebservices/myser.asmx”)
    code2
    $url=”https://server1/mywebservices/myser.asmx”
    $cert=(Get-ChildItem cert: -Recurse | where {$_.Thumbprint -eq “abcdefgh3333…..something”}| Select -First 1)
    [System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}
    #load my client certificate defined by thumbprint
    $HTTP_Request = [System.Net.WebRequest]::Create($url)
    $HTTP_Request.ClientCertificates.Add($cert )
    # We then get a response from the site.
    $HTTP_Response = $HTTP_Request.GetResponse()
    Can someone please help me.
    Thanks
    Manish

    Hi Anna
    Thanks for the reply.
    I used the function referred earlier in my script, as below.
    It is still throwing error. However it does work in powershell v2.
    Is there any specific change required in the script to make it work with V3. I tried invoke-webrequest and it failed too.
    function Ignore-SSLCertificates
        $Provider = New-Object Microsoft.CSharp.CSharpCodeProvider
        $Compiler = $Provider.CreateCompiler()
        $Params = New-Object System.CodeDom.Compiler.CompilerParameters
        $Params.GenerateExecutable = $false
        $Params.GenerateInMemory = $true
        $Params.IncludeDebugInformation = $false
        $Params.ReferencedAssemblies.Add("System.DLL") > $null
        $TASource=@'
            namespace Local.ToolkitExtensions.Net.CertificatePolicy
                public class TrustAll : System.Net.ICertificatePolicy
                    public bool CheckValidationResult(System.Net.ServicePoint sp,System.Security.Cryptography.X509Certificates.X509Certificate cert, System.Net.WebRequest req, int problem)
                        return true;
        $TAResults=$Provider.CompileAssemblyFromSource($Params,$TASource)
        $TAAssembly=$TAResults.CompiledAssembly
        ## We create an instance of TrustAll and attach it to the ServicePointManager
        $TrustAll = $TAAssembly.CreateInstance("Local.ToolkitExtensions.Net.CertificatePolicy.TrustAll")
        [System.Net.ServicePointManager]::CertificatePolicy = $TrustAll
    $url="https://server1/mywebservices/myser.asmx"
    $certs = Get-ChildItem Cert:\CurrentUser\My | where {
    $_.Thumbprint -eq “abcdefgh3333…..something”} 
    $HTTP_Request = [System.Net.WebRequest]::Create($url)
    Ignore-SSLCertificates
    try
        $HTTP_Request.ClientCertificates.Add($certs )
        # We then get a response from the site.
        $HTTP_Response = $HTTP_Request.GetResponse()
    catch [System.exception]
        Write-Error $error[0].Exception
    $HTTP_Status = [int]$HTTP_Response.StatusCode
    Manish

  • Saving test results for retrieval later.

    I have a program that works for testing semiconductors using GPIB control of some HP equipment. The problem is that although it looks like it saves properly (lets me choose directory and name the file), I cannot retrieve any files. When I try to retrieve, the program automatically brings up the last test saved. I am using Open/Create/Replace File, Write File, Close File and Read File. I have an event structure loop that has the options: Run Test, Save Test, Recall Data and Quit. As soon as I hit the recall button the last test saved appears on my graph and output array indicator, then the window comes up asking me for file name etc. I am using LabVIEW 7.0. Any ideas?

    I believe I am closing the refnum by using the FileClose VI. I have attached a simpler version of my program that is doing the same thing. I used local variables to create the data types for saving and recalling the files. Do I need to add the file extension when I name the file I am saving? Thank you for looking at this.
    Attachments:
    File_I-O_trial.vi ‏216 KB

  • Speed Tests Results for 802.11ac Wireless Connections

    Using the new Apple MacBook Air with 802.11ac wireless, I tested copying a file and a folder to both the new 802.11ac AirPort Extreme router housing a USB-connected hard disk and the less recent 802.11n Apple AirPort Extreme router housing a similar USB-connected hard disk.
    The results of the tests are summarized in the table below. The movie file was ripped from a DVD movie, and the Microsoft folder is simply the Microsoft Office 2011 folder in my Applications folder containing 14,231 items.
    The MacBook Air computer was located 6–8 feet away from each router with no intervening obstructions. While this was not a scientific test, it demonstrated to me that 802.11ac wireless is clearly superior to 802.11n in a real world setting. I assume that the lower relative performance of 802.11ac versus 802.11n for the large folder containing many files is due to overhead in copying and writing files from and to the hard disks. Ditto for the Gigabit Ethernet test.

    Great resource for speedtesting: www.speedtest.net
    Will show you ping speed, upload/download speeds for your connection. Try for each then post results.

  • Load Test Results - time series request data for by URL in VS2013

    I am trying to figure out how to export and then analyze the results of a load test, but after the test is over it seems I cannot find the data for each individual request by url. This data shows during the load test itself, but after it is over it seems
    as if that data is no longer accessible and all I can find are totals. The data that I want is under the "Page response time" graph on the graphs window during the test. I know this is not the response time for every single request and is probably
    averaged, but that would suffice for the calculations I want to make. 
    I have looked in the database on my local machine (LoadTest2010, where all of the summary data is stored) and I cannot find the data I'm looking for. 
    My goal is to plot (probably in excel) each request url against the user load and analyze the slope of the response time averages to determine which requests scale the worst (and best). During the load test I can see this data and get a visual idea but when
    it ends I cannot seem to find it to export. 
    A) Can this data be exported from within visual studio? Is there a setting required to make VS persist this data to the database? I have, from under Run Settings, the "Results" section "Timing Details Storage" set to "All individual
    details" and the Storage Type set to "Database". 
    B) If this data isn't available from within VS, is it in any of the tables in the LoadTest2010 database where all of the summary data is stored?
    Thanks
    Luke

    Hi Luke,
    Since the load test is used to
    simulate many users accessing a server at the same time, it is mainly verify a wev server load stress.
    As you said that you want to find the data
    for each individual request by url, I know that generally we can analyze the url request from the Summary like the following screen shot.
    >>I
    have looked in the database on my local machine (LoadTest2010, where all of the summary data is stored) and I cannot find the data I'm looking for. 
    I suggest you can try to add the
    SQL Tracing Connect String in the Run Setting properties to trace the data.
    Reference:
    https://social.msdn.microsoft.com/Forums/en-US/74ff1c3e-cdc5-403a-b82f-66fbd36b1cc2/sql-server-tracing-in-visual-studio-load-test?forum=vstest
    In addition, you can try to create an excel to analyze the load test result, for more information:
    http://msdn.microsoft.com/en-us/library/dd997707.aspx
    Hope it help you!
    Best Regards,
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • AVK test results

    I am new to Using AVK for my JSP web Application.
    My Environment:
    jdk1.4
    Struts 1.2
    hibernate3
    OC4J 10g (9.0.4)
    SQL server2000
    LDAP
    Plum tree portal.
    I am new to using AVK.
    I need to generate AVK test results for my Customer requirement.
    Where Can i get information ?
    What are jars i need to download ?
    Raghu

    Ideally, I'd like to find a test plan and/or results from the test plan. I've found reference in the Java platform specification indicating that it
    "satisfies all testing requirements available from Sun relating to the most recently published version of the Specification six (6) months prior to any release of the clean room implementation or upgrade thereto..."
    but I have not found what the testing requirements are that have been satisfied. Any ideas?
    Many Thanks,
    -E

  • How to summarize all the DUT test results such as Serialnumber, high limit, test result,low limit ,pass or fail to save to one excel table file? So i just open the excel file to see all the DUT report.For your help! any example.

    RT

    Here is an example of a sequence opening an excel document and creating a table. You have to slightly modify it, to save Serialnumber, high limit, test result etc. But this will be a good start.
    Hope this helps
    SijinK
    National Instruments
    Attachments:
    Write_Table_to_XL_and_Create_Chart.zip ‏9 KB

  • Example of Test Bench for multiple FPGA I/O items

    Hi,
    I'm looking for an example on creating a Test Bench that simulates multiple FPGA I/O digital lines.  
    I've gone through the Creating Test Benches tutorial but it only uses a single I/O item.  Read I/O Item Name.vi is obviously an important part of this but I'm not sure how to structure the VI and assign the name (especially since things seem to get wired up magically for a single IO item).
    Any pointers to examples would be greatly appreciated.
    Thanks,
    Steve
    Solved!
    Go to Solution.

    >>What kind of hardware are you using? 
    In this case, NI 9401, but we will be expanding to cover several different module types
    >>Did you look in the NI Example Finder? 
    Yes.  There are two (in Labview 2011) returned with searching for "testbenches" that are very basic.
    >>Which tutorial are you talking about?
    http://zone.ni.com/reference/en-XX/help/371599H-01/lvfpgaconcepts/test_bench_tutorial/
    This shows how to use a Template from the "Execute VI on Development Computer with Simulated I/O" option on the Debugging tab of the FPGA Target Properties. The template covers a single input and it's not clear to me how to simulate more than a single input.
    Steve

  • Need information on open source testing tools for ADF web applications

    Hi experts,
    I need to investigate on new feasible open source Java testing tools for testing ADF web applications. I have tried to google a lot but getting confused.
    My requirements as as under:
    1. The tool must be open source.
    2. It should be easy to understand and to work upon by the tester and developers.
    Selenium based testing approach is already in place for testing the application but need to search for tools other than Selenium which shall prove suitable for testing ADF applications. Kindly let me know your inputs / suggestions.
    Thanks a lot in advance.
    Neelanand

    Hi,
    Have a look at JMeter http://jakarta.apache.org/jmeter/index.html
    1. The tool must be open source.It is.
    2. It should be easy to understand and to work upon by the tester and developers.I guess it is.
    There are some specifics in configuring it for ADF, but Chris Muir wrote a nice blog about how it's done, check it out http://one-size-doesnt-fit-all.blogspot.com/2010/04/configuring-apache-jmeter-specifically.html
    Pedja

  • Need training material / test scripts for MRP and Supply Chain module. Can anyone hel

    Can anyone out there provide me with sample test scripts they may have used for upgrading to Oracle Financials 11i? Need sample scripts for MRP and Supply Chain management. Looking for good test scenarios for testing these modules. Appreciate any help. Email at [email protected]

    Can anyone out there provide me with sample test scripts they may have used for upgrading to Oracle 11i? Need sample scripts for MRP and Supply Chain management. Looking for good test scenarios for testing these modules. Appreciate any help. Email at [email protected]

  • Need TEST DATA for CP_BD_DIRECT_INPUT_PLAN..

    Hi.
    I want to update a Created Routing with the help of "CP_BD_DIRECT_INPUT_PLAN"  function module. I am trying but getting error. Can any one help me with providing TEST DATA for "CP_BD_DIRECT_INPUT_PLAN" for Creating/Changing Operations???
    Or is it a good way to use CP_BD_DIRECT_INPUT_PLAN for Routing update???
    please answer.
    Advance thanks,
    ~Guru
    Edited by: gtipturnagaraja on Nov 18, 2010 10:33 AM

    Hi,
    the cause of this message is a change number (rc271_di_imp-aennr), which has an valid-from date different to the valid-from date of the work plan (aenr-datuv was the 01 September 2010 and plko-datuv the 10 November 2010).
    By clearing of the change number into structure rc271_di_imp structure appears this message does not occur.
    Regards
    Peter
    Edited by: Peter Loeff on Nov 10, 2010 4:33 PM

  • Need test documents for RAC failover Scenarios

    Hello friends...
    By the end of this week i have to produce sum test documents for RAC and Database server including Sun Cluster Failover Scenarios.
    Can sumone guide me to a link where i can get enough help.
    I have already managed to get enough information.. but i want to see to it that i cover most of the topics.
    Thanks, Regards
    Monu Koshy

    Please check the following links.
    http://download-uk.oracle.com/docs/cd/B19306_01/rac.102/b14197/toc.htm
    http://download-uk.oracle.com/docs/cd/B19306_01/install.102/b14205/toc.htm
    -aijaz

  • Highly frustrated with Outlook 2013 Search People box bugs - Multiple Name Results for Same Contact & Inconsistent Results

    The Outlook 2013 "Search People" box does not function properly. It frequently displays incorrect results or a mess of duplicate results. I've reported previous issues about this and consolidating my posts into one (with screenshots this
    time). Hopefully this message will be forwarded to or seen by the Outlook programmers. It really needs to be fixed.
    Outlook 2010 and other prior versions worked perfectly. You search for name, you get ONE result with the info you're looking for. FAST AND EASY. But with Outlook 2013 Microsoft has created a heck of a mess resulting in huge frustration and productivity loss
    with such simple but important tasks.
    I have hundreds of contacts stored in my Outlook address book, and they all have COMPLETE contact info added. 
    One major issue that I'm experiencing in the new Outlook 2013 is that I now get average of 4 or more duplicate name results appearing for the same contact. And each result contains different and incomplete contact info, making it impossible for me to quickly
    find the basic info I'm looking for. The cause of this issue is that Outlook 2013 now provides results from not only your local address book(s), but it also shows results based  on your email history and social media accounts setup.
    And there's no way to turn this off, or at least specify what folders and/or accounts the People Search box should use.
    To make matters worse, the Microsoft developers conveniently forgot to add some form of an indicator (like a small icon besides each name result in the list)  that clearly indicates what result is from what source. So you must manually click on each
    result one at a time and repeat the search until you locate the correct one.
    For one specific example, I have a contact stored in my local address book called
    Infusionsoft. When I type "Infusionsoft" in the People Search box to quickly find a phone number, Outlook  2013 shows me 7 results with the same name. See the screenshot below:
    As you can see in the screenshot above, every result just says "Infusionsoft", so I have to manually click on each name result one at a time and repeat the process until I find the correct one from my address book. This same thing happens with other
    random contacts.
    From what I can tell, Outlook is pulling results based on  based on recent emails I've received from different people with "@infusionsoft.com" in their email address. So the first result shows "[email protected]" (just the email
    address), the second result shows "[email protected]", the third result shows "[email protected]" and so forth. I don't want Outlook to show all of that. I just want what's in my address book!
    And you would think that the last result would be the correct one from my address book, but no. Sometimes its the 5th result, and other times it's the 3rd or 7th result. So there's no freaking order of things here.
    We simply need the ability to turn off searching of email history and other accounts when using the People Search box. Problem fixed.
    (And please don't tell me that I need to "link" every incorrect result to one main contact. You shouldn't expect everyone to have to tediously link any and all results that appear to a record. ESPECIALLY when 5+ results for each contact appear regularly.)
    ISSUE 2: Some names must be typed in a different way for the Search People to locate them
    Another big issue I'm having with the Search People box is that some name searches don’t show the correct result, unless I search for their names in a different way.
    For one specific example, I have a contact stored in my address book named "Dave Johnson". When I type "Dave Johnson" in the Search People box, one result appears, but it's just his email address, only. It's not the result that's stored in my Outlook address
    book with his phone number, addresses, etc. Screen shot below:
    If I type in Dave's name reverse order (Johnson Dave),  no results are found at all.
    Now if I just type in just"Johnson" all by itself, it finds Dave's correct result (the one stored in my Outlook Address Book). Along with everybody else that has "Johnson" in their name (see screenshot below)...
    I double-checked how I have Dave's name programed in my address book, and it's in there as "Dave Johnson" for both the Full Name and File As fields. 
    Also, the name order shouldn't make any difference when using the People Search Box anyway. Sometimes I can find people by Last Name, First Name or First Name, Last Name. Only with random contacts does it get difficult finding  their info and
    I have to do strange things like this to find them from the People Search box.
    ISSUE 3: Some Search People results only yield an email address only.
    For other random contacts, some search results only yield an email address with no other contact details. But I can open the persons contact card from the address book manually, with the same email address shown! Screenshot below...
    In the screenshot above, I have outlined the Search People box results in red, and the Address Book results in green. You can clearly see that "Robert White" is a contact stored in my local address book with full contact details, but the Search People result
    only shows his email address! Again, it's not consitent. It's hit or miss with different people.
    ISSUE 4: Some results just don't appear at all, but they are in the address book
    Another issue I'm experiencing with the People Search Box is that some people simply  cannot be found. But I can see their contact info just fine if I click on the "People" tab down at the bottom of the page and type in their name in the "Search Contacts"
    field. Why can't the People Search box find certain people? I opened up their contact details and cannot find a single thing  that would prevent them from showing up in results.
    These are clearly serious bugs that need to be fixed. And I'm shocked as to how this got missed--or ignored during alpha and beta testing. I see the "idea" behind the developers having the Search People box search everything outside of the
    address book, but in real world application this causes a heck of a lot of problems & confusion, and it needs to be fixed ASAP.
    For technical details, I have Outlook 2013 running on two computers using hosted Exchange 2010. One system is Windows 7 and other is Windows 8. The same problems occur on BOTH computers. As far as my Outlook account setup, I have all contacts stored in the
    main address book (no sub-folders or other folders).
    Can someone help communicate this message to the Outlook developers??? The "Frown" button limits me to 100 characters and one image. There's no way I can communicate this level of detail and steps to duplicate in 100 characters!

    Thanks for your reply.
    1) The instant search boxes in each individual page work just fine. If I am on the People page and type in a name in the "Search Contacts" field, it searches my contacts and displays the results that I want. But I should not have to leave whatever screen
    I'm in to find people now. In Outlook 2010 and earlier versions, I could be on the calendar page and then search for a contact without clicking off the calendar completely. For productivity-sake, it's a huge waste of time and hassle now.
    2) I'm familiar with how contact linking works, and quite frankly it's a huge mess in general. I NEVER create multiple contacts for the same person. I get that Outlook 2013 get confused now when it detects a LinkedIn or Facebook account for the same person
    already in my Outlook address book, but we need to have options that allow us to turn off results from some or all social networks. This is a big part of the problem.
    Think about it this way - The average person has 150+ LinkedIn connections, and more for Facebook. Many people today have accounts for both and they are setup with the same email address. When Outlook 2013 has to scan all the networks IN ADDITION to your
    local address book(s), it's a no brainer that it can get very confused trying to display results.
    Another big part of the problem is that Outlooks new search system also scans your email history. I receive emails from people who use multiple email addresses, or emails from companies with multiple reps or ticket systems that send you a unique
    ticket ID # ending in the same email address domain. Now Outlook displays people search results based on everything under the sun in my email history. This is beyond frustrating (see my "Infusionsoft" screenshot above in the first post).
    Again, I want to stress that for the search examples I referenced, I only have one entry in my Outlook address book for each person. And that's all I want to find when I search for people--what's already in my own address book! 
    In summary:
    We need an OPTION to turn off searching external networks when using the People Search box
    We need an option to tell Outlook to not scan email history for people search results (I think this needs to be disabled entirely actually. It's not helpful at all)
    There should be a fixed priority for displaying people search results, with local address book results FIRST, followed by social network results.
    There should be a clear icon/indicator next to each result that gives you a clue as to where the result is coming from. Your address book? Facebook? LinkedIn? We should not need to click on each result to get a hint as to where it's coming from.
    Work out the bugs in general with the new search system.
    One other thing that I didn't mention is that the Search People box also shows results for people I'm not even "friends" or connected with on the different social networks. But I've noticed that some people use the same email address for those networks that
    I already have programmed for them in my address book, which is why Outlook sometimes shows me these results. Does that make sense?
    I'll try rebuilding the index, but after testing Outlook 2013 on 3 different machines so far and seeing the same results (all slightly different results on each machine and very inconsistent), I doubt this will address the issue.

  • How to generate test data for all the tables in oracle

    I am planning to use plsql to generate the test data in all the tables in schema, schema name is given as input parameters, min records in master table, min records in child table. data should be consistent in the columns which are used for constraints i.e. using same column value..
    planning to implement something like
    execute sp_schema_data_gen (schemaname, minrecinmstrtbl, minrecsforchildtable);
    schemaname = owner,
    minrecinmstrtbl= minimum records to insert into each parent table,
    minrecsforchildtable = minimum records to enter into each child table of a each master table;
    all_tables where owner= schemaname;
    all_tab_columns and all_constrains - where owner =schemaname;
    using dbms_random pkg.
    is anyone have better idea to do this.. is this functionality already there in oracle db?

    Ah, damorgan, data, test data, metadata and table-driven processes. Love the stuff!
    There are two approaches you can take with this. I'll mention both and then ask which
    one you think you would find most useful for your requirements.
    One approach I would call the generic bottom-up approach which is the one I think you
    are referring to.
    This system is a generic test data generator. It isn't designed to generate data for any
    particular existing table or application but is the general case solution.
    Building on damorgan's advice define the basic hierarchy: table collection, tables, data; so start at the data level.
    1. Identify/document the data types that you need to support. Start small (NUMBER, VARCHAR2, DATE) and add as you go along
    2. For each data type identify the functionality and attributes that you need. For instance for VARCHAR2
    a. min length - the minimum length to generate
    b. max length - the maximum length
    c. prefix - a prefix for the generated data; e.g. for an address field you might want a 'add1' prefix
    d. suffix - a suffix for the generated data; see prefix
    e. whether to generate NULLs
    3. For NUMBER you will probably want at least precision and scale but might want minimum and maximum values or even min/max precision,
    min/max scale.
    4. store the attribute combinations in Oracle tables
    5. build functionality for each data type that can create the range and type of data that you need. These functions should take parameters that can be used to control the attributes and the amount of data generated.
    6. At the table level you will need business rules that control how the different columns of the table relate to each other. For example, for ADDRESS information your business rule might be that ADDRESS1, CITY, STATE, ZIP are required and ADDRESS2 is optional.
    7. Add table-level processes, driven by the saved metadata, that can generate data at the record level by leveraging the data type functionality you have built previously.
    8. Then add the metadata, business rules and functionality to control the TABLE-TO-TABLE relationships; that is, the data model. You need the same DETPNO values in the SCOTT.EMP table that exist in the SCOTT.DEPT table.
    The second approach I have used more often. I would it call the top-down approach and I use
    it when test data is needed for an existing system. The main use case here is to avoid
    having to copy production data to QA, TEST or DEV environments.
    QA people want to test with data that they are familiar with: names, companies, code values.
    I've found they aren't often fond of random character strings for names of things.
    The second approach I use for mature systems where there is already plenty of data to choose from.
    It involves selecting subsets of data from each of the existing tables and saving that data in a
    set of test tables. This data can then be used for regression testing and for automated unit testing of
    existing functionality and functionality that is being developed.
    QA can use data they are already familiar with and can test the application (GUI?) interface on that
    data to see if they get the expected changes.
    For each table to be tested (e.g. DEPT) I create two test system tables. A BEFORE table and an EXPECTED table.
    1. DEPT_TEST_BEFORE
         This table has all EMP table columns and a TEST_CASE column.
         It holds EMP-image rows for each test case that show the row as it should look BEFORE the
         test for that test case is performed.
         CREATE TABLE DEPT_TEST_BEFORE
         TESTCASE NUMBER,
         DEPTNO NUMBER(2),
         DNAME VARCHAR2(14 BYTE),
         LOC VARCHAR2(13 BYTE)
    2. DEPT_TEST_EXPECTED
         This table also has all EMP table columns and a TEST_CASE column.
         It holds EMP-image rows for each test case that show the row as it should look AFTER the
         test for that test case is performed.
    Each of these tables are a mirror image of the actual application table with one new column
    added that contains a value representing the TESTCASE_NUMBER.
    To create test case #3 identify or create the DEPT records you want to use for test case #3.
    Insert these records into DEPT_TEST_BEFORE:
         INSERT INTO DEPT_TEST_BEFORE
         SELECT 3, D.* FROM DEPT D where DEPNO = 20
    Insert records for test case #3 into DEPT_TEST_EXPECTED that show the rows as they should
    look after test #3 is run. For example, if test #3 creates one new record add all the
    records fro the BEFORE data set and add a new one for the new record.
    When you want to run TESTCASE_ONE the process is basically (ignore for this illustration that
    there is a foreign key betwee DEPT and EMP):
    1. delete the records from SCOTT.DEPT that correspond to test case #3 DEPT records.
              DELETE FROM DEPT
              WHERE DEPTNO IN (SELECT DEPTNO FROM DEPT_TEST_BEFORE WHERE TESTCASE = 3);
    2. insert the test data set records for SCOTT.DEPT for test case #3.
              INSERT INTO DEPT
              SELECT DEPTNO, DNAME, LOC FROM DEPT_TEST_BEFORE WHERE TESTCASE = 3;
    3 perform the test.
    4. compare the actual results with the expected results.
         This is done by a function that compares the records in DEPT with the records
         in DEPT_TEST_EXPECTED for test #3.
         I usually store these results in yet another table or just report them out.
    5. Report out the differences.
    This second approach uses data the users (QA) are already familiar with, is scaleable and
    is easy to add new data that meets business requirements.
    It is also easy to automatically generate the necessary tables and test setup/breakdown
    using a table-driven metadata approach. Adding a new test table is as easy as calling
    a stored procedure; the procedure can generate the DDL or create the actual tables needed
    for the BEFORE and AFTER snapshots.
    The main disadvantage is that existing data will almost never cover the corner cases.
    But you can add data for these. By corner cases I mean data that defines the limits
    for a data type: a VARCHAR2(30) name field should have at least one test record that
    has a name that is 30 characters long.
    Which of these approaches makes the most sense for you?

Maybe you are looking for