Unit testing? help.

OK, I have tried to rap my mind around this for a few days. Unit testing.
I understand the whole:
if(expected)
System.out.println("OK");
else System.out.println("Error");Part of my class assignment but, I am still confused. Could someone explain how unit testing works? (*No, I don't want you to do it for me or anything like that.*) I am just confused on how exactly to test for these things in Java. I understand that I need to add some kind of input then check the return value with what is expected. However, I am not sure how to implement it.
Currently I tried something:
public static void lastIndexOfTest(MyString ms, int pos, char expected){
          System.out.println("Testing lastIndexOf("+pos+") on MyString " + ms);
          System.out.println("Expected: " + expected);
          try{
               int got = ms.charAt(pos);
               System.out.println("Got: " + got);
               if(got == expected)
                    System.out.println("OK");
               else
                    System.out.println("Error2");
          catch(Exception e) {
               System.out.println("Error: " + e);
          }That works in the sense that it give me no errors. But, because I don't understand how the code works exactly for everything I am not sure at all if it is even returning something thats correct.
Here is my assignment:
http://www.csl.mtu.edu/cs1122/www/programs/prog1/desc.html
Are there any tutorials on how to do unit testing? I have been looking all over and can't seem to find any listed on Google.

shawnw wrote:
OK, I have tried to rap my mind around this for a few days. Unit testing."Unit" == "one thing". You're testing each individual part of your program that can be considered a single functional unit (not necessarily a single class).
Although I've heard people say "unit testing" to apply to just about everything.
That works in the sense that it give me no errors. But, because I don't understand how the code works exactly for everything I am not sure at all if it is even returning something thats correct.Well, obviously, knowing what's correct is necessary before you can test for correctness.
Typically, you introduce specific test data to the thing being tested, and you know the correct response for that test data.
For example if you're testing a factorial program, you know that 3! == 6.
If you have a bit of (your own code) and you don't know what correct behavior of that code is, then you're not done. Really you shouldn't have started coding if you didn't know what you were trying to accomplish when you were done.
In test-driven development, you write the test before you write the code. Among other things, this ensures that you know what a bit of code is supposed to accomplish before you even start, which is a good thing.
Here is my assignment:
http://www.csl.mtu.edu/cs1122/www/programs/prog1/desc.html
Are there any tutorials on how to do unit testing? Didn't your professor mention it in class?
I have been looking all over and can't seem to find any listed on Google.I find that hard to believe. If I Google "unit testing tutorial" I find a bunch. But several of those are specific to particular environments, so maybe that's what's throwing you off.
The canonical unit testing framework for Java is JUnit. You might want to google for JUnit tutorials (I just did and found some).

Similar Messages

  • Help with XS unit tests

    We are working on some unit tests for HANA and we have two problems:
    When running the tests on browser it uses the browser language as language for tests, as we are using text joins the language may affect the tests. Is it possible to force English to be used as language independently of the browser language?
    We need to do some tests that involve currency conversion. The conversions are made automatically by the HANA Server. How can we isolate this behavior  to use in the unit tests?
    Thanks and regards
    Jefferson

    Hello Arti,
    with release 7.02 onwards there will be an special workbench tool that enables developers to execute single tests and measure their coverage. In releases < 7.02 there is NO chance to provide a similiar service to developers.
    Measuring Coverage for mass runs should be possible however. I propose the following approach:
    - create a dedicated user account (e.g. ABAPUNIT)
    - create in SCOV a test group and assign the dedicated user
    - schedule a massrun for the dedicated user with the help of code inspector
    - review the unit test result in code inspector
    - review the coverage in SCOV
    - reset the coverage result in SCOV prior to each folloup analysis
    Best Regards
      Klaus

  • Unit Testing, Null, and Warnings

    I have a Unit Test that includes the following lines:
    Dim nullarray As Integer()()
    Assert.AreEqual(nullarray.ToString(False), "Nothing")
    The variable "nullarray" will obviously be null when ToString is called (ToString is an extension method, which is the one I am testing). This is by design, because the purpose of this specific unit test is to make sure that my ToString extension
    method handles null values the way I expect. The test runs fine, but Visual Studio 2013 gives includes the following warning:
    Variable 'nullarray' is used before it has been assigned a value. A null reference exception could result at runtime.
    This warning is to be expected, and I don't want to stop Visual Studio 2013 from showing this warning or any other warnings, just this specific case (and several others that involve similar scenarios). Is there any way to mark a line or segment
    of code so that it is not checked for warnings? Otherwise, I will end up with lots of warnings for things that I am perfectly aware of and don't plan on changing.
    Nathan Sokalski [email protected] http://www.nathansokalski.com/

    Hi Nathan Sokalski,
    Variable 'nullarray' is used before it has been assigned a value. A null reference exception could result at runtime.
    Whether the warning above was thrown when you built the test project but the test run successfully? I assume Yes.
    Is there any way to mark a line or segment of code so that it is not checked for warnings?
    There is no built-in way to make some code snippet or a code line not be checked during compiling, but we can configure some specific warnings not to be warned during compiling in Visual Basic through project Properties->Compile
    tab->warning configurations box.
    For detailed information, please see: Configuring Warnings in Visual Basic
    Another way is to correct your code logic and make sure the code will not generate warning at runtime.
    If I misunderstood you, please tell us what code you want it not to be checked for warnings with a sample so that we can further look at your issue.
    Thanks,
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Unit Testing

    Hi All,
    I would like to inform to all, can any one explain to me the concept of Unit testing in FICO and how it should be drawn? If possible send any dcoumentation for this focus to my mail id: [email protected]
    Thanks with regards,
    Bala

    Hi
    refer below reward if helps
    Unit testing is done in bit and pieces. Like e.g. in SD standard order cycle; we do have 1-create order, then 2-delivery, then 3-transfer order, then 4-PGI and then 5-Invoice.  So we will be testing 1,2,3,4 and 5 seperately alone one by one using test cases and test data. We will not be looking and checking/testing any integration between order and delivery; delivery and TO; TO and PGI and then invoice.
    Whrereas System testing you will be testing the full cycle with it's integration, and you will be testing using test cases which give a full cyclic test from order to invoice.
    Security testing you will be testing different roles and functionalities and will check and signoff.
    Performance testing is refered to as how much time / second will take to perform some actions, like e.g. PGI.  If BPP defination says 5 seconds for PGI then it should be 5 and not 6 second.  Usually it is done using software.
    Regression testing is reffered to a test which verfies that some new configuration doesnot adversly impact existing functionality.  This will be done on each phase of testing.
    User Acceptance Testing:  Refers to Customer testing. The UAT will be performed through the execution of predefined business scenarios, which combine various business processes. The user test model is comprised of a sub-set of system integration test cases.
    We use different software during testing. Most commonly use are
    Test Director:  which is used to record requirement, preparing test plan and then recording the progress.  We will be incorporating defects that are coming during these testings using different test cases.
    Mercury Load Runner:  is used for performance testing.  This is an automatic tool.
    What does the following terms means :
    - Technical Unit Testing
    - Functional Unit Testing
    - IntegrationTesting
    - Volume Testing
    - Parallel Testing?
    Technical Unit Testing= Test of some technical development such as a user exit, custom program, or interface. the test usually consists of a test data set that is processed according to the new program.  A successful test only proves the developed code works and that it performed the process as as designed.
    Functional Unit Testing= Test of configuration, system settings or a custom development (it may follow the technical unit testing) These usually use actual data or data that is masked but essentially the same as a real data set. A successful test shows that the development or configuration works as designed and the data is accurate as a result.
    IntegrationTesting= Testing a process, development or configuration within the context of any other functions that the process, development or functionality will touch or integrate . The test should examine all data involved across all modules and any data indirectly affected. A successful test indicates that the processes work as designed and integrate with other functions without causing any problems in any integrated areas.
    Volume Testing= testing a full data set that is either actual or masked to insure that the entire volume does cause system problems such as network transmission problems, system resources issues, or any systemic problem, A successful test indicates that the processes will not slow or crash the system due to a full data set being utilized.
    Parallel Testing= Testing the new system or processes with a complete data set while running the same processes in the legacy system. A successful test will show identical results when both the legacy system and new system results are compared.
    I would also note that when a new implementation is being done you will want to conduct at least one cut over test from the old system to the new and you should probably do several.
    What kind of testings that are carried out in testing server?
    1. Individual Testing ( Individually which we've created)
    2. Regressive Testing ( Entire Process)
    3. Integration Testing ( Along with other integrated modules)
    The 3 types of testing is as follows:-
    1. Unit testing (where an individual process relevant to a SD or MM etc is tested)
    2. Integration testing (where a process is tested that cuts across all areas of SAP).
    3. Stress testing (where lots of transactions are run to see if the system can handle the data)
    http://www50.sap.com/businessmaps/6D1236712F84462F941FDE131A66126C.htm
    Unit test issues
    The unit tools test all SAP development work that handles business object processing for the connector. Also, the unit test tools enable you to test the interaction of your work with the ABAP components of the connector. The test tools allow you to test your development work as an online user (real-time) only.
    Note:
    It is important to understand the differences between testing the connector as an online user and testing the connector as if operating as a background user.
    The differences between testing the connector as an online user or as a background user are described as follows:
    Memory--When testing a business object, the connector must log into the SAP application.
          The connector runs as a background user, so it processes in a single memory space that is never implicitly refreshed until the connector is stopped and then restarted (therefore it is critical in business object development to clear memory after processing is complete). Since you are an online user, memory is typically refreshed after each transaction you execute.
          For more information, see Developing business objects for the ABAP Extension Module. Any problems that may occur because of this (for example, return codes never being initialized) are not detected using the test tool; only testing with the connector will reveal these issues.
    Screen flow behavior--Screen flow behavior is relevant only when using the Call Transaction API. The precise screen and sequence of screens that a user interacts with is usually determined at runtime by the transaction's code. For example, if a user chooses to extend a material master record to include a sales view by checking the Sales view check box, SAP queries the user for the specific Sales Organization information by presenting an additional input field. In this way, the transaction source code at runtime determines the specific screen and its requirements based on the data input by the user. While the test tool does handle this type of test scenario, there is a related scenario that the test tool cannot handle.
          SAP's transaction code may present different screens to online users versus background users (usually for usability versus performance). The test tool only operates as an online user. The connector only operates as a background user. Despite this difference, unit testing should get through most of the testing situations.

  • Unit testing or integration testing?

    Here it says that this example is more of an explanation of framework than best practice because of the conflation of integration/unit testing concepts: http://docs.flexunit.org/index.php?title=Using_Asynchronous_Startup
    My question is, how do you define a line here between unit testing and integration testing? What kind of UI components asynchronous tests are then considered to be unit tests and not yet integration tests? Many books recommend having 100% code coverage with unit tests. How would this example look like, to be considered only a unit test and not integration test?

    The question about which levels of testing to do are really dependent upon you. Let me explain a bit more.
    If you are building an application, and only an application, then functional testing probably covers most of your needs. In an application environment, you have a relatively narrow pathway of options and interactions that a user might follow. So long as you test those options well, your app will probably work as well.
    Where I see the real value of unit testing is testing for the unknowns. If you have a class that will be used in multiple applications in multiple ways, or perhaps it is intended to be a reusable class where others in your organization will use it in the future, then you want lower level testing. You want to have the best coverage you can because you want to know that, regardless of the way someone chooses to use the code, it will likely function or at least fail gracefully.
    The same is true for integration. If I am creating a Flex component, then I can unit test part of it, but not all. So, I want to use integration tests to ensure that it works in as many circumstance as possible. Further, Flex components have a life cycle, so testing them when they are not on the display list isn't really a valid test..
    So, to summarize, I see unit and integration w/Flex about testing possibilities and trying to achieve realistic code coverage for known and possible use cases. I see functional testing as testing those components in very specific use cases.
    Which do you need? Depends. Are you developing reusable components across (n) projects, likely the first 2 and maybe the last. Are you an application developer testing that an application works in documented business cases? Just functional will likely work.
    Hope that helps,
    Mike

  • Unit Testing  approach

    Hi,
    We had done technical migration of value based roles to derived roles, and facing problem to design the unit testing approach for the same. can you please suggest what must unit testing approach and how to create test cases for authorizations specificaly to derived roles created from value based roles ?
    goal is after testing, end users should not feel any changes done in roles approach.
    Thanks.
    Regards,
    Swapnil
    <removed_by_moderator>
    Edited by: Julius Bussche on Oct 7, 2008 3:40 PM

    Hi Swapnil,
    The Testing of Security roles need to be taken in a two step approach
    Step 1 Unit Testing in DEV
    A. Prepare the test cases for each of the derived roles and ensure that your main focus is to see if you are able to execute all the tcodes that have been derived from the parent role with out authorization errors. You also need to verify if each of the derived roles are applicable to those respective Org level Values.
    B. Because there will not enough data in DEV ( except In some cases where you have a refresh of fresh PROD data) it is always advisable to do the actual testing of the roles in QA. The goal here is to see if you are able to perform a dry run of all tcodes/Reports/Programs that belong to the roles.
    C. You may create fewer Unit test ids as you only assign one ID with one role and once the role is tested you can assign the same ID to another role.
    Step 2 Integration Testing in QA
    A. Prepare the Integration Test cases for each of the Derived roles. Here most likely the testing will be performed by the  end users/Business Analysts in that respective Business Process. Each test case must reflect the possible Org level Authorization Objects and Values that need to be tested.
    B. As Integration testing is simulation of actual Production authorizations scenario, care must be taken when creating mulitple Integration test user ids and assigning them right roles and send the ids to the end users to perform the testing in QA.
    C. The objective here is that end user must feel comfortable with the test cases and perform both Positive and Negative testing. Testing results must be caputured and documented for any further analysis.
    D. In an event of any authorization errors from Integration testing, the authorization errors will be sent to the Security team along with SU53 screenshots. The roles will be corrected in DEV and transported back to QA and the testing continues.
    E. Also the main objective of Integration testing would be to check if the transactions are reflecting the right amount of data when executed and any mismatch in the data will be direct implication that the Derived roles do not contain the right Org level values.
    Hope this helps you to understand how testing of Security roles (Derived) is done at a high level.
    Regards,
    Kiran Kandepalli.
    Edited by: Kiran Kandepalli on Oct 7, 2008 5:47 AM

  • Unit testing in J2EE environment

    Hi All:
    We have been trying to use Junit for creating unit test scripts, and have been bit successful in unit testing DAOs and Value objects - but problem is testing of components like EJBs and Servlets or even classes like Actions or Commands. Since these components run under application server environment, I am not sure how to unit test them without either deploying them on the actual server or simulating app server and rest of the system.
    I was wondering if people could share their experience in writing unit test scripts especially for J2EE components - like Servlets, JSPs and EJBs. On the similar note, is there any similar tool or API for creating integration test scripts?
    Thanks,

    Well - we already have couple of other servers but that's for beta, QA and Integration testing. Problem is with the unit testing. In unit testing, some piece of code is tested by itself. And I am not sure how I can unit test some of the classes like Servlets, EJBs, JSPs or even dependent classes like Commands or Actions. For example, consider following Action class from Java Pet Store:
    public final class CartHTMLAction extends HTMLActionSupport {
    public Event perform(HttpServletRequest request)
    throws HTMLActionException {
    // Extract attributes we will need
    String actionType= (String)request.getParameter("action");
    HttpSession session = request.getSession();
    // get the shopping cart helper
    CartEvent event = null;
    if (actionType == null) return null;
    if (actionType.equals("purchase")) {
    String itemId = request.getParameter("itemId");
    event = new CartEvent(CartEvent.ADD_ITEM, itemId);
    else if (actionType.equals("remove")) {
    String itemId = request.getParameter("itemId");
    event = new CartEvent(CartEvent.DELETE_ITEM, itemId);
    else if (actionType.equals("update")) {
    Map quantities = new HashMap();
    Map parameters = request.getParameterMap();
    for (Iterator it = parameters.keySet().iterator();
    it.hasNext(); ) {
    String name = (String) it.next();
    String value = ((String[]) parameters.get(name))[0];
    final String ITEM_QTY = "itemQuantity_";
    if (name.startsWith(ITEM_QTY)) {
    String itemID = name.substring(ITEM_QTY.length());
    Integer quantity = null;
    try {
    quantity = new Integer(value);
    catch (NumberFormatException nfe) {
    quantity = new Integer(0);
    quantities.put(itemID, quantity);
    event = CartEvent.createUpdateItemEvent(quantities);
    return event;
    In order to unit test above class from say, JUnit test script class, I will have to pass HttpServletRequest object to its method - I will also have to store corresponding command params and store them in the request object - how will you write unit test script for the above class?

  • Unit Testing with Microsoft Sharepoint Emulators and Fakes with Visual Studio 2013

    Hi All,
    I have created Test Project and now creating Test cases for Sharepoint. I found a link on MSDN which suggests using Fakes framework but it supports VS2012 and I am using Visual Studio 2013.
    So how can I use it with VS2013 or is there any other way with which I can implement the Test cases with VS2013.
    Please suggest.
    Thanks in advance.
    Himanshu Nigam

    Hi HimanshuNigam,
    According to your descrition, my understanding is that you want to use Fakes framework to create test case for SharePoint project in Visual Studio 2013.
    If you want to test using Fakes Framework, you can use the codeplex extension to achieve it. It supports Visual Studio 2013.
    Here is a detailed article for your reference:
    Better Unit Testing with Microsoft Fakes
    About how to include the Nuget package, you can use the package with the link below:
    NuGet Package Manager for Visual Studio 2013
    Installing NuGet
    If you still have question about this issue, I suggest you can create a post in Visual Studio, more experts will help you and you can get more detailed information from there:
    https://social.msdn.microsoft.com/Forums/vstudio/en-US/home?category=visualstudio%2Cvsarch%2Cvsdbg%2Cvstest%2Cvstfs%2Cvsdata%2Cvsappdev%2Cvisualbasic%2Cvisualcsharp%2Cvisualc
    Best Regards
    Zhengyu Guo
    TechNet Community Support

  • Unit Testing and APEX Global Variables

    We've recently started to unit test our database level PL/SQL business logic.
    As such we have a need to be able to simulate or provide output from PL/SQL APEX components in order to facilitate testing of these components.
    Some of the most obvious portions that need simulation are:
    1. The existence of a session
    2. The current application ID
    3. The current page ID.
    We currently handle requirement #1 by using apex_040100.wwv_flow_session.create_new
    We handle 2 and 3 using the apex_application.g_flow_id and g_flow_step_id global variables.
    I'm just wondering, how safe is it for us to use wwv_flow_session.create_new to simulate the creation of a session at testing time for those things which need a session?
    I've also noticed that there are apex_application.get_application_id and apex_application.get_page_id functions whose output is not tied to the global variables (at least in our current version).
    Is it safe for us to expect that we can set these global variables for use in testing or is apex moving to get_application_id and get_page_id functions away from global variables?
    Will there be corresponding set_application_id and set_page_id functions in the future?
    Sorry for the question bomb. Thanks for any help.

    My first question would be why do you need to establish a session to test your PL/SQL?
    wwv_flow_session is a package internal to APEX, and you should probably leave it be.
    The get_application_id procedure you refer to is in apex_application_install, which is used for scripting installation of applications - not get/set of page ID like you're describing.
    If you're uncomfortable using apex_application.g_flow_id, you can use v('APP_ID') or preferably pass the app_id/page_id as parameters to your procedures.
    Your question seems to have a few unknowns, so that's the best I can describe.
    Scott

  • Unit testing and integration testing

    hello 2 all,
                    what is the diff bet unit and integration testing? in sap what is unit teesting consists of and integration testing consists of what?
    is this the work  of test engineers r whose work is this?
    take care
    love ur parents

    Hi Sameer,
    Unit Testing
    A unit test is a procedure used to validate that a particular module of source code is working properly from each modification to the next. The procedure is to write test cases for all functions and methods so that whenever a change causes a regression, it can be quickly identified and fixed. Ideally, each test case is
    separate from the others; constructs such as mock objects can assist in separating unit tests. This type of testing is mostly done by the developers and not by end-users.
    Integration testing
    Integration testing can proceed in a number of different ways, which can be broadly characterized as top down or bottom up. In top down integration testing the high level control routines are tested first, possibly with the middle level control structures present only as stubs. Subprogram stubs were presented in Section 2 as incomplete subprograms which are only present to allow the higher level control routines to be tested. Thus a menu driven program may have the major menu options initially only present as stubs, which merely announce that they have been successfully called, in order to allow the high level menu driver to be tested.
    Top down testing can proceed in a depth-first or a breadth-first manner. For depth-first integration each module is tested in increasing detail, replacing more and more levels of detail with actual code rather than stubs. Alternatively breadth-first would proceed by refining all the modules at the same level of control
    throughout the application. In practice a combination of the two techniques would be used. At the initial stages all the modules might be only partly functional, possibly being implemented only to deal with non-erroneous data. These would be tested in breadth-first manner, but over a period of time each would be
    replaced with successive refinements which were closer to the full functionality. This allows depth-first testing of a module to be performed simultaneously with breadth-first testing of all the modules.The other major category of integration testing is bottom up integration testing where an individual module is
    tested from a test harness. Once a set of individual modules have been tested they are then combined into a collection of modules, known as builds, which are then tested by a second test harness. This process can continue until the build consists of the entire application.
    In practice a combination of top-down and bottom-up testing would be used. In a large software project being developed by a number of sub-teams, or a smaller project where different modules were being built by individuals. The sub-teams or individuals would conduct bottom-up testing of the modules which they were
    constructing before releasing them to an integration team which would assemble them together for top-down testing.
    I think this will help.
    Thanks ,
    Saptarshi

  • Unit testing and Issue on ODS

    Hi every one,
    I was asked to do Unit testing on all the objects I created which includes data loading, checking the structure.....etc...If any one has any template for Unit testing pls forward it to me [email protected].....
    My ods is designed with out checking "Activate ODS data immeaditely".....and we r not using process chains here...
    while testing data loads when ever we send data data to that ODS how do u activate it thru jobs automatically????
    if we give Job selection it may not work out right, cos we don't know when we load data....so can I make use of events so that when ever data comes to ODS I need to activate it and it has to push that data to other ODS...can anyone throw some light on the events and what even to use for automatic activation....
    If any one has any template pls forward it to the above ID
    Thanks in advance

    BW
    Doing unit test is simple. Just document what ever the steps you follow while loading your ODS. You could follow these steps
    1 . Check the functionality of your Infosource whether it is full or Delta – Take a screen shot.
    2.   Check Delta Load Functionality if you have any Delta extractors.
    4     . Schedule the Info Package
    5     Go to Manage Data targets and see whether ODS got loaded successfully or not.
    6     Check if it is activated or Not. In your case it will automatically gets activated since you selected automatic activation.
    Don’t forget to take screen shots of each Process
    Hope this helps
    Thanks
    Sat

  • Unit Testing and Oracle Sessions

    We have an issue regarding Oracle session and SQL Developer Unit Testing.
    Every time we run a Unit Test in SQLDeveloper a new Oracle session is created. When closing that Unit Test, the session, however, is not disconnected.
    And the only way to close that "Unit Test" session is to close SQLDeveloper.
    This is causing problems with the no. of sessions available to developers.
    Any help would be much appreciated.
    Subboss

    The focus of this forum is report design. The actual testing of reports would be subject to your own internal policies and procedures.
    Jason

  • Suggestions requested for Unit Testing process and build processes.

    Hi All,
    We are using WebLogic WorkShop 8.1 SP2 to build our WebApp. One thing I am trying
    to get together is a "Best Practises" list for aspects of WorkShop developement,
    particularly Unit Testing, Continous Build methodology, source control management
    etc.I have been through the "Best Practises Guide" that comes in WorkShop help,
    but it doesnt address these issues.This could help us all for future projects.
    1)Could anyone give pointers on how to perform Unit Testing using either JUnit/JUnitEE
    in the WorkShop realm, given that Controls cannot be accessed directly from PO
    Test classes.
    2)For a project of size say 5 developers ,does it make sense to have a nightly
    build using tools like CruiseControl?We use CVS for our source control and its
    working out pretty well, but given that we currently have no Unit Tests that can
    be run after the build and that can provide some reports on what broke/what didnt?
    I am sure we all would appreciate any suggestions and more questions on this topic.
    Thanks,
    Vik.

    Hi, Chris,
    can you perhaps explain your solution in greater detail. I am really curious to
    find a way to test controls.
    "Chris Pyrke" <[email protected]> wrote:
    >
    I have written (well it's a bit of a dirty hack really) something that
    lends itself
    to the name ControlTest (it unit tests controls). Its a blend of Junit
    and Cactus
    with some of the source of each brutalised a bit to get things to work
    (not much
    - it was a couple of hours work, when I was supposed to be doing something
    else).
    To write a control test you code something like...
    package com.liffe;
    import com.liffe.controlunit.ControlTestCase;
    import controls.Example;
    public class TestExample extends ControlTestCase
    private Example example = null;
    public void setUp() {
    example = (Example)getControl("example");
    public void testExample() {
    this.assertNotNull(example);
    String result = example.getData();
    assertEquals(result, "Happy as Larry");
    Other tasks required to set up a test are creating a web project with
    a jpf which
    needs some cut and paste code (14 lines) in its begin method and a jsp
    which is
    also cut and paste (5 lines). (ie create a standard web project and paste
    in 2
    pieces of code)
    In the web project you need to create a control (A) with an instance
    name of controlContainer.
    (if it's called something else the pasted in code will need changing
    to reflect)
    In this control you need to put an instance of TestContainerImpl and
    any controls
    that need testing.
    You then need to add a method to the control (A) that looks like…
    * @common:operation
    public String controlTestRun(String theSuiteClassName, boolean xsl)
    container.setControl("example", example);
    return container.controlTestRun(theSuiteClassName, xsl);
    You need to call container.setControl for each control being tested and
    the object
    'container' is the instance name of the TestContainerImpl that was put
    in.
    There are 4 jars (junit, cactus etc) that go in the library. You will
    also need
    the ControlUnitBase project (or maybe just it's jar).
    To use you call a URL like:
    http://localhost:7001/TestWeb/Controller.jpf?test=com.liffe.TestExample
    TestWeb is the name I gave to my web project - this will be different
    for each
    test project
    com.liffe.Example is the class above and will therefore be different
    for each
    test case.
    You can also call
    http://localhost:7001/TestWeb/Controller.jpf?test=com.liffe.TestExample&xsl=true
    (Note the extra &xsl=true) and the browser will (if it can) render it
    more prettily.
    This seems to do the job quite nicely, but there are several caveats
    I would hope
    someone from bea would be able to address before I start using it widely.
    1) To access the control you need to create it (eg as a subcontrol in
    the control
    (A) above.
    To get it into the test case you need to pass it round as an Object (can't
    return
    Control from a control operation). As it's being passed around among
    Java (POJO)
    classes I'm assuming that control remains in the control container context
    so
    this is OK. It seems to work and the Object is some form of proxy as
    any control
    seems to be reproxied before the control is invoked from the test case.
    2) If I'm testing controls called from a JPD then they either need to
    be in a
    control project (and my test cases called from a Web Project) which makes
    for
    a large increase in project numbers (we already have this and are trying
    to resist
    it) To avoid this - as a process project is a brain damaged web project
    I simply
    perform some brain surgery and augment the process project with some
    standard
    files found in any old web project. this means I can call the test JPF
    from a
    browser. This seems nasty, is there a better way?
    3) I would like to be able to deliver without the test code. At the worst
    the
    code can be in place but must be inacessible in a production environment.
    I don't
    know how best to do this - any suggestions (without creating lots of
    projects,
    or lots of manual effort)
    If anyone has read this far I would ask the question does this seem like
    the kind
    of thing that would be useful?
    Hopefully a future version of workshop will have something to enable
    unit testing
    and this hacking will be unnecessary.
    Could someone from BEA tell me if this is a dangerous way to do things?
    Chris
    "vik" <[email protected]> wrote:
    Hi All,
    We are using WebLogic WorkShop 8.1 SP2 to build our WebApp. One thing
    I am trying
    to get together is a "Best Practises" list for aspects of WorkShop developement,
    particularly Unit Testing, Continous Build methodology, source control
    management
    etc.I have been through the "Best Practises Guide" that comes in WorkShop
    help,
    but it doesnt address these issues.This could help us all for future
    projects.
    1)Could anyone give pointers on how to perform Unit Testing using either
    JUnit/JUnitEE
    in the WorkShop realm, given that Controls cannot be accessed directly
    from PO
    Test classes.
    2)For a project of size say 5 developers ,does it make sense to have
    a nightly
    build using tools like CruiseControl?We use CVS for our source control
    and its
    working out pretty well, but given that we currently have no Unit Tests
    that can
    be run after the build and that can provide some reports on what broke/what
    didnt?
    I am sure we all would appreciate any suggestions and more questions
    on this topic.
    Thanks,
    Vik.

  • Problem creating unit testing repository (and workaround)

    I tried to create the unit testing repository with a user that had all the required system privileges granted through a different role other than RESOURCE and CONNECT roles, and who wasn't explicitely granted CREATE VIEW, but inherited it through the custom role. However the check in SQL Developer is explicitely for the RESOURCE and CONNECT roles (regardless of what privileges they have granted to them in whichever database version past, present or future) and for CREATE VIEW privilege granted direct to the user:
    SELECT DECODE (roles.required_role_count + privs.required_priv_count,
                   3, 1,
                   0)
              AS basic_privs_granted
      FROM (SELECT COUNT (*) AS required_role_count
              FROM user_role_privs
             WHERE granted_role IN ('CONNECT', 'RESOURCE')) roles,
           (SELECT COUNT (*) AS required_priv_count
              FROM USER_SYS_PRIVS
             WHERE privilege IN ('CREATE VIEW')) privsI found this SQL contained within the sqldeveloper/extensions/oracle.sqldeveloper.unit_test.jar in an XML file: oracle/dbtools/unit_test/manage_user/UserSql.xml and replaced the SQL with one that actually checks for the specific system privileges:
    SELECT DECODE (required_sys_priv_count,
                   10, 1,
                   0)
              AS basic_privs_granted
      FROM
           (SELECT COUNT(*) AS required_sys_priv_count FROM
             (SELECT privilege
                FROM role_sys_privs
               WHERE privilege IN ('CREATE CLUSTER','CREATE INDEXTYPE','CREATE OPERATOR','CREATE PROCEDURE','CREATE SEQUENCE','CREATE SESSION','CREATE TABLE','CREATE TRIGGER','CREATE TYPE','CREATE VIEW')
              UNION
              SELECT privilege
                FROM user_sys_privs
               WHERE privilege IN ('CREATE CLUSTER','CREATE INDEXTYPE','CREATE OPERATOR','CREATE PROCEDURE','CREATE SEQUENCE','CREATE SESSION','CREATE TABLE','CREATE TRIGGER','CREATE TYPE','CREATE VIEW')))This enabled me to bypass that check when creating the respository. I don't know if all these privileges are actually required by the unit testing repository or not, so maybe the above list can be reduced...
    Edited by: RDB on Mar 1, 2013 2:24 PM

    Hi Sai Vineeth,
    First of all, I appreciate your elaborate way of framing the question. That itself helps others to comment.
    On the first two questions, I am sure you'll be able to follow the answers provided and take care of your concerns.
    On your question 3), an additional piece of information that should bring smile to you
    Whether the Test is to happen in DEV or QAS or whatever 'System Role' is dependent on the Test Plan.
    So, if you have two distinct Test Plans, one for Unit Tests and the other for Integration Tests, which should be the case, as usual, you can certainly point the System Role to DEV or QAS as required.
    Feel free to pose any other query as well.
    Best regards,
    Srini

  • Unit testing in nitrox

    I am new to eclipse and nitrox. Have been building webapps for years with IDEA.
    1) I find info in eclipse help concerning Junit based testing, I find nothing about unit testing in NitroX help. The webapps I am trying to import and build in NitroX must support unit testing. Eclipse Junit based help does not work for me in NitroX webapp.
    2) I have created several types of new webapps in NitroX accepting default parameters in wizard hoping to see how NitroX would lay out a new development environment. None of the results comes close to the development structure described in eclipse help. I find a src folder inside WEB-INF folder. This does not make sense to me and I can find no explanation of what NitroX expects with regards to project structure other than the existence of WEB-INF just below webapp context folder. My ultmate goal is to use both NitorX and IDEA accessing the same CVS repositories. So far I have imported several existing webapps via 'folder import' of source tree, but NitroX does note 'see' my testsrc or testbuild branches and I can not run tests except via my own custom Ant scripts. Any suggetions?
    Thanks

    1) I find info in eclipse help concerning Junit based testing, I find nothing about unit testing in NitroX help. The webapps I am trying to import and build in NitroX must support unit testing. Eclipse Junit based help does not work for me in NitroX webapp.
    If you have a Java class that extends JUnit, then in Eclipse, when you're in the Java-perspective, you can right-click on that class, "Run As..." and select "JUnit Test." That'll give you the JUnit Green or Red bar based on whether the tests succeeded/failed, etc.
    NitroX = Eclipse + Struts functionality. Anything you could do in Eclipse, you can do using NitroX.
    You could, for example, switch to the Java-perspective and perform JUnit testing exactly as in Eclipse without NitroX. Or, if you're in NitroX's "Web" perspective, you can go into Package Explorer and run JUnit tests from there.
    Re: 2)
    NitroX puts a "src" folder in your WEB-INF and sets that to be a source code folder, probably so when the application is deployed on the server, the source code is also on that server (in the isolated-from-the-web WEB-INF directory), which can be beneficial for several reasons, including your development environment and CVS server crashing.
    You could delete that folder, though, and use a folder you created yourself as a source code folder for that project. You'd just have to make sure that when you designate that source code folder (via: right-click on the project -> Properties -> Java Build Path -> Source Folders), you designate the output target of that folder to be /WEB-INF/classes. That'll make sure that, regardless of where your .java files are, your .class files will be placed in the appropriate folder inside your web application structure.
    Regarding NitroX not "see"ing your testsrc or testbuild branches (I assume you mean "folders" instead of "branches" ...), I'm assuming you're using NitroX's "Web" perspective, and looking for those folders in the AppXplorer window. Since your JUnit tests are not a part of the Struts application, you won't find them there -- instead, click over to the Package Explorer view, or switch over to Eclipse's Java Perspective, where everything will behave exactly as you'd expect.
    I hope this helps ..

Maybe you are looking for

  • Switching from Flash

    I recently starting looking at Adobe CC and found that the Flash content I was using on a site I created 7 years ago is no longer popular. This is fine by me because Flash doesn't work on my iPhone or iPad anyway. What I need help with is finding a w

  • Entity object attribute with a list of objects

    Does anyone know how one sets up an entity object that has an attribute with a list of objects as the type? (assuming that's supported) as in: CREATE TYPE phones AS VARRAY(10) OF varchar2(10); Create table suppliers (supcode number(5), Company varcha

  • How to undelete an email?

    I accidentally deleted an email tonight as I was reading it (hands are so cold they were shaking!). However, when I delete an email it never goes to the Trash folder so there is nothing to undelete. The email account is my Gmail account using Exchang

  • Bdc performance in "No screen" mode, on slow network, on remote server.

    Hi, We have created a bdc program in ABAP. say, our servers are installed in US. and our users are trying to execute bdc by uploading a data file from UK. consider that the connectivity(network) between our US office and UK office is not so goood. i

  • One ship to party multiple delivery points

    senario: customer has two retail stores store A in city x and store B in city Y,how to add two different unloading points to the cmr and how the system picks the unloading point where to specify the unloading point for a particular specially if the c