Best practice for creating test classes

I am quite new to java using NetBeans 6.1 with JDK 1.6 and for some of the functions I created I want to write test classes to run automated tests to check consistence when I do some changes.
I read something about this topic getting confused about different approaches depending on the jdk version / netbeans version used and however found very different looking examples on the net.
Now I am very confused how to continue. Any hint would be helpful,
Thanks,
Martin.

georgemc wrote:
Can you expand on that? I suspect you're talking about JUnit < 4 vs JUnit 4, since JUnit 4 can only work with Java versions from 5 onwards.Yes, I am talking about JUnit - On NetBeans the default was 3.8.2 if I remember right and I installed 4.1.
The problem starts where to start; creating a JUnit Test, Test for Existing Class or Test Suite.
Trying to create tests for an existing class it shows me something like:
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import static org.junit.Assert.*;
public class TestsTest {
    public TestsTest() {
    @Before
    public void setUp() {
    @After
    public void tearDown() {
}using 4.1
but
import junit.framework.TestCase;
public class TestsTest extends TestCase {
    public TestsTest(String testName) {
        super(testName);
    protected void setUp() throws Exception {
        super.setUp();
    protected void tearDown() throws Exception {
        super.tearDown();
}using 3.8.2
so first the versions seem to work completely different. The older version seems more understandable (for me as a beginner). Is it "save/stable" to use the already the newer JUnit? How is the support in NetBeans with the newer version - I mean do I have to expect problems not going with the older one?

Similar Messages

  • BEST PRACTICES FOR CREATING DISCOVERER DATABASE CONNECTION -PUBLIC VS. PRIV

    I have enabled SSO for Discoverer. So when you browse to http://host:port/discoverer/viewer you get prompted for your SSO
    username/password. I have enabled users to create their own private
    connections. I log in as portal and created a private connection. I then from
    Oracle Portal create a portlet and add a discoverer worksheet using the private
    connection that I created as the portal user. This works fine...users access
    the portal they can see the worksheet. When they click the analyze link, the
    users are prompted to enter a password for the private connection. The
    following message is displayed:
    The item you are requesting requires you to enter a password. This could occur because this is a private connection or
    because the public connection password was invalid. Please enter the correct
    password now to continue.
    I originally created a public connection...and then follow the same steps from Oracle portal to create the portlet and display the
    worksheet. Worksheet is displayed properly from Portal, when users click the
    analyze link they are taken to Discoverer Viewer without having to enter a
    password. The problem with this is that when a user browses to
    http://host:port/discoverer/viewer they enter their SSO information and then
    any user with an SSO account can see the public connection...very insecure!
    When private connections are used, no connection information is displayed to
    SSO users when logging into Discoverer Viewer.
    For the very first step, when editing the Worksheet portlet from Portal, I enter the following for Database
    Connections:
    Publisher: I choose either the private or public connection that I created
    Users Logged In: Display same data to all users using connection (Publisher's Connection)
    Users Not Logged In: Do no display data
    My question is what are the best practices for creating Discoverer Database
    Connections.
    Is there a way to create a public connection, but not display it in at http://host:port/discoverer/viewer?
    Can I restrict access to http://host:port/discoverer/viewer to specific SSO users?
    So overall, I want roughly 40 users to have access to my Portal Page Group. I then want to
    display portlets with Discoverer worksheets. Certain worksheets I want to have
    the ability to display the analyze link. When the SSO user clicks on this they
    will be taken to Discoverer Viewer and prompted for no logon information. All
    SSO users will see the same data...there is no need to restrict access based on
    SSO username...1 database user will be set up in either the public or private
    connection.

    You can make it happen by creating a private connection for 40 users by capi script and when creating portlet select 2nd option in Users Logged in section. In this the portlet uses there own private connection every time user logs in.
    So that it won't ask for password.
    Another thing is there is an option of entering password or not in ASC in discoverer section, if your version 10.1.2.2. Let me know if you need more information
    thnaks
    kiran

  • Best practice for the test environment  &  DBA plan Activities    Documents

    Dears,,
    In our company, we made sizing for hardware.
    we have Three environments ( Test/Development , Training , Production ).
    But, the test environment servers less than Production environment servers.
    My question is:
    How to make the best practice for the test environment?
    ( Is there any recommendations from Oracle related to this , any PDF files help me ............ )
    Also please , Can I have a detail document regarding the DBA plan activities?
    I appreciate your help and advise
    Thanks
    Edited by: user4520487 on Mar 3, 2009 11:08 PM

    Follow your build document for the same steps you used to build production.
    You should know where all your code is. You can use the deployment manager to export your configurations. Export customized files from MDS. Just follow the process again, and you will have a clean instance not containing production data.
    It only takes a lot of time if your client is lacking documentation or if you re not familiar with all the parts of the environment. What's 2-3 hours compared to all the issues you will run into if you copy databases or import/export schemas?
    -Kevin

  • Best practice for creating a new email address to Exchange Server 2010 for share point Library

    Hi,
    Please advise if there is any best practice for the above issue?
    Thanks 
    srabon

    Hi Srabon,
    Hope these are what you want.
    Use a cmdlet to Create a User account and Mailbox in Exchange 2010
    http://technet.microsoft.com/en-us/magazine/ff381465.aspx
    Create a Mailbox for an Existing User
    http://technet.microsoft.com/en-us/library/aa998319(v=exchg.141).aspx
    Thanks
    Mavis
    Mavis Huang
    TechNet Community Support

  • What is the best practice for creating primary key on fact table?

    what is the best practice for primary key on fact table?
    1. Using composite key
    2. Create a surrogate key
    3. No primary key
    In document, i can only find "From a modeling standpoint, the primary key of the fact table is usually a composite key that is made up of all of its foreign keys."
    http://download.oracle.com/docs/cd/E11882_01/server.112/e16579/logical.htm#i1006423
    I also found a relevant thread states that primary key on fact table is necessary.
    Primary Key on Fact Table.
    But, if no business requires the uniqueness of the records and there is no materilized view, do we still need primary key? is there any other bad affect if there is no primary key on fact table? and any benifits from not creating primary key?

    Well, natural combination of dimensions connected to the fact would be a natural primary key and it would be composite.
    Having an artificial PK might simplify things a bit.
    Having no PK leads to a major mess. Fact should represent a business transaction, or some general event. If you're loading data you want to be able to identify the records that are processed. Also without PK if you forget to make an unique key the access to this fact table will be slow. Plus, having no PK will mean that if you want to used different tools, like Data Modeller in Jbuilder or OWB insert / update functionality it won't function, since there's no PK. Defining a PK for every table is a good practice. Not defining PK is asking for a load of problems, from performance to functionality and data quality.
    Edited by: Cortanamo on 16.12.2010 07:12

  • What is the best practice for creating master pages and styles with translated text?

    I format translated text all the time for my company. I want to create a set of master pages and styles for each language and then import those styles into future translated documents. That way, the formatting can be done quickly and easily.
    What are the best practices for doing this? As a company this has been tried in the past, but without success. I'd like to know what other people are doing in this regard.
    Thank you!

    I create a master template that is usually void of content, with the exception I define as many of the paragraph styles I believe can/will be used with examples of their use in the body of the document--a style guide for that client. When beginning a new document for that client, I import those styles from the paragraph styles panel.
    Exception to this is when in a rush I begin documentation first, then begin new work. Then in the new work, I still pull in those defined paragraph and or object styles via their panels into the new work.
    There are times I need new styles. If they have broader applicability than a one-off instance or publication, then I open the style template for that client and import that style(s) from the publication containing the new style(s) and create example paragraphs and usage instructions.
    Take care, Mike

  • Best Practices for creating reports/Dashboards from BW systems

    HI Gurus,
    Best  Practices of creating BO Dashboards / Xcelsisus from BW systems
    Prasad

    You can use the BICS connector that leverages BW queries directly.  It is listed in the Connection Manager as "SAP NetWeaver BW Connection".  You will need both the ABAP and Java stack and SSO configured between the two.  You will also need to have SAP GUI and BEx installed on the machine you are doing development on.  Note that dashboards using this connection can only be hosted in NW Portal for the time being until the next release of BI 4.x platform.
    Here are some links on getting started with the BICS connector:
    [Building Fast and Efficient Dashboards with BW and Xcelsius|http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/d0ab8cce-1851-2d10-d5be-b5147a651c58]
    [Requirements for BICS|http://wiki.sdn.sap.com/wiki/display/BOBJ/prerequisitestoXcelsiusandSAPNetWeaverBW+Connection]

  • Best practice for creating JCO destinations

    Hi All,
       I have a project which uses 10 to 12 BAPIs.What is the best practice
      1) Create 10 JCO destinations one for each BAPI .
      2) Create one JCO and use it for all BAPIs.
         Can some one tell me what is the best practice.What are the advantages and the disadvantages.
    Regards,
    Rajini.

    Hi,
    these docs helps you to get idea over the jco best practices.
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/705f2b2e-e77d-2b10-de8a-95f37f4c7022
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/85a483cb-0d01-0010-2990-c5168f01ce8a
    Regards,
    ramesh

  • Best Practice for creating Excel report from SSIS.

    I have a requirement to create an Excel report on a daily basis which pulls data from SQL. I have attempted to resolve this by creating a stored procedure to save the results in SQL, a template in Excel to hold the graphs & pivot tables and an SSIS package
    to copy the data to the template.
    Problem 1: When the data turns up in Excel it is saved as text rather than numbers.
    Problem 2: When the data turns up in Excel it appends the data rather than overwriting it.
    I resolved problem 1 by having another sheet which converts the text to numbers (=int(sheet1!A1))
    I resolved problem 2 by adding some VB script to my SSIS package which clears the existing cells before copying the data
    The job runs fine, however when I schedule the job to run overnight it complains "System.UnauthorizedAccessException: Retrieving the COM class factory for component with CLSID". A little googling tells me that running the client side commands in
    my vb script (workSheet1.Range("A2:F9999").Clear(), workBook.Save(), workBook.Close() etc) from a server side task is bad practice.
    So, I am left wondering how people usually get around this problem; copy a SQL table into an existing Excel file and overwrite the data, without having the numbers turn up as text. My requirements are that the report must display pivot charts with selectable
    options and be automatically updated overnight.
    Help appreciated,
    Bish.
    Office 2013 on my PC, Office 2010 on the server, Windows Server 2008R2 Enterprise, SQL Server 2008R2.

    I think that the best practice in case like this is to Link an excel file to a view or directly to a table. So you don't have to struggle with changing template, with overnight packages, etc. If the data are too much complex and the desiderate too excessive
    then I tend to create a Cube and that's it...dashboard, graph and everyone is happy. In your case if the request is not too much try to don't use SSIS but directly build a view and point directly on SQL.
    SSIS is really strong for the ETL, to run some stored procedure too heavy, to use a cut time scheduled, etcetera , etcetera, etcetera...I love it. But sometimes we need to find the easier solutions...
    I hope this post helped you

  • Best practice for creating RFC destination entries for 3rd parties(Biztalk)

    Hi,
    We are on SAP ECC 6 and we have been creating multiple RFC destination entries for the external 3rd party applications such as Biz-talk and others using TCP/IP connection type and sharing the programid.
    The RFC connections with IDOC as data flow have been made using Synchronous mode for time critical ones(few) and majority through asynchronous mode for others. The RFC destination entries have been created for many interfaces which have unique RFC destinations with its corresponding ports defined in SAP. 
    We have both inbound and outbound connectivity.with the large number of RFC destinations being added we wanted to review the same. We wanted to check with others who had encountered similar situation and were keen to learn their experiences.
    We also wanted to know if there are any best practices to optimise on number of RFC destinations.
    Here were a few suggestions we had in mind to tackle the same.
    1. Create unique RFC destinations for every port defined in SAP for external applications as Biztalk for as many connections. ( This would mean one for inbound, one for outbound)
    2. Create one single RFC destination entry for the external host/application and the external application receiving the idoc control record to interpret what action to perform at its end.
    3. Create RFC destinations based on the modules it links with such as materials management, sales and distribution, warehouse management. This would ensure we can limit the number of RFC to be created and make it simple to understand the flow of data.
    I have done checks on SAP best practices website, sap oss notes and help pages but could not get specific information I was after.
    I do understand we can have as unlimited number of RFC destinations and maximum connections using appropriate profile parameters for gateway, RFC, client connections, additional app servers.
    I would appreciate if you can suggest the best architecture or practice to achieve  RFC destinations in an optimized manner.
    Thanks in advance
    Sam

    Not easy to give a perfect answer
    1. Create unique RFC destinations for every port defined in SAP for external applications as Biztalk for as many connections. ( This would mean one for inbound, one for outbound)
    -> be careful if you have multi cllients ( for example in acceptance) RFC's are client independ but ports are not! you could run in to trouble
    2. Create one single RFC destination entry for the external host/application and the external application receiving the idoc control record to interpret what action to perform at its end.
    -> could be the best solution... its easier to create partner profiles and the control record will contain the correct partner.
    3. Create RFC destinations based on the modules it links with such as materials management, sales and distribution, warehouse management. This would ensure we can limit the number of RFC to be created and make it simple to understand the flow of data.
    -> consider this option 2.
    We send to you messagebroker with 1 RFC destination , sending multiple idoctypes, different partners , different ports.

  • The best practice for creating reports and dashboard

    Hello guys
    I am trying to put together a list of best practice on how to create reports and dashboards using OBIEE presentation service. I know a lot of those dos and donts are just corporate words that don't apply consistantly in real world environment, but still I'd like to know if Oracle has any officially defined best practice or not.
    the only best practice I can think of when it comes to building reports and dashboards is:
    Each subject area should contain only one star schema that holds data for a specific business information
    Is there anything else?
    Please advice
    Thanks

    Read this book to understand what a Dashboard is, what it should do and look like to be used by the end users. Very enlightentning.
    Information Dashboard Design: The Effective Visual Communication of Data by Stephen Few (There are a couple of other books by Stephen and although I haven't read them yet, I anticipate them to be equally helpful.
    This book was also helpful to me:
    http://www.amazon.com/Performance-Dashboards-Measuring-Monitoring-Managing/dp/0471724173
    I also found this book helpful in Best Practices...
    http://www.biconsultinggroup.com/knowledgebase.asp?CategoryID=337

  • Best Practices for creating PDFs using PLPDF?

    Does anyone have any suggestions for Best Practices in making PDF files using PLPDF?
    I have been using it for about a month now, and the best that I have come up with is to use MS Access to prototype the layout of a report. Once I have all the graphics areas and text areas lined up how I would want them, I then write PLSQL code to create a procedure which is called from an HTMLDB page. MS Access is handy in that it provides the XY coordinates for each text area and graphics area. It also provides the dimensions of the respective cells. So long as I call plpdf.Init('P', 'in', 'letter') at the beginning of the procedure, both my MS Access prototype and my plpdf code are both using inches - this makes the translation relatively easy.
    Has anybody found anything else easier/better?
    Regards,

    You can make it happen by creating a private connection for 40 users by capi script and when creating portlet select 2nd option in Users Logged in section. In this the portlet uses there own private connection every time user logs in.
    So that it won't ask for password.
    Another thing is there is an option of entering password or not in ASC in discoverer section, if your version 10.1.2.2. Let me know if you need more information
    thnaks
    kiran

  • Best practice for EBS Testing

    Hello Friends - We are doing EBS Configuration and it includes Search strings also. These changes are applicable to around 40-50 Bank accounts.
    Is it that I should ask Bank to send us test Bank statements for all these accounts.
    Can you pls share Best practices to test EBS set up.
    Thanks

    Hello!
    You don't have to test EBS for each bank account. The best approach to testing is to identify the typical bank statement cases and test them. For instance, if you have bank statements from five banks with several bank accounts in each bank, you need to test the one bank statement for one bank account from each bank. Similarly, if you have different types of bank accounts (e.g. current account, deposit account, transfer account etc.), you will have different operation types in bank statements for these accounts. Therefore, you also have to test bank statement from different account types.
    To sum up, test typical bank statements from each bank and different bank statements from each account type if applicable.
    Hope this will help you!
    Best regards!

  • Best Practices for Creating eLearning Content With Adobe

    As agencies are faced with limited resources and travel restrictions, initiatives for eLearning are becoming more popular. Come join us as we discuss best practices and tips for groups new to eLearning content creation, and the best ways to avoid complications as you grow your eLearning library.
    In this webinar, we will take on common challenges that we have seen in eLearning deployments, and provide simple methods to avoid and overcome them. With a little training and some practice, even beginners can create engaging and effective eLearning content using Adobe Captivate and Adobe Presenter. You can even deploy content to your learners with a few clicks using the Adobe Connect Training Platform!
    Sign up today to learn how to:
    -Deliver self-paced training content that won't conflict with operational demands and unpredictable schedules
    -Create engaging and effective training material optimized for knowledge retention
    -Build curriculum featuring rich content such as quizzes, videos, and interactivity
    -Track program certifications required by Federal and State mandates
    Come join us Wednesday May 23rd at 2P ET (11A PT): http://events.carahsoft.com/event-detail/1506/realeyes/
    Jorma_at_RealEyes
    RealEyes Connect

    You can make it happen by creating a private connection for 40 users by capi script and when creating portlet select 2nd option in Users Logged in section. In this the portlet uses there own private connection every time user logs in.
    So that it won't ask for password.
    Another thing is there is an option of entering password or not in ASC in discoverer section, if your version 10.1.2.2. Let me know if you need more information
    thnaks
    kiran

  • Best practices for creating application schema

    All,
    Can anyone recommend best practices (or pointer to a url) for creating application schema. A novice installer created a schema and the tablespace ran out of disk space in 2 days and the system came to a halt at a production site. The tablespace was created with one datafile and with MAXSIZE specified. I am looking for Do's and Dont's on production system.
    Thanks for any help,
    Vissu

    I'm not sure that you can boil this down to a "Do's and Don'ts" list unless you want to get overly general...
    For example, do make sure that you provision space appropriately. "Appropriately" however, is going to be radically different in different environments. Some shops set all their data files to autoextend in production and monitor utilization at the OS level. Other shops specify exact file sizes and monitor utilization at the Oracle level. Each approach has its own advantages and disadvantages, you just need to make sure that your application uses the same approach that every other application in the organization uses.
    Do have an idea about the space utilization of the application, but don't go overboard. Running out of space in 2 days means someone failed to do a basic analysis. On the other hand, I've seen people spend way more time than they should making 5 year projections based on some relatively soft assumptions and getting worried about internal overheads that were much smaller than the error bars in their baseline estimates. Of course, the precision necessary also depends on the implications-- a 20% error in a multi-TB data warehouse is going to have a lot more impact than a 20% error in a 20 GB OLTP application.
    Justin

Maybe you are looking for