Trouble w unit test code

this code compiles fine, but my unit tests starting with the lines
public static class AllTests extends TestCase {
are failing with a null pointer exception. i was advised that
'>
The testCreateAccount is failing because you're using a local variable,newAccount,
in the setUp method(), which has the same name as the class instance variable,newAccount.
Since the testCreateAccount uses the class instance variable, it is null.
protected void setUp() {
anAccountServiceImpl = new AccountServiceImpl();
AccountEntryStruct newAccount = new AccountEntryStruct();'-so i changed all instances of newAccount to newAccount2 in my unit tests (you
will see towards
the bottom of my code). it looks like the testCreateAccount method is still
using a
null value. i'm not sure if i'm following the advice correctly...
code is below:
// AccountServiceImpl.java: The AccountService Implementation
package com.kafein.accountServices;
import java.util.Hashtable;
import java.util.Calendar;
import java.util.GregorianCalendar;
import junit.framework.TestCase;
import junit.framework.Test;
import junit.framework.TestSuite;
import org.omg.PortableServer.*;
import com.kafein.idl.accountServices.AccountServicePOA;
import com.kafein.idl.accountServices.AccountEntryStruct;
import com.kafein.idl.accountServices.AccountStruct;
import com.kafein.idl.utilities.DateTimeStruct;
import com.kafein.idl.exceptions.NotFoundException;
import com.kafein.idl.exceptions.DataValidationException;
import com.kafein.idl.errorCodes.DataValidationErrorCodes;
import com.kafein.utils.ServiceHandler;
import com.kafein.utils.Log;
public class AccountServiceImpl extends AccountServicePOA {
POA poa;
private Hashtable accounts = new Hashtable(); // collection of Accounts
private static int nextAccountID = 1; // global account ID
* Construct an instance.
public AccountServiceImpl(POA aPOA) {
super();
poa = aPOA;
* Overloaded constructor for unit tests.
protected AccountServiceImpl() {
poa = null;
* createAccount is used by administrator to add a new Account
* in the system.
* @param newAccount AccountEntryStruct containing data for new account
* @return int the new unique Account ID
* @exception com.kafein.idl.exceptions.DataValidationException
public int createAccount(AccountEntryStruct newAccount) throws
DataValidationException {
validateData (newAccount); // throws DataValidationException;
int accountID = getNextID();
// Create new Account.
Account anAccount = new Account(accountID,
newAccount.userName,
newAccount.userEmail,
newAccount.creditCardType,
newAccount.creditCardNumber,
newAccount.creditCardExpirationDate.year,
newAccount.creditCardExpirationDate.month,
newAccount.userPassword,
newAccount.initialBalance);
accounts.put(accountID,anAccount);
return accountID;
* isAccountValid is used to validate a user logon.
* @param accountID AccountID
* @param userPassword String
* @return boolean true to indicate an existing Account
public boolean isValidAccount (int accountID, String userPassword) {
// Get account with key equal to accountID.
AccountStruct anAccount;
try {
anAccount = (AccountStruct) getAccount (accountID);
catch (NotFoundException e) {
return false;
// Verify password.
return (anAccount.userPassword.equals(userPassword) ? true : false);
* getAccount is used to retrieve an existing Account in the system
* @param int AccountID
* @return AccountStruct containing data for the existing Account
* @exception com.kafein.idl.exceptions.NotFoundException
public AccountStruct getAccount(int accountID)throws
NotFoundException {
// Verify that accountID is within an appropriate interval.
if (accountID < 1 || accountID > accounts.size()) {
throw new NotFoundException(DataValidationErrorCodes.INVALID_ACCOUNT_ID,
"Account ID not found");
// Get Account and convert to AccountStruct (which is returned).
Account anAccount = (Account) accounts.get(accountID);
return anAccount.getAccountStruct();
* getAllAccounts is used to retrieve all existing Accounts in the system
* @return AccountStruct[] containing all existing Accounts.
* @fyi returns an empty sequence if no Accounts exist
public AccountStruct[] getAllAccounts() {
// Allocate the array of AccountStructs.
int lastKey = accounts.size();
AccountStruct[] accountSequence = new AccountStruct[lastKey];
if (lastKey==0) {
return accountSequence;
// DAR:Sort accounts by accountID (int).
// Create AccountStructs from Accounts.
for (int i = 1; i <= lastKey; i++) {
Account anAccount = (Account) accounts.get(i);
accountSequence[i-1] = anAccount.getAccountStruct();
return accountSequence;
* validateData is used to check new account data.
* @param newAccount AccountEntryStruct containing data for new account
* @exception com.kafein.idl.exceptions.DataValidationException
protected void validateData(AccountEntryStruct newAccount) throws
DataValidationException {
// Check all of the member data in newAccount.
if (newAccount.userName.equals("")) {
throw new DataValidationException(
DataValidationErrorCodes.INVALID_USER_NAME,
"User Name must not be empty");
if (newAccount.userEmail.equals("")) {
throw new DataValidationException(
DataValidationErrorCodes.INVALID_USER_EMAIL,
"User Email must not be empty");
if (newAccount.creditCardType.equals("")) {
throw new DataValidationException(
DataValidationErrorCodes.INVALID_CREDIT_CARD_TYPE,
"Credit card type must not be empty");
if (newAccount.creditCardNumber.equals("")) {
throw new DataValidationException(
DataValidationErrorCodes.INVALID_CREDIT_CARD_NUMBER,
"Credit card number must not be empty");
// Compare creditCardExpirationDate to current date
// (we only consider year and month).
GregorianCalendar now = new GregorianCalendar();
DateTimeStruct proposed = newAccount.creditCardExpirationDate;
if (proposed.year < now.get (Calendar.YEAR) ||
(proposed.year == now.get (Calendar.YEAR) &&
proposed.month < now.get (Calendar.MONTH) + 1)) {
throw new DataValidationException(
DataValidationErrorCodes.INVALID_CREDIT_CARD_EXPIRATION_DATE,
"Credit card has expired");
if (newAccount.userPassword.equals("")) {
throw new DataValidationException(
DataValidationErrorCodes.INVALID_USER_PASSWORD,
"Password must not be empty");
else if
(!newAccount.userPassword.equals(newAccount.userPasswordVerification)) {
throw new DataValidationException(
DataValidationErrorCodes.INVALID_USER_PASSWORD,
"Password verification failure");
if (newAccount.initialBalance < 0.0F) {
throw new DataValidationException(
DataValidationErrorCodes.INVALID_BALANCE,
"Account Balance cannot have a negative balance");
* getNextID is used to generate a unique ID.
* Needs a much better implementation that generates a globally unique ID!
* @return int an Account ID
protected synchronized int getNextID() {
nextAccountID++;
return nextAccountID - 1;
* override defaultPOA to return this servant's POA, not Root POA
public POA defaultPOA() {
return poa;
* AllTests is used for unit testing the AccountServiceImpl Class. It
extends the
* JUnit testing framework's TestCase class.
* To execute in graphic mode:
* java junit.swingui.TestRunner
com.kafein.accountServices.AccountServiceImpl$AllTests
* To execute in text mode:
* java com.kafein.accountServices.AccountServiceImpl$AllTests
public static class AllTests extends TestCase {
private AccountServiceImpl anAccountServiceImpl;
private Hashtable accounts = new Hashtable();
private Account anAccount;
private AccountEntryStruct newAccount2;
public AllTests(String name) {
super(name);
protected void setUp() {
anAccountServiceImpl = new AccountServiceImpl();
AccountEntryStruct newAccount2 = new AccountEntryStruct();
newAccount2.userName = "testName";
newAccount2.userEmail = "test@email";
newAccount2.creditCardType = "testCreditCardType";
newAccount2.creditCardNumber = "0123456789";
newAccount2.creditCardExpirationDate = new DateTimeStruct();
newAccount2.creditCardExpirationDate.year = 2005;
newAccount2.creditCardExpirationDate.month = 6;
newAccount2.creditCardExpirationDate.day = -1; // not applicable data
newAccount2.creditCardExpirationDate.hour = -1;
newAccount2.creditCardExpirationDate.minute = -1;
newAccount2.userPassword = "kafein";
newAccount2.userPasswordVerification = "kafein";
newAccount2.initialBalance = 0.0F;
protected void tearDown() {
public static void main (String[] args) {
junit.textui.TestRunner.run(suite());
public static Test suite() {
return new TestSuite(AllTests.class);
public void testGetAllAccountsSizeZero() {
AccountStruct[] results =
anAccountServiceImpl.getAllAccounts();
assertTrue(results != null);
assertTrue(results.length == 0);
public void testCreateAccount() {
try {
int testAccountID = anAccountServiceImpl.createAccount(newAccount2);
//fail(); // shouldn't get here
assertTrue(testAccountID == 1);
AccountStruct[] results = anAccountServiceImpl.getAllAccounts();
assertTrue(results[0].userName.equals("testName"));
assertTrue(anAccountServiceImpl.isValidAccount(1, "kafein"));
} catch(DataValidationException e) {
public void testGetAllAccounts() {
AccountStruct[] results = anAccountServiceImpl.getAllAccounts();
assertTrue(results[0].userName.equals("testName"));
public void testIsValidAccount() {
assertTrue(anAccountServiceImpl.isValidAccount(1, "kafein"));
}

thanks again all
my code now looks like
        public static class AllTests extends TestCase {
          private AccountServiceImpl anAccountServiceImpl;
          private Hashtable accounts = new Hashtable();
          private Account anAccount;
          private AccountEntryStruct newAccount2;
          public AllTests(String name) {
                        super(name);
                protected void setUp() {
          anAccountServiceImpl = new AccountServiceImpl();
          newAccount2 = new AccountEntryStruct();
          newAccount2.userName = "testName";
                newAccount2.userEmail = "test@email";
                newAccount2.creditCardType = "testCreditCardType";
               newAccount2.creditCardNumber = "0123456789";
                newAccount2.creditCardExpirationDate = new DateTimeStruct();
                newAccount2.creditCardExpirationDate.year = 2005;
                newAccount2.creditCardExpirationDate.month = 6;
                newAccount2.creditCardExpirationDate.day = -1; // not applicable data
                newAccount2.creditCardExpirationDate.hour = -1;
                newAccount2.creditCardExpirationDate.minute = -1;
                newAccount2.userPassword = "kafein";
                newAccount2.userPasswordVerification = "kafein";
                newAccount2.initialBalance = 0.0F;
                protected void tearDown() {
                public static void main (String[] args) {
                        junit.textui.TestRunner.run(suite());
                public static Test suite() {
                        return new TestSuite(AllTests.class);
          public void testGetAllAccountsSizeZero() {
                    AccountStruct[] results = anAccountServiceImpl.getAllAccounts();
                    assertTrue(results != null);
                    assertTrue(results.length == 0);
                public void testCreateAccount() {
                try {
          int testAccountID = anAccountServiceImpl.createAccount(newAccount2);
          assertTrue(testAccountID == 1);
          AccountStruct[] results = anAccountServiceImpl.getAllAccounts();
          assertTrue(results[0].userName.equals("testName"));
          assertTrue(anAccountServiceImpl.isValidAccount(1, "kafein"));
                  //fail(); // shouldn't get here
          } catch(DataValidationException e) {
                public void testGetAllAccounts() {
          AccountStruct[] results = anAccountServiceImpl.getAllAccounts();
          assertTrue(results[0].userName.equals("testName"));
                public void testIsValidAccount() {
          assertTrue(anAccountServiceImpl.isValidAccount(1, "kafein"));
}-it compiles cleanly. unit tests testGetAllAccounts() and testIsValidAccount() are failing still, however. i'm not sure how to initialize the struct outside of the setUp method and have the struct be available to methods outside of setUp()

Similar Messages

  • Unit Test code using wrong persistence unit

    In the midst of learning Maven, I created a simple application in which I am using JPA (Java Persistence 1.0.2) with EclipseLink implementation (2.0.2).
    Note: This is an Application Managed environment. So I manually control EntityManager's life cycle.
    The persistence.xml file used by the main source code is different from the one that unit test code uses. Main code uses an Oracle DB and the test code uses an in-memory Derby.
    Running unit tests was updating the Oracle DB (!) and I eventually managed to fix that by using two different persistence-units in the XML files.
    However, I don't understand why that fixed the problem. I manually create and shut down the entity managers and they are not running concurrently. I'm pretty sure Maven (or the way I set it up) doesn't mess up the resources (XML files). In fact by looking at Maven's debug output I can see it's using the right XML file for unit tests.
    Could someone enlighten me, please?

    Do you have both persistence.xml files on your classpath? If so, and they contain the same name for their respective persistence units, you should be getting a warning or error since they must have unique names. There is no way to tell which one you want to access otherwise.
    Best Regards,
    Chris

  • Unit test code problem

    I'm doing my first proper unit test and I get an error, think min_size> is the problem
    Element type "application" must be followed by either attribute specifications, ">" or "/>".
    this generated code
    <?xml version="1.0" encoding="utf-8"?>
    <!-- This file is automatically generated by Flash Builder to compile FlexUnit classes and is not intended for modification.
    Please click on the "Refresh" icon in "FlexUnit Results" view to regenerate this file. -->
    <application xmlns:fx="http://ns.adobe.com/mxml/2009"
                 xmlns:s="library://ns.adobe.com/flex/spark"
                 xmlns:mx="library://ns.adobe.com/flex/mx"min_size>
        <fx:Script>
            <![CDATA[
                import flexUnitTests.serviceTestSuit;
                private var flexUnitTests_serviceTestSuit_obj:flexUnitTests.serviceTestSuit;
            ]]>
        </fx:Script>
        <fx:Declarations>
            <!-- Place non-visual elements (e.g., services, value objects) here -->
        </fx:Declarations>
    </application>

    Hi Nikos,
    Which Flash Builder build are you using?
    FlexUnitApplication.mxml is generated from MXML Application/MXML Windowed Application(Flex/AIR project respectively) file templates configured in Window->Preference->File templates.
    Can you take a look at the template if you see any 'min_size' available as part of the template?
    However, we are unable to reproduce the issue in the latest builds.
    Thanks,
    Balaji
    http://balajisridhar.wordpress.com

  • Trouble Running Unit Tests on Device

    I am having trouble running multiple unit test on my device. I am using JMUnit as part of NetBeans IDE 5.5.1. I can get the tests running in the emulator but when I try them on the phone they don't work. Under my project's properties->Application Descriptor->MIDlets->MIDlets in the Suite, the 'jmunit.framework.cldc11.TestSuite' comes up in red with the error 'Some MIDlet classes are invalid'. I am using CDLC 1.1 and MIDP 2.0. Any ideas?

    wegunterjrASI, 
    I have a few follow up questions for you to gain a better understanding of your application: 
    1). What are you using to do the unit test (NI’s unit test framework or VI tester from JKI)?
    2). What are you conducting the unit test on (RT Driver or something on the host machine)? 
    3). Are the unit test VIs under the target in the LabVIEW project? 
    4). Are you able to run a simple unit test? 
    Screenshots are always appreciated. 
    Regards, 
     

  • How do you debug your unit tests?

    Hey folks,
    I'm obviously doing something wrong that I can't figure out. I've tried following Chris Hanson's instructions that he put on LJ. No matter what I do, I can't get the debugger to breakpoint in a unit test, no matter whether I put the breakpoint in the test code or the worker code. I can get it to breakpoint just fine in the worker code when I run the real app.
    How do you do this on your unit test code?
    Thanks,
    Pat

    Tried that. No joy so far. That's why I asked here; I hoped that someone here had done it successfully.
    Pat

  • Unit tests and QA process

    Hello,
    (disclaimer : if you agree that this topic does not really belong to this forum please vote for a new Development Process forum there:
    http://forum.java.sun.com/thread.jspa?forumID=53&threadID=504658 ;-)
    My current organization has a dedicated QA team.
    They ensure end-user functional testing but also run and monitor "technical" tests as well.
    In particular they would want to run developer-written junit tests as sanity tests before the functional tests.
    I'm wondering whether this is such a good idea, and how to handle failed unit tests:
    1) Well , indeed, I think this is a good idea: even if developer all abide by the practice of ensuring 100% of their test pass before promoting their code (which is unfortunately not the case), integration of independant development may cause regression or interactions that make some test fail.
    Any reason against QA running junit tests at this stage?
    However the next question is, what do they do with failed tests : QA has no clue how important a given unit test is with regard to the whole application.
    Maybe a single unit test failed out of 3500 means a complete outage of a 24x7 application. Or maybe 20% of failed tests only means a few misaligned icons...
    2) The developer of the failed package may know, but how can he communicate this to QA?
    Javadocing their unit testing code ("This test is mandatory before entering user acceptance") seems a bit fragile.
    Are there recommended methods?
    3) Even the developer of the failed package may not realize the importance of the failure. So what should be the process when unit tests fail in QA?
    Block the process until 100% tests pass? Or, start acceptance anyway but notify the developper through the bug tracking system?
    4) Does your acceptance process require 100% pass before user acceptance starts?
    Indeed I have ruled out requiring 100% pass, but is this a widespread practice?
    I rule it out because maybe the failed test indeed points out a bad test, or a temporary unavailability of a dependent or simulated resource.
    This has to be analyzed of course, as tests have to be maintained as well, but this can be a parallel process to the user acceptance (accepting that the software may have to be patched at some point during the acceptance).
    Thank you for your inputs.
    J.

    >
    Any reason against QA running junit tests at this
    stage?
    Actually running them seems pointless to me.
    QA could be interested in the following
    - That unit tests do actually exist
    - That the unit tests are actually being run
    - That the unit tests pass.
    This can all be achieved as part of the build process however. It can either be done for every cm build (like automated nightly) or for just for release builds.
    This would require that the following information was logged
    - An id unique to each test
    - Pass fail
    - A collection system.
    Obviously doing this is going to require more work and probably code than if QA was not tracking it.
    However the next question is, what do they do with
    failed tests : QA has no clue how important a given
    unit test is with regard to the whole application.
    Maybe a single unit test failed out of 3500 means a
    complete outage of a 24x7 application. Or maybe 20%
    of failed tests only means a few misaligned icons...
    To me that question is like asking what happens if one class fails to build for a release build.
    To my mind any unit test failure is logged as a severe exception (the entire system is unusable.)
    2) The developer of the failed package may know, but
    how can he communicate this to QA?
    Javadocing their unit testing code ("This test is
    mandatory before entering user acceptance") seems a
    bit fragile.
    Are there recommended methods?Automatic collection obviously. This has to be planned for.
    One way is to just log success and failure for each test which is gathered in one or more files. Then a seperate process munges the result file to collect the data.
    I know that there is a java build engine (add on to ant or a wrapper to ant) which will do periodic builds and email reports to developers. I think it even allows for categorization so the correct developer gets the correct error.
    >
    3) Even the developer of the failed package may not
    realize the importance of the failure. So what
    should be the process when unit tests fail in
    QA?
    Block the process until 100% tests pass? Or, start
    acceptance anyway but notify the developper through
    the bug tracking system?
    I would block it.
    4) Does your acceptance process require 100% pass
    before user acceptance starts?
    No. But I am not sure what that has to do with what you were discussing above. I consider unit tests and acceptance testing to be two seperate things.
    Indeed I have ruled out requiring 100% pass, but is
    this a widespread practice?
    I rule it out because maybe the failed test indeed
    points out a bad test, or a temporary unavailability
    of a dependent or simulated resource.Then something is wrong with your process.
    When you create a release build you should include those things that should be in the release. If they are not done, missing, have build errors, then they shouldn't be in the release.
    If some dependent piece is missing for the release build then the build process has failed. And so it should not be released to QA.
    I have used version control/debug systems which made this relatively easy by allowing control over which bug/enhancements are included in a release. And of course requiring that anything checked in must be done so under a bug/enhancement. They will even control dependencies as well (the files for bug 7 might require that bug 8 is added as well.)

  • Unit Testing and Code Coverage

    Is there any way to see graphs and charts of how much code was covered during Unit Tests in OBPM 10GR3 from the CUnit and PUnit tests?
    We use Clover Reports in Java Projects.
    Any such tool for OBPM 10GR3 projects?

    Here are some more
    Triggers and DB links are not available in Oracle Explorer - it would be great to have them in there - I found triggers under tables - but I would much prefer them to be broken out under their own node rather than nested under table
    I think others have mentioned this but when you query a table (Retrieve Data) - it would be great to be able to specify a default number of records to retrieve - the 30 second timeout is great - but more control would be nice - also a way to control the timeout would be nice as well
    I noticed that I get different behavior on the date format when I retrieve data (by selecting the option from the table menu when I right click on a table) versus getting the same data through the query window - why isn't it consistent?
    Also - with Intellisense - can you make the icons different for the type of objects that the things represent (like tables versus views versus functions)
    I noticed that I couldn't get dbms_output to show up with Intellisense - I had filtered out of Oracle Explorer all the System objects - does that somehow affect Intellisense as well? I know that the account I am using has access to those packages.
    Also - more control over collapsible regions would be nice - I love that feature of VS - but for ODT it seems to only work at the procedure level (not configurable with some kind of directive etc...)

  • Unit testing legacy code - best approach?

    Hi,
    I've got a project full of legacy code (by this I simply mean untested) and I'm experimenting with retrofitting unit testing to it. I'm sure this entails a great many problems, but here's the first one.
    We have a central class with the Singleton pattern - it's instantiated once and then fetched all over the code. It's a concrete class at the moment, not an interface. Let's call this CentralControl.
    Because the code I want to test relies on the presence of this class, and yet it's too complex/heavyweight to properly instantiate in a test scenario, I need to mock it.
    I guess the ideal way is to turn what's now CentralControl into an interface, and have both a real and mock implementation. The code that enforces the singleton behaviour could have a new method to instantiate the mock version rather than the real thing.
    Would you do this yourself rather than alternatives like extending it, and are there any tools to help automate the process?
    Thanks,
    Rob

    I guess the ideal way is to turn what's now CentralControl into an interface...I think that's the better idea, but you can mock classes as well as interfaces; e.g., EasyMock 2.2.2 and higher.
    ~

  • Unit Test Variable Substitution in PL/SQL User Vailidation code not running

    Hi
    I am using new Unit Test Feature in SQL Developer 2.1.0.62.
    I have created a test implemented to test a function. The function has a VARCHAR2 parameter as input and returns a BINARY_INTEGER.
    I would like to perform 'Process Validation instead of specifying an explicit 'Result'. The check box 'Test Result' is unchecked.
    I have seen in the doc. that I can use substitution variables in my user defined PL/SQL code. I try
    IF {RETURNS$} < 0
    THEN ...
    but I always get the error
    ERROR
    <RETURN> : Expected: [Any value because apply check was cleared], Received: [-103]
    Validation User PL/Sql Code failed: Non supported SQL92 token at position: 181: RETURNS$
    Obviously, the program doesn't recognize {RETURN$}.
    Am I missing something?
    br
    Reiner

    Hi all,
    I have installed the latest version of SQL Developer (2.1.1) that fixed the problem - must have been a bug.
    The only problem was that I got an ORA-904 TEST_USER_NAME... error. I export my tests, dropped the repository, created a new one and reimported everything. Now it works as it should.
    br
    Reiner

  • ABAP unit: test classes part of prod. code

    Hi all,
    I've read <a href="https://weblogs.sdn.sap.com/pub/u/266">Thomas Weiss</a>'s weblog: <a href="/people/thomas.weiss/blog/2004/12/17/a-spotlight-on-abap-unit-part-1 Spotlight on ABAP Unit Part 1</a>.
    I posted a question there, but apparently Thomas can't answer at the moment, and since I'm impatient for getting someone's opinion, I reckon we can discuss that topic here  :^)
    As explained in the <a href="https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/c1be1003-0701-0010-3795-f87160de6483">NetWeaver Developer’s Guide Using ABAP</a>, the test classes are part of the program under test (PUT):
    <i>In ABAP Unit, test classes are part of the production code of the TU. This avoids problems arising from the test code being separate from the production code:
    - Programs and tests must be kept synchronized.
    - You have to ensure that the test and program code are transported together.
    - External test code only enables black box tests with an outside perspective of the tested program.
    Since the test code is part of the production code, it is easy to keep the unit tests and the
    production code up to date if the latter is changed.
    Although the test code and the production code are transported through the system landscape, ABAP unit tests do not increase the load on the production system. By default, the test code is not compiled in the production system. Therefore, the test code can never be executed in the production system.</i>
    On the other hand, this idea is quite different from what suggested by other unit testing frameworks.
    For instance, the documentation of <a href="http://www.ruby-doc.org/stdlib/libdoc/test/unit/rdoc/classes/Test/Unit.html">Test::Unit - Ruby Unit Testing Framework</a> reads:
    <i>It‘s handy to collect a bunch of related tests, each test represented by a method, into a common test class that knows how to run them.
    The tests will be in a separate class from the code they‘re testing for a couple of reasons. First of all, it allows your code to stay uncluttered with test code, making it easier to maintain. Second, it allows the tests to be stripped out for deployment, since they‘re really there for you, the developer, and your users don‘t need them. Third, and most importantly, it allows you to set up a common test fixture for your tests to run against.</i>
    Regarding the advantages outlined by NetWeaver Developer’s Guide:
    - Programs and tests might be kept synchronized even if they don't belong to the same unit of code.
    - Moreover, you could assure that the test and program code are transported together by saving them in the same package.
    - And quite frankly, I don't have a thourough understanding of the reason why external tests only allow <i>black box tests with an outside perspective</i>.
    What are your opinions on this?
    Regards, Davide

    Hi,
    when I started unit testing with JUnit I was quite surprised that SAP groups the tests with the production code. But now I actually really like it. It has a few advantages:
    - Good tests serve as an excellent documentation so why not bundling them together.
    - If you want to look at the tests they are easy to find.
    - I write all my class tests as local classes. I found out that it really helps me to focus on writing tests only for the class under test. Sometimes it is quite easy to forget the "unit" and write integration tests again.
    Regarding your question:
    >And quite frankly, I don't have a thourough understanding of the reason why external tests only allow black box tests with an outside perspective.
    Usually I would not use the term black box testing in this context. Maybe you mean that it is easy to access private attributes, methods when you group class and testclass together?
    Normally black box testing means that you cannot look at the implementation of the code under test. So your tests are based on the specifications or if there is not one, on your common sense.
    cheers
    Thomas

  • [svn:osmf:] 14261: Updated DRM unit tests to work with code review feedback .

    Revision: 14261
    Revision: 14261
    Author:   [email protected]
    Date:     2010-02-18 14:15:23 -0800 (Thu, 18 Feb 2010)
    Log Message:
    Updated DRM unit tests to work with code review feedback.
    Modified Paths:
        osmf/trunk/framework/OSMFTest/org/osmf/elements/TestParallelElementWithDRMTrait.as
        osmf/trunk/framework/OSMFTest/org/osmf/elements/TestSerialElementWithDRMTrait.as
        osmf/trunk/framework/OSMFTest/org/osmf/traits/TestDRMTrait.as
        osmf/trunk/framework/OSMFTest/org/osmf/utils/DynamicDRMTrait.as

    Hello Alex,
    I don't have an answer for you.
    But, can you try to use http://drmtest2.adobe.com:8080/Content/anonymous.f4v with locally hosted OSMF player? This content doens't require user/pass info.
    I'm wondering that Google TV's flash player doesn't support prompt dialog.
    http://drmtest2.adobe.com/AccessPlayer/player.html requires flash player 11. That's why it won't be loaded with flash player 10.x.
    Thanks,
    -- Hiroshi

  • [svn:osmf:] 15979: Second code submission for bug FM-760, add unit tests to increase code coverage.

    Revision: 15979
    Revision: 15979
    Author:   [email protected]
    Date:     2010-05-09 16:26:09 -0700 (Sun, 09 May 2010)
    Log Message:
    Second code submission for bug FM-760, add unit tests to increase code coverage.
    Ticket Links:
        http://bugs.adobe.com/jira/browse/FM-760
    Modified Paths:
        osmf/trunk/framework/OSMFTest/org/osmf/OSMFTests.as
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/dvr/MockHTTPNetStream.as
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/dvr/TestHTTPStreamingDVRCastDVRT rait.as
    Added Paths:
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/TestHTTPNetStreamMetrics.as
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/TestHTTPStreamRequest.as
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/TestHTTPStreamingFileHandlerBase .as
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/TestHTTPStreamingIndexHandlerBas e.as
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/TestHTTPStreamingUtils.as

    Revision: 15979
    Revision: 15979
    Author:   [email protected]
    Date:     2010-05-09 16:26:09 -0700 (Sun, 09 May 2010)
    Log Message:
    Second code submission for bug FM-760, add unit tests to increase code coverage.
    Ticket Links:
        http://bugs.adobe.com/jira/browse/FM-760
    Modified Paths:
        osmf/trunk/framework/OSMFTest/org/osmf/OSMFTests.as
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/dvr/MockHTTPNetStream.as
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/dvr/TestHTTPStreamingDVRCastDVRT rait.as
    Added Paths:
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/TestHTTPNetStreamMetrics.as
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/TestHTTPStreamRequest.as
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/TestHTTPStreamingFileHandlerBase .as
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/TestHTTPStreamingIndexHandlerBas e.as
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/TestHTTPStreamingUtils.as

  • [svn:osmf:] 15956: First code submission for bug FM-760, add unit tests to increase code coverage.

    Revision: 15956
    Revision: 15956
    Author:   [email protected]
    Date:     2010-05-07 10:42:13 -0700 (Fri, 07 May 2010)
    Log Message:
    First code submission for bug FM-760, add unit tests to increase code coverage.
    Ticket Links:
        http://bugs.adobe.com/jira/browse/FM-760
    Modified Paths:
        osmf/trunk/framework/OSMFTest/org/osmf/OSMFTests.as
    Added Paths:
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/dvr/
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/dvr/MockHTTPNetStream.as
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/dvr/TestDVRInfo.as
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/dvr/TestHTTPStreamingDVRCastDVRT rait.as
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/dvr/TestHTTPStreamingDVRCastTime Trait.as

    Revision: 15956
    Revision: 15956
    Author:   [email protected]
    Date:     2010-05-07 10:42:13 -0700 (Fri, 07 May 2010)
    Log Message:
    First code submission for bug FM-760, add unit tests to increase code coverage.
    Ticket Links:
        http://bugs.adobe.com/jira/browse/FM-760
    Modified Paths:
        osmf/trunk/framework/OSMFTest/org/osmf/OSMFTests.as
    Added Paths:
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/dvr/
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/dvr/MockHTTPNetStream.as
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/dvr/TestDVRInfo.as
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/dvr/TestHTTPStreamingDVRCastDVRT rait.as
        osmf/trunk/framework/OSMFTest/org/osmf/net/httpstreaming/dvr/TestHTTPStreamingDVRCastTime Trait.as

  • [svn:osmf:] 15984: FM-848: adding unit test that holds the code to reproduce the reported bug.

    Revision: 15984
    Revision: 15984
    Author:   [email protected]
    Date:     2010-05-10 05:50:23 -0700 (Mon, 10 May 2010)
    Log Message:
    FM-848: adding unit test that holds the code to reproduce the reported bug.
    Ticket Links:
        http://bugs.adobe.com/jira/browse/FM-848
    Modified Paths:
        osmf/trunk/framework/OSMFTest/org/osmf/elements/compositeClasses/TestCompositeMetadata.as

    My apologies about the wall of text. After I made my original post, I thought maybe it would better to go back and put it in a pastebin instead. I was not able to edit that post once I sent it.
    In regards to your question, the  permissions on the
    /Library/LaunchAgents/com.adobe.AAM.Updater-1.0.plist file is "read and write" for system, wheel and everyone.

  • Unit Testing, Null, and Warnings

    I have a Unit Test that includes the following lines:
    Dim nullarray As Integer()()
    Assert.AreEqual(nullarray.ToString(False), "Nothing")
    The variable "nullarray" will obviously be null when ToString is called (ToString is an extension method, which is the one I am testing). This is by design, because the purpose of this specific unit test is to make sure that my ToString extension
    method handles null values the way I expect. The test runs fine, but Visual Studio 2013 gives includes the following warning:
    Variable 'nullarray' is used before it has been assigned a value. A null reference exception could result at runtime.
    This warning is to be expected, and I don't want to stop Visual Studio 2013 from showing this warning or any other warnings, just this specific case (and several others that involve similar scenarios). Is there any way to mark a line or segment
    of code so that it is not checked for warnings? Otherwise, I will end up with lots of warnings for things that I am perfectly aware of and don't plan on changing.
    Nathan Sokalski [email protected] http://www.nathansokalski.com/

    Hi Nathan Sokalski,
    Variable 'nullarray' is used before it has been assigned a value. A null reference exception could result at runtime.
    Whether the warning above was thrown when you built the test project but the test run successfully? I assume Yes.
    Is there any way to mark a line or segment of code so that it is not checked for warnings?
    There is no built-in way to make some code snippet or a code line not be checked during compiling, but we can configure some specific warnings not to be warned during compiling in Visual Basic through project Properties->Compile
    tab->warning configurations box.
    For detailed information, please see: Configuring Warnings in Visual Basic
    Another way is to correct your code logic and make sure the code will not generate warning at runtime.
    If I misunderstood you, please tell us what code you want it not to be checked for warnings with a sample so that we can further look at your issue.
    Thanks,
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

Maybe you are looking for