JavaServer Faces Unit Test Framework

Based on my knowledge, there isn't any handy JSF Unit test tools. To facilitate UI developers and QAs to do test, I have created a prototype project to evaluate a solution.
For a Simple JSF enabled JSP file:
<?xml version="1.0" encoding="UTF-8"?>
<jsp:root version="1.2" xmlns:f="http://java.sun.com/jsf/core" xmlns:h="http://java.sun.com/jsf/html" xmlns:jsp="http://java.sun.com/JSP/Page">
   <jsp:directive.page contentType="text/html;charset=UTF-8" pageEncoding="UTF-8" />
     <f:view>
          <h:form>
            <h:outputText value="Please input:"/>
               <h:inputText id="input1"/>
            <br/>
            <h:outputText value="Please input a integer:"/>
               <h:inputText style="width:50px;" converter="javax.faces.Integer"/>
            <h:commandButton value="Submit"/>     
       </h:form>
   </f:view>
</jsp:root>writing a HttpUnit to this page is obviously difficult:
1) Hard to pin-point the html element rendered by input1, because up-level NamingContainer h:form is missing id.
2) More difficult to find the second InputText, because missing two ids.
3) How to assert if the converter really works or not, we can't assert any object in PageBean by HttpUnit.
4) Much more difficult to find those html element if there are several level NamingContainer, such as the component nested within "for-loop".
I perfer not to write HttpUnit and seperate JUnit to PageBean, so if the follow JSP can be used in both test and production environment, work will be much easier:
* test1.jsp
<?xml version="1.0" encoding="UTF-8"?>
<jsp:root version="1.2" xmlns:test="http://www.yourcompany.com/jsf/test"  xmlns:f="http://java.sun.com/jsf/core" xmlns:h="http://java.sun.com/jsf/html" xmlns:jsp="http://java.sun.com/JSP/Page">
   <jsp:directive.page contentType="text/html;charset=UTF-8" pageEncoding="UTF-8" />
     <f:view>
        <test:parameter name="abc" value="def"/><!-- setting request parameter -->
          <h:form>
            <h:outputText value="Please input:"/>
               <h:inputText>
               <test:set value="hello"
/> <!-- emulate inputing "hello" to the text box -->
               <test:assert phase="afterUpdateModelValues" description="InputText1" var="input1" value="#{input1=='hello'}"/><!-- you can try other JSF phases as well, here var "input1" is parent jsf component -->
                  <test:assert description="InputText1" var="input1" value="#{input1=='hello'}"/>
<!-- use default phase "afterRenderResponse", here var "input1" is HtmlElement fetched from rendered page by HttpUnit -->
            </h:inputText>
            <br/>
            <h:inputText style="width:50px;" converter="javax.faces.Integer">
               <test:set actions="all" value="123"/>
               <test:assert phase="afterUpdateModelValues" description="Input2 Class Name" var="input2CN" value="#{input2CN.class.name == 'java.lang.Integer'}"/>
               <test:assert description="InputText2 value in afterRenderResponse phase" var="input2" value="#{input2=='123'}"/>
               <test:assert description="InputText2 Style" var="input2Style" valueAttribute="style" value="#{input2Style=='width:20px;height:25px;'}"/>
            </h:inputText>
            <br/>
            <h:commandButton value="Submit">
               <test:action description="Test Sumbit"/><!-- support multi actions -->
            </h:commandButton>     
       </h:form>
   </f:view>
</jsp:root> For production jsp you can use normal url, http://server:port/yourapp/faces/test1.jsp, those test:xxx components won't do anything.
For testing a single page, http://server:port/yourapp/faces/TestAction?view=/test1.jsp
For testing more pages, http://server:port/yourapp/faces/TestAction?suite=/suite1.xml and write file:
* suite1.xml
<?xml version='1.0' encoding='UTF-8'?>
<TestSuite>
   <page>/test1.jsp</page>
   <page>/test2.jsp</page>
</TestSuite> After running this test suite in sun jsf1.0 runtime, the result comes:
<?xml version="1.0" encoding="UTF-8" ?>
<Test suite="/suite1.xml">
  <Page id="/test1.jsp">
    <Action id="_id13" time="0 hour(s) 0 minute(s) 0 second(s) 31 millisecond(s)" description="Test Sumbit">
      <Assert id="_id5" description="InputText1">
         <afterUpdateModelValues pass="true" />
      </Assert>
      <Assert id="_id9" description="Input2 Class Name">
         <afterUpdateModelValues pass="true" />
      </Assert>
      <Assert id="_id6" description="InputText1">
         <afterRenderResponse pass="true" />
      </Assert>
      <Assert id="_id10" description="InputText2 value in afterRenderResponse phase">
         <afterRenderResponse pass="true" />
      </Assert>
      <Assert id="_id11" description="InputText2 Style">
         <afterRenderResponse pass="false" />
      </Assert>
    </Action>
  </Page>
  <Page id="/test2.jsp">
  </Page>
</Test> So far, if you think this idea is good for you, you can get the src code at the end of this post.
Furthermore, you can make improvement to this as well, such as implement concurrent and repeatable test.
Anyway, don't expect this two days prototype is quite robust, and coding is not my day to day work either.
If you have any questions or idea, please drop me a line: [email protected], [email protected]
*  jsftest.tld
<?xml version="1.0"?>
<!DOCTYPE taglib PUBLIC "-//Sun Microsystems, Inc.//DTD JSP Tag Library 1.2//EN"
"http://java.sun.com/dtd/web-jsptaglibrary_1_2.dtd">
<taglib>
     <tlib-version>1.0</tlib-version>
     <jsp-version>1.2</jsp-version>
     <short-name>test</short-name>
     <uri>http://www.yourcompany.com/jsf/test</uri>
     <display-name>JSF Test Tag Library</display-name>
     <description></description>
     <tag>
          <name>action</name>
          <tag-class>com.yourcompany.jsf.ui.test.taglib.ActionTag</tag-class>
          <body-content>empty</body-content>
          <display-name>Action</display-name>
          <description>JSF test submit action</description>
          <attribute>
               <name>binding</name>
               <required>false</required>
               <rtexprvalue>true</rtexprvalue>
          </attribute>
          <attribute>
               <name>id</name>
               <required>false</required>
               <rtexprvalue>true</rtexprvalue>
          </attribute>
          <attribute>
               <name>description</name>
               <required>false</required>
               <rtexprvalue>true</rtexprvalue>
          </attribute>
     </tag>     
     <tag>
          <name>assert</name>
          <tag-class>com.yourcompany.jsf.ui.test.taglib.AssertTag</tag-class>
          <body-content>empty</body-content>
          <display-name>Assert</display-name>
          <description>JSF test assertion</description>
          <attribute>
               <name>binding</name>
               <required>false</required>
               <rtexprvalue>true</rtexprvalue>
          </attribute>
          <attribute>
               <name>id</name>
               <required>false</required>
               <rtexprvalue>true</rtexprvalue>
          </attribute>
          <attribute>
               <name>actions</name>
               <required>false</required>
               <rtexprvalue>true</rtexprvalue>
                        <description>default value is "all", means all actions, use "," as delimeter</description>
          </attribute>
          <attribute>
               <name>phase</name>
               <required>false</required>
               <rtexprvalue>true</rtexprvalue>
                        <description>effective value can be anyone of: afterRestoreView,
                               beforeApplyRequestValues, afterApplyRequestValues,
                               beforeProcessValidations, afterProcessValidations,
                               beforeUpdateModelValues,  afterUpdateModelValues,
                               beforeInvokeApplication,  afterInvokeApplication,
                               beforeRenderResponse,     afterRenderResponse
                             The default value is afterRenderResponse, this assertion will be applied to HtmlElement rather than JSF Component.
                        </description>
          </attribute>
          <attribute>
               <name>value</name>
               <required>false</required>
               <rtexprvalue>true</rtexprvalue>
                        <description>must be boolean, means pass test or not</description>
          </attribute>
          <attribute>
               <name>var</name>
               <required>false</required>
               <rtexprvalue>true</rtexprvalue>
                        <description>variable name, used to define a request attribute which can be used in valueBinding language</description>
          </attribute>
          <attribute>
               <name>valueAttribute</name>
               <required>false</required>
               <rtexprvalue>true</rtexprvalue>
                        <description>The default is "value".
                                     var will be bound to this attribute.
                                     var will be bound to the rendered HtmlElement if the phase="afterRenderResponse",
                                     otherwise var will be bound to parent component.
                        </description>
          </attribute>
          <attribute>
               <name>postback</name>
               <required>false</required>
               <rtexprvalue>true</rtexprvalue>
                        <description>The default is "true".
                                     Currently only support true;
                        </description>
          </attribute>
          <attribute>
               <name>description</name>
               <required>false</required>
               <rtexprvalue>true</rtexprvalue>
          </attribute>
     </tag>
     <tag>
          <name>parameter</name>
          <tag-class>com.yourcompany.jsf.ui.test.taglib.ParameterTag</tag-class>
          <body-content>empty</body-content>
          <display-name>Set</display-name>
          <description>JSF test request parameter setter</description>
          <attribute>
               <name>binding</name>
               <required>false</required>
               <rtexprvalue>true</rtexprvalue>
          </attribute>
          <attribute>
               <name>id</name>
               <required>false</required>
               <rtexprvalue>true</rtexprvalue>
          </attribute>
          <attribute>
               <name>actions</name>
               <required>false</required>
               <rtexprvalue>true</rtexprvalue>
                        <description>default value is "all", means all actions, use "," as delimeter</description>
          </attribute>
          <attribute>
               <name>name</name>
               <required>true</required>
               <rtexprvalue>true</rtexprvalue>
          </attribute>
          <attribute>
               <name>value</name>
               <required>true</required>
               <rtexprvalue>true</rtexprvalue>
          </attribute>
     </tag>
     <tag>
          <name>set</name>
          <tag-class>com.yourcompany.jsf.ui.test.taglib.SetTag</tag-class>
          <body-content>empty</body-content>
          <display-name>Set</display-name>
          <description>JSF test component value setter</description>
          <attribute>
               <name>binding</name>
               <required>false</required>
               <rtexprvalue>true</rtexprvalue>
          </attribute>
          <attribute>
               <name>id</name>
               <required>false</required>
               <rtexprvalue>true</rtexprvalue>
          </attribute>
          <attribute>
               <name>actions</name>
               <required>false</required>
               <rtexprvalue>true</rtexprvalue>
                        <description>default value is "all", means all actions, use "," as delimeter</description>
          </attribute>
          <attribute>
               <name>value</name>
               <required>true</required>
               <rtexprvalue>true</rtexprvalue>
          </attribute>
     </tag>
</taglib>
*  faces-config.xml
<?xml version="1.0"?>
<!DOCTYPE faces-config PUBLIC
  "-//Sun Microsystems, Inc.//DTD JavaServer Faces Config 1.1//EN"
  "http://java.sun.com/dtd/web-facesconfig_1_1.dtd">
<faces-config>
  <component>
    <component-type>com.yourcompany.jsf.ui.test.Action</component-type>
    <component-class>com.yourcompany.jsf.ui.test.component.Action</component-class>
  </component>
  <component>
    <component-type>com.yourcompany.jsf.ui.test.Assert</component-type>
    <component-class>com.yourcompany.jsf.ui.test.component.Assert</component-class>
  </component>
  <component>
    <component-type>com.yourcompany.jsf.ui.test.Parameter</component-type>
    <component-class>com.yourcompany.jsf.ui.test.component.Parameter</component-class>
  </component>
  <component>
    <component-type>com.yourcompany.jsf.ui.test.Set</component-type>
    <component-class>com.yourcompany.jsf.ui.test.component.Set</component-class>
  </component>
  <lifecycle>
      <phase-listener>com.yourcompany.jsf.ui.test.listener.TestPhaseListener</phase-listener>
      <phase-listener>com.yourcompany.jsf.ui.test.listener.TestAssertPhaseListener</phase-listener>
  </lifecycle>
</faces-config>
*  TestPhaseListener.java
package com.yourcompany.jsf.ui.test.listener;
import java.io.InputStream;
import java.lang.reflect.Method;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Iterator;
import java.util.List;
import javax.faces.application.ViewHandler;
import javax.faces.component.UIComponent;
import javax.faces.component.UIViewRoot;
import javax.faces.context.FacesContext;
import javax.faces.context.ResponseWriter;
import javax.faces.event.PhaseEvent;
import javax.faces.event.PhaseId;
import javax.faces.event.PhaseListener;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory;
import org.w3c.dom.Document;
import org.w3c.dom.NodeList;
import org.w3c.dom.Text;
import com.meterware.httpunit.HTMLElement;
import com.meterware.httpunit.PostMethodWebRequest;
import com.meterware.httpunit.WebConversation;
import com.meterware.httpunit.WebRequest;
import com.meterware.httpunit.WebResponse;
import com.sun.faces.renderkit.html_basic.HtmlResponseWriter;
import com.yourcompany.jsf.ui.component.UIIterate;
import com.yourcompany.jsf.ui.test.component.Action;
import com.yourcompany.jsf.ui.test.component.Assert;
import com.yourcompany.jsf.ui.test.component.Parameter;
import com.yourcompany.jsf.ui.test.component.Set;
import com.yourcompany.jsf.ui.test.report.Constants;
import com.yourcompany.jsf.ui.test.report.TestAction;
import com.yourcompany.jsf.ui.test.report.TestAssert;
import com.yourcompany.jsf.ui.test.report.TestPage;
import com.yourcompany.jsf.ui.test.report.TestReport;
import com.yourcompany.jsf.ui.util.ComponentUtil;
public class TestPhaseListener implements PhaseListener {
     public TestPhaseListener() {
          super();
          // TODO Auto-generated constructor stub
     /* (non-Javadoc)
      * @see javax.faces.event.PhaseListener#afterPhase(javax.faces.event.PhaseEvent)
     public void afterPhase(PhaseEvent event) {
          if(-1 != event.getFacesContext().getViewRoot().getViewId().indexOf("TestAction")){
               FacesContext context =      event.getFacesContext();
               ViewHandler vh = context.getApplication().getViewHandler();
               HttpServletRequest request = (HttpServletRequest)context.getExternalContext().getRequest();
               TestReport report = new TestReport();
               request.getSession().setAttribute(Constants.TEST_REPORT_SESSION_ATTRIBUTE_ID, report);
               String suitefile = request.getParameter("suite");
               List pages = new ArrayList(5);
               if(suitefile==null){
                    pages.add(request.getParameter("view"));
               }else{
                    report.setSuiteId(suitefile);
                    try{
                         DocumentBuilderFactory factory = DocumentBuilderFactory.newInstance();
                      factory.setValidating(false);
                      factory.setNamespaceAware(false);
                      DocumentBuilder builder = factory.newDocumentBuilder();
                      InputStream ins = context.getExternalContext().getResourceAsStream(suitefile);
                      Document doc = builder.parse(ins);
                      NodeList pageNodes = doc.getElementsByTagName("page");
                      for(int i=0; i<pageNodes.getLength(); i++){
                           pages.add(((Text)pageNodes.item(i).getFirstChild()).getData());
                    }catch(Exception e){
                         report.setException(e);
               for(Iterator p=pages.iterator(); p.hasNext(); ){
                    String viewId = (String)p.next();
                    UIViewRoot view = vh.restoreView(context, viewId);
                    if(view==null){
                         try{
                            WebConversation wc = new WebConversation();
                            wc.putCookie("JSESSIONID", request.getSession().getId());
                            String url = getUrl(request, viewId);
                            WebResponse wr = wc.getResponse(url);
                         }catch(Exception e){
                              report.setException(e);
                         view = vh.restoreView(context, viewId);
                    try{
                         Method doActionMethod = TestPhaseListener.class.getDeclaredMethod("doAction", new Class[]{Action.class, FacesContext.class, HttpServletRequest.class, UIViewRoot.class, String.class});
                         ComponentUtil.iterateComponent(view, Action.class, this, doActionMethod, new Object[]{context, request, view, viewId});
                    }catch(Exception e){
                         report.setException(e);
               //render report
               renderReport(context, report);
               request.getSession().removeAttribute(Constants.TEST_REPORT_SESSION_ATTRIBUTE_ID);
               context.responseComplete();
     /* (non-Javadoc)
      * @see javax.faces.event.PhaseListener#beforePhase(javax.faces.event.PhaseEvent)
     public void beforePhase(PhaseEvent event) {
     /* (non-Javadoc)
      * @see javax.faces.event.PhaseListener#getPhaseId()
     public PhaseId getPhaseId() {
           return PhaseId.RESTORE_VIEW;
     private String getUrl(HttpServletRequest request, String viewId){
          return request.getScheme()+"://" + request.getServerName()+":"+request.getServerPort()+request.getContextPath()+"/faces"+viewId;
     protected void doAction(Action action, FacesContext context, HttpServletRequest request, UIViewRoot view, String viewId){
          try{
                  WebConversation wc = new WebConversation();
                  wc.putCookie("JSESSIONID", request.getSession().getId());
                  String url = getUrl(request, viewId);
                  WebRequest wrq = new PostMethodWebRequest(url);
                  wrq.setParameter(Constants.TEST_VIEW_PARAMETER, viewId);
                  wrq.setParameter(Constants.TEST_ACTION_PARAMETER, action.getId());
                  //set parameters
                  try{
                       Method doParameterMethod = TestPhaseListener.class.getDeclaredMethod("doParameter", new Class[]{Parameter.class, String.class, WebRequest.class});
                       ComponentUtil.iterateComponent(view, Parameter.class, this, doParameterMethod, new Object[]{action.getId(), wrq});
                  }catch(Exception e){
                       e.printStackTrace();
                  //set attributes
                  //set values
                  try{
                       Method doSetMethod = TestPhaseListener.class.getDeclaredMethod("doSet", new Class[]{Set.class, FacesContext.class, String.class, WebRequest.class});
                       ComponentUtil.iterateComponent(view, Set.class, this, doSetMethod, new Object[]{context, action.getId(), wrq});
                  }catch(Exception e){
                       e.printStackTrace();
                  //do test actions
                  String formId = null;
                  if(formId!=null)wrq.removeParameter(formId);
                  formId = ComponentUtil.getForm(action).getClientId(context);
                  wrq.setParameter(formId, action.getParent().getClientId(context));
                  long start = System.currentTimeMillis();
                  WebResponse wrp = wc.getResponse(wrq);
                  //System.out.println(wrp.getText());
                  //do assertions
                  try{
                       Method doAssertMethod = TestPhaseListener.class.getDeclaredMethod("doAssert", new Class[]{Assert.class, FacesContext.class, HttpServletRequest.class, String.class, String.class, String.class, WebResponse.class});
                       ComponentUtil.iterateComponent(view, Assert.class, this, doAssertMethod, new Object[]{context, request, viewId, action.getId(), action.getDescription(), wrp});
                  }catch(Exception e){
                       e.printStackTrace();
                  TestReport report = (TestReport)request.getSession().getAttribute(Constants.TEST_REPORT_SESSION_ATTRIBUTE_ID);
                 report.getPage(viewId).getAction(action.getId()).setTime(System.currentTimeMillis()-start);
               }catch(Exception e){
                    e.printStackTrace();
     protected void doParameter(Parameter parameter, String actionId, WebRequest wrq){
           if("all".equals(parameter.getActions()) || Arrays.asList(parameter.getActions().split(",")).contains(actionId))wrq.setParameter(parameter.getName(), parameter.getValue());
     protected void doSet(Set set, FacesContext context, String actionId, WebRequest wrq){
           if("all".equals(set.getActions()) || Arrays.asList(set.getActions().split(",")).contains(actionId))wrq.setParameter(set.getParent().getClientId(context), set.getValue());
     protected void doAssert(Assert ast, FacesContext context, HttpServletRequest request, String viewId, String actionId, String actionDescription, WebResponse wrp){
          if(("all".equals(ast.getActions()) || Arrays.asList(ast.getActions().split(",")).contains(actionId)) && "afterRenderResponse".equals(ast.getPhase())){
                TestReport report = (TestReport)request.getSession().getAttribute(Constants.TEST_REPORT_SESSION_ATTRIBUTE_ID);
                try{
                     HTMLElement targetElement = wrp.getElementWithID(ast.getParent().getClientId(context));
                     if(targetElement==null)targetElement = wrp.getElementsWithName(ast.getParent().getClientId(context))[0];
                     if(targetElement==null)request.setAttribute(ast.getVar(), null);
                     else request.setAttribute(ast.getVar(), targetElement.getAttribute(ast.getValueAttribute()));
                     //report assert
                     report.report(viewId, actionId, actionDescription, ast.getId(), ast.getDescription(), ast.getPhase(), (Boolean)ast.getValue(), null);
                }catch(Exception e){
                     report.report(viewId, actionId, actionDescription, ast.getId(), ast.getDescription(), ast.getPhase(), Boolean.FALSE, e);
     protected void renderReport(FacesContext context, TestReport report){
          try{
               HttpServletResponse response = (HttpServletResponse)context.getExternalContext().getResponse();
               response.setContentType("text/xml; charset=UTF-8");
               response.setHeader("Cache-Control", "no-cache");
               ResponseWriter writer = new HtmlResponseWriter(response.getWriter(), null, null);
               context.setResponseWriter(writer);
               writer.startDocument();
               writer.write("<?xml version=\"1.0\" encoding=\"UTF-8\"?>\r\n");
               writer.startElement("Test", null);
               if(report.getSuiteId()!=null)writer.writeAttribute("suite", report.getSuiteId(), null);
               for(Iterator p=report.listPages().iterator(); p.hasNext(); ){
                    TestPage page = (TestPage)p.next();
                    writer.startElement("Page", null);
                    writer.writeAttribute("id", page.getViewId(), null);
                    for(Iterator i=page.listActions().iterator(); i.hasNext(); ){
                         TestAction action = (TestAction)i.next();
                         writer.startElement("Action", null);
                         writer.writeAttribute("id", action.getId(), null);
                         writer.writeAttribute("time", getElapsedTime(action.getTime()), null);
                         if(action.getDescription()!=null)writer.writeAttribute("description", action.getDescription(), null);
                         for(Iterator j=action.getAsserts().values().iterator(); j.hasNext(); ){
                              TestAssert ast = (TestAssert)j.next();
                              writer.startElement("Assert", null);
                              writer.writeAttribute("id", ast.getId(), null);
                              if(ast.getDescription()!=null)writer.writeAttribute("description", ast.getDescription(), null);
                              for(Iterator k=ast.getStatus().keySet().iterator(); k.hasNext(); ){
                                   String phase = (String)k.next();
                                   writer.startElement(phase, null);
                                   writer.writeAttribute("pass", ast.getStatus().get(phase).toString(), null);
                                   writer.endElement(phase);
                              if(ast.getException()!=null){
                                   writer.startElement("Exception", null);
                                   StackTraceElement[] stack = ast.getException().getStackTrace();
                                   for(int m=0; m<stack.length; m++){
                                        writer.write(stack[m].toString());
                                        writer.write("\r\n");
                                   writer.endElement("Exception");
                              writer.endElement("Assert");
                         if(action.getException()!=null){
                              writer.startElement("Exception", null);
                              StackTraceElement[] stack = action.getException().getStackTrace();
                              for(int m=0; m<stack.length; m++){
                                   writer.write(stack[m].toString());
                                   writer.write("\r\n");
                              writer.endElement("Exception");
                         writer.endElement("Action");
                    if(page.getException()!=null){
                         writer.startElement("Exception", null);
                         StackTraceElement[] stack = page.getException().getStackTrace();
                         for(int m=0; m<stack.length; m++){
                              writer.write(stack[m].toString());
                              writer.write("\r\n");
                         writer.endElement("Exception");
                    writer.endElement("Page");
               if(report.getException()!=null){
                    writer.startElement("Exception", null);
                    StackTraceElement[] stack = report.getException().getStackTrace();
                    for(int m=0; m<stack.length; m++){
                         writer.write(stack[m].toString());
                         writer.write("\r\n");
                    writer.endElement("Exception");
               writer.endElement("Test");
               writer.endDocument();
               response.getWriter().flush();
               response.getWriter().close();
          }catch(Exception e){
               e.printStackTrace();
      public static String getElapsedTime(long millis) {
               long hours, minutes, seconds, ms;
               hours = millis / 3600000;
               millis = millis - (hours * 3600000);
               minutes = millis / 60000;
               millis = millis - (minutes * 60000);
               seconds = millis / 1000;
               ms = millis - (seconds * 1000);
               return hours
                    + " hour(s) "
                    + minutes
                    + " minute(s) "
                    + seconds
                    + " second(s) "
                    + ms
                    + " millisecond(s)";
*  TestAssertPhaseListener.java
package com.yourcompany.jsf.ui.test.listener;
import java.lang.reflect.Method;
import java.util.Arrays;
import javax.faces.component.UIViewRoot;
import javax.faces.context.FacesContext;
import javax.faces.event.PhaseEvent;
import javax.faces.event.PhaseId;
import javax.faces.event.PhaseListener;
import javax.servlet.http.HttpServletRequest;
import org.apache.commons.beanutils.PropertyUtils;
import com.yourcompany.jsf.ui.test.component.Assert;
import com.yourcompany.jsf.ui.test.report.Constants;
import com.yourcompany.jsf.ui.test.report.TestReport;
import com.yourcompany.jsf.ui.util.ComponentUtil;
public class TestAssertPhaseListener implements PhaseListener {
     public TestAssertPhaseListener() {
          super();
          // TODO Auto-generated constructor stub
     /* (non-Javadoc)
      * @see javax.faces.event.PhaseListener#afterPhase(javax.faces.event.PhaseEvent)
     public void afterPhase(PhaseEvent event) {
          if(PhaseId.RENDER_RESPONSE.equals(event.getPhaseId()))return;
          iterateAssert(event, false);
     /* (non-Javadoc)
      * @see javax.faces.event.PhaseListener#beforePhase(javax.faces.event.PhaseEvent)
     public void beforePhase(PhaseEvent event) {
          if(PhaseId.RESTORE_VIEW.equals(event.getPhaseId()))return;
          iterateAssert(event, true);
     public PhaseId getPhaseId() {
           return PhaseId.ANY_PHASE;
     protected void iterateAssert(PhaseEvent event, boolean before){
          FacesContext context = event.getFacesContext();
          HttpServletRequest request = (HttpServletRequest)context.getExternalContext().getRequest();
          String viewId = request.getParameter(Constants.TEST_VIEW_PARAMETER);
          String actionId = request.getParameter(Constants.TEST_ACTION_PARAMETER);
          if(actionId!=null){
               TestReport report = (TestReport)request.getSession().getAttribute(Constants.TEST_REPORT_SESSION_ATTRIBUTE_ID);
               UIViewRoot view = context.getViewRoot();
               if(view==null)return;
               try{
                    Method doAssertMethod = TestAssertPhaseListener.class.getDeclaredMethod("doAssert", new Class[]{Assert.class, FacesContext.class, HttpServletRequest.class, String.class, String.class, String.class});
                    ComponentUtil.iterateComponent(view, Assert.class, this, doAssertMethod, new Object[]{context, request, viewId, actionId, this.getAssertPhase(event.getPhaseId(), before)});
               }catch(Exception e){
                    report.getPage(viewId).getAction(actionId).setException(e);
     protected void doAssert(Assert ast, FacesContext context, HttpServletRequest request, String viewId, String actionId, String assertPhase){
           if(("all".equals(ast.getActions()) || Arrays.asList(ast.getActions().split(",")).contains(actionId)) && assertPhase.equals(ast.getPhase())){
                TestReport report = (TestReport)request.getSession().getAttribute(Constants.TEST_REPORT_SESSION_ATTRIBUTE_ID);
                try{
                     request.setAttribute(ast.getVar(), PropertyUtils.getProperty(ast.getParent(), ast.getValueAttribute()));
                     //report assert
                     report.report(viewId, actionId, null, ast.getId(), ast.getDescription(), ast.getPhase(), (Boolean)ast.getValue(), null);
                }catch(Exception e){
                     report.report(viewId, actionId, null, ast.getId(), ast.getDescription(), ast.getPhase(), Boolean.FALSE, e);
     protected String getAssertPhase(PhaseId phaseId, boolean before){
          String perfix = before ? "before" : "after";
          if(PhaseId.RESTORE_VIEW.equals(phaseId))return perfix+"RestoreView";
          if(PhaseId.APPLY_REQUEST_VALUES.equals(phaseId))return perfix+"ApplyRequestValues";
          if(PhaseId.PROCESS_VALIDATIONS.equals(phaseId))return perfix+"ProcessValidations";
          if(PhaseId.UPDATE_MODEL_VALUES.equals(phaseId))return perfix+"UpdateModelValues";
          if(PhaseId.INVOKE_APPLICATION.equals(phaseId))return perfix+"InvokeApplication";
          if(before && PhaseId.RENDER_RESPONSE.equals(phaseId))return "beforeRenderResponse";
          return "";
*  UIIterate.java
// This class brings "                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                              

To Neoreborn:
If I understand right, Shale only provides some mock core JSF objects and extends JUnit so that you can do junit test to your java classes.
What I concern is to test Pages(JSP) rather than java classes. I want to assert component attribute values within all lifecycle including rendered HTML script. Currently, there isn't any good tools to do this kind of JSP Unit test, am I right?

Similar Messages

  • Unit Test Framework for 8.6

    Hi all,
    Do we have Unit Test Framework Toolkit for LabVIEW 8.6?
    We have unit Test Framework Toolkit for LabVIEW 8.6.1(Uit Test Framework Toolkit 1.0)
    Thanks,
    Suresh Kumar.G

    John Harby <[email protected]> wrote:
    Did you ever find anything? We are looking too ...I have been reading a great book that has given me some ideas, but I have not
    solidified any proofs of concept as of yet. Check out Vincent Massol book JUnit
    in Action. He has some great working examples of Mock Objects and Stubs using
    Cactus and Jetty. What I am thinking is that in a seperate "Java Project" within
    the application, we can extend JUnit and create whatever global objects a process
    needs and then make SOAP based calls to the JPD, since the JPD is derived from
    a web service. So the other piece to this is experience Unit Testing SOAP...
    - Noam

  • Test coverage in LabView Unit Test Framework

    Hi,
    can somebody from NI confirm the following two statements about the Unit Test Framework:
    1. The framework does not support "recursive coverage metrics", where the coverage considers sub-VIs that are executed in the VI under test.
    2. 100% coverage means something weaker that common "branch coverage". For example, an "if" VI is a branch in the program but it is not considered as a branch by LabView's test coverage metrics.
    Thanks,
    Peter

    Hello Johannes,
    I'm interested in branch coverage of a VI under test.
    Imagine a VI A that calls another VI B. If A is tested and LV's unit test framework reports 100% test coverage for A, it is possible that the test cases didn't visit all frames (branches) in B.
    Now my question is: is it possible that LV thinks of A as "flattened" so that all code in B is considered as code of A?
    Peter

  • Selecting the JavaServer Faces in the Frameworks list diables FINISH button

    I create a new Web Application in Netbeans 5.5.1 by clicking the new project item from the FILE pull down list. I complete steps 1 and 2 and continue to step 3, FRAMEWORKS. A list of possible frameworks are listed with unchecked checkboxes. I highlight the JavaServer Faces and put a check in the checkbox. As soon as the checkbox is checked, the FINISH button becomes disabled. If I uncheck the JavaServer Faces, the FINISH button is enabled.
    What do I do to allow JavaServer Faces when creating a new web application project?
    It is not only the JavaServer Faces checkbox that disables the FINIISH button, ALL of the frameworks listed disable the FINISH button when I check them.

    Resolution: In Netbeans 5, click on the TOOLS menu pull down list. In that list, select Module Manager. A node listing of modules is displayed. Scroll down to the WEB node and click on the plus (+) to list the web modules. Uncheck the GWT4NB in the Active column. In my case it was version 1.3.4. Also uncheck the Struts Support in the active column, my version was1.3.30.1

  • Unit Test Framework Crashing LabVIEW Project

    [Cross posted to LAVA]
    OS: Windows 7
    LabVIEW 2009, LabVIEW 2009 SP1. 
    Howdy
    If I create a project and add a VI and a unit test to it and save it, then delete the unit test, everything works.
    If I do the above but put the unit test in a virtual folder then delete it, the project hangs and LabVIEW crashes.
    I have tried this on multiple computers and get the same effect.
    If it is an issue I thought this may have been fixed in 2009 SP1, but it is not.
    I do not remember this happening in 8.6.1 but I was using VISTA at the time (if that is related?)
    I get the same problem if I open an existing project with unit tests in it. 
    Additionally I have noticed that the files from the unit test in Windows 7 have no "logo/icon" associated with them. I thought in VISTA they had the same icon as what is shown in the project - the green plus sign (but this was a while ago). 
    Is this a known issue / can anyone else confirm this? 
    Cheers
    -JG 
    Message Edited by jg-code on 04-13-2010 10:02 PM
    Certified LabVIEW Architect * LabVIEW Champion

    Hi Kyle 
    Thank you for your quick reply.
    I am having this problem on my work PC, home PC and a colleague's PC all running Windows 7 and 2009 SP1.
    I have found the problem occurs if I create a new unit test OR drag and drop an existing unit test into the project.
    Attached is a (jing) video of what is happening every time.
    You will see that at the root level of the project (My Computer), I can remove the unit test with no problems at all.
    But if it goes into a virtual folder and I try to remove it - then LabVIEW crashes. 
    The attachment is in LV2009.SP1 and is the sane project as used in the video.
    I look forward to your feedback on resolving this issue I am having.
    Cheers
    -JG 
    Certified LabVIEW Architect * LabVIEW Champion
    Attachments:
    Test Project.zip ‏4 KB

  • Unit Test Framework Bug: Removing a test vector file causes LV to crash

    Labview 2010 f2 crashes every time I try to remove a test vector file from a project.
    Repro:
    1. Start Labview and open a new project.
    2. Save the project.
    3. Right-click My Computer and select New -> Test Vectors
    4. Right-click the new file (Untitled.lvvect) and select Remove From Project
    I've tried various combinations of renaming the test vector file, trying multiple vector files, etc., but the end result is the same.  LV always crashes when I try to remove the vector file from the project.  LV crashes when I try to remove it even if I delete the vector file from disk before loading it from disk.  The only way I can remove a vector file from a project once the project has been saved is by editing the .lvproj file directly.

    Update:  I believe the last error is a different issue.  I was able to narrow the error down to a specific library and generate some error logs.  That library hasn't had any code changes in over a month.  The error logs point to something in the f2 patch, which I installed last Thursday.  Unfortunately there doesn't appear to be any way to uninstall the patch.
    I can send the guilty code via email or ftp if you need them.
    Attachments:
    lvlog2010-11-03-11-12-03.txt ‏6 KB
    lvlog2010-11-03-12-00-02.txt ‏4 KB

  • Unit testing? help.

    OK, I have tried to rap my mind around this for a few days. Unit testing.
    I understand the whole:
    if(expected)
    System.out.println("OK");
    else System.out.println("Error");Part of my class assignment but, I am still confused. Could someone explain how unit testing works? (*No, I don't want you to do it for me or anything like that.*) I am just confused on how exactly to test for these things in Java. I understand that I need to add some kind of input then check the return value with what is expected. However, I am not sure how to implement it.
    Currently I tried something:
    public static void lastIndexOfTest(MyString ms, int pos, char expected){
              System.out.println("Testing lastIndexOf("+pos+") on MyString " + ms);
              System.out.println("Expected: " + expected);
              try{
                   int got = ms.charAt(pos);
                   System.out.println("Got: " + got);
                   if(got == expected)
                        System.out.println("OK");
                   else
                        System.out.println("Error2");
              catch(Exception e) {
                   System.out.println("Error: " + e);
              }That works in the sense that it give me no errors. But, because I don't understand how the code works exactly for everything I am not sure at all if it is even returning something thats correct.
    Here is my assignment:
    http://www.csl.mtu.edu/cs1122/www/programs/prog1/desc.html
    Are there any tutorials on how to do unit testing? I have been looking all over and can't seem to find any listed on Google.

    shawnw wrote:
    OK, I have tried to rap my mind around this for a few days. Unit testing."Unit" == "one thing". You're testing each individual part of your program that can be considered a single functional unit (not necessarily a single class).
    Although I've heard people say "unit testing" to apply to just about everything.
    That works in the sense that it give me no errors. But, because I don't understand how the code works exactly for everything I am not sure at all if it is even returning something thats correct.Well, obviously, knowing what's correct is necessary before you can test for correctness.
    Typically, you introduce specific test data to the thing being tested, and you know the correct response for that test data.
    For example if you're testing a factorial program, you know that 3! == 6.
    If you have a bit of (your own code) and you don't know what correct behavior of that code is, then you're not done. Really you shouldn't have started coding if you didn't know what you were trying to accomplish when you were done.
    In test-driven development, you write the test before you write the code. Among other things, this ensures that you know what a bit of code is supposed to accomplish before you even start, which is a good thing.
    Here is my assignment:
    http://www.csl.mtu.edu/cs1122/www/programs/prog1/desc.html
    Are there any tutorials on how to do unit testing? Didn't your professor mention it in class?
    I have been looking all over and can't seem to find any listed on Google.I find that hard to believe. If I Google "unit testing tutorial" I find a bunch. But several of those are specific to particular environments, so maybe that's what's throwing you off.
    The canonical unit testing framework for Java is JUnit. You might want to google for JUnit tutorials (I just did and found some).

  • Duplicate control label preventing creation of unit test using UTF

    Hello Everyone,
                       I was trying to create a unit test for a VI using the unit test framework (evaluation version) to try out this tool.  I have two questions regarding that.
    1. Several sub-VIs are called by this VI. I wanted to create a unit test for each of these sub-VIs so that I could test each of them separately. But UTF provided me with the option to create a unit test for the main VI alone. The main VI is huge and getting 100% code coverage is going to be a challenging task. Is there a way by which I can create unit tests for the individual sub-VIs?
    2. When I tried to create the unit test for the main VI, I got an error saying: Cannot create a test from this VI. Contains the following duplicate control labels. This is because the same indicators are being used in different cases within a case structure.
    Is there anyway to resolve this issue?
    Thanks

    If I understand what the problem is you have to controls or indicators that have the same name. To fix this you need to give them unique names. If they need to say the same thing on the user interface then you can use the caption and change the name on that so they both have the same name.
    Tim
    Johnson Controls
    Holland Michigan

  • Some Thoughts On An OWB Performance/Testing Framework

    Hi all,
    I've been giving some thought recently to how we could build a performance tuning and testing framework around Oracle Warehouse Builder. Specifically, I'm looking at was in which we can use some of the performance tuning techniques described in Cary Millsap/Jeff Holt's book "Optimizing Oracle Performance" to profile and performance tune mappings and process flows, and to use some of the ideas put forward in Kent Graziano's Agile Methods in Data Warehousing paper http://www.rmoug.org/td2005pres/graziano.zip and Steven Feuernstein's utPLSQL project http://utplsql.sourceforge.net/ to provide an agile/test-driven way of developing mappings, process flows and modules. The aim of this is to ensure that the mappings we put together are as efficient as possible, work individually and together as expected, and are quick to develop and test.
    At the moment, most people's experience of performance tuning OWB mappings is firstly to see if it runs set-based rather than row-based, then perhaps to extract the main SQL statement and run an explain plan on it, then check to make sure indexes etc are being used ok. This involves a lot of manual work, doesn't factor in the data available from the wait interface, doesn't store the execution plans anywhere, and doesn't really scale out to encompass entire batches of mapping (process flows).
    For some background reading on Cary Millsap/Jeff Holt's approach to profiling and performance tuning, take a look at http://www.rittman.net/archives/000961.html and http://www.rittman.net/work_stuff/extended_sql_trace_and_tkprof.htm. Basically, this approach traces the SQL that is generated by a batch file (read: mapping) and generates a file that can be later used to replay the SQL commands used, the explain plans that relate to the SQL, details on what wait events occurred during execution, and provides at the end a profile listing that tells you where the majority of your time went during the batch. It's currently the "preferred" way of tuning applications as it focuses all the tuning effort on precisely the issues that are slowing your mappings down, rather than database-wide issues that might not be relevant to your mapping.
    For some background information on agile methods, take a look at Kent Graziano's paper, this one on test-driven development http://c2.com/cgi/wiki?TestDrivenDevelopment , this one http://martinfowler.com/articles/evodb.html on agile database development, and the sourceforge project for utPLSQL http://utplsql.sourceforge.net/. What this is all about is having a development methodology that builds in quality but is flexible and responsive to changes in customer requirements. The benefit of using utPLSQL (or any unit testing framework) is that you can automatically check your altered mappings to see that they still return logically correct data, meaning that you can make changes to your data model and mappings whilst still being sure that it'll still compile and run.
    Observations On The Current State of OWB Performance Tuning & Testing
    At present, when you build OWB mappings, there is no way (within the OWB GUI) to determine how "efficient" the mapping is. Often, when building the mapping against development data, the mapping executes quickly and yet when run against the full dataset, problems then occur. The mapping is built "in isolation" from its effect on the database and there is no handy tool for determining how efficient the SQL is.
    OWB doesn't come with any methodology or testing framework, and so apart from checking that the mapping has run, and that the number of rows inserted/updated/deleted looks correct, there is nothing really to tell you whether there are any "logical" errors. Also, there is no OWB methodology for integration testing, unit testing, or any other sort of testing, and we need to put one in place. Note - OWB does come with auditing, error reporting and so on, but there's no framework for guiding the user through a regime of unit testing, integration testing, system testing and so on, which I would imagine more complete developer GUIs come with. Certainly there's no built in ability to use testing frameworks such as utPLSQL, or a part of the application that let's you record whether a mapping has been tested, and changes the test status of mappings when you make changes to ones that they are dependent on.
    OWB is effectively a code generator, and this code runs against the Oracle database just like any other SQL or PL/SQL code. There is a whole world of information and techniques out there for tuning SQL and PL/SQL, and one particular methodology that we quite like is the Cary Millsap/Jeff Holt "Extended SQL Trace" approach that uses Oracle diagnostic events to find out exactly what went on during the running of a batch of SQL commands. We've been pretty successful using this approach to tune customer applications and batch jobs, and we'd like to use this, together with the "Method R" performance profiling methodology detailed in the book "Optimising Oracle Performance", as a way of tuning our generated mapping code.
    Whilst we want to build performance and quality into our code, we also don't want to overburden developers with an unwieldy development approach, because what we'll know will happen is that after a short amount of time, it won't get used. Given that we want this framework to be used for all mappings, it's got to be easy to use, cause minimal overhead, and have results that are easy to interpret. If at all possible, we'd like to use some of the ideas from agile methodologies such as eXtreme Programming, SCRUM and so on to build in quality but minimise paperwork.
    We also recognise that there are quite a few settings that can be changed at a session and instance level, that can have an effect on the performance of a mapping. Some of these include initialisation parameters that can change the amount of memory assigned to the instance and the amount of memory subsequently assigned to caches, sort areas and the like, preferences that can be set so that indexes are preferred over table scans, and other such "tweaks" to the Oracle instance we're working with. For reference, the version of Oracle we're going to use to both run our code and store our data is Oracle 10g 10.1.0.3 Enterprise Edition, running on Sun Solaris 64-bit.
    Some initial thoughts on how this could be accomplished
    - Put in place some method for automatically / easily generating explain plans for OWB mappings (issue - this is only relevant for mappings that are set based, and what about pre- and post- mapping triggers)
    - Put in place a method for starting and stopping an event 10046 extended SQL trace for a mapping
    - Put in place a way of detecting whether the explain plan / cost / timing for a mapping changes significantly
    - Put in place a way of tracing a collection of mappings, i.e. a process flow
    - The way of enabling tracing should either be built in by default, or easily added by the OWB developer. Ideally it should be simple to switch it on or off (perhaps levels of event 10046 tracing?)
    - Perhaps store trace results in a repository? reporting? exception reporting?
    at an instance level, come up with some stock recommendations for instance settings
    - identify the set of instance and session settings that are relevant for ETL jobs, and determine what effect changing them has on the ETL job
    - put in place a regime that records key instance indicators (STATSPACK / ASH) and allows reports to be run / exceptions to be reported
    - Incorporate any existing "performance best practices" for OWB development
    - define a lightweight regime for unit testing (as per agile methodologies) and a way of automating it (utPLSQL?) and of recording the results so we can check the status of dependent mappings easily
    other ideas around testing?
    Suggested Approach
    - For mapping tracing and generation of explain plans, a pre- and post-mapping trigger that turns extended SQL trace on and off, places the trace file in a predetermined spot, formats the trace file and dumps the output to repository tables.
    - For process flows, something that does the same at the start and end of the process. Issue - how might this conflict with mapping level tracing controls?
    - Within the mapping/process flow tracing repository, store the values of historic executions, have an exception report that tells you when a mapping execution time varies by a certain amount
    - get the standard set of preferred initialisation parameters for a DW, use these as the start point for the stock recommendations. Identify which ones have an effect on an ETL job.
    - identify the standard steps Oracle recommends for getting the best performance out of OWB (workstation RAM etc) - see OWB Performance Tips http://www.rittman.net/archives/001031.html and Optimizing Oracle Warehouse Builder Performance http://www.oracle.com/technology/products/warehouse/pdf/OWBPerformanceWP.pdf
    - Investigate what additional tuning options and advisers are available with 10g
    - Investigate the effect of system statistics & come up with recommendations.
    Further reading / resources:
    - Diagnosing Performance Problems Using Extended Trace" Cary Millsap
    http://otn.oracle.com/oramag/oracle/04-jan/o14tech_perf.html
    - "Performance Tuning With STATSPACK" Connie Dialeris and Graham Wood
    http://www.oracle.com/oramag/oracle/00-sep/index.html?o50tun.html
    - "Performance Tuning with Statspack, Part II" Connie Dialeris and Graham Wood
    http://otn.oracle.com/deploy/performance/pdf/statspack_tuning_otn_new.pdf
    - "Analyzing a Statspack Report: A Guide to the Detail Pages" Connie Dialeris and Graham Wood
    http://www.oracle.com/oramag/oracle/00-nov/index.html?o60tun_ol.html
    - "Why Isn't Oracle Using My Index?!" Jonathan Lewis
    http://www.dbazine.com/jlewis12.shtml
    - "Performance Tuning Enhancements in Oracle Database 10g" Oracle-Base.com
    http://www.oracle-base.com/articles/10g/PerformanceTuningEnhancements10g.php
    - Introduction to Method R and Hotsos Profiler (Cary Millsap, free reg. required)
    http://www.hotsos.com/downloads/registered/00000029.pdf
    - Exploring the Oracle Database 10g Wait Interface (Robin Schumacher)
    http://otn.oracle.com/pub/articles/schumacher_10gwait.html
    - Article referencing an OWB forum posting
    http://www.rittman.net/archives/001031.html
    - How do I inspect error logs in Warehouse Builder? - OWB Exchange tip
    http://www.oracle.com/technology/products/warehouse/pdf/Cases/case10.pdf
    - What is the fastest way to load data from files? - OWB exchange tip
    http://www.oracle.com/technology/products/warehouse/pdf/Cases/case1.pdf
    - Optimizing Oracle Warehouse Builder Performance - Oracle White Paper
    http://www.oracle.com/technology/products/warehouse/pdf/OWBPerformanceWP.pdf
    - OWB Advanced ETL topics - including sections on operating modes, partition exchange loading
    http://www.oracle.com/technology/products/warehouse/selfserv_edu/advanced_ETL.html
    - Niall Litchfield's Simple Profiler (a creative commons-licensed trace file profiler, based on Oracle Trace Analyzer, that displays the response time profile through HTMLDB. Perhaps could be used as the basis for the repository/reporting part of the project)
    http://www.niall.litchfield.dial.pipex.com/SimpleProfiler/SimpleProfiler.html
    - Welcome to the utPLSQL Project - a PL/SQL unit testing framework by Steven Feuernstein. Could be useful for automating the process of unit testing mappings.
    http://utplsql.sourceforge.net/
    Relevant postings from the OTN OWB Forum
    - Bulk Insert - Configuration Settings in OWB
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=291269&tstart=30&trange=15
    - Default Performance Parameters
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=213265&message=588419&q=706572666f726d616e6365#588419
    - Performance Improvements
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=270350&message=820365&q=706572666f726d616e6365#820365
    - Map Operator performance
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=238184&message=681817&q=706572666f726d616e6365#681817
    - Performance of mapping with FILTER
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=273221&message=830732&q=706572666f726d616e6365#830732
    - Poor mapping performance
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=275059&message=838812&q=706572666f726d616e6365#838812
    - Optimizing Mapping Performance With OWB
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=269552&message=815295&q=706572666f726d616e6365#815295
    - Performance of mapping with FILTER
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=273221&message=830732&q=706572666f726d616e6365#830732
    - Performance of the OWB-Repository
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=66271&message=66271&q=706572666f726d616e6365#66271
    - One large JOIN or many small ones?
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=202784&message=553503&q=706572666f726d616e6365#553503
    - NATIVE PL SQL with OWB9i
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=270273&message=818390&q=706572666f726d616e6365#818390
    Next Steps
    Although this is something that I'll be progressing with anyway, I'd appreciate any comment from existing OWB users as to how they currently perform performance tuning and testing. Whilst these are perhaps two distinct subject areas, they can be thought of as the core of an "OWB Best Practices" framework and I'd be prepared to write the results up as a freely downloadable whitepaper. With this in mind, does anyone have an existing best practices for tuning or testing, have they tried using SQL trace and TKPROF to profile mappings and process flows, or have you used a unit testing framework such as utPLSQL to automatically test the set of mappings that make up your project?
    Any feedback, add it to this forum posting or send directly through to me at [email protected]. I'll report back on a proposed approach in due course.

    Hi Mark,
    interesting post, but I think you may be focusing on the trees, and losing sight of the forest.
    Coincidentally, I've been giving quite a lot of thought lately to some aspects of your post. They relate to some new stuff I'm doing. Maybe I'll be able to answer in more detail later, but I do have a few preliminary thoughts.
    1. 'How efficient is the generated code' is a perennial topic. There are still some people who believe that a code generator like OWB cannot be in the same league as hand-crafted SQL. I answered that question quite definitely: "We carefully timed execution of full-size runs of both the original code and the OWB versions. Take it from me, the code that OWB generates is every bit as fast as the very best hand-crafted and fully tuned code that an expert programmer can produce."
    The link is http://www.donnapkelly.pwp.blueyonder.co.uk/generated_code.htm
    That said, it still behooves the developer to have a solid understanding of what the generated code will actually do, such as how it will take advantage of indexes, and so on. If not, the developer can create such monstrosities as lookups into an un-indexed field (I've seen that).
    2. The real issue is not how fast any particular generated mapping runs, but whether or not the system as a whole is fit for purpose. Most often, that means: does it fit within its batch update window? My technique is to dump the process flow into Microsoft Project, and then to add the timings for each process. That creates a Critical Path, and then I can visually inspect it for any bottleneck processes. I usually find that there are not more than one or two dogs. I'll concentrate on those, fix them, and re-do the flow timings. I would add this: the dogs I have seen, I have invariably replaced. They were just garbage, They did not need tuning at all - just scrapping.
    Gee, but this whole thing is minimum effort and real fast! I generally figure that it takes maybe a day or two (max) to soup up system performance to the point where it whizzes.
    Fact is, I don't really care whether there are a lot of sub-optimal processes. All I really care about is performance of the system as a whole. This technique seems to work for me. 'Course, it depends on architecting the thing properly in the first place. Otherwise, no amount of tuning of going to help worth a darn.
    Conversely (re. my note about replacing dogs) I do not think I have ever tuned a piece of OWB-generated code. Never found a need to. Not once. Not ever.
    That's not to say I do not recognise the value of playing with deployment configuration parameters. Obviously, I set auditing=none, and operating mode=set based, and sometimes, I play with a couple of different target environments to fool around with partitioning, for example. Nonetheless, if it is not a switch or a knob inside OWB, I do not touch it. This is in line with my dictat that you shall use no other tool than OWB to develop data warehouses. (And that includes all documentation!). (OK, I'll accept MS Project)
    Finally, you raise the concept of a 'testing framework'. This is a major part of what I am working on at the moment. This is a tough one. Clearly, the developer must unit test each mapping in a design-model-deploy-execute cycle, paying attention to both functionality and performance. When the developer is satisifed, that mapping will be marked as 'done' in the project workbook. Mappings will form part of a stream, executed as a process flow. Each process flow will usually terminate in a dimension, a fact, or an aggregate. Each process flow will be tested as an integrated whole. There will be test strategies devised, and test cases constructed. There will finally be system tests, to verify the validity of the system as a production-grade whole. (stuff like recovery/restart, late-arriving data, and so on)
    For me, I use EDM (TM). That's the methodology I created (and trademarked) twenty years ago: Evolutionary Development Methodology (TM). This is a spiral methodology based around prototyping cycles within Stage cycles within Release cycles. For OWB, a Stage would consist (say) of a Dimensional update. What I am trying to now is to graft this within a traditional waterfall methodology, and I am having the same difficulties I had when I tried to do it then.
    All suggestions on how to do that grafting gratefully received!
    To sum up, I 'm kinda at a loss as to why you want to go deep into OWB-generated code performance stuff. Jeepers, architect the thing right, and the code runs fast enough for anyone. I've worked on ultra-large OWB systems, including validating the largest data warehouse in the UK. I've never found any value in 'tuning' the code. What I'd like you to comment on is this: what will it buy you?
    Cheers,
    Donna
    http://www.donnapkelly.pwp.blueyonder.co.uk

  • Unit testing PL/SQL used in APEX application

    question from my customer :
    I am developing an application in Oracle Application Express and am working on unit tests for the PL/SQL stored procedures and packages that are stored in the underlying database and that are used by the APEX application. These unit tests should run within the SQL Developer Unit Test framework.
    The problem is that some of the PL/SQL code stored in the database uses functions like NV('APPLICATION_ITEM') to access items in the apex application. These do not return any values when I try to execute the PL/SQL within the unit test framework, ie through the backend. While it is good that the NV function does not error, NULL do not really work well in my scenario either (for example when the result of this functions is inserted into a NOT NULL column of a table). I can think of a few workarounds, such as creating my own NV function inside the test schema to return desirable values, but nothing seems a really satisfactory solution. I just wonder if there is any best practice recommendation from Oracle for this scenario - how can I run code that uses APEX-specific functions through the back end. I could not find anything in the APEX documentation for this but I'd be interesting to know if there is any recommendation how to best deal with this case.
    I am using SQL Developer version 4.0.0.13.80

    User[[:digit:]*
    Your PL/SQL Package APIs are poorly designed.
    You need to take Tom Kyte's quote to heart:
    "Application come and application go, but data remains forever"
    In short, you need to separate your database processing code (the stuff you need to unit test) from front-end/middle tier code.
    (repetitiveness is for effect.. not rudeness.)
    As such, The PL/SQL code that you need to 'UNIT TEST' must work without needing to run in APEX.
    The PL/SQL code that you need to 'UNIT TEST' must work without needing to run in .NET.
    The PL/SQL code that you need to 'UNIT TEST' must work without needing to run in JSP.
    The PL/SQL code that you need to 'UNIT TEST' must work without needing to run in Jive.
    The PL/SQL code that you need to 'UNIT TEST' must work without needing to run in Ruby.
    The PL/SQL code that you need to 'UNIT TEST' must work without needing to run in Perl::CGI.
    The PL/SQL code that you need to 'UNIT TEST' must work without needing to run in P9.
    The PL/SQL code that you need to 'UNIT TEST' must work without needing to run in <place latest and greatest thing here>.
    Again, I don't mean to sound rude.  I'm just trying to reinforce the idea that you need to separate database code from middle-tier/front-end stuff.
    Basically, you will need to separate all of your packages into multiple parts.
    a _CORE package (that will be unit tested) that does all the hard work
    an _APEX package for APEX infrastructure (this works with NV()/V(), etc.)
    a _NET package for .NET infrastructure when you need it
    a _JSP package for the JSP infrastructure when you need it
    a _JIVE package for the JIVE infrastructure when you need it
    a _<place latest and greatest thing here> for the <place latest and greatest thing here> when you need it.
    MK

  • Better UI for Unit Tests

    I've been doing a lot of unit tests lately.  Right now I'm working on a code converter Air app and I'm constantly creating unit tests for bugs I'm finding and running them, and debugging them when they don't work.  When I debug, it's easiest to isolate the runner to run only one test.  I know I can do that by editing the source file and typing in the test name in the core.run call.  However, I can't help by think a better UI is needed for doing this form for TestDriven Development. 
    Basically, what I think would be perfect, is to have the UI, not run the tests initially.  It should provide you with a list of tests with checkboxes.  Then give you the opportunity to select which ones you want to run, and then run them.  It should also have the ability to re-run a test.  I realize that for any code changes to take affect you would need to re-run the entire app, but a lot of times, I find myself running the same test over and over and stepping through the code with the debugger in order to figure out what's going on.  Then I eventually change the code.
    Also, the new UI should persist the last set of selected tests, and maybe have a way to store previous selections.  That would make it perfect for how I use it. 
    Does anyone agree or disagree with this?  Has someone done this already?
    BTW, the reason I'm using the stand alone runner and not the one in Flash Builder 4 is because I've run into some bugs with the FlexUnit shipped with Flash Builder 4 and there doesn't seem to be a way to get the latest FlexUnit 4.1 to work properly with Flash Builder 4.  There seems to be a way to do it with Flash Builder 4.5, but I don't have that version.
    Thanks,
    Mark

    On 05/08/2012 03:56 PM, prakash jv wrote:
    > We have been looking for unit test framework for unit testing SWT
    > components in our RCP application developed in eclipse galileo 3.5.
    >
    > We found SWTBot supports the better UI testing and wanted to additional
    > details reagrding its support for maven 2.2.1.
    >
    > Does SWTBot work with projects which are mavenized with maven 2.2.1?
    >
    > Our aim of adding the unit tests for UI components is for better build
    > quality. So we would want these Unit tests to be run everytime we build
    > our assembly. As of now we use Junit4 for running our JUnits and they
    > run with maven outside eclipse.
    >
    > Does SWTBot support running the UI unit tests outside eclipse using maven?
    Hi
    in
    http://code.google.com/a/eclipselabs.org/p/emf-components/
    we run swtbot tests with maven/tycho
    hope this helps
    cheers
    Lorenzo
    Lorenzo Bettini, PhD in Computer Science, DI, Univ. Torino
    ICQ# lbetto, 16080134 (GNU/Linux User # 158233)
    HOME: http://www.lorenzobettini.it MUSIC: http://www.purplesucker.com
    http://www.myspace.com/supertrouperabba
    BLOGS: http://tronprog.blogspot.com http://longlivemusic.blogspot.com
    http://www.gnu.org/software/src-highlite
    http://www.gnu.org/software/gengetopt
    http://www.gnu.org/software/gengen http://doublecpp.sourceforge.net

  • Unit Test component missing after upgrade to 3.2

    I recently upgraded (just installed it over the previous version) the sqldeveloper from 3.1 to 3.2.20.09.
    Since the new installation the unit test component is missing. After some searches I found that entry in the help/info/enhancements panel:
    Oracle SQL Developer - Unit Test    oracle.sqldeveloper.unit_test    11.2.0.09.87    Deaktiviert durch Benutzer (=>deactivated by user !?)
    It seems as if it was deactivated somehow. I certainly can't remember deactivating any components/enhancements, escpecially not the unit testing framework.
    How can I enable it again?

    JimSmith wrote:
    You should never install a new version over an old version. It always causes problems. The upgrade path as documented in the release notes is ALWAYS to install in a new directory.
    Hm... I just double checked the installation guide (http://docs.oracle.com/cd/E35137_01/doc.32/e35119/install.htm#CIHFGGJB) and I can't see a comment that you HAVE TO use a new directory. And even more that reusing the same directory would result in a corrupt installation.
    The only thing that it tells me is that IF I want to migrate user settings from a previous release, I need to install it in an empty folder.
    To migrate user settings from a previous SQL Developer release:
      Unzip the kit for the current release into an empty directory (folder). Do not delete or overwrite the directory into which you unzipped the kit for the previous SQL Developer release.
    If I don't want to migrate user settings, then I could install it into the same directory.
    Did I overlook something in the installation guide? Maybe there is a documentation gap?
    The comment in the release notes is easily overlooked, especially as soon as you start looking though the installation guide. And there the wording could be misleading.
    Message was edited by: SvenW.

  • Unit Test in SQL Developer 2.1: Automated Builds

    Hi,
    I am interested to know if the new Unit Testing framework can be accessed via API so the test execution is initiated from automated build process.
    Regards,
    Vadim

    I am having a problem with the unit testing command line.
    I am attempting to run the unit testing using the command line interface.
    I can connect to UNIT_TEST_REPOS schema in SQL developer.
    I am successfully running units test and suites in SQL developer.
    UNIT_TEST_REPOS, RCSV1 and DEVER users are granted on the UT_REPO_USER role and UNIT_TEST_REPOS and DEVER on the UT_REPO_ADMINISTRATOR.
    The following commands result in an error box saying "No Repository was found on the selected connection, you need to create a repository." (The HELP button apparently does nothing. The OK button closes the box.)
    C:\Program Files\sqldeveloper\sqldeveloper\bin>UtUtil -exp -test -name RCSV1_RCS_SECURITY.GET_LDAP_BASE -repo unit_test_repos -file c:\ut_xml\test.xml
    Unable to open repository
    C:\Program Files\sqldeveloper\sqldeveloper\bin>UtUtil -run -test -name RCSV1_RCS_SECURITY.GET_LDAP_BASE -repo unit_test_repos -db dever
    Unable to open repository
    C:\Program Files\sqldeveloper\sqldeveloper\bin>UtUtil -run -test -name RCSV1_RCS_SECURITY.GET_LDAP_BASE -repo dever -db dever
    Unable to open repository
    I would guess that I am not supplying the correct connection info.
    My last comment triggered an idea. It turns out that the connection names required are those connections named in SQL developer. In my case, they are not the same as the schema names. The following command worked as advertised.
    C:\Program Files\sqldeveloper\sqldeveloper\bin>UtUtil -run -test -name RCSV1_RCS_SECURITY.GET_LDAP_BASE -repo UNIT_TEST -db DeverLocal
    The ANT target is
    <target name="UnitTests">
    <exec executable="cmd" dir="${sqldev.bin.dir}">
    <arg value="/c"/>
    <arg value="UtUtil -run -suite -name RCSV1 -repo UNIT_TEST -db DeverLocal"/>
    </exec>
    </target>
    Regards,
    Bill

  • Unit test with Dynamic Value Query

    Hi,
    I am planing a function which should return a table name for a given ROWID. The ROWID will be selected from a view. Now I am trying to setup a unit test which consists of Startup and Teardown Process as well as a Dynamic Value Query. During Startup Process one dataset is inserted in a table which is part of a view definition. The dataset should be selected during Dynamic Value Query. However I am getting the following error message: Unable to run the test because no rows were returned by the dynamic query. Is there the possibility that within a Dynamic Value Query I can't make use of datasets which where inserted by a Startup Process? I am asking because the selection outside the unit testing framework supplies the dataset as expected.
    SQL Developer: 3.2.20.09 Build MAIN-09.87
    Java: 1.6.0_45
    Regards.

    Dynamic SQL is very tricky.  Since you can get the data outside your application extract and test the dynamic SQL as it is generated.  When working with dynamic SQL I find it useful to first generate the SQL text as a string and then if necessary store it in a table (CLOB type) for later reference.  Because of the uncertainty of obtaining bind variable values later I prefer to hard-code such values into dynamic SQL - possibly slightly less efficient for Oracle at run time but a lot easier to work with when debugging and tuning (again, apart from the built-in inefficiency of not using bind variables)  I usually find the overhead of the hard-coded values not too bad and the ability to know the data values at a glance useful.  I also find it useful to write dynamic SQL in such a way so that the resulting statement can be pasted in to any normal SQL tool (SQL*PLUS, SQL*Developer) and run without any editing.
    There may be some quirk in the generated dynamic SQL preventing it from finding anything.
    Good luck.

  • Create Unit Test Vectors From The Unit Test Configuration Window

    I have recently been using more test vectors in the unit test framework.
    The principle works well but if you are in a unit test and decide you need a test vector you must:
    Close the unit test configuration
    Create a test vector
    Set up your vector (entering values and data types, which is much easier if you could see the actual unit test case)
    Close the vector configuration window
    Open the unit test configuration
    Assign the new vector file to the unit test
    Now you can assign vectors to test inputs
    This seems convoluted and forces unnecessary context switchs.
    I propose that at a minimum, you should be able to create a new vector file and launch it's configuration without leaving the unit test configuration window. I suspect that the whole process could be streamlined even further though.
    James Mc
    ========
    CLA and cRIO Fanatic
    wiresmithtech.com/blog

    Oops supposed to be ideas exchange, sorry!
    James Mc
    ========
    CLA and cRIO Fanatic
    wiresmithtech.com/blog

Maybe you are looking for