Junit question...multiple input files

I'm not very clear if fixtures is what I need to use and if so how to even use it in my case. I have a simple XMLUnit testcase that compares a set of xml files using DetailedDiff and getAllDifferences(). This works just fine for a comparison of a single set of XML files, how should this be done for n sets of XML files? I can't just simply put the assertEquals within the testXXX in a for loop since it will exit on the first failure.
I want to be able to run this XMLUnit testcase on n sets of xmlfiles and output all failures for all n sets of xmlfiles. I hope all of that made sense :).

What you want is a data driven test, they are quite common and a lot has been written about them on the web (hint: Google ;-))
This is one exampel using JUnit 3.x http://twasink.net/blog/archives/2003/10/junit_and_datad.html
For 4.x there are more modern possibilities out there, but I would have to Google them as well.
Also the Yahoo JUnit Group (might require registering: http://tech.groups.yahoo.com/group/junit/messages ) is pretty active and has plenty of information about this topic in its archive (don't post that question there, 'though, it's already been answered a lot).

Similar Messages

  • File bc: Multiple input files in one folder

    I am working on an integration project, which includes five BPEL processes. These processes are intiated whenever a file is copies in a given folder. The input-filename-pattern for these bpel processes are different (say, input1_%d.txt, input2_%d.txt etc.). But, these files are present in one single folder.
    To implement it, we created one more bpel process, which was bound to file-bc. The first activity in this process is a 'pick' activity. Five operations were defined corresponding to each input-file (name pattern and schema) in the WSDL under single port-type. And, the port-type was bound to the folder where all messages are received. For all five operations, polling interval was defined as 1000 ms.
    For pick activitiy, five 'onMessage' node was created for each operation, which invoked the corresponding (one of five) BPEL process.
    We tested this solution, and it worked. The solution was able to pick five types of input files and invoke the bpel-processes.
    However, the system log of glassfish contains a number of exception error messages. To me, it looks like that all five operations are trying to get the lock on the folder at the same time and this exception is thrown. Though, it does not create any problem in execution of the process, it is a headache for the system administrator.
    Is there any way to control these specific error messages, so that these are not sent to the system log?
    The error message in they system-log is:
    [#|2007-12-21T22:21:40.602-0500|SEVERE|sun-appserver9.1|sun-file-binding.com.sun.jbi.filebc.InboundMessageProcessor|_ThreadID=32;_ThreadName=filebc-ib-Thread-47;_RequestID=15c6d2fe-59e4-4f4f-970f-d13d5d7024ea;|FILBC-IMP0001: inonly message for operation {http://www.servicelive.com/wsdl/HPServiceOrder}AcceptCancelOrder failed processing, an exception was raised. null
    java.lang.IllegalMonitorStateException
         at java.util.concurrent.locks.ReentrantLock$Sync.tryRelease(ReentrantLock.java:125)
         at java.util.concurrent.locks.AbstractQueuedSynchronizer.release(AbstractQueuedSynchronizer.java:1102)
         at java.util.concurrent.locks.ReentrantLock.unlock(ReentrantLock.java:431)
         at com.sun.jbi.filebc.InboundMessageProcessor.execute(InboundMessageProcessor.java:343)
         at com.sun.jbi.filebc.InboundMessageProcessor.run(InboundMessageProcessor.java:184)
         at java.lang.Thread.run(Thread.java:595)
    |#]
    [#|2007-12-21T22:21:41.602-0500|SEVERE|sun-appserver9.1|sun-file-binding.com.sun.jbi.filebc.InboundMessageProcessor|_ThreadID=32;_ThreadName=filebc-ib-Thread-47;_RequestID=15c6d2fe-59e4-4f4f-970f-d13d5d7024ea;|FILBC-IMP0001: inonly message for operation {http://www.servicelive.com/wsdl/HPServiceOrder}AcceptCancelOrder failed processing, an exception was raised. null
    java.lang.IllegalMonitorStateException
         at java.util.concurrent.locks.ReentrantLock$Sync.tryRelease(ReentrantLock.java:125)
         at java.util.concurrent.locks.AbstractQueuedSynchronizer.release(AbstractQueuedSynchronizer.java:1102)
         at java.util.concurrent.locks.ReentrantLock.unlock(ReentrantLock.java:431)
         at com.sun.jbi.filebc.InboundMessageProcessor.execute(InboundMessageProcessor.java:343)
         at com.sun.jbi.filebc.InboundMessageProcessor.run(InboundMessageProcessor.java:184)
         at java.lang.Thread.run(Thread.java:595)
    |#]
    Thanks,
    Sanjay

    Sanjay,
    Most of the folks that would normally reply are on winter holiday break, so there would be a delay in response. In the mean time, I would really appreciate if you log this as a bug at http://open-jbi-components.dev.java.net. You need to join as an observer to log this bug. You may also want to subscribe to [email protected] and start posting questions there so that you get a fast response from the Open ESB dev team. FYI, Open JBI Components is a sub project of Open ESB and the dev alias at Open ESB is very active.
    Suresh

  • Loading a table from multiple input files using sqlldr

    Hi,
    For my project i need to load into a table by joining two input files using sqlldr. For example,
    I have a file1, which has values
    name,salary,ssn
    and file2 which has values for
    ssn,location,country
    now i need to load these values into a table say employee_information by joining both input files. both input files can be joined using ssn as common field.
    Any idea how to do this??
    Thanks in advance
    Satya.

    Hi,
    What is the size of the files. If possible mail me the sample files, And the structure of table. Is the <ssn> from first file and second will have seperate columns in the table or we have to merge it.
    SKM

  • Multiple input files to multiple file outputs

    I have a workflow that can select multiple files and zip them into a single zip file.
    What modification should I do to get a workflow that zips each of the files separately into separate zip files?
    So I can select multiple files but each file is zipped into a separate file.

    You can use a Run AppleScript action do do it - the following action can be used instead of the Finder's Create Archive action:
    <pre style="
    font-family: Monaco, 'Courier New', Courier, monospace;
    font-size: 10px;
    margin: 0px;
    padding: 5px;
    border: 1px solid #000000;
    width: 720px; height: 335px;
    color: #000000;
    background-color: #FFDDFF;
    overflow: auto;"
    title="this text can be pasted into an Automator 'Run AppleScript' action">
    on run {input, parameters}
    -- create a PKZip archive of the selected Finder item(s)
    -- if no destination folder is specified, the archive will be placed in the same location
    -- input: a list of Finder items to archive
    -- output: a list of Finder items archived
    set output to {}
    set SkippedItems to {} -- this will be a list of skipped items
    set DestinationFolder to missing value -- a Finder path to a destination folder if different
    repeat with SomeItem in the input -- step through each item in the input
    try
    set SomeItem to SomeItem as text
    if the last character of SomeItem is in {":", "/"} then set SomeItem to text 1 thru -2 of SomeItem
    set ArchiveSource to POSIX path of SomeItem
    if DestinationFolder is missing value then -- save the archive to the same location
    set ArchiveName to ArchiveSource & ".zip"
    else -- save the archive to the specified folder
    set TheName to name of (info for SomeItem as alias)
    set ArchiveName to (POSIX path of DestinationFolder) & TheName & ".zip"
    end if
    do shell script "ditto -ck " & (quoted form of ArchiveSource) & space & (quoted form of ArchiveName)
    set the end of the output to (POSIX file ArchiveName) as alias -- success
    on error ErrorMessage number ErrorNumber
    log ErrorMessage
    set the end of SkippedItems to SomeItem
    end try
    end repeat
    if SkippedItems is not {} then -- handle skipped items
    set TheCount to (count SkippedItems) as text
    display alert "Error with Archive action" message TheCount & " items(s) were skipped - workflow will continue"
            choose from list SkippedItems with title "Error with Archive action" with prompt "The following items were skipped:" with empty selection allowed
    if result is false then error number -128 -- user cancelled
    end if
    return the output -- pass the result(s) to the next action
    end run
    </pre>

  • Multiple Input Files using File adapter

    hey Experts,
    Wanted to know how i could pick up two different XML files using the Sender file adapter, which i want to process in my scenario, need ur suggestions, i dont want to use BPM. Thanks

    When you use the "additional file" feature, you have to keep in mind that the additional file enters the Integration Engine as attachment of the primary file.
    Modifying this attachment (or merging the two files) is difficult, as you can't use graphical mapping. Accessing the attachment with the other mapping techniques is often cumbersome as well.
    With a BPM, you could use two sender communication channels and correlation, but this alternative seems to be no option for you.
    Could you please tell us
    1. whether or not the two files have the same message type,
    2. which processing has to be performed afterwards and
    3. where does it go to when it leaves the XI?
    That would greatly help us.
    Kind regards,
    Dennis

  • Very basic compiling question: multiple c files

    Hi,
    I've been playing with alchemy for a few days, and hit a road block. I'm sure there is an easy answer, but I can't find it...
    I have a .c file "alchemy_project.c" that lets me call 3 functions from actionscript. I have the basic main() function that sets up the functions and calls
    AS3_LibInit(result);
    It is working well. However, I want to add another c file "support_functions.c" with a function I can call from "alchemy_project.c".
    int testFunction(){
         int i;
         i = 2;
         return i;
    I made a header file "support_functions.h":
    int testFunction();
    I included this header in "alchemy_project.c" by placing it after my AS3.h include:
    #include "AS3.h"
    #include "support_functions.h"
    I compile with:
    alc-on
    gcc alchemy_project.c support_functions.c -Wall -swc alchemy_project.swc
    When I run my flash file, it gives:
    Undefined sym: _main
    If I add a main function to support_functions.c:
    int main(){
        return 0;
    I get:
    [object AlchemyExit]
    Any tips?
    Thanks for the help!

    This helped me a little bit:
    http://forums.adobe.com/thread/465926
    I now compile the 2 c files with:
    gcc -c file1.c -o file1.o
    gcc -c file2.c -o file2.o
    then link them with
    gcc file1.o file2.o file3.o -O3 -Wall -swc --combine lib.swc
    I don't get any errors about main(), but actionscript gives me the error:
    Undefined sym: _testFunction
    when alchemy_project.c tries to call the testFunction() function in support_functions.c

  • Track multiple input files in SQLLDR

    I'm loading many infiles, say "File1", "File2", "File3", etc with SQLLDR. Can I create a column in the table that loads the filename into a column? Maybe a better way to ask is: how can I reference the INFILE name in the control file, so that I would get the INFILE text/name for each record in that file?
    Edited by: Yura on Dec 23, 2009 3:29 PM

    The only way to accomplish this is to hard code the file name into the file or ... Years ago I wrote some code based on UTL_FILE.
    You can find a version of it here:
    http://www.morganslibrary.org/reference/utl_file.html
    under "WRITE DEMO" that dynamically builds the control file
    A variation on the theme might solve the problem.

  • Any Tutorial / Sample to create Single PDF from multiple source files using PDF assembler in a watched folder process.

    Any Tutorial / Sample to create Single PDF from multiple source files using PDF assembler in a watched folder process. I have a client application which will prepare number of source files and some meta data information (in .XML) which will be used in header/footer. Is it possible to put a run time generated DDX file in the watch folder and use it in Process. If possible how can I pass the file names in the DDX. Any sample Process will be very helpful.

    If possible, make use of Assembler API in your client application instead of doing this using watched folder. Here are the Assembler samples :  LiveCycle ES2.5 * Programming with LiveCycle ES2.5
    Watched folder can accept zip files (sample : Configuring a watched folder to handle multiple input files and write results to a single folder | Adobe LiveCycle Blog ). You can also use execute script to create the DDX at runtime : LiveCycle ES2 * Application Development Using LiveCycle Workbench ES2
    Thanks
    Wasil

  • Flat File-to-RFC question, multiple RFC calls for one file.

    Hi guys,
    I'm quite new to XI / PI and I have a question regarding a File-to-RFC scenario we're trying in NW PI 7.1.
    We have a flat file which has two lines with two fields each, let's take this as an example :
    001,001
    002,002
    The files needs to be converted to XML and then transferred to the RFC program which will update a table with the values above.
    In the ESR I've created 3 data types (z_row1,z_record1 and z_fileinput1), 1 message type (z_file2rfc_ob_mt), 1 message mapping (z_file2rfc_mm), 2 Service Interface (z_file2rfc_ob_si and z_file2rfc_ib_ztestpi) and 1 operation mapping (z_file2rfc_om).
    In the Integration Builder (ID) I've created everything required (The sender and receiver communication channels, sender and receiver agreement, receiver determination and interface mapping).
    We're also using content conversion to convert the flat file to XML, this does seem to work because I see all the lines/field in the message when it gets into PI. The problem is that the RFC can only accept two fields at a time and only the first line gets updated. I see that only the first line gets sent to the RFC.
    How can I make the RFC call for each and every line ?
    Thanks !!

    Create the RFC with table, which takes multiple lineitem as input and update the table in single call.
    If you want response back then call RFC as synchrounous else in Asynchrounous mode.
    By doing this in single call it will update the complete table.
    Gaurav Jain
    Reward Points if answer is helpful

  • Multiple DB files clean migration (follow up to my decommission question)

    So I have requested help standing up a new environment and phasing out an old one since it was "all messed up" after spending some time digging into it it looks like best practice was followed (which I don't know how I missed this after reading
    through Mastering the fundamentals by Kent Agerlund multiple times). Apparently they suggest you create multiple SQL file based on how many CPU's you have available to help performance. 
    The question I have now is when the backup site task runs it only backs up 1 2GB MDF file and a 11GB LDF file. Does that make sense? It doesn't look like the other SQL files that were created are being used at all? None of the SQL files have not grown past
    the initial size (the first file started at 2GB and is still that size). 
    Assuming everything is working correctly my new plan would be to upgrade the existing setup and do a backup and restore to the new box using the same site code and server name. If I do that will it bring all those other SQL files with it our can I consolidate
    onto 1 file again? I don't think we need to worry about performance as the DB is pretty small and I'm trying to keep things simple. 
    My other choice is I have the new server built with new site code so I could add the first site as a child site and migrate things over? I have multiple DP's out there so I'm not sure best way to handle that. The plan was to blow them away and rebuild brand
    new ones for the new site, but if I can just throw an upgrade on them that should work and be less work. 

    My other choice is I have the new server built with new site code so I could add the first site as a child site and migrate things over?
    That won't work in CM12.
    Just back up the database using SQL and use that for the restore (there's no need for the SMSBackup task in CM12).
    Torsten Meringer | http://www.mssccmfaq.de

  • The file size of selected file in input file control is shown as 0 for multiple file selection in Safari 5.1

    The file size of selected file in input file control is shown as 0 for multiple file selection in Safari 5.1. If you select single file, then it is able to return file size correctly. However, if you select multiple files, then the file size of each of the selected file is always returned as 0 from javascript. This works correctly in Safari 4.0 but it does not work in Safari 5.1.
    How do I get the correct file size in Safari 5.1 ?

    If you want to post (or send me) a link to the lrcat file, I'd take a look at it for you, and give you a break-down what's consuming all the bytes. But it might be fun to learn how to do that yourself (e.g. using SQL). I use SQLiteSpy, but other people have their favorites.. (or you can use a command-line client if you prefer..). One way: just run "drop table "{table-name}" on each table then look at filesize (do this to a copy, not the real thing).
    Anyway, it's hard to imagine keywords and captions etc. taking much of the space, since even if you had 1000 10-character words of text metadata per photo average that still only adds up to 117MB, which isn't a substantial portion of that 8G you're seeing occupied.
    Anyway, if you've painted the heck out of most of them and not cleared dev history, that'll do it - that's where I'd put my money too...
    One thing to consider to keep file-size down:
    ===================================
    * After reaching a milestone in your editing, take a snapshot then clear edit history, or the top part of it anyway (e.g. leave the import step), using a preset like:
    Clear Edit History.lrtemplate
    s = {
        id = "E36E8CB3-B52B-41AC-8FA9-1989FAFD5223",
        internalName = "No Edit",
        title = "Clear Edit History",
        type = "Develop",
        value = {
            settings = {
                NoEdit = true,
            uuid = "34402820-B470-4D5B-9369-0502F2176B7F",
        version = 0,
    (that's my most frequently used preset, by far ;-})
    PS - I've written a plugin called DevHistoryEditor, which can auto-consolidate steps and reduce catalog size - it's a bit cumbersome to use a.t.m. but in case you're interested...
    Rob

  • Syndicating multiple Outputs for 1 Input file

    Hi SDners,
    When I am syndicating records output is splitting into 2 Files for 1 Input file.
    Example:- 10 records are there in Input file. While importing Import Manager Identifies 2 records as Duplicates and it will create only 8 records in MDM. While syndicating these 8 records are splitting into 2 output files. I want these 2 output files in 1 Output File.
    This is not happeneing everytime, Only some times. If I process the same file agian in MDM then it will syndicate in 1 Output file only.
    I configured Parameters like this:
    Suppress unchanged records=Yes
    Max Records per Output file=0
    XML File Output=Single File (All Records)
    and the PORT Map also like this
    "Processing type=Automatic"
    "Processing Interval=Continues".
    Still I facing the Same problem.
    Why it is happening only sometimes. If I configured the parametrs wrongly then it should syndicate everytime output into 2 file. But it is happening only once in a Week like that tooo 1 or 2 files only.
    MDM Version I am using is : SAP MDM 5.5 AP06 (5.5.63.71). Any help ASAP.
    Thanks
    Kiran

    Hi Kiran,
    Can you check the key mappings for the records .If you have multiple key mappings maintained and you have overridden the key mapping in the remote systems property then MDM will generate identical copies for each remote key
    Thanks
    Vinay

  • One Input File Multiple output files.

    Hi,
    How to Generate Multiple Output (Target) files by using Single (Source File) Input file in File 2 File Scenario.
    Please help me to do this.
    With Regards,
    Mahesh

    Hi,
    This is a scenario of 1 : N transporting right.. for this u shud use Multi-mapping then you need to change your logic because Multi-mapping is only be used by ccBPM. so use BPM (integration process) in IR to implement you logic in File to File scenario.
    oops.. i missed one thing ..
    in simple file to file scenario you can use Message patterns for duplicating the target fields if you getting from single file. You can use this pattern like in target field structure right click on root tag and then click on duplicate subtree, and then you are down.. above is the other way if u want to use bpm.
    Hope this will help you,
    Regards
    Aashish Sinha
    PS : reward points if helpful

  • Question about creating multiple output  files from same query

    I have a query like this:
    select * from emp;
    ename empno deptno
    Scott 10001 10
    Tiger 10002 10
    Hanson 10003 20
    Jason 10004 30
    I need to create multiple output files in xml format for each dept
    example:
    emp_dept_10.xml
    emp_dept_20.xml
    emp_dept_30.xml
    each file will have the information for employees in different departmemts.
    The reason I need to do this is to avoid executing the same query 200 times for generating the same output for different departments. Please let me know if it is practically possible to do this.
    Any input is greatly appreciated.
    Thanks a lot!!

    You can write a shell script to generate the multiple spools files for the same output. Below script may helps you.
    #====================
    #!/bin/bash
    n=0
    while [ $n -le 20 ]
    do
    n=`expr $n + 1`
    sqlplus -s system/manager <<EOF
    spool emp_dept_$n.xml
    select count(1) from tab;
    spool off
    EOF
    done
    #====================

  • Multiple replacements from an input file with 1.4 Regex

    hi,
    i'm trying to make multiple replacments to a source file <source> using a second input file <patterns> to hold the regex's. the problem i'm having is that the output file only makes the last replacement listed in the input file. Example if the input file is
    a#123
    b#456
    only b is changed to 456 and a remains.
    the second debug i've got shows that all the replacements are in memory, but i'm not sure how to write them all to the file.
    the syntax is Java MyPatternMatcher <source> <patterns>
    import java.util.regex.*;
    import java.io.*;
    import java.util.*;
    public class MyPatternResolver {
         private File patternFile;
         private File sourceFile;
         private Vector patterns = new Vector();
         public MyPatternResolver(String sourceFile, String patternFile) {
              this.sourceFile = new File(sourceFile);
              this.patternFile = new File(patternFile);
              loadPatterns();
              resolve();
         private void loadPatterns() {
              // read in each line if File
              FileReader fileReader = null;
              try {
                   fileReader = new FileReader(patternFile);
              } catch(FileNotFoundException fnfe) {
                   fnfe.printStackTrace();
              BufferedReader reader = new BufferedReader(fileReader);
              String s = null;
              String[] strArr = new String[2];
              try {
                   while((s = reader.readLine()) != null) {
                        StringTokenizer tokenizer = new StringTokenizer(s, "#");
                        for (int i =0; i < 2; i++) {
                             strArr[i] = tokenizer.nextToken();
                             //Debugging Info
                             System.out.println("Token Value " + i + " = " + strArr);
                             //End Debugging Info
                        patterns.add(new PatternResolver(strArr[0], strArr[1], sourceFile));
              } catch(IOException ioe) {
                        ioe.printStackTrace();
         private void resolve() {
              Iterator iterator = patterns.iterator();
              while(iterator.hasNext()) {
                   PatternResolver pr = (PatternResolver) iterator.next();
                   pr.resolve();
         public static void main(String[] args) {
              MyPatternResolver mpr = new MyPatternResolver(args[0], args[1]);
         public class PatternResolver {
              private String match, replace;
              private File source;
         public PatternResolver(String s1, String s2, File f) {
              this.match = s1;
              this.replace = s2;
              this.source = f;
         public File resolve() {
              File fout = null;
              try {
         //Create a file object with the file name in the argument:
         fout = new File(sourceFile.getName() + "_");
         //Open and input and output stream
         FileInputStream fis = new FileInputStream(sourceFile);
         FileOutputStream fos = new FileOutputStream(fout);
         BufferedReader in = new BufferedReader(new InputStreamReader(fis));
         BufferedWriter out = new BufferedWriter(new OutputStreamWriter(fos));
         // The find and replace statements
         Pattern p = Pattern.compile(match);
                   Matcher m = p.matcher(replace);
                   //Debugging Info
                   System.out.println("Match value = " + match + " Replace value = " + replace);
                   //Debugging Complete
                   String aLine = null;
                   while((aLine = in.readLine()) != null) {
                   m.reset(aLine);
                   //Make replacements
                   String result = m.replaceAll(replace);
                   out.write(result);
              out.newLine();
                   in.close();
                   out.close();
              } catch (Exception e) {
                   e.printStackTrace();
    return fout;

    If your aim is to learn about regex, then its okay.
    Otherwise you might want to check the utility "sed" (stream editor) which does something similar what you are up to. It is a POSIX (.i.e. UNIX) utility, but it is available (in several versions) for other platforms (including Windows) too. (Cf. Cygnus or Mingw).

Maybe you are looking for

  • Login to Portal without credentials - Anonymous access

    All, We have a requirement to provide anonymous access to all the LDAP users in portal for accessing KM documents. Can somebody explain me if it is possible to give access to all the users without using credentials in portal ? Thanks!

  • Is there any way to default to the Bookmarks Toolbar when clicking on the Bookmarking star in the Address Bar?

    When adding a bookmark by clicking the star in the address bar, the bookmark automatically goes to the Unsorted Bookmarks folder, is there any way it can automatically go to the Bookmarks Toolbar? This is a minor inconvenience.

  • No Matching Records found G/L Accounts

    Hi All, I am adding a new Sales Order.Convert the sales order to Invoice.When I try to add the INVOICE document,i get an error message saying "No Matching Records found G/L Accounts (OACT). In The sale Order I have selected Item A in line one with pr

  • Movement Types in the Delivey Document

    Hi all, I have some problems regarding the Movemnet types used in Outbound Delivery Documnet. 1. Can use the Transfer posting function in the delivery documnet. Becasue i want to transfer the Goods at the Post Goods Issue level to another Storage loc

  • Watch a slideshow without using iPhoto?

    Hello together, I never used iPhoto and I never want to use it. I have far better ways to sort/control my pictures (e.g. Adobe Bridge). Is it somehow possible to see a slideshow of an image folder in Front Row? Like it is possible with DVDs with the