OracleParameter type inference

The type inference described in section 3 of the production ODP.NET documentation (table 3.8) seems to be enforced when setting a value directly using OracleCommand.Parameters.Add(), but not when assigning OracleParameter.Value and then adding that parameter to the command. Am I missing something here?
DateTime thingDateTime = DateTime.Now;
OracleCommand cmdOne = new OracleCommand();
OracleParameter paramTemp = new OracleParameter();
cmd.Parameters.Add("one", thingDateTime);
paramTemp.ParameterName = "two";
paramTemp.Value = thingDateTime;
cmd.Parameters.Add(paramTemp);
Console.WriteLine("one - Value.GetType:{0}", cmd.Parameters["one"].Value.GetType());
Console.WriteLine("one - OracleDbType:{0}", cmd.Parameters["one"].OracleDbType.ToString());
Console.WriteLine("one - DbType:{0}", cmd.Parameters["one"].DbType.ToString());
Console.WriteLine("");
Console.WriteLine("two - Value.GetType:{0}", cmd.Parameters["two"].Value.GetType());
Console.WriteLine("two - OracleDbType:{0}", cmd.Parameters["two"].OracleDbType.ToString());
Console.WriteLine("two - DbType:{0}", cmd.Parameters["two"].DbType.ToString());
Console.WriteLine("");
Console.WriteLine("temp - Value.GetType:{0}", paramTemp.Value.GetType());
Console.WriteLine("temp - OracleDbType:{0}", paramTemp.OracleDbType.ToString());
Console.WriteLine("temp - DbType:{0}", paramTemp.DbType.ToString());
Result:
one - Value.GetType:System.DateTime
one - OracleDbType:TimeStamp
one - DbType:DateTime
two - Value.GetType:System.DateTime
two - OracleDbType:Varchar2
two - DbType:String
temp - Value.GetType:System.DateTime
temp - OracleDbType:Varchar2
temp - DbType:String

This issue has been identified as a bug in the production release and would be fixed in the next version.

Similar Messages

  • Odd scenario where type inference doesn't work

    I've come across an odd scenario where type inference doesn't work and instead I'm greeted with a message:
    incompatible types
    found : java.lang.Object
    required: java.util.Set
    Here's the scenario (built with JDK6, javac):
    public class GenericTest {
         public static void test() {
              Set testList = GenericTest.method1(new Class[] { ArrayList.class });
         public static <I> I method1(Class<List>[] params) {
              return null;
    }Yet when I remove the type information from the method1 params, the class builds.
    public class GenericTest {
         public static void test() {
              Set testList = GenericTest.method1(new Class[] { ArrayList.class });
         public static <I> I method1(Class[] params) {
              return null;
    }Why would removing the type information from the parameter have any effect whatsoever on the return type?

    ilmaestro wrote:
    Though casting the parameter as you mentioned resolves the error (because it eliminates the generic), it still doesn't explain why changes to the parameter effect an error message having to do with the return type.It is due the call to the method with the "erased" type Class[] (not a Class<?>[], for example) and a return type resolution. Unchecked conversion (from the raw type to the parametrized type) is applied to the method argument and the return type of the method is determined as an erasure of the declared return type (JLS 3rd edition, [Method Result and Throws Types|http://java.sun.com/docs/books/jls/third_edition/html/expressions.html#15.12.2.6], first bullet, second sub-bullet). Erasure of the return type is a java.lang.Object, which is incompatible with the java.util.Set.
    In the second example, method parameter is already a raw type, and unchecked conversion is not required. The return type of the method is I and is correctly inferred to the java.util.Set.
    If the method is called with a non-raw type, then return types are inferred correclty. For example, following code compiles fine.
    public class GenericTest {
      public static void test() {
        Set testList = GenericTest.method1((Class<List>[]) (new Class[] { ArrayList.class }));
      public static <I> I method1(Class<List>[] params) {
        return null;
    }This example is not "correct" due to the cast to Class<List>[], because such array can't contain Class<ArrayList> as an element (Class<ArrayList> is not a child of Class<List>). And this makes definition Class<List>[] almost useless, because such array should contain only List.class values.

  • Better type inference

    VS2015 is currently at CTP6, so I might be a bit late. But when I read about improved type inference, I imediately atempted this:
    public IList<string> GetStrings(bool b)
    return b ? new List<string>() : new string[]{};
    It still did not compile. Too bad: I often use an empty array as a fallback for a service call...

    "However, to me the case for the conditional operator behaving more like "if" is obvious. In my opinion, if all paths for an conditional statement unambiguously "work", the conditional statement should work like if:"
    You are making the (reasonable) assumption that the conditional and the if work the same but they don't.  If is a statement whereas conditional is an expression. While they perform similar functions they behave completely differently.  A statement
    will generate executable code to perform some action based upon the language requirements. But an expression must evaluate using the standard rules that expressions follow.  The net impact is that an expression must have a single, definite type that can
    be computed by the compiler.  A statement has no type.
    You keep going back to your assignment example but that isn't the correct example to consider.  Remember that assignment is evaluated in isolation so the lvalue and the rvalue are evaluated independently of each other. Once both types are known the
    compiler can determine if they are compatible.  As such your expression really boils down to this:
    isInternationalOrder ? db.GetCustomsDocuments() : docs
    But let's reduce this to what the compiler sees:
    expression1<bool> ? expression2<List> : expression3<Array>
    At this point the compiler can choose any type/interface that both expressions support including IList<T>, IList, IEnumerable and IEnumerable<T>.  You are choosing IList<T> in this case but the compiler doesn't know (or care)
    at this point.  It isn't until the compiler evaluates the assignment that it'll worry about that.  Note that this is why explicit casting is useful as you can "give" the compiler the info it needs.  Let's look at a modified example:
    var result = isInternationalOrder ? db.GetCustomsDocuments() : docs;
    What is the type of result? Here's another case to consider:
    Foo(isInternationalOrder ? db.GetCustomsDocuments() : docs);
    void Foo ( IList<T> items ) { }
    void Foo ( IEnumerable<T> items ) { }
    Which overload should be called?  It is these kinds of ambiguities that can cause unexpected runtime behavior.  The best way to prevent ambiguity is to keep the language tight and use explicit casts as needed.
    "I'm confused. I always thought the IsReadOnly property was supposed to indicate if an IList is mutable. But you're telling me an IList is mutable by definition? Why then does an array behave like it implements IList?"
    This is one of the ugly side effects of .NET v1. In the original version ICollection was the indexing interface and IList was the modification interface.  The IsReadOnly was supposed to allow for determining if a list could be modified as most times
    both interfaces were implemented.  We also had the IsFixedSize property for arrays.  However when generics were added IList<T> and ICollection<T> were basically combined to represent the same thing. At this point anything that implemented
    IList would implement IList<T> and the same for ICollection.  Arrays were part of this because indexing is still something that is useful to do. However the use of IsReadOnly and IsFixedSize has been mostly ignored in lieu of new interfaces like
    IReadOnlyCollection. In theory arrays should implement IEnumerable<T> and IReadOnlyList<T> but backwards compatibility has to come into play and we have to make compromises for legacy types.
    "I think the types for these cases can be inferred and it would be an improvement if they worked:"
    In your specific case I agree but remember what the compiler sees for the conditional:
    expression<bool> ? expression<T1> : expression<T2>
    The compiler has to ensure that expression<T1> and expression<T2> are type compatible and they aren't.  A simple cast solves this problem without having to change the compiler.
    I'm not necessarily against a type coercion change.  I'm just trying to justify the rationale why some things work the way they do in languages.

  • Why is type inference failing  here?

    Hi
    I tested code with generics 2.2. Can compile it but get runetime-error:
    Thx Gerhard
    public class Cell2 {
       <A> void doS(A[] h, A i){
            h[0] = i;        
       public static void main(String[] args) {  
          Cell2 c = new Cell2();
          c.doS(new Integer[]{new Integer(5)}, new String("Hallo"));     
    }

    Actually, inference is not failing here. Some would say that it is just doing its job too well.
    A call to doS will be allowed if inference can come up with an actual type T to replace A with such that, in this case, String is a subtype of T and Integer[] is a subtype of T[]. Notice that this doesn't mean that the type of i has to be identical to the element type of h, they just have to both have a common supertype. In this case, Object works (as does Serializable and Comparable<?>). In fact, no matter which arguments are given to doS, the call will be allowed if the first argument is an array, because Object can always be used for A. So, though it may seem counterintuitive, the method signature exactly corresponds to this:
    void doS(Object[] h, Object i) { ... }which would also compile but fail at runtime.
    The method can be written safely, however. You just have to state explicitly that the second argument has to be a subtype of the element type of the first:
    <A, B extends A> doS(A[] h, B i) { ... }With this signature, the call to doS with an Integer[] and a String is caught by the type system:
    Test.java:8: doS<A, B extends A>(A[],B) in Test cannot be applied to
    (java.lang.Integer[],java.lang.String); inferred type argument(s)
    java.lang.Integer,java.lang.String do not conform to bounds of type
    variable(s) A,B
            doS(new Integer[] {new Integer(5)}, "hello");
            ^
    1 errorI hope that helped -- inference can be tricky (and hard to explain, too).
    -- Christian

  • Same Type Inference 2x...One Compiles, One Does Not.

    Sorry, wrong forum on first post:
    Consider:
    public interface Graph<T> extends ReadOnlyGraph<T>, WriteOnlyGraph<T>
    public interface Vertex<T>
      public T getContents();
    public class  DefaultGraphFactory implements GraphFactory
      private GraphFactory _factory = AdjacencyHashFactory.Singleton;
      public static final DefaultGraphFactory Singleton = new DefaultGraphFactory();
      private DefaultGraphFactory()
      public <T> Graph<T> makeGraph()
        return _factory.makeGraph();
    public class DefaultSetFactory implements SetFactory
      private SetFactory _factory = HashSetFactory.Singleton;
      public static final DefaultSetFactory Singleton = new DefaultSetFactory();
      private DefaultSetFactory()
      public <T> Set<T> makeSet()
        return _factory.makeSet();
    }and the usage:
    public class ScopeGraph<K,V>
    Graph<Set><K>> _scopes;
      Vertex<Set><K>> _currentScope = null;
       public ScopeGraph()
        _scopes = DefaultGraphFactory.Singleton.makeGraph();
        _currentScope = DefaultSetFactory.Singleton.makeSet(); // comment me to compile clean
    }This seems like 2 applications of the same pattern. I don't understand why the compiler can infer the correct type for the scope assignment but not currentScope. Error is:
    incompatible types; no instance(s) of type variable(s) T exist so that java.util.Set<T> conforms to com.continental.datastructures.graph.Vertex<java.util.Set><K>>
    found : <T>java.util.Set<T>
    required: com.continental.datastructures.graph.Vertex<java.util.Set><K>>

    Sorry. This is simple lhs not matching rhs. I thought it was a binding issue on T. My mistake.

  • Constructor Type Inference

    I've assume this is a FAQ and the answer is easily found, but I have been unable to find it anywhere (yes, I've googled extensively and even used that cursed Forum Search).
    Assuming the following code:
    public class ScratchMain {
         private static class Container<V> {
              V value;
              public Container(V value) {
                   this.value = value;
              public static <T> Container<T> constructContainer(T value) {
                   return new Container<T>(value);
         public static void main(String[] args) throws Exception {
              Container<String> sc = new Container("string"); //1
              sc = Container.constructContainer("string");    //2
    }The compiler can infer the Type T at the line marked "//2", because it knows at runtime that the parameter "string" is of a type compatible with String. Fine, nice and working.
    For some strange reason the same inferenence is not applied when calling the constructor directly, so that the line "//1" gives a warning about using a raw type. Why do I have to re-specify that I want a Container<String> here, when the compiler can figure out that information just fine one line down (in what I think is a very similar situation).
    Can someone enlightenme? Even a simple "Sun said so, thus it is like that" (with a link on where Sun said that ;-)) would suffice.

    The way that I think about it is this:
    To call a constructor, you must use a fully specified concrete type after the 'new' on the right side of the assignment statement. You have some flexibility on the left side: you can use the same fully specified concrete type, or you may use a less specific type. This is why you could always do these:
    HashMap m0 = new HashMap();
    Map m1= new HashMap();
    Collection m2 = new HashMap();but not
    Collection c = new Collection();  // because Collection isn't a concrete typeWith generics, the specific instantiation types are an essential part of the type for construction purposes: they must be included when in the constructor call on the right side of the assignment.
    Container<String> sc = new Container<String>("string"); You can, if you are able to live with the associated limitations (or can accept warnings in your build), do any of the following:
    Container sc1 = new Container<String>("string");
    Container<?> sc2 = new Container<String>("string");
    Object sc3 = new Container<String>("string"); Fully qualifiying the type of an object being constructed should not be construed as a violation of the DRY (don't repeat yourself) principle. Without generics, I don't believe that anyone would argue that, in this statement...
    Integer i0 = new Integer(0);... typing the word 'Integer' twice is a violation of DRY, where this statement ...
    Number n0 = new Integer(0);avoids it*. With generic code, the same thing applies: the generic parms are (for construction purposes) an essential part of the type description -- you must specify them to the constructor, you may not need them or you may relax them on the declaration.
    Dave Hall
    http://jga.sf.net/
    http://jroller.com/page/dhall
    * There's obviously another principle involved: depending on as abstract a type as possible

  • Ternary and type inference

    private Set<GenericValue> discounts;
    private Set<GenericValue> surcharges;
    public Price(GenericValue basePrice, Set<GenericValue> surcharges, Set<GenericValue> discounts){
    if (surcharges == null){
    this.surcharges = Collections.emptySet();
    else{
    this.surcharges = surcharges;
    // doesnt work without cast. why?
    ///this.surcharges = surcharges != null ? surcharges: Collections.emptyList();
    cheers
    Colin.

    For a full explanation, I can't do better than to tell you to read the JLS.
    You can, however, avoid the cast. this.surcharges = surcharges != null ? surcharges: Collections.<GenericValue>emptySet();

  • Can we call a method by inferring a type ?

    Hi,
    Pardon my ignorance. Why doesn't type inference help in writing code like this ?
    <code>
    public class TypeInferredMethodCall {
    private interface Base {
         public void x();
    private static class Derived implements Base {
         public void x(){
    static <T> List<T> getList(Class<? extends T> type) {
         // T.x(); -> Error
    return new ArrayList<T>();
    public static void main(String[] args) {
    List<Base> list2 = getList( Base.class );
    </code>
    Thanks,
    Mohan

    Do you mean that you can't write T.x()? That is completely normal: you couldn't write Derived.x() either. More importantly, here T is being erased to Object, so you couldn't do this either:
    static <T> List<T> getList(Class<? extends T> type,T obj) {
        T.x()
    }for that to work you would have to do this:
    static <T extends Base> List<T> getList(Class<? extends T> type,T obj) {
        T.x()
    }

  • Generic Method, How parameter type is determined

    For method
    <T> void fromArrayToCollection(T[] a, Collection<T> c) { ... } Why does
    fromArrayToCollection(sa, co);passes and
    fromArrayToCollection(oa, cs); fails.
    oa - Object Array, Object[]
    cs - Collection of String, Collection<String>
    sa - String Array, String[]
    co - Collection of Object, Collection<Object>
    What are the rules governing the type of T inferred by compiler?

    epiphanetic wrote:
    I think you still haven't fully understood the issue.
    I suggest, you also read the same generics tutorial by Gilad Bracha, section 6 from where I found this issue :). Ha! But I think it's misleading that that section uses arrays.
    In his words "It will generally infer the most specific type argument that will make the call type-correct." Its also mentioned that collection parameter type has to be supertype of Array parameter type but no reason is given. I wonder why it fails to infer correct type in second case.Assume you passed in an array of Objects, and a Collection of Strings, and it was possible that T would then be Object. Using Bracha's example implementation:
    static <T> void fromArrayToCollection(T[] a, Collection<T> c) {
       for (T o : a) {
          c.add(o); // correct
    }Now imagine you had this code making use of it:
    Object[] objects = {Integer.valueOf(1), "hi", new Object()};
    Collection<String> strings = new LinkedList<String>();
    fromArrayToCollection(objects, strings);
    String string = strings.iterator().next(); //get first String, which is actually an IntegerTrying to get the first String would give a ClassCastException. So clearly that method cannot not be safely invoked.
    The reason I think he's confusing things by using the array is because you might get it in your head that this would be OK:
    static <T> void fromCollectionToCollection(Collection<T> one, Collection<T> two) {
       for ( T t : one ) {
          two.add(t);
    Collection<Object> col1; Collection<String> col2;
    doSomething(col1, col2);When clearly it's unsafe, as now your Collection of Strings might have a non-String in it! That's why I said this is more a nuance of generic arrays than of type inference proper.

  • Collections.emptyMap() isn't picking up the correct type

    I've got the following code:
    public String toString(Map<String, String> args);
    public String toString()
      return toString(Collections.emptyMap());
    }this gives me a compiler error:
    Cannot find symbol: toString(java.util.Map<java.lang.Object,java.lang.Object>)
    In other words, it is picking up the wrong generic types. If, however, I use the following code:
    public String toString()
      Map<String, String> t = Collections.emptyMap();
      return toString(t);
    }it works fine. Any ideas why?
    Thanks,
    Gili

    Cannot find symbol:
    toString(java.util.Map<java.lang.Object,java.lang.Obje
    ct>)
    In other words, it is picking up the wrong generic
    types. Apparent the compiler is not infering the types from the method parameters. I'm not sure that this is the correct behavior, but it would seem to me that it would be difficult to implement this kind of type inference correctly.
    If it did infer the type from the method declaration, then if you later add a toString(Map<Object, Object>) what should the compiler do? Should it change which method is called? This would be create a binary compatiblity issue which I think was intentionally avoided.

  • Meaning of obj. Type methodCall();

    Hi,
    Is this syntax obj.<Type>methodCall(); used for type inference ?
    In what situations does this algorithm work ?
    If I have methods like these
    public static void method1 (Object o)
    public static void method1 ( Number n )
    public static <T> void call (T t)  {
       method1 (t);
    }and call like this
    obj.<Number>call (new Integer (1));then due to the overloading rules the <Number> syntax doesn't have any effect.
    Why allow this syntax if it is ignored ?
    Thanks,
    Mohan

    That is not where the problem is. call() takes a parameter of type T, which erases to Object. Therefore call() is erased to this:
    public static void call (Object t)  {
       method1 (t);
    }Therefore method1 is determined to be method1(Object). This is done at compile time, which is why it doesn't matter what T actually ends up being.

  • JavaFX Compiler - Limiting Inference?

    Inference is no doubt a love/hate feature depending on how much you want the compiler and/or runtime to 'guess' what/how your code is supposed to execute/compile.
    PhiLo's got a good example posted here:
    http://phi.lho.free.fr/serendipity/index.php?/archives/18-Beware-of-JavaFXs-type-inference!.html
    h1. Are there any options/arguments that we can give javafxc (the javafx compiler) that limits inference?
    h5. This means:
    h6. Specific assignment types, 'var myVar:SpecificType'
    h6. Function Return types 'public function getSpecificType():SpecificType'
    h6. Function Returns a Specific Instance (i.e. has a return statement)
    There might be more that I can't think of off the top of my head.

    Hi. This may not be a direct answer to your question. Just an idea.
    The ComboBox uses a ListView as a method of display. A ComBox has
    the following methods to limit visible items:
    public final int getVisibleRowCount()
    public final void setVisibleRowCount(int value)You may want to look how these methods are implemented for a ComBox.

  • := inference

    map := new HashMap<String,List<Object>>(); //What does the operator := do?
    // I think it has something to do with type inference.

    hunter9000 wrote:
    jbish wrote:
    hunter9000 wrote:
    su_penguin wrote:
    map := new HashMap<String,List<Object>>(); //What does the operator := do?
    // I think it has something to do with Pascal.Works in Oracle PL/SQL : )
    It's been so long since I've used a language with that operator, it just looks like a walrus now.
    *:=*I like it as a symbol. Using "=" for assignment seems wrong: why use a symmetric symbol for an operation that's not? I know in some pseudo code (but not any language that comes to mind) I've seen an arrow used:
    a <== b //actual arrow, not ascii art
    edit:
    a &larr; b

  • What is the scope of implicit loop variables?

    Hi,
    I'm facing some strange error from the ABSL editor (syntax checker).
    In ABSL the loop variables are implicit and don't have to be declared in the head section of the script.
    My question now is simple: How is the scope/visibility of such loop variables specified ?
    There's a complete code snippet below.
    In line no.9, there's the first time use of implicit loop variable 'task_inst'.
    Because of type inference, it will be typed as MasterDataWanneBe/Tasks (which is my own BO type).
    In line no.20, I want to use the same variable name in a different loop, outside the parenthesis/scope of the first first use.
    Now the ABSL syntax checker complains about incompatible types (see code snippet)
    Thus the type inference should result in the, (lets say 'local') type Project/Task, which is the one I was querying for.
    To me it looks like, that loop variables implicitly get a global scope (hopefully bound to this ABSL file only).
    I would like to see the scope/visibility of loop variables restricted to the parenthesis.
    In other words only inside the loop.
    Hint
    I heard (from little sparrows), that local variable scoping is not possible because of underlying
    generated ABAP code. If so, than it would be helpful to print warnings, in case of types are compatible
    but used in different scopes. Think about the unintended side effects.
    import ABSL;
    import AP.ProjectManagement.Global;
    var query_tasks;
    var query_tasks_param;
    var query_tasks_result;
    foreach (empl_inst in this.Employees) {
         foreach (task_inst in empl_inst.Tasks) {
             //   ^^^^^^^^^  first time use
              task_inst.Delete();
    // ===========================================================================
    query_tasks = Project.Task.QueryByResponsibleEmployee;
    query_tasks_param = query_tasks.CreateSelectionParams();
    query_tasks_result = query_tasks.Execute(query_tasks_param);
    foreach (task_inst in query_tasks_result) {
          // ^^^^^^^^^ Error: 4
          // The foreach loop variable is already inferred to an incompatible type:
          // Node(MasterDataWanneBe/Tasks). Expected Node(Project/Task)

    Yes, variable declarations in ByD Scripting Language indeed have (snippet) global visibility. In the FP 3.0 release the variables can be declared anywhere in the snippet (not only in the beginning, as with FP 2.6), however still not within code blocks, i.e. within curly braces ({}). Therefore variable name shadowing is still not supported and because of the global visibility of variables they cannot be reused for a different type, later in the same snippet. This is because of the statically typed nature of ByD Script, despite the type inference convenience.
    Kind regards,
    Andreas Mueller

  • Generic working in eclipse compiler but not through builds

    The following code snippet works fine in my eclipse development environment (1.6.0_06), but the build system running 1.6.0_06 throws exception:
    MyClass:343: incompatible types
    found : java.util.List<C>
    required: java.util.List<B>
    entries = createListFromSmartCopy(myAList, new B(), true);
    Types:
    A is an interface
    B is an interface of A
    C is an implementation of B
    List<A> aList = new ArrayList<A>();
    aList.add(new A());
    List<B> return = createListFromSmartCopy(aList, new C(), true);
        * <p>Creates a copy of a list where the source list could be an ancestor
        * type of the returned list. It also uses a reference object to actually
        * construct the objects that populate the return list.</p>
        * @param <T> - The ancestor type of the source list
        * @param <R> - The derived type of the destination list
        * @param <S> - The more derived type of the prototype object used to
        * construct the list of R's
        * @param sourceList - The source list
        * @param referenceObject - The object used to construct the return list
        * @param deepCopy - Deep copy serializable objects instead of just copying
        * the reference
        * @return a list of R's (as defined by the caller) of entries with the
        * object constructed as a copy of referenceObject with the properties of the
        * sourceList copyied in after construction
    public static <T extends Serializable, R extends T, S extends R> List<R> createListFromSmartCopy(
                List<T> sourceList, S referenceObject, boolean deepCopy)
          List<R> retr = new ArrayList<R>();
          for(int i = 0; i < sourceList.size(); i++)
             retr.add(copyOf(referenceObject));
             copyInto(sourceList.get(i), retr.get(i), deepCopy);
          return retr;
       }Any thoughts on either:
    1. How does this pass the compiler validation inside eclipse, even through 'R' has not been defined? I believe that the code is doing some sort of return type inference to return the exactly correct type of list as referred by the return result or else it is simply ignoring the invariant capture of R all together and silently dropping the error. The funny thing is that the code does work just fine in practice in my development system without an issue.
    or
    2. Why if the code is valid does the independent build system disallow this generic return type 'inference' to occur? Are there compiler flags I can use to withhold this special condition?

    Thanks for the response, I wasn't trying to show a full example but just my implementation's snippet. I'll list one now:
    package test;
    import java.io.ByteArrayInputStream;
    import java.io.ByteArrayOutputStream;
    import java.io.IOException;
    import java.io.ObjectInputStream;
    import java.io.ObjectOutputStream;
    import java.io.Serializable;
    import java.util.ArrayList;
    import java.util.List;
    public class TestMe
        * <p>This method performs a deep copy of an object by serializing and
        * deserialzing the object in question.</p>
        * @param <T> - The type of data to copy
        * @param original - The original object to copy
        * @return The object who's state should be the same as the original. This
        * call uses serialization to guarantee this copy is fully separate from the
        * original, so all sub-object references are also deep copies of their
        * originals
       @SuppressWarnings("unchecked")
       public static <T extends Serializable> T clone(T original)
          T obj = null;
          try
             ByteArrayOutputStream bos = new ByteArrayOutputStream();
             ObjectOutputStream out = new ObjectOutputStream(bos);
             out.writeObject(original);
             out.flush();
             out.close();
             ObjectInputStream in = new ObjectInputStream(new ByteArrayInputStream(
                      bos.toByteArray()));
             obj = (T)in.readObject();
          catch(IOException e)
             e.printStackTrace();
          catch(ClassNotFoundException cnfe)
             cnfe.printStackTrace();
          return obj;
        * <p>Copies the properties from one object to another. The destined object
        * in this method must be derived from the source object. This allows for a
        * faster and smoother transition.</p>
        * @param <T> The type of source
        * @param <R> The type of destination
        * @param source - The source object
        * @param destination - The destination object
        * @param deepCopy - Copies the reference objects instead of just passing
        * back the reference pointer reference
       public static <T, R extends T> void copyInto(T source, R destination,
                boolean deepCopy)
       // Stubbed because it links into a ton of unnecessary methods
        * <p>Copies the values of a list of an ancestor class into the values of
        * another list who's value is derived from the ancestor.</p>
        * @param <T> - The ancestor type of the source list
        * @param <R> - The derived type of the destination list
        * @param sourceList - The source list
        * @param destinationList - The destination list
        * @param deepCopy - Deep copy serializable objects instead of just copying
        * the reference
       public static <T, R extends T> void copyIntoList(List<T> sourceList,
                List<R> destinationList, boolean deepCopy)
          if(sourceList.size() > destinationList.size())
             throw new IllegalArgumentException(
                      "Cannot copy entire source set into destination list");
          for(int i = 0; i < sourceList.size(); i++)
             copyInto(sourceList.get(i), destinationList.get(i), deepCopy);
        * <p>Creates a copy of a list where the source list could be an ancestor
        * type of the returned list. It also uses a reference object to actually
        * construct the objects that populate the return list.</p>
        * @param <T> - The ancestor type of the source list
        * @param <R> - The derived type of the destination list
        * @param <S> - The more derived type of the prototype object used to
        * construct the list of R's
        * @param sourceList - The source list
        * @param referenceObject - The object used to construct the return list
        * @param deepCopy - Deep copy serializable objects instead of just copying
        * the reference
        * @return a list of R's (as defined by the caller) of entries with the
        * object constructed as a copy of referenceObject with the properties of the
        * sourceList copyied in after construction
       public static <T extends Serializable, R extends T, S extends R> List<R> createListFromSmartCopy(
                List<T> sourceList, S referenceObject, boolean deepCopy)
          List<R> retr = new ArrayList<R>();
          for(int i = 0; i < sourceList.size(); i++)
             retr.add(clone(referenceObject));
             copyInto(sourceList.get(i), retr.get(i), deepCopy);
          return retr;
       public static void main(String[] args)
          List<A> aList = new ArrayList<A>();
          aList.add(new AImpl());
          aList.add(new AImpl());
          List<B> bList = createListFromSmartCopy(aList, new C(), true);
          for(B bItem : bList)
             System.out.println("My String = "
                      + bItem.getString() + " and my number = " + bItem.getInt());
       public static interface A extends Serializable
          public void setString(String string);
          public String getString();
       public static class AImpl implements A
          private static final long serialVersionUID = 1L;
          @Override
          public void setString(String string)
          @Override
          public String getString()
             return null;
       public static interface B extends A
          public void setInt(int number);
          public String getInt();
       public static class C implements B
          private static final long serialVersionUID = 1L;
          public C()
          @Override
          public String getInt()
             return null;
          @Override
          public void setInt(int number)
          @Override
          public String getString()
             return null;
          @Override
          public void setString(String string)
    }In my eclipse (20090920-1017), this compiles and runs just fine. I stripped out the functional pieces that weren't pertinent to the discussion.

Maybe you are looking for