Home > Software engineering >  Casting Class instance of an generic class using wildcards
Casting Class instance of an generic class using wildcards

Time:12-20

I currently really can't wrap my head around what is happening here. I have the following (minimal) code:

public class SillyCast {

    interface B {}
    interface D<T extends B> {}
    class C<T extends D<?>> {}
    class A<T extends B> extends C<D<T>> {}
    List<Class<? extends C<?>>> list = new ArrayList<>();

    void m() {
        list.add((Class<? extends C<?>>) A.class);
    }
}

When compiling (and running) this code in Eclipse everything works as expected but when doing it from the command line (Oracle JDK) I get the following error during compiling:

error: incompatible types: Class<A> cannot be converted to Class<? extends C<?>> list.add((Class<? extends C<?>>) A.class);

I know that the Oracle JDK not always behaves exactly the same as Eclipse JDT but this seems weird.

How can I add an instance of A.class into my ArrayList? Intuitively it seems like this should be possible.

Why does the Oracle JDK and Eclipse JDT have different opinions about this cast? Why does the Oracle JDK have problems doing this cast.

At first I thought this might be a bug but I tested this on different Java versions (8 and 11) and the problem happens on all of them. Also this kind of cast seems way too "simple" as this could go unnoticed over multiple versions, so expect this to be done like this by design.

CodePudding user response:

Not sure what you're trying to achieve but this works fine:

public class Main{
    static interface B {}
    static interface D<T extends B> {}
    static class C<T extends D<?>> {}
    static class A<T extends B> extends C<D<T>>{}

    static List<Class<? extends C>> list = new ArrayList<>();
    public static void main(String[] args) {
        list.add(A.class);
        list.add(C.class);
    }
}

This is really confusing, what are you trying to achieve? You are parameterizing by interfaces extending other interfaces (and then by classes extending the classes parameterized by interfaces extending interfaces). This level of complexity indicates that it might need a different, simpler approach :D

EDIT: Does this work for you? (This compiles and runs with no errors):

public static void main(String[] args) {
    List<Class<? super A<?>>> list = new ArrayList<>();
    list.add(A.class);
    list.add(C.class);
}

I might be wrong, this is a bit confusing, but from what I understand C<?> is a superclass of C<D<T>>, so List<? extends C<?>> accepts objects of type C<?> and all its children (variations of C parameterised by any type T), which is why sticking an instance of A in it gives errors.

A being a subclass of C has no effect on List<A> and List<C>, their only relationship is that they are children of List<?>.

This is why I think we have been running into errors, while A is a C<D<T>>, List<A> has no relationship to List<C>. I agree it would feel intuitive if the relationship was preserved.

CodePudding user response:

I think as @ferrouskid pointed out the reason why this isn't working is probably that, although being generally assumed, C is no super type of C>.

Although this being more of a confusion tactic then a real solution, my current work around is to just distract the Oracle compiler by assigning the A.class to an variable before casting it (may require an @SuppressWarnings("unchecked"))

Class<?> aa = A.class;
list.add((Class<? extends C<?>>) aa);

Disclaimer: This kind of operation is very dangerous because the cast may fail at runtime. So only do this if you have complete test coverage of the impacted methods.

  • Related