Monday, August 1, 2011

Java 7 and diamond operator

Java 7 has finally been released. After 5 years we've finally been given a new toy to play around with. Better, faster, lighter.. One would hope for a breeze of modern features in the language after reading statements like "Type inference" and so on.

What I'd like to make sure everyone understands is that the so called "Diamond operator", further more specified by Oracle as "Type inference for generic instance creation" is nothing more than a lie sold to us once again. Let me show what I mean.

Java 1.6 code:
List<String> names = new ArrayList<String>();
In this case there's nothing to inference because the code is given to the compiler with all the details. Now what actually happens when the code gets compiled?
The right-hand operand (as well as the left-hand side) is stripped of the generic parameter because of the type erasure, one of the most incredible nonsense in Java. So in fact it's no different that saying
List names = new ArrayList();
In the light of the last statement here's what the creators of Java 1.7 made to ease our pain:
List<String> names = new ArrayList<>();
In this case instead of forcing you to specify both generic arguments they only make you specify it once (where the compiler will actually need it). That's it. The type erasure still takes place so the compiler really couldn't care less about what kind of generic freak show is being assigned to the variable as long as it satisfies the assignment (which in this case has nothing to do with generics but simply with the fact that an ArrayList is a List).

Now let's look at what type inference looks in other languages. Let's start with Groovy:
def names = new ArrayList<String>();
Here we see that the type of the names variable has been inferred to match the type being assigned to it. In Scala the situation is almost identical:
var names = new ArrayList[String]()
Again here's a real type inference in action.
Let's switch platforms for a moment and see how it's done in C#, a statically typed language for .NET:
var names = new List<String>();
What you see here is once again an additional keyword that marks the variable as being subject to type inference.

At the end of the day for me it is more important what the variable name is and that's what exposed in C#, Groovy, Scala and pushed to the background with all the type declaration fuzz in Java.

We can go on and on with examples from other languages where type inference goes beyond the simple fact that you don't need to specify the generic type of the variable twice. There is however one thing that can make you wonder what the hell is actually the type inference in Java style for? Let's see the following example:
List<String> names = new ArrayList<String>();
names.removeRange(0, 1);
This code obviously will not compile but this snippet (in Scala) will:
var names = new ArrayList[String](); 
names.removeRange(0, 1);
Why is that the case? In the Java snippet what we're doing is we're specifically saying that he variable is of type List<String> not ArrayList<String> which in turn means that the method removeRange is not available. In the second example what we're saying is that the variable names should be of type ArrayList[String] because that's what the type inference will ultimatelly figure out. But do we really want to have a List in the Java version? If so why do we specify ArrayList as the class to define the construct that we want to instantiate? Shouldn't we have that instance injected from somewhere else and then code against an interface instead? And if we're instantiating an ArrayList do we really need to strip ourselves from the actual thing that we have oh so obviously specified and play cripples just for the fun of it? Or better yet here's how the Java code could have been written:
List<String> names = ArrayList<String>();
((ArrayList)names).removeRange(0, 1);
Cute, isn't it? And so damn readable!!!

Again we can go on and on with examples and theories what actually is type inference and if what Oracle is feeding us is a trick to make us believe Java is still evolving. To my liking there's no point in coding in pure Java, a language that didn't see a major change for 7 years (September 30, 2004 where generics have been introduced). Or do you think that underscores in integer literals deserve to be taken as a major breakthrough? :D

Have fun!

No comments: