Java Grab Bag 2
Fri May 03 15:38:00 EDT 2019
- Java Hiccups
- Bitwise Operators
- Java Grab Bag 2
- Java Travelogue: The Care and Feeding of Locales
- More Notes on Filesystem and Charset Portability
Following in the vein of "Java Hiccups", I've had a couple things floating around my head lately that I think collectively make for a good post for Java developers, particularly those working in the Domino arena.
Without further ado:
Map#computeIfAbsent
This is a method that was added in Java 8 and, while it's not as big a deal as the addition of streams, it's one of my favorite additions and something I use very frequently. To give a point of reference, consider this common idiom from an imagined XPages app:
Map<String, Object> applicationScope = ExtLibUtil.getApplicationScope();
if(!applicationScope.containsKey("someVal")) {
applicationScope.put("someVal", someExpensiveOperation());
}
String someVal = (String)applicationScope.get("someVal");
Essentially, using a Map
as a cache for a complex computed value. Java 8 added the #computeIfAbsent
method (alongside several similar ones) to do this in one go:
String someVal = (String)ExtLibUtil.getApplicationScope().computeIfAbsent("someVal", key -> someExpensiveOperation());
The second parameter is (usually) a lambda, like the ones used in streams, that takes the provided key as an argument and is only executed if the value does not already exist. Due to the way this was added, most implementations do pretty much the same thing as the first block of code, but you don't have to care about that. Your code gets a bit smaller, the intent is much clearer, and it's less prone to small bugs like changing the key and forgetting to change it in all three places.
Arrays Are Weird
Java's built-in array type is loosely based on C's, and that's reflected in the syntax:
int[] foo = new int[4];
foo[0] = 1;
foo[1] = 2;
foo[2] = 3;
foo[3] = 4;
Like C, they are zero-based, declared with the capacity and not the max index, and cannot be resized. Unlike C, arrays aren't just syntactical sugar on top of pointers, and this manifests immediately in bounds checking. Take this line:
foo[4] = 10;
In C, this will (famously) just write an integer 10
value into whatever memory happens to be just beyond the bounds of your array. In Java, you'll get an ArrayIndexOutOfBoundsException
, saving you from the insidious bug. But, since Java arrays are (probably) implemented internally very similar to in C - they're likely contiguous blocks of memory sized to the type - they're still extremely efficient, and so they show up in a lot of speed-critical code.
As speedy and safe as they are, though, they're still pretty unfriendly. For starters, they can't be resized. When you do new int[4]
(or use literal syntax like new int[] { 1, 2, 3, 4 }
), you carve out that much memory and can't shrink or expand it in-place. You can change the values inside an array, just not its size. To "resize" efficiently, you have to make a new array and then use System.arraycopy
to populate the new array with the contents of the old.
This is all why the List
interface (with its predecessor class Vector
) exists: they serve the same function of "ordered collection of stuff", but allow for dynamic resizing. Because these objects are usually "efficient enough" (ArrayList
uses "true" arrays under the covers) while having numerous additional benefits, you should use them as your first go-to and only use arrays if you have a reason.
That's in part because the weirdness of arrays doesn't end with their inconvenience. Array types are actually implicitly-created classes, even when they contain primitive types. So:
int foo = 3; // primitive value
foo = null; // syntax error!
int[] bar = new int[] { 3 }; // Object
bar = null; // legal!
List<int> fooList = new ArrayList<>(); // syntax error - primitives can't be in generics
List<int[]> barList = new ArrayList<>(); // legal!
Object.class == Object[].class; // false
int[].class.isArray(); // not only legal, but true
Java provides two main utility classes for working with arrays: java.util.Arrays
(extremely useful for Arrays.asList
) and java.lang.reflect.Array
(usually only useful in edge cases).
Java Has No Library Versioning System
If you've worked with Java in Designer or Eclipse, you've likely run across this preferences pane or its per-project version:
These settings affect two things:
- The syntax allowed in your source (e.g.
new ArrayList<>()
requires 1.7 or higher) - The class file format version (you can think of this like an NSF ODS). You've likely seen the latter in play by receiving an
UnsupportedClassVersionError
trying to run Java 7 or 8 code on a pre-9.0.1FP8 Domino server.
Conspicuously absent from this short list is anything to do with classes or methods added to the runtime in newer versions. For example, the String
class gained a static method String.join
to conveniently concatenate strings with a given delimiter. If you're targeting an older Java version but using a newer Java library (as Designer 9.0.1FP10 does with a default target of 1.5 and JVM of 1.8), you can write a line of code using that method without issue - the syntax doesn't require anything above 1.5, so all is clear as far as the compiler is concerned. But if you then try to run that code on an older JVM (such as an older Domino server), you'll get an exception at runtime, since the method doesn't exist.
Unfortunately, the only true answer to this is to tell your IDE about a JRE for each specific Java version you're targeting, something that doesn't happen by default, and which will gradually get more difficult as Java 6 becomes harder and harder to come across.
This is one of the things that OSGi aims to fix - you could, for example, have many versions of Guava installed, and you could declare that your plugin works specifically with version 18. Then, when loading, the runtime will either bind to a matching version or give you an error that no version could be resolved. No mystery involved. Unfortunately, OSGi is a niche thing losing ground, and the module system introduced in Java 9 consciously does not address this.
In a pinch, you can use the file
tool on most Unix systems to check the version of an individual class file:
$ file Foo.class
Foo.class: compiled Java class data, version 52.0 (Java 1.8)
Licensing
A little while ago, Oracle raised a bit of a stink by declaring that, as of this year, commercial use of their Java runtimes would require paid licensing. Historically, you could get support for Java for money, and certain additional components had their own licensing requirements, but it was pretty normal otherwise to install Java from java.sun.com and not give it a second thought.
This naturally caused a few questions when it comes to Domino and other Java-incorporating IBM products, and IBM released a statement that basically amounted to "you don't have to worry about it". IBM has maintained their own variant of the JVM (called J9 for Smalltalk-related reasons, not to be confused with Java 9) and anyway has always had arrangements with Sun/Oracle such that IBM's customers don't have to worry about dealing with Oracle directly.
But what about using Java outside of a licensed product, such as if you just want to run Tomcat on some server? The short answer there is that you're still fine, but you just have to know a little about the difference between "a JDK" and "Oracle's JDK". Java and the surrounding JDK have been progressively open-sourced in fits and spurts over the years, and are now at a point where the project called OpenJDK is basically the real Java environment, and then Oracle's JDK is just one implementation of it. It's similar to Linux: the core parts are open-source, and many distributions are entirely free, but there also exist commercial variants for pay.
So, if you want to run a Java stack, you can do so without putting forth a single cent by using an OpenJDK build. Oracle, IBM, and others will still be happy to take your money if you want a commercially-supported Java environment, of course.
For a longer explanation, this blog post from @javachampions is pretty much the definitive word.