State of my Workspace 2021

Jan 15, 2021, 10:59 AM

Tags: sotu
  1. State of my Workspace 2017
  2. State of my Workspace 2021

A few years back, I wrote a post about the State of my Workspace, and I figure now is as good a time as any to write an updated version.

That said, looking through my old post, it's a little surprising how much remains the same, considering how many of the categories where allegedly in transition at the time.

Eclipse is Eternal, Apparently

Over the years, I've spent varying amounts of time in IntelliJ, and it's remarkably good in a lot of ways. I love how well-integrated its features are, and the revamped Endpoints pane in the latest releases is outstanding. It's speedy, consistently-updated, and well-considered. It particularly shines when working with a single app (either an individual thing or a small multi-module project), such as working on this blog.

That said, I still use Eclipse daily and always return to it when I try to switch away. Eclipse itself has been improving steadily too. Recent quarterly releases have focused a lot on performance, and it's paid off - the switch to non-blocking completion proposals in particular has brought some of the editing snappiness that other editors revel in. In addition, the Wild Web Developer project, though still jankier than, say, VS Code, does a good-enough job bringing fully-modern JS editing to Eclipse by way of using the same underlying LSP as other editors (the fact that it's also the backbone of NSF ODP's Eclipse support helps too). It also has some of the inline Java hints like IntelliJ does now, too, dubbed "Code Minings".

There are a few things that keep me coming back to Eclipse other than just "it's speedy like others now", too. Most of my work involves working with multiple sprawling Maven trees from distinct repositories. While IntelliJ can do that by way of importing modules, Eclipse's UI just makes it easier to manage. There's also the eternal back-and-forth about Build Automatically, and I come down thoroughly on Eclipse's side there. While IntelliJ has some options to do something like that, it's not as consistent as Eclipse. With Eclipse, I can be confident that the Problems pane will always show everything applicable. In general, I just feel more in-control of a large set of projects when working in Eclipse, and that goes a long way.

Issue Trackers Are Even Worse Off Now

Keeping track of open issues for my various tasks - both open-source projects and client work - remains a real thorn in my side, and the situation has degraded further over the years. I quite liked using the app Ship years ago, but then the company behind it shut down. The Mylyn Bitbucket connector, too, kept breaking and I gave up on it entirely a while ago. This has left me back in the situation of manually checking various browser tabs to follow everything, and that stinks.

Every once in a while, I've tinkered with writing my own little coordinated issue-tracker app to bridge all the differences, but it never really gets off the ground. It's just one of those things where it never really makes sense to sink a bunch of non-billable time into slightly improving the way I manage actually-billable time. Maybe one day it'll tip over the mental threshold and I'll actually solve the problem. We'll see.

My Computer Is Due For A Change

I ended up buying the iMac Pro I mentioned in my last post, and it's been pretty decent. However, I've hit the same trouble that Marco Arment of ATP ran into, which is that the fans on this thing go constantly. It crept up over time, with them just spinning up when I would e.g. compile a bunch of stuff at once, but now they're going basically any time I use it. It's still under AppleCare, but, between the pandemic and the fact that there's no time when it's convenient for me to be without my development environment for a week, I haven't taken it in. It's just led to be getting more and more annoyed over time.

Fortunately, the M1 Macs should be an opportunity to cut the Gordian knot: my old MacBook was due for a replacement, so I'm swapping that out for an Air. In theory, that Air will be equivalent or better than the iMac Pro for the type of work I do anyway, and I'd long ago moved my Windows VM to a PC in the basement. I plan to give it a shot as my main desktop once it arrives, and that'll also give me some room to take this thing in to be fixed. I'm looking forward to that, since it'll be nice to not glare at my computer in annoyance all day.

Oh, and, since I mentioned Storage Spaces last time: since then, I took an old tower Mac Pro and installed FreeNAS in it, and it's a delight. FreeNAS is cool and I legitimately missed working with FreeBSD. So now I have that thing handling mass storage, while my gaming PC runs Windows Server and hosts my various development VMs using Hyper-V.

Other Misc. Tools

In my last post, I mentioned Franz as a coordinated tool for the ridiculous number of chat apps I had. I ended up settling on it and, other than some consistent problems with notifications and audio, it does a good-enough job. Certainly, it's a much-better experience than running all of those stupid apps individually, that's for sure.

I also did indeed make the switch from SourceTree to Tower. While I appreciated the $0 price of SourceTree, its general crashiness and tendency to hang on complex operations got to me and I've been using Tower ever since. It does exactly what it's supposed to and does it well. Nice job, Tower.

Browser-wise, I used Firefox Developer Edition for a while, but switched back to Safari when Firefox developed a tendency to somehow hard-crash my computer. I don't know if it's a computer-specific thing (likely related to whatever the fan trouble is, I'd guess) or trouble with Firefox, but it's not exactly the sort of thing I want to have to diagnose. Safari is speedy and I'm getting used to the developer tools, so it's a fine replacement.

For mail, I (and my company) have been using Spark for a while, and it's great. The ability to share emails and have inline threads among people in your organization is wonderful.

For calendars, I use Fantastical. I miss the days when I didn't have to really care about calendars at all, but, given that I regularly have meetings, this does a splendid job dealing with them. I quite appreciate the recent additions of weather reports and the automatic detection of meeting URLs, too.

For blogging, I use MarsEdit, both because it's good on its own and because writing an API for my blog meant I didn't need to bother making a good editing UI on my own. I think it's also just conceptually good to use tools that work with open protocols like that.

Getting Started with Hotwire in a Java Webapp

Jan 12, 2021, 5:19 PM

Whenever I have a great deal of discretion over how a web app is made these days, I like to push to see how simple I can make the front end portion. I spend some of my client time writing heavy client-JS front ends in React and Angular and what-have-you, and, though I get why they are good, I kind of hate them all.

One of the manifestations of my desires has been this very blog, where I set out to try not only some interesting current tools on the Java side, but also challenged myself heavily to use little to no JavaScript. On that front, I was tremendously successful - and, in fact, the only JavaScript on here is the Turbolinks library, which intercepts same-app links and updates the changed parts inline, without the server knowing about the "partial refresh" going on.

Since then, Turbolinks merged with its cousin Stimulus and apotheosized into Hotwire, which is somewhere in between a JavaScript framework and a manifesto. Specifically, it's a manifesto to my liking, so I've been champing at the bit to use it more.

Hotwire Overview

The "Hotwire" name is a cheeky truncation of HTML-over-the-wire, which itself is a neologism for how the web has historically worked: your server sends HTML, and then your browser does stuff with that. It "needs" a new name to set it apart from full-JS apps, which amount to basically sending an application to the browser, having it initialize the app, and then having the app do what would otherwise be the server's job by way of shuttling JSON around.

Turbo is that part that subsumed Turbolinks, and it focuses on enhancing existing HTML and providing a few web components to bring single-page-application niceties to server-rendered apps. The "Drive" part is Turbolinks, so that was familiar to me. What interested me next was Turbo Frames.

Turbo Frames

If you've ever used the XPages Dojo Tab Container's partialRefresh property before, Turbo Frames will be familiar. There are two main ways you can go about using it: making a "frame" that contains some navigable content (say, a form) that will then refresh in-place or making a lazy-loaded frame that pulls from another URL. The latter is what interested me now, and is what carries similar benefits to the Tab Container. It lets you serve the main page and then defer complex complication of an inner part without having to write your own JavaScript to do an API call or otherwise populate the section.

In my case, I wanted to do something very similar to the example. I have my main page, then a sidebar that can be potentially complicated to generate. So, I set up a Turbo Frame using this bit of JSP:

1
<turbo-frame id="links" src="${pageContext.request.contextPath}/links"></turbo-frame>

The only difference from the example, really, is the bit of EL in ${...}, which just makes sure that the final URL adapts to wherever the app is hosted.

The "links" resource there is another MVC controller that renders a different JSP page, truncated like:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
<html>
    <head>
        <script type="text/javascript" src="${pageContext.request.contextPath}/webjars/hotwired__turbo/7.0.0-beta.2/dist/turbo.es5-umd.js"></script>
    </head>
    <body>
        <turbo-frame id="links">
            <!-- expensive content here -->
        </turbo-frame>
    </body>
</html>

The <turbo-frame id="links"> on the initiating page matches up with the one in the embedded page to figure out what to extract and render.

One little side note here is my use of WebJars to bring in Turbo. This isn't an NPM-based project, so there's no package.json bringing the dependency in, but I also didn't want to just paste the JS into my project. Fortunately, WebJars does yeoman's work: it makes various JS libraries available in Servlet-friendly Java JAR format, giving you a JAR with the JS from whatever the library is in META-INF/resources. In turn, an at-least-reasonably-modern servlet container will serve files up from there as if they're part of your main app. That way, you can just use a Maven dependency and not have to worry.

A Hitch: 406 Not Acceptable

Edit 2021-01-13: Thanks to a new release of Turbo, this workaround is no longer needed.

When I first put this together, I saw that Turbo was doing its job of fetching from the remote URL, but it was getting a 406 Not Acceptable response from the server. It took me a minute to figure out why - the URL was correct, it was just a normal GET request, and nothing immediately stood out as a problem in the headers.

It turned out that the trouble was in the Accept header. To work with other Turbo components, Frames makes a request with a header like Accept: text/html; turbo-stream, text/html, application/xhtml+xml. That first one - text/html; turbo-stream - is problematic. I'm not sure if it's the presence of a qualifier at all on text/html, the space, or the lack of an = (as in text/html;charset=UTF-8), but Liberty didn't like it.

Since I'm not (yet, at least) using Turbo Streams, I decided to filter this out on the server. Since MVC is built on JAX-RS, I wrote a JAX-RS request filter to find any Accept values of this type and strip them out:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
@Provider
@PreMatching
public class TurboStreamAcceptFilter implements ContainerRequestFilter {
    @Override
    public void filter(ContainerRequestContext requestContext) throws IOException {
        MultivaluedMap<String, String> headers = requestContext.getHeaders();
        if(headers.containsKey(HttpHeaders.ACCEPT)) {
            List<String> cleaned = headers.get(HttpHeaders.ACCEPT).stream()
                .map(accept -> {
                    String[] vals = accept.split(",\\s*"); //$NON-NLS-1$
                    List<String> localClean = Arrays.stream(vals)
                        .filter(val -> val.indexOf(';') < 0)
                        .collect(Collectors.toList());
                    return String.join(", ", localClean); //$NON-NLS-1$
                })
                .collect(Collectors.toList());
            headers.put(HttpHeaders.ACCEPT, cleaned);
        }
    }
}

Since those filters happen before almost anything else, this cleared up the trouble.

Summary

Setting the Accept quirk aside, this was a pleasant success, and I look forward to using this more. I've found the modern Java stack of JAX-RS + CDI + MVC + simple JSP to be a delight, and Hotwire slots perfectly-smoothly into it. I still quire enjoy rendering HTML on the server and the associated perk of not having to duplicate business logic on both sides. Next time I have an app that requires a bit of actual JavaScript, I'll likely throw Stimulus into the mix here.

The Difficulties of Domino Project Dependencies

Dec 31, 2020, 9:38 AM

Tags: java maven

This post is a drum I've been banging for a long time, from nagging the dev team in the IBM days through to formally requesting it in HCL's Ideas Portal. That idea there has been "Likely to implement" for a little while now, which is heartening, and either way I figured it'd be useful to have a proper blog post explaining the trouble and what a useful better version would be.

The Core Trouble

The main thing I'm talking about here is the act of having a third-party or (particularly) open-source project that depends on Domino artifacts - namely, Notes.jar, the NAPI, and the XPages UI components. I have more than a few such projects, so it's something I deal with pretty much daily.

When you're dealing with an XPages app in an NSF, this isn't really an issue: all the parts you need are there and are part of the classpath. You just reference lotus.domino.Database or com.ibm.xsp.extlib.util.ExtLibUtil and don't even give it a second thought. When you have a project outside of an NSF or Designer, though, you start to have to worry about this.

OSGi Projects

For OSGi-based projects, this means that you need to have a Target Platform that points to the XPages artifacts and then either have a variant of that that includes a packages Notes.jar or also include Notes.jar in your classpath another way. In Eclipse, this might be accomplished by adding Notes.jar to your active JVM and referencing a Notes or Domino installation's OSGi directories - this is something the XPages SDK helps with.

The immediate trouble this involves is if you want to build this project outside of Eclipse - most commonly now with Maven. This is where the IBM Domino Update Site for Build Management came in, which is a cleanly-packaged p2 update site of the XPages artifacts and Notes.jar, suitable for use with Maven+Tycho and any other tool (like Eclipse) that gets its dependencies out of a p2 repository. Unfortunately, it hasn't been updated since its initial release, and contains just the original 9.0.1 versions.

To aid with creating updated versions of that, I created the generate-domino-update-site tool a while back. Since no one outside HCL can legally share update sites themselves, the tool is the next-best thing: point it at Notes or Domino and it'll make one for you in a consistent way.

With either of those routes, though, there's still a gotcha: you still need to have each developer set up the update site for themselves, and it's only consistent across projects because the community settled on the notes-platform Maven property as a URI pointing to the update site. This is as opposed to something like Eclipse-the-IDE's repositories, which (as a virtue of being open-source) are publicly available and can be referenced freely.

Overall, it's a drag having to bring-your-own-site, but at least the use of notes-platform as a pseudo-standard smooths it out.

Non-OSGi Projects

Things get stickier with non-OSGi projects, though. With OSGi projects, the dependency mechanism lines up with the way the artifacts are delivered from the vendor: they all have OSGi metadata (or have a ready-made hook for it, like Notes.jar) and so just making a p2 site out of them makes them ready to go. They don't, though, have Maven metadata, and so referencing them that way takes extra processing.

I've gone about this two ways to date:

  • The aforementioned update site project also has a mechanism for "Mavenizing" update sites. You point the tool at an existing p2 site (like one created by the first step), pick a groupId for it, and it'll install the files into your local repository.
  • The P2 Maven Resolver plugin, which cuts out that middle step and uses a p2 repository as a source of Maven dependencies directly. This route is more "clever", but some tools get a little shaky with it.

Either way, the experience is okay but not perfect. There are some oddities to do with the different dependency mechanisms between OSGi and Maven, but overall it gets the job done.

The core trouble with it is that it's even less consistent across developers/projects than the Tycho notes-platform idiom. I've personally gone through a couple iterations of the Mavenized layout, with different inter-dependency schemes and groupIds. That leads to drift and incompatibility among projects. For example, I use the xpages-runtime project for client work to do my lingering XPages development, and there's some friction in keeping the dependency schemes between that and the client project in line, even though I'm the only developer.

What I'd Like

What I'd really like would be an official HCL-provided or -sanctioned repository for p2 and Maven use for these artifacts. I've pitched the idea of OpenNTF hosting this, since I already have the tools and servers on hand, though we'd have to come up with a way to agree about who is legally allowed to access it. All the better would be consistently-updated HCL-hosted repositories, where they could link access to one of the various HCL accounts we tend to have.

The best route would be to publish it on a repository that doesn't require authentication. While I'm making wishes, attaching Javadoc would be a classy touch too.

Anyway, that's the gist of it. It's one of the two main thorns in my side when doing Domino-targeted development (the other being initializing the runtime itself in the process), and it'd save me a whole lot of heartache if it had a proper solution.

Quick Tip: JDK Null Annotations for Eclipse

Dec 10, 2020, 3:17 PM

  1. The Cleansing Flame of Null Analysis
  2. Quick Tip: JDK Null Annotations for Eclipse

A few years back, I more-or-less found the religion of null analysis, and I've stuck with it with at least my larger projects.

One of the sticking points all along, though, has been Eclipse's lack of knowledge about what code not annotated with nullness annotations does, with the biggest blind spot being the JDK itself. For example, take this bit of code:

1
BigDecimal foo = BigDecimal.valueOf(10).add(1);

That will never throw a NullPointerException, but, since BigDecimal#valueOf isn't annotated at all, Eclipse doesn't know that for sure, and so it may flag it as a potential problem. To deal with this, Eclipse has the concept of external annotations, where you can associate a specially-formatted file with a set of classes and Eclipse will act as if those classes had nullness annotations already.

Core JDK Annotations

Unfortunately - and as opposed to things like IntelliJ - Eclipse for some reason doesn't ship with this knowledge out of the box. For a while, I just dealt with it, throwing in technically-unnecessary checks around things like Optional#get that are guaranteed to return non-null. The other day, though, I decided to look into it more and found lastNPE.org, which is a community-driven project to provide such external annotations.

Better still, they also provide an Eclipse plugin (404 expected on that link - Eclipse knows what to do with it) to apply rules from your project's Maven configuration to the IDE. This not only covers applying external annotations, but also synchronizing compiler configurations.

Sidebar: The Eclipse Compiler

By default, a Java project is compiled with javac, the stock Java compiler. Eclipse maintains its own compiler, varyingly called ECJ or (as shorthand) JDT. Eclipse's compiler is, unsurprisingly, well-geared towards IDE use, and part of that is that it can flag and process a great deal of semantic and stylistic issues that the stock compiler doesn't care about. This included null annotations.

Maven Configuration

With this information in hand, I went to configure my project's Maven build. The first step was to change it to use Eclipse's compiler, since I had recently switched the project away from being Tycho-based (which uses ECJ by default). This can be done by configuring maven-compiler-plugin:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
<build>
  <plugins>
    <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-compiler-plugin</artifactId>
      <version>3.8.1</version>
      <configuration>
        <failOnWarning>false</failOnWarning>
        <compilerId>jdt</compilerId>
        <compilerArguments>
          <properties>${project.basedir}/../../config/org.eclipse.jdt.core.prefs</properties>
        </compilerArguments>
      </configuration>
      <dependencies>
        <dependency>
          <groupId>org.eclipse.tycho</groupId>
          <artifactId>tycho-compiler-jdt</artifactId>
          <version>1.7.0</version>
        </dependency>
      </dependencies>
    </plugin>
  <plugins>
<build>

I use an inline dependency on Tycho's tycho-compiler-jdt to provide the compiler. I stuck with version 1.7.0 for now because Tycho 2.0+ uses the newer core runtime that requires Java 11, which this project does not yet for platform lag reasons. I also find it useful to set <failOnWarning>false</failOnWarning> here because ECJ throws many more (legitimate) warnings than javac. Long-term, it's cleanest to keep this enabled.

I also configured Eclipse's compiler settings like I wanted for one of the project's modules, then copied the settings file to a common location. That's where the compilerArguments bit comes from.

Then, I went through the available libraries from lastNPE.org, found the ones that match the libraries we use, and added them as dependencies in my root project:

<properties>
  <lastnpe-version>2.2.1</lastnpe-version>
</properties>

<dependencies>
  <dependency>
    <groupId>org.lastnpe.eea</groupId>
    <artifactId>jdk-eea</artifactId>
    <version>${lastnpe-version}</version>
    <scope>provided</scope>
  </dependency>
  <!-- and so on -->
</dependencies>

Once I updated the project configurations, Eclipse churned for a while, and then I got to work cleaning up the giant pile of new errors and warnings it brought up. As usual with null checks, this was a mix of "oh, nice catch" and "okay, sure, technically, but come on". For example, it flags System.out.println as a potential NPE because System.out is assignable - this is true, but realistically my app's code is going to be the least of my concerns when System.out is set to null.

In any event, I was pleased as punch to find this. Now, I have a way to not only properly check nullness with core classes and common libraries, but it's a way that's shared among the whole project team automatically and enforced at compile time.

The Intricate Work of OSGi Dependencies on Domino

Dec 2, 2020, 3:03 PM

Tags: domino osgi
  1. Converting Tycho Projects to maven-bundle-plugin, Initial Phase
  2. Winter Project #2: Maven P2 Repository Resolver
  3. OpenNTF Fork of p2-maven-plugin
  4. The Intricate Work of OSGi Dependencies on Domino

One of the main goals of OSGi is proper runtime dependency management, not only allowing a bundle to declare what its dependencies are to ensure that they're there, but even to select one of multiple available versions. For example, you might have one bundle that expects Guava 15 but not higher and another that expects 18 or above, and both can be loaded successfully if you have multiple Guava versions installed. The goal is that you're supposed to be able to just throw a whole bunch of bundles into a pot and the system will figure it out.

Before I get into why this system is hobbled on Domino, I'll add a little more background.

Mechanisms

For our purposes here, bundles have two main ways to declare what they are (there are more than this, but they're not relevant right now):

  • The bundle's symbolic name combined with its bundle version. For example, the core bundle for ODA is named "org.openntf.domino" and has a version like "11.0.1.202006091416".
  • The bundle's exported packages, which are Java class packages and might also individually have versions. Using ODA again as an example, it exports a slew of packages, such as org.openntf.domino and org.openntf.domino.nsfdata, though it doesn't specify versions for any of these.

The versions of the bundle and the exported packages don't need to be the same, nor do all the exported packages need to have the same version. This shows up a lot in "spec bundles", where a vendor will wrap a standard spec API in their own bundle for various reasons. For example, Apache Geronimo has a bundle called "org.apache.geronimo.specs.geronimo-jaxrs_2.1_spec", which provides the JAX-RS 2.1 spec. The bundle itself has a version of "1.1.0", while the exported packages are all version "2.1".

To require a bundle by name, you use the Require-Bundle header, like Require-Bundle: org.openntf.domino;bundle-version="11.0.0", which requires specifically the ODA bundle, version 11 or above. To require a package, you use Import-Package, like Import-Package: javax.ws.rs;version="2.1.0". The two methods have some implications when it comes to how the ClassLoaders work, which I touched on a bit earlier this year.

Culture Differences

The package-based mechanism is generally preferred nowadays over the bundle-based one, largely for the kind of flexibility and division-of-responsibilities it provides. For example, if I want version 2.1.0 of javax.ws.rs, my code shouldn't care at all whether it comes from Geronimo's bundle, JBoss's, or anywhere else - it's all the same thing (in theory). This is extremely common for projects created with bnd and tools based on it, which can generated imported and exported packages automatically based on your code and some small configuration. For example, the bundles that make up Open Liberty use bnd config files that have some loose configuration, which then is processed out to full listings of packages with version information. This is also often paired with OSGi's "Capability" system, but that's one of the "not important for now" things.

Domino - and I believe this is inherited from Eclipse, which does the same thing - is largely based on bundle requirements. For example, the org.eclipse.jdt.ui bundle (part of the Java development tools in Eclipse and Designer) requires a bevy of bundles by name version and doesn't tag any of its exported packages with versions. This is similarly reinforced in the tools. Eclipse, unsurprisingly, uses Tycho to bring the "Eclipse PDE" style to the table, where the MANIFEST.MF file is more hand-crafted, and the tools don't do as much for you automatically. You can go full Import-Package and attach versions to your exported packages with Eclipse, but the tooling doesn't encourage it.

Conflicts in Practice

That brings us back to how this all contributes to making working with OSGi a little extra annoying on Domino. This is something that comes up constantly in my XPages Jakarta EE Support project: I want to bring in an implementation component for a JEE spec, but it will either already have or will have generated for it OSGi rules that lean towards the "package-style" of doing things, versions and all. Because there are common packages (like, say, javax.activation) that are supplied by bundles present in the Domino runtime, I want to use those. However, since the packages from Domino don't have versions specified, I need to re-wrap the bundles to import the packages without versions. Thus begins this ongoing nightmare.

Another approach I could take would be to bring my own, nicely-OSGi-ified versions of the afflicted packages, and options abound. However, that leads to a sneaky other trouble: because some of the Domino bundles are multi-spec monsters - with com.ibm.designer.lib.javamail being the absolute worst culprit - some other bundle on the system might casually import javax.activation and javax.mail. If I have this other spec implementation floating around, then it could get matched to my bundle for the former and IBM's for the latter... and crash right into "exposed to a package via two dependency chains" problem. That wouldn't necessarily be an issue if everything used versions on packages, but leaving them off means it's kind of up to the container to match bundles to each other, and it's entirely happy creating impossible conflicts.

Ways Around It

When working through this sort of trouble, I've found a few ways around the things I've run into. The first is what I mentioned above: using p2-maven-plugin to re-wrap bundles with instructions that make them more Domino-friendly. This involves a few tricks:

Aside from reworking existing bundles, I have a few times ended up creating fragment bundles that glom onto one bundle to tie it to another one after the fact, usually for the components that bridge different JEE specs.

The Wildcard: The System Bundle

In general, with OSGi, you can expect the packages you need to be provided in bundles, but it also allows for the core JDK to be implicitly available - that is, things like java.lang don't need to be imported, and are always present. That's fine, since it's generally well-enough-defined what makes up the JDK and what you'd need to instead bring in.

However, the Domino core classpath doesn't contain just the JDK, but also anything in jvm/lib/ext and (as a curveball) ndext from the Domino directory. Above, I specifically pointed out the javax.activation package, and that's because it suffers the most from both the javamail bundle as well as this. If you run tell http osgi packages javax.activation on Domino's console, you should see that bundles get this from two places: some use com.ibm.designer.lib.javamail, while others use the system bundle, org.eclipse.osgi. That "system bundle" concept is not only the thing in charge, but it also passively provides access to classes found in the lower-level classpath. In a fully OSGi environment, that system classpath will be pretty clean, but Domino's isn't that.

The good news here is that it isn't usually a big problem. If you import packages by a non-zero version, they'll never match from the system bundle this way. Still, it's always there, lurking, and the problems increase if you start adding more JAR files to jvm/lib/ext to avoid policy or amgr-memory issues. We've seen this periodically with ODA, where we used to support the use case of putting the core files there and using just a shim at the XSP layer, before it became too much of a hassle to manage. I ran into it again more recently when Guava showed up there: because I hadn't specified a version, it was able to match from the system bundle, and ended up with classes available at compile time but missing at runtime.

The Upshot

The upshot here is that there's no simple advice for dealing with this. Cultural and implementation factors make bringing third-party code into Domino unusually difficult, but it can generally be dealt with via various patching mechanisms. The XPages JEE project's dependency module ended up turning into a trove of such workarounds, so perhaps it can be useful if you ever run into this sort of thing yourself.

Java Discontinuities in Practice

Nov 23, 2020, 11:04 AM

Tags: java

Earlier this year, I wrote a post about the lay of the Java land, and in it I mentioned the oddities of post-8 Java releases as well as the then-oncoming namespace conversion in Jakarta EE. Those changes are a bit more "real" now, so I think it's worth taking the opportunity to expand on them and how they relate to Java with Domino.

Jakarta EE 9

With Jakarta EE 9 officially out now, I think it's all the more important to keep an eye on what these changes are. For Jakarta's part, there's a convenient post up on Eclipse's site detailing the specifics of what's going on, and most of what I say here is really just going to rehash that.

The "namespace conversion" in question is the switch from javax.* to jakarta.* for EE-related packages like Servlet, due to Oracle not granting rights to the "javax" term. This has involved a lot of fiddly work internally for the Jakarta project as a whole, and all of the included specs have received a major-version bump to reflect the break. In general, these new versions are functionally equivalent to the previous release, but use the different package names - so Servlet 5 has the same capabilities as 4, JSF 3 as 2.3, and on down the line.

A bit of a quirk in this is that not all classes in the javax.* namespace will be moving to jakarta.*, because not everything in there was part of Java EE. For example, Swing is in javax.swing, but it's not going anywhere. It gets fiddlier, too, especially when it comes to XML. The JDK traditionally (more on that in a bit) contained a couple distinct technologies wrapped up under the javax.xml package space, but some of those are actually part of Java EE and make the transition to Jakarta. For example, the javax.xml.transform package (covering XSLT) is part of what was originally termed "Java API for XML Processing", or "JAX-P", and is still part of the Java SE core. The javax.xml.bind package (covering mapping between XML and Java objects) was part of the "Java Architecture for XML Binding" API, or "JAX-B", and is not part of Java SE anymore. It's now "Jakarta XML Binding" and is receiving a package change to jakarta.xml.bind. I think it's cases like these that will be hairy for a lot of people not doing full Jakarta EE 9 work.

For the most part, this won't have an effect on Java development on Domino for a good while. Domino has never tracked changes in the EE world - XPages was a partial fork of Java EE 5 and that's been about it. I think that the ways it will affect Domino development (other than if you just outright do Jakarta EE development, which you should) is that code examples and third-party libraries are going to gradually transition over to the new namespaces, making them incompatible with code in the Domino stack. This will certainly affect things like my XPages Jakarta EE Support project, where future versions of the implementation components won't be usable directly if they use the Servlet spec, even if they don't require Servlet 3+ functionally.

So I think it's worth being aware of what's going on, even if there's not (yet) anything you need to do about it. The same applies to the changes in the core Java runtime itself.

Java 11 and Beyond

After 8, Java switched to a peculiar numbering system, where new major-version-numbered releases come out every six months, but only the ones that come out every three years are Long-Term-Service releases. As of right now, the current version of Java is 15, but 11 is the active LTS one, and so 11 is effectively the "real" current version for concerns like platform vendors. Java 8 is now in the same spot that Java 6 was for a while, where it's been the baseline expectation for a long time, and it's a slog of a process to move the full community past it.

Still, Java 11 is certainly hitting critical mass now. Eclipse-the-IDE started requiring it in the 2020-09 release, and the various app servers have either supported it for a while or are on the cusp of doing so.

There are a lot of nice things added to the language in the releases past 8, but they've also gotten more aggressive about removing things from the core Java SE runtime, and those changes are the things likely to be immediately noticeable Domino-wise. As I mentioned above, JAX-B was always technically an EE specification, but it was shipped with Java SE for a good long time. As of Java 11, though, it's gone, and instead must be either provided by the app server or brought in as an explicit dependency. The same goes for some less-important packages, such as org.omg - though that package sounds fun, it stands for "Object Management Group" and it just included some classes used for CORBA.

I imagine that few Domino developers use JAX-B or CORBA directly, but our old nemesis Notes.jar sure does! If you're doing any project builds outside of Domino that make use of the Notes.jar API, you likely already have or will soon run into this. For Tycho, I made a patch fragment that provides the required API to the com.ibm.notes.java.api bundle a good while back. For non-Tycho projects, your best bet is generally to include a dependency on the GlassFish-packaged variant and a pre-3.0 version of the Jakarta XML Bind API.

There will be some further removals down the line, like RMI Activation, but I don't think any currently on the horizon will be as pertinent as those.

Java With Domino Roundtable Recording

Nov 17, 2020, 4:37 PM

Tags: java

I hosted my "Java With Domino" roundtable earlier today, and I think it went pretty well! We ended up having just about the ideal number of participants, and it was not only great hearing how people feel on the topic, but also seeing and hearing from everyone.

I've put the video up on YouTube:

I'm thinking of doing more of these, and kind of making them a looser, more-casual companion to OpenNTF's webinar series. I don't know whether they'd all be on similar topics or what, but it seems like it'll be worth continuing.

OpenNTF Fork of p2-maven-plugin

Nov 14, 2020, 1:56 PM

Tags: maven tycho
  1. Converting Tycho Projects to maven-bundle-plugin, Initial Phase
  2. Winter Project #2: Maven P2 Repository Resolver
  3. OpenNTF Fork of p2-maven-plugin
  4. The Intricate Work of OSGi Dependencies on Domino

It's been one of my long-running goals to reduce my use of Tycho for my work. While Tycho does what it says on the tin, the way PDE works in Eclipse means it's an ongoing nightmare to deal with when I want to do simple things like add a new dependency. This isn't really Tycho's fault as such, and the project itself is making major steps to alleviate some issues, but it's the nature of the surrounding tooling. Even beyond that, the shaky support in IntelliJ and total lack of support in Visual Studio Code and similar editors makes it a real thorn in my side.

Still, though, it brings a lot to the table, particularly when dealing with Domino-targeted projects. Because Domino's OSGi layout is... fiddly, it's often safest to use the "Manifest-first" approach for dependencies, and it's definitely important to still be able to do feature projects and p2 repositories for importing into Designer and Domino.

But I've still been trying to whittle away at the constraints over time, and I got fed up enough yesterday to make some major strides.

The Original Project

One of the major tools in my toolbelt for years has been the p2-maven-plugin, which does a lot of heavy lifting when it comes to taking non-Tycho or non-OSGi-focused projects and making them palatable for an OSGi environment. Even when I don't use it as the backbone of a project, I tend to use it to gather third-party dependencies and process them to make them Domino-friendly.

The Fork

It has its limitations, though, that have kept me from using it to replace the final steps of a Tycho build, and those are the ones that I set out to improve. Yesterday, I forked the project and got to work. Most of my work centered around letting it pull more information out of existing p2 repositories. While it already has some knowledge of such repos, it was still geared heavily towards only using them to pick up a bundle here or there. The big annoyance for me there was that I wanted to bring in entire existing p2-housed features into the final update site.

For example, one of my big projects consumes and redistributes a bunch of upstream projects, such as ODA and the XPages Jakarta EE support. While the p2-maven-plugin made it possible to reference those projects as Maven artifacts or individual bundles, I couldn't do what I wanted and just say "bring X and Y features in, including all their bundles".

I also went in and added a few other niceties needed for Domino: generation of the antiquated "site.xml" file for the NSF Update Site, archiving of the final site for distribution, and so forth.

The Implications

With my changes, I was able to delete all of the feature projects in the tree, which lowers the mental complexity a bit. That also means that the only parts "controlled" by Tycho now are the actual bundle projects, and those have a clear path to de-Tycho-ization. Though doing that will make it a little more difficult to know when dependencies are Domino-suitable ahead of time, the conversion should save a ton of hassle overall.

So now, I have a toolchain that should be able to work together to replace Tycho while still working with the Equinox-heavy target:

  • maven-bundle-plugin to generate the OSGi metadata in META-INF/MANIFEST.MF. I could also use bnd-maven-plugin directly for this and bndtools in Eclipse, but I'm not sure that it'd gain me much in practice
  • generate-domino-update-site to create p2 repositories from post-9.0.1 Domino releases' XPages framework, which remains damnably non-Mavenized
  • p2-layout-provider to resolve p2-housed artifacts like those from above and OpenNTF projects and make them available as normal-enough Maven dependencies on the fly
  • The forked p2-maven-plugin to generate features and update sites, as well as to repackage existing bundles to be more Domino-friendly

What's missing now is an ability to run compile-time test suites in a true Equinox environment. I'm hemming and hawing on how important that really is, though. The tests I write only rarely expect the presence of OSGi - the main way it comes into play is for extensions, which are papered over by IBM Commons anyway. I've had a delightful time lately running tests of JAX-RS resources with Liberty's dev mode, and I'm pretty sure I saw some examples somewhere of building up and tearing down a scaffolding to run them during compilation, so maybe I'll switch to that anyway.

In any event, just having a tool to do this stuff is a huge weight off my back, and now the goal of a fully-normal-enough Maven project tree is tantalizingly in sight.

Upcoming Event: Java With Domino Roundtable

Nov 12, 2020, 3:31 PM

Tags: java

The other day, I floated the idea of running an unstructured roundtable discussion of working with Java either on or accessing Domino, and I think it'll be worth giving a shot.

Since Java with Domino is in a weird place, the goal would be to discuss the various ways that people are or want to use it. So that can include XPages, OSGi, REST services generally, Jakarta EE, Spring, Vert.x, and so forth. I'd also like it to be open generally. I imagine I'll have some preliminary remarks, but otherwise the goal is to be less like a webinar and more like a free-flowing discussion, in the vein of the "happy hour" and "coffee break" rooms from CollabSphere and Digital Week.

My current plan is to run it on short notice, next week:

Tuesday, November 17th
2:00 PM US Eastern (19:00 UTC)
https://zoom.us/j/99514285138
Password: Computers!

I'll share the password I come up with on Twitter on the day of the event, so look for it there.

CollabSphere 2020 Slides and Video

Oct 29, 2020, 4:09 PM

One of the nice bonuses of an all-online conference is that session recording comes built-in, so I was able to snag that and put it up on YouTube for posterity:

Additionally, I uploaded my slides to SlideShare, though that loses out on the extremely-fancy 5-second videos I used: