Quick-Tip Thursday: Avoid Future Base64 Trouble In Java

Nov 14, 2019 9:35 AM

Tags: java sntt

TL;DR: Don't use com.ibm.misc.BASE64Encoder or com.ibm.misc.BASE64Decoder.

If you've been doing Java development for a while, you've likely run into a situation where you need to Base64-encode or -decode something. It's used commonly in MIME documents, data URIs, and various other areas typically related to data transfer.

Unfortunately, using Base64 in Java was awkward for a long time, with no natural choice in the core JDK until Java 8. The trouble over the years, though, is that there are choices in previous releases, and it became somewhat common in the community to use sun.misc.BASE64Encoder for this need. The problem with that is that sun.* packages are officially not part of the JDK and can't be relied upon to exist in every Java implementation. Unfortunately, a combination of a lack of enforcement of this in the tooling and the fact that those classes are in practice pretty universal has let them creep into Java code.

On the Domino side, we have a second problem: com.ibm.misc.BASE64Encoder.

Sidebar: The Different Flavors of Java

For the most part, setting aside Android, it's safe to think of "Java" as a monolithic entity, where having a JVM of a given version number on one system will be equivalent to the same on another. There are subtle differences, though, and several vendors have long maintained their own variants of Sun/Oracle's JVM (which is named HotSpot).

IBM has maintained one such variant, called J9, and infused all of their Java-using products, Domino included, with it. J9 has since been open-sourced and donated to Eclipse and, renamed to OpenJ9, has carved out a niche for itself by being particularly lean and speedy to spin up.

The Snag We Hit With J9

For the most part, the differences in JVM flavors don't matter to developers, but IBM played a bit of a trick on us by making duplicates of some sun.* classes under the com.ibm.* package. Unlike sun.*, which at least has a little community knowledge of being internal-only (and also just looks odd compared to standard classes), com.ibm.* has no such connotation, and there's nothing in Designer or the classes themselves to indicate that com.ibm.misc.BASE64Encoder (internal and non-portable) is any different from, say, com.ibm.commons.util.io.base64.Base64 (portable via IBM Commons).

So, over the years, use of that class has crept into XPages and Java agent code - it does the job and it's always been safe to assume that Domino-the-IBM-product would use J9-the-IBM-JVM. Domino isn't an IBM product anymore, however, and, to add to potential future trouble, even OpenJ9 builds from AdoptOpenJDK don't come with those com.ibm.misc.* classes.

The upshot is that, if your Java code were to run on a JVM other than an fully-IBM-style one (whether intentionally or otherwise), you're liable to run into trouble with code that casually used those JVM-internal classes.

The Options

If you're running a version of Domino using Java 8 (9.0.1FP8+), which you'd darn well better be at this point, you're in luck: the built-in java.util.Base64 class has a nice API and is guaranteed to be present on every JVM. This is your best bet, since it doesn't require any other dependencies beyond the core JDK and it has some nice features like variant encoders for MIME- and URL-safe use.

If you're targeting a version of Domino that still uses Java 6 (don't), you do have one other safe-enough option that's available in both XPages and Java agents: javax.xml.bind.DatatypeConverter. This class is technically part of the JAX-B specification but is included in JVMs from version 6 through version 10. This is less of a clean choice than the java.util.Base64 class both because it's not as nice of an API and because use on Java 11+ will require bringing in an external dependency. Still, if you're stuck on an old release, it'll at least get you through any potential rough patches in the near future.

Finally, in XPages, you have the aforementioned com.ibm.commons.util.io.base64.Base64 class, which will continue to be present due to being included in a base library of the XPages stack, but which is rendered obsolete by java.util.Base64.

So I recommend taking some time to look through any running Java code you have for this potential hiccup and making the switch. It's admittedly a little awkward to do, but it's still a good idea.

Writing Domino JEE Apps Outside Domino

Nov 8, 2019 1:47 PM

In my last post, I mentioned that I decided to write the File Store app as a Jakarta EE application running in a web server alongside Domino, instead of as an app running on the server itself. In the comments, Fredrik Norling asked the natural question of whether it'd have been difficult to run this on the server, which in turn implies a lot of questions about deployment, toolkits, and other aspects.

Why Not

My immediate answer is that it would certainly be doable to run this on Domino, but the targeted nature of the app meant that I had some leeway in how I structured it. There are a couple reasons why I went this route, but most of them just boil down to not having to deal with all the gotchas, big and small, of doing development on top of Domino.

As a minor example, I wanted to add some configuration parameters to the app, and for this I used MicroProfile Config. MicroProfile Config is a small spec that standardizes the process of doing key-value-based configuration, allowing me to just say that I want a named value and let the runtime pick it up from the environment, system properties, or a config file as necessary. It wouldn't be difficult to write a configuration checker, but why bother reinventing the wheel when it's right there?

Same goes for having the SFTP server load at startup and be running consistently. If I did this on Domino, I could either put it in an NSF-based XPages app with an ApplicationListener and depend on the preload notes.ini parameter, or I could go my usual route and use an HttpService that manages the lifecycle. Neither route is difficult either, but they're both weirder and more fiddly than using servlet context listeners in a normal JEE app.

And then there's just the death by a thousand cuts: needing to use Tycho to build, having to deal with Eclipse Target Platforms, making sure anything using reflection is wrapped in an AccessController block to avoid Java policy issues, the nightmare of dependency management, the requirement to use either Designer or the okay-but-still-cumbersome Domino HTTP restart development cycle, and on and on. With a JEE app, all of those problems disappear into thin air.

The Main Hurdle

All that said, there's still the hurdle of actually implementing Notes native API access outside of Domino. At its core, what I'm doing is the same as what has been possible for years and years, initializing a Notes/Domino runtime in a secondary process. The setup for this varies platform by platform but generally involves either setting a couple environment variables for your process or (as is the case with the Domino Open Liberty Runtime) spawning the process from Domino itself.

Beyond setting up your process's environment, there's also the matter of initializing each thread of your app. On Domino, all threads are inherently NotesThreads, but outside of that there's some specific management to be done. You can call NotesThread.sinitThread() and NotesThread.stermThread() manually or run your code in specifically-spawned NotesThreads. I largely do the latter, making use of an ExecutorService to handle maintaining a thread pool for me. I then added some supporting code on top of that to let me run blocks of code as an arbitrary named user while retaining a cached set of sessions. That part wouldn't really be necessary if you're writing an app that doesn't need to act as different user names, but it's handy for something like this, where incoming connections must run as a user to enforce security fields properly.

Philosophical Advantages

Beyond my desire to avoid hassles and get to use modern Jakarta EE goodies, I think there are also some more philosophical advantages to writing applications this way, and specifically advantages that line up with some of HCL's stated long-term goals as well. Domino has long been a monolith, and that has largely served it well, but keeping everything from the DB all the way up through to the app-dev stack in the same bag means you're often constrained in your toolkits and deployment choices. By moving things just outside of the main Domino tower, you're freed up to use different languages and techniques that don't have to be integrated and maintained in the core. This could be a much larger jump than I'm doing here, and that's just what HCL has been pushing with the AppDev Pack and the associated Node.js domino-db module.

I think it's beneficial to picture Domino more as a dominant central core with ancillary servers and apps running just adjacent to it - not a full-on Microservices architecture, but just a little decentralized to keep areas of concern nice and separate. Done well, this setup is a lot more flexible and fault-tolerant, while still being fairly straightforward and performant. It's also a perfect match for a project like this that's geared towards implementing a new protocol - it doesn't even have to worry about HTTP SSO or reverse proxies yet.

So I think that this is where things are heading anyway, and it's just a nice cherry on top that it also happens to be a much (much, much) more pleasant way to write Java apps than OSGi plugins.

Another Side Project: NSF SFTP File Store

Nov 5, 2019 4:12 PM

When I Know Some Guys kicked off, we bought a couple of Transporter devices to handle our file-syncing needs without having to rely on Dropbox or another hosted service. Unfortunately, Nexsan killed off Transporters a couple years back, and, though they still kind of work, it's been a back-burnered project for us to find a good replacement.

Ideally, we'd find something that would handle syncing data from our various locations transparently while also allowing for normal file access through some common protocol. Aside from the various hosted commercial services, there are various software packages you can run locally, like OwnCloud and NextCloud. I even got a Raspberry Pi with a USB hard drive to tinker with those, though I never got around to actually doing so.

Yesterday, though, I realized that we already have a fleet of privately-owned servers that replicate seamlessly in the form of our Domino domain. They also, conveniently, have nice capabilities for blob storage, shared user authentication, and fine-grained access control. What they didn't have, though, was any good form of file protocol. I'm pretty sure that Domino still has WebDAV built-in, but that's just for design elements. Years ago, Stephan Wissel created a project that works with file attachments, but that didn't cover all the bases I wanted and I didn't want to adopt the code base to extend it myself. There's also Karsten Lehmann's Mindoo FTP Server from around the same time, but that was non-SSH FTP and targeted at the local filesystem.

So that meant it was time for a new project!

The Plan

I initially looked at WebDAV, since it's commonly supported, but it's also very long in the tooth, and that has led to all of the projects implementing that being pretty old and cumbersome as well.

Then, I found the Apache Mina project, which implements a number of server protocols, SSH included, and is actively maintained. Looking into how its SFTP support works, I found that it's shockingly simple and well-designed. All of the filesystem access is based on the Java NIO packages added in Java 7, which is a pluggable system for making arbitrary filesystems.

Using SFTP and SCP means that it'll work with common tools like Transmit and - critically - rsync. That means that, even in the absence of an custom app like Dropbox, mobile access and syncing with a local filesystem come "for free".

The Project

So out of this was born a new project, NSF File Server. Thanks to how good Mina is, I was able to get a NIO filesystem implementation and SFTP+SCP server up and running in very little time:

Screen shot of the SFTP server in Transmit

In its current form, there aren't a lot of tricks: the files are stored as attachments to normal documents in a "filestore.nsf" database with two views, which allow for directory-contents and individual-file lookup while also being pretty self-explanatory to a Notes client. I have some ideas about other ways to structure this, but there's an advantage to having it be pretty basic:

Screen shot of the File Store NSF

Similarly simple are the authentication mechanisms, which allow for both password and public-key authentication based on the HTTPPassword and sshPublicKey fields in a person document, respectively (and maybe via LDAP in directory assistance? I never remember the mechanics of @NameLookup).

The App

Because this is a scratch-our-itch project and I'm personally tired of dealing with Domino's OSGi environment, the app itself is implemented as a WAR file, expected to be deployed to a modern Jakarta EE server like my precious Open Liberty. Conveniently, I have just the project for that as well, making deploying NSF-accessing WARs to Domino a bit more reasonable.

For now, the app is faceless: the only "web app" bits are some listeners that initialize the Notes environment and then spawn the SSH server. I plan to add at least a basic web UI, though.

Future Plans

My immediate plan is to kick the tires on this enough to get it to a point where it can serve as its original goal of a syncing SFTP server. I do have other potential ideas in mind for the future, though, if I feel so inspired. Most of my current logged issues are optional enhancements like POSIX attribute support, more-efficient handling with the C API, and better security handling.

It's also a good foundation for any number of other interfaces. A normal web UI is the natural next step, but it could easily provide, for example, S3 API compatibility.

For now, though I haven't gotten around to uploading a build to OpenNTF yet, feel free to poke around the code and let me know if any ideas strike your fancy.

CollabSphere 2019 Slides

Nov 2, 2019 9:04 AM

I've uploaded my slides for my session at CollabSphere: "Preparing Your XPages Applications for a Changing Future":

CollabSphere 2019 - Preparing Your XPages Applications for a Changing Future

While working on the presentation, I realized that pretty much all of the slides in there warrant blog posts of their own, so, as I have the time and wherewithal, I plan to expand on the topics a good deal in the weeks to come.

Additionally, Howard Greenberg uploaded the slides from our shared presentation ("shared" in the sense that he did 90% of the work), "Let's Calendar That":

Dev112 let's calendar that from Howard Greenberg

CollabSphere 2019

Oct 22, 2019 2:41 PM

Tags: mwlug

CollabSphere 2019 is just around the corner, and I'll be taking part in three sessions:

DEV115: Preparing Your XPages Applications for a Changing Future
Monday, October 28th at 5:45 PM
Brooks Room

DEV103: Developer Tools for the Next Decade (Roundtable)
With Graham Acres
Tuesday, October 29th at 1:00 PM
Brooks Room

DEV111: Let's Calendar That! - Add Calendars to your Domino web applications
With Howard Greenberg
Wednesday, October 30th at 9:00 AM
Brooks Room

The first one will be a continuation of our collective descent into madness when it comes to XPages, the second will be an open-ended discussion of how we can or do use OpenNTF projects in our work, and the last will be a practical example of such projects in action, in particular using them to write cleaner and more-portable XPages code.

Concept Proven: A Complex Production XPages App Running Outside Domino

Oct 7, 2019 1:13 PM

Tags: javaee xpages
  1. Letting Madness Take Hold: XPages Outside Domino
  2. XPages on Android
  3. Concept Proven: A Complex Production XPages App Running Outside Domino

Around the start of the year, I took a little time to see if I could get XPages as it exists today running outside of a Domino server, specifically inside Open Liberty. I met with a good deal of success, finding that I could run stripped-down apps "freely" inside a normal WAR web app, and I could run full-fledged NSF-hosted apps using the LCDEnvironment container-within-a-container technique that normal XPages uses.

Since then, the prospects had been percolating in my mind, including coming back around a few months later to get XPages running on iOS and Android by way of Darwino. At the end of that post, I mused a bit about what a proper setup would look like, which would be essentially taking XPages, custom controls, and supporting resources and putting them into a Maven-structured webapp.

This past week, I decided to put WoW Classic aside for a bit and put some evening-and-weekend time into seeing if I could my get client's huge, 500-XPages-and-CC, plugin-backed, JAX-RS-heavy app working in this setup. And, like before:

Short Answer

Yep!

Long Answer

Building on what I had before, there were some medium- and large-scale hurdles I had to overcome:

  • Hook up the remaining Servlets and listeners at the right times
  • Figure out what to do about transpiling XSP source
  • Add and adapt the remaining required IBM Commons Extensions
  • Tweak the OSGi bridge for more bundle-like behavior and remove Equinox dependencies in our code
  • Make the stack have a Notes runtime, but not think it's too much like Domino
  • Adapt Java EE standards use to ensure it'd work with different implementations

A lot of this work ended up being finding the right little bit to tweak or add - an environment variable here, a META-INF/services file there - but a couple of the topics warrant expansion.

App Legwork

The core of what made this possible was a combination of coincidental and intentional work I had been doing in the main app for a good while. The first big part was that I had started bringing in components of my separate XPages Jakarta EE Support project, in particular bean validation and JAX-RS 2.1. We had previously used an older version of Hibernate Validator, and we were using the ExtLib-provided Wink for a good while to build out our REST services. Moving to RESTEasy's JAX-RS 2.1 implementation not only let us use the newer features it provided, but also let us set aside the Wink-isms we had been using in a couple places in favor of what became standard after Wink went moribund.

I also stripped out as many Equinox-isms as I could that I had made assumptions about over the years, such as removing home-brewed Equinox extension points in favor of portable IBM Commons extensions and reducing our use of Require-Bundle in favor of Import-Package, which pointed out all the areas where we depended on a specific implementation of a standard as opposed to just the cross-server spec.

Notes Runtime

One of the pleasant things I found out in my original experiments was that, while a lot of the newer and more-complex components of the XPages stack end up having some assumptions about the presence of a Notes or Domino environment, there's no dependency on nHTTP specifically. As long as the stack can call into the lotus.domino and NAPI classes, it's happy.

Really, the only thing I had to tiptoe around was making sure that I didn't include the LCDEnvironment stuff and it's associated platform assumptions. At one point, I did investigate whether I could use that and make the running web app just another "module" like an NSF - there's even partial support classes like FileModule for this kind of thing in a "mashupmaker" package - but I ran into too many assumptions about the environment and class hierarchies that way. It's a bit of a shame, since that would have allowed transparently referencing NSF-housed apps in the same server, but that was only a "nice to have" anyway. Plus, it would have had the down side of keeping everything within the LCD wrappers, which do things like report the environment as Servlet 2.5 no matter how they're running (which is, horrifically, an upgrade over the underlying 2.4 on Domino).

XSP Source

My original experiment and my bootstrapping here involved copying the generated Java xsp.* files from Designer into the src/main/java build path of the Maven project, which are then picked up by the CompiledPageDriver used by the XPages runtime. This works, but it's not exactly developer-friendly. To make it in any way practical, I'd need to be able to continue working with the XML source like in Designer.

The trouble here is conceptually similar to the impediment to incremental compilation in the ODP Compiler: due to the way XSP Libraries work, they require not only the presence of the full classpath to know what components are available, but also an active running Java application. They can't just be derived statically - this is why you have to install library plugins into Designer-the-application.

However, I had a distinct advantage here: though I didn't have a running app at editing/compile time, I sure would have a running app while the app is, uh, running. And that is a problem I did solve in the ODP Compiler, thanks to the Bazaar's hooks into the XSP interpreter inside the XPages stack. I realized that I could implement a FacesPageDriver that received requests for pages and compiled them on the fly. The interface is very simple: it contains only one method, which takes a FacesContext and a page name (like "/foo.xsp") and returns a FacesPageDispatcher, which is the obliquely-named interface implemented by the translated xsp.* Java classes.

Really, this could be anything; it's not actually tied to the Java translation and compilation process at all, and could be something like a live translator of XSP into spitting out UIComponent objects on the fly, which is something I considered. Though the use of compiled classes is an implementation detail, I ended up deciding to still piggyback on that, since a dynamic interpreter would have extra legwork to do to make sure the behavior was the same, whereas using the translator would guarantee the same results as if they were compiled the normal way. In practice, the main differences between my code here and the code in the ODP Compiler is that I could use the existing component registry from the running app and that I ended up compiling the classes as Groovy source instead of Java. I did the latter because the Groovy in-memory compiler didn't require the same kind of classpath crawling that the Java compiler does, and which had ended up being a major performance sink in practice. Groovy is almost a strict superset of Java syntax, so it only took a little bit of tweaking to the generated source to make it work - tweaking that could be done simply since the problem domain is so small.

Liberty-Specific Gotcha

Though I'm still smitten with Open Liberty as my Java host of choice, I did pull my hair out a bit over some of the side effects of it. Specifically, I ran into trouble when having the JSP feature enabled (which is on by default), where it would load up com.sun.faces.config.ConfigureListener much earlier in the process than it's supposed to be. It's something to do with an optimization for when you have real JSF enabled as well, but, since IBM didn't rename the core packages, Liberty picked up on the presence of it and kicked it off on server start, instead of during app load when it's expected.

Fortunately, Liberty's architecture is such that, if you don't choose to enable a feature, it's entirely absent at runtime, so there was a clear workaround. It's a bit of a shame, since going from XSP ? JSP is a legitimate potential path, but it's not the end of the world. Additionally, there's nothing Liberty-specific in my approach here. I haven't tried it, but the same app should work in Tomcat (if you bring in some standards implementations) or in another JEE server like Glassfish.

The Results

After wrangling with this stuff enough, though, the results were ideal: the full app, with all its dynamic content, runtime component additions, use of home.xsp/foo/bar path info shenanigans (its own hurdle in the Servlet world), dependency on ODA and half a dozen other XPages Libraries, and its plugin-served resources works just as it does on Domino. And, though some of the runtime setup was a bit hairy, the app itself is pretty svelte and straightforward. All the Java code is where it's expected to be, resources in src/main/webapp show up like you'd want, and it all just acts like it should.

The Future

So now I have an odd conundrum. I prefaced my post in January by saying that I didn't plan to do anything with the experiment, but now I'm looking over a precipice where I could theoretically plop the WAR file into the Domino Open Liberty Runtime, add a redirection rule to point to it, and call it a day. I certainly don't plan to, since there could be any number of weird problems with this and the app still points back to normal classic Domino web resources in some places, but now there's a dark voice whispering into my mind. "No more Designer," it says. "No more Tycho, or writing OSGi plugins just to get a third-party library to work nicely. No more Java policy nonsense. No more Servlet 2, or missing Javadoc, or the server not loading pages because the wrong logging JAR went into jvm/lib/ext. You could be free!" The dark voice makes a good case, I'll give it that. We'll see.

Medium-Term Ways to Improve XPages

Aug 28, 2019 1:18 PM

Tags: xpages
  1. What Makes XPages "Not Modern"?
  2. Medium-Term Ways to Improve XPages

Following up on my jeremiad the other day, I've been thinking about some of the short- to medium-term ways that XPages could be improved. I'm not fully sold on the idea that it should be massively improved in-place due to some of the systemic decisions and the general goal of future-proofing, but some improvements that HCL could make in the near future would make everyday XPages development better and would help people like me in our tasks of bringing in other tools.

The big caveat here is that I'm labeling these as small-scale tasks in theory. Since I don't know about all the workings of the lowest layers, in particular the way the XPages stack interacts with traditional nHTTP, this could easily fall into "this should be easy!" territory for reasons I can't know.

The other caveat is that I'm largely going to leave out bringing in new runtimes as a stated goal: no integrating Open Liberty, no cramming in a Node runtime, or so forth. The most expedient fix for some of this may actually be to do something like that, depending on how the "WebSphere" parts of Domino work to begin with, but that would be an implementation detail for the purposes of this list.

With those out of the way, let's proceed!

Servlet API

This has been a bugbear for a long time, and being able to bump up the Servlet API version would be huge for both its immediate features and for compatibility with newer standards and third-party libraries. I'm not sure what the source of the limitation is - it could be primarily to do with the deprecated web app part of Equinox Domino uses or it could be due to nHTTP-side C code.

Remove or Don't Export Apache Commons Logging From the JSF Libs Plugin

This may seem overly specific at first, but it's been the source of an outsized amount of hair-pulling for me and others.

I'm referring here to the "com.ibm.designer.lib.jsf" plugin that's a central part of the XPages stack. In addition to the obligatory "jsf-api" and "jsf-impl" JARs, it also contains very-old versions Apache Commons BeanUtils, Collections, Digester, and Logging. The trouble is that the Logging packages are made available outside this bundle and then these are automatically on the classpath of every XPages application and plugin. In a void, that's not terrible, but logging packages are a source of tremendous heartache when developing XPages and Java agents, and in particular they can often end up in jvm/lib/ext and cause ClassLoader trouble by mixing with the version in this plugin.

Ideally, it'd be removed outright or taken out of the exported packages and instead included just where it's needed. Next best would be to include newer versions as proper OSGi bundles with package-version information (more on that next).

Add version information to exported packages

What I'm referring to here is the OSGi mechanism for assigning versions not just to a bundle as a whole, but also to individual packages. As an example, here's a trimmed-down and cleaned-up version of the export list from the aforementioned JSF libs plugin:

Export-Package: javax.faces,
 org.apache.commons.logging,
 org.apache.commons.logging.impl

With version information (including a made-up "1.3" version to differentiate XPages JSF), it'd be more like:

Export-Package: javax.faces;version="1.3.0",
 org.apache.commons.logging;version="1.0.3",
 org.apache.commons.logging.impl;version="1.0.3"

Doing this across the board would make the XPages runtime much more OSGi-friendly.

Provide Public or OpenNTF-Permission-Gated p2 and Maven Access (With Javadoc!)

I've been on this soap box for a long while, and it comes up pretty frequently: because the XPages runtime exists solely as a component of Notes and Domino, you have to jump through hoops to depend on it as part of an external project. The "Update Site for Build Management" was a good step, in that it became one place to get a nicely-formatted p2 repository, but that was never updated past 9.0.1 and even then was just a ZIP download, not an HTTP-accessible p2 site like you'd get with things like Eclipse. Additionally, there's never been a Mavenized version of these artifacts, meaning anyone wanting to use XPages JARs without PDE or Tycho has to reinvent the wheel.

I made tools to automate these processes, but it'd be up to HCL to give the legal approval to make the results of that available generally (or to authenticated OpenNTF users who agreed to an EULA).

Shared-Source the Stack

Open-sourcing XPages would likely be a large task just because of the legal side of it, but perhaps just giving read-only access to licensed customers would be an easier step to take. There's so much about the inner workings of XPages that are effectively a black box, and it's very difficult to figure out how to accomplish a lot of advanced tasks. Having source access would make this sort of thing much, much smoother.

Support WABs or Similar

Alongside XPages came the ability to deploy servlets and JEE web apps via OSGi, but these use the deprecated Equinox extension points and an IBM-Expeditor-specific point, respectively. From what I gather, the more-normal way to do this in OSGi is via Web Application Bundles. That wouldn't necessarily change the capabilities of development for Domino dramatically, but it'd make it a smaller step to go from "normal web app" to deploying on Domino, and wouldn't necessarily involve a full revamp of the Expeditor layers.


That's it for this list, at least for now. I may end up thinking of enough for another one of these posts, but I wouldn't want to load HCL up with too much homework right after the summer break.

What Makes XPages "Not Modern"?

Aug 23, 2019 1:53 PM

Tags: xpages
  1. What Makes XPages "Not Modern"?
  2. Medium-Term Ways to Improve XPages

A big part of figuring out how to move past XPages is discussing what makes it no longer modern in the first place. Some of these reasons will be gimmes - out-of-date or outright-missing features - while some will be less about what XPages is strictly capable of and instead more about how XPages development and deployment works. I'm going to leave out some of the ones that are just from recent lack of development attention, such as Bootstrap 4 support remaining in alpha state.

Admittedly, I suspect that this post will come across as an enumeration of complaints, but I don't really mean it that way. XPages served us well for a while, but it's one thing to say "ah, XPages, it's a bit old and creaky" and another to think about just how pervasive that is.

Technical Limitations

The most straightforward reasons are the Java technologies that are included in the XPages stack, but are significantly out of date. This includes big-ticket core standards like Servlet 2.4 (faux 2.5), the forked JSF 1.x, JavaMail 1.3, pre-Unified EL, and JAX-RS 1.1; lesser-known standards like Activation 1.1; and even third-party libraries that show up in the fixed XPages classpath like Apache BeanUtils 1.6, Apache Collections 2.1, Apache Logging 1.0, and Guava 19 (a recent addition).

Some of these can be worked around relatively simply (including a newer JAR in your app/plugin or adding an OSGi bundle), but others are either more pernicious (like Logging and Guava, causing runtime ClassLoader trouble) or are outright showstoppers. I've done a lot of work to bring newer and missing Jakarta EE standards to XPages, but the ancient Servlet version is a hard limit on a lot of things. In addition to its own limitations like the lack of Web Sockets, third-party libraries and implementations often expect Servlet 3 as a bare minimum, and nowadays make use of Servlet 4.

Self-Imposed Hurdles

Beyond the technical limitations that largely come from outdated implementations, XPages is afflicted by what I'll deem "self-imposed hurdles": difficulties that spring from choices that Lotus/IBM/HCL actively made or chose to not make that resulted in things being a bit more difficult than in other environments. These decisions were often even good decisions in the aggregate, but they have tradeoffs.

The first big one is the use of the NSF container. Overall, this is a positive: not only did it create a more unified development experience with classic Notes elements, but the NSF itself brings attributes like external-to-the-webapp ACLs and seamless replication, which are extra addons at best with other frameworks. However, because the NSF is essentially a proprietary binary blob, it means that it can't readily participate with other tooling. You can't crack one open with a ZIP file tool, inside-JVM reflection/classpath tools can stumble over it, and it can't inherently participate in source control or automated builds. As above, some of that can be worked around with immense effort, but even then not nearly as smoothly as elsewhere.

Along the lines of the NSF, due to Designer's lineage and (presumably) the state of technology at the time that XPages was integrated, the normal XPages development process has no participation in the larger unified Java development world that Maven largely ushered in. Even besides not fitting into the build system, this introduces just a bit more friction in normal development. Yes, an XPages app can use many third-party Java libraries by just adding them to the Code/Jars section, but they don't benefit at all from version management or automated fetching of source and Javadoc. It may sound like a small thing, but it makes a world of difference once you're used to it.

That plays in to a general Domino/Designer issue that stretches back to the introduction of Java to the platform: the libraries include no parameter info or Javadoc. Designer has its help sidebar for Notes.jar classes and the Javadoc exists for a subset of the XPages stack as of 9.0.1, but they're not integrated in the IDE, and large swaths of the platform don't have any Javadoc at all. This leads to stumbling blocks where you'll seek out documentation for a class that nominally exists in XPages, but you have to be wary of the "Since" note for every method (and even then you may be thrown for a loop, like UIViewRoot#getViewMap() that's "since" 2.0 but does exist in XPages).

Then there's a big sticking point that's created a weird cargo cult within the community: the default Java policy. While the file itself does indeed come from stock Java (including the adorable "properies" typo), the way it interacts with the SecurityManager and ClassLoader setup in XPages has an outsized negative impact. XPages apps butt heads with this policy constantly, in particular when using reflection - which is something that third-party libraries do often as a matter of course. It's had the effect of making many people shy away from using third-party libraries or code that uses normal Java features like this out of a partially-justified desire to not tweak the settings.

The last one in this category is the use of OSGi generally and Equinox specifically. I can't knock the original choice too hard: OSGi is a logical pick for making a large, pluggable system and IBM institutionally had tremendous experience with it by way of Eclipse and WebSphere. However, while it's a solid choice for system architecture, it's often very awkward for app architecture. And, hypothetically, XPages is designed to avoid this. Like the various Java EE servers that are implemented with OSGi but run plain-old WAR-based apps, a basic XPages app isn't meant to care about the OSGi layer at all. However, a combination of the above security-policy troubles, Designer's limitations as an IDE, explicit blocks on what classes an NSF can access, and the severely-limited EE services provided by the XPages runtime means that OSGi plug-in development came immediately to the fore. And developing plug-ins for XPages is hard. It involves all sorts of ceremony and incantations just to get started, and maintaining even a well-structured plug-in project is much more difficult than a normal Maven or Gradle project. Some of this is OSGi, some of it is Equinox specifically, and some of it is unique to XPages, but it all combines to make it all a daunting process even for people who aren't put off by Java-the-language. So, much like with the security policy, you have a large subset of the XPages community that for either developer or admin reasons don't develop XPages libraries and often don't bring in any outside of that HCL ships in the box.

Development Model

And, finally, I'd like to cover the development model created by XPages even when everything is firing on all cylinders.

The first one is the lack of MVC (or similar) structure. XPages was introduced as bringing MVC to Domino development, in large part thanks to the underlying JSF nature, but this was something of a dirty trick. While it's theoretically possible to make a cleanly-structured XPages app, the framework provides no useful model layer, and Designer is almost actively hostile to this kind of separation. And, because the framework doesn't give any guidance and few of us in the community came in as seasoned web developers, the discussion about XPages and MVC is largely about each person's individual version of "MVC-style" development, and Lord knows I'm as guilty as anyone. This, like the lack of dependency management, is something that you don't fully appreciate until you work with a system with real structure and clear guidance on how to do it.

The second point in this section is one that I actually debated putting on this list, and that's XPages's use of server-side state. Depending on who you talk to, this alone is something that makes XPages dead in the water, an ancient relic of a forgotten era. Personally, I'm not so universally down on the notion: I feel that the worries about scalability are usually based on hypotheticals, and being able to make complicated dynamic apps maintained by the server cuts down on the work of writing and securing REST APIs for absolutely everything. Still, it's pretty clear that the server-state model has fallen out of favor, and that's worth at least considering when determining "not modern".

So What About Workarounds?

A running theme through all of this has been "X is behind the times or weird, but can be overcome with a bunch of work". And this shows up a ton in practice: a lot of XPages developers have gone to Herculean lengths to bring in other technologies or carve out a way to do structured programming, and it sometimes works. Others eschew XPages UI components entirely and build JavaScript apps backed by REST services written with the ExtLib components, JAX-RS, or other mechanisms. Especially since the Java 8 switch, you can do a lot to drag XPages forward.

But that doesn't make it "modern" really. This is all work fighting against the platform, work that doesn't need to be done with other tooling. In other environments, if you want to use the latest version of a spec, you make sure your container is up-to-date and then you just use it. If you want to bring in a dependency, you declare it in your "pom.xml" and it shows up, source and all. And then there are the things that other environments provide for you, like OpenAPI documentation, and concepts that don't even exist in XPages, like fault tolerance. Maybe some of those things could be brought to XPages too, but, again, it'd be a struggle every step of the way. There'd always be some weird thing to do with the Servlet version, or Tycho, or ClassLoaders, or loggers, or some of the many other little asterisks that accompany the platform.

The Work To Move Forward

My plan for future entries in this series is to discuss some of the specific steps that I've been taking with my largest active client project to prepare it for the future. That future is as-yet-undefined - it may stay within the Domino web container, it may not - but the good part is that a lot of the work is in common regardless of where we take it, and I think that it will prove useful to others facing similar situations.

Converting Tycho Projects to maven-bundle-plugin, Initial Phase

Aug 22, 2019 3:27 PM

Tags: maven osgi tycho
  1. Developing an Open/WebSphere Liberty UserRegistry with Tycho
  2. Developing Open Liberty Features, Part 2
  3. Converting Tycho Projects to maven-bundle-plugin, Initial Phase

To date, Tycho has been my tool of choice for developing Domino-targeted Maven projects. However, it's not without protest.. Unlike most Maven plugins, Tycho inserts itself at the very start of the build process and takes over dependency management. Purely in Maven, you can use normal Maven dependencies, but only so long as you're pointing to a dependency that already has OSGi metadata (which, fortunately, most do), and only then to satisfy a Require-Bundle or Import-Package that also has to be present. This gets more annoying, though, when dealing with Eclipse, which removes the notion of Maven dependencies entirely when using Tycho and forces you to jump through hoops to do what you want. And, as a final kicker, Tycho's p2 repository support is completely broken in the latest release version of Maven.

So why do I keep using it, anyway?

Well, it brings a couple major benefits that are of particular importance for Domino:

  • It can use p2 repositories for dependencies. This matters because the XPages runtime plugins are not (yet?) available as normal Maven dependencies. Years back, IBM [provided a "Build Management" update site](https://openntf.org/main.nsf/project.xsp?r=project/IBM Domino Update Site for Build Management), which is helpful, but it's still an Eclipse-style p2 repository, not a Maven repository. Tycho can use p2 repositories natively, though, just as Eclipse does.
  • It constructs a true Equinox environment. This matters both when compiling your project and when running automated tests. The environment created by Tycho is the same Equinox OSGi runtime that Domino uses, and so it supports the same styles of bundle resolution and extensions that you get in Domino. Without this happening during the build, you lose some assurance that things at runtime will match your expectations.
  • It spawns tests in a separate process. This is a little esoteric, but it matters because launching a Notes environment on a non-Windows platform more-or-less requires setting up environment variables for the Notes/Domino directory and others, and these variables are not successfully set when using the normal maven-surefire-plugin runtime. This means that reliably running tests requires setting up the environment ahead of time, which is fiddlier and less automated.
  • It can generate new- and old-style Eclipse Update Sites. To be used in Designer and NSF-based Update Sites, an OSGi project has to be packaged up into a p2 repository along with an old-style "site.xml" file. Tycho can generate these (and can be assisted with "site.xml" when using the newer style), and it can also auto-generate source bundles, features, and repositories.

Alternatives and Workarounds

Some of the "hard" requirements for Tycho can be at least worked around.

Years ago, I wrote a Ruby script that would take a p2 site like IBM's or one generated from a newer version and "Mavenize" it by creating artifact information based on each bundle's OSGi manifest. I since converted it to Java and included it in Darwino's Studio plugins, and yesterday added it to the generate-domino-update-site Maven plugin. Using that lets you declare dependencies on any of the bundles or embedded JARs in a normal Maven project:

1
2
3
4
5
6
7
        <dependency>
            <groupId>com.ibm.xsp</groupId>
            <artifactId>com.ibm.notes.java.api.win32.linux</artifactId>
            <version>[10.0.0,)</version>
            <classifier>Notes</classifier>
            <scope>provided</scope>
        </dependency>

This isn't perfect, since it's neither standardized nor generally available (go vote for the aha idea!), but at least it's reproducible and can be something of a de-facto standard if used enough.

Then there's the matter of generating appropriate OSGi metadata. Outside of the Tycho-using world, the main way that generating this is via a tool called bnd and its related tools. bnd is kind of a parallel world and there's even an alternate tooling set for Eclipse instead of the default PDE. There are a couple ways to use bnd in a Maven build, but the one I'm familiar with to date is the maven-bundle-plugin. I've used this with Darwino to incidentally create OSGi metadata for the otherwise non-OSGi core modules, and I suspect that it gets used heavily this way. It's more powerful than that, though, and is a nice wrapper for bnd under the hood, supporting Declarative Services annotations and all the other OSGi goodies. In my case, I used it to generate the MANIFEST.MF with most of the defaults, but then added in some specifics to play nice in my Domino Equinox target.

I suspect that these bnd-based tools can also be a route to solving my automated-testing woes. For the Open Liberty Runtime project, I don't have to worry about that, since it's so dependent on running in actual Domino that the return-on-investment for setting up JUnit tests wouldn't be worth it. However, I recall seeing some Maven testing plugin that let you spawn an OSGi environment of your choice, and I think that something like that may be able to replace Tycho for me there.

Since p2 repositories/update sites are entirely an Eclipse-ism, most OSGi tooling doesn't care about them. That's where p2-maven-plugin comes in. Not only will it allow you to create p2 repositories, but it lets you define features in the configuration, meaning they don't have to be separate modules like in Tycho. And not only that, but it will also auto-OSGi-ify any Maven dependencies you bring in if they don't already have OSGi bundle information. It also lets you override existing bundle data on the fly if needed, such as if the dependencies and imports conflict with something on Domino.

Eclipse Friendliness

Since I still use Eclipse to develop these projects, I want to be able to make use of the XPages SDK's ability to run Domino's HTTP stack pointed at my active workspace. For that to work, I need to be able to get Eclipse to recognize my projects as functional PDE-compatible bundles even if I'm not using PDE for them. Fortunately, that process isn't difficult: once I set the location for MANIFEST.MF to be in "META-INF" in the project root, maven-bundle-plugin started generating the files there instead of within "target", and Eclipse started working with the projects as OSGi bundles. The only thing left to do then was to gitignore the generated files, since they don't need to be checked into source control anymore.

Future Improvements

The big thing that is still an open problem is dealing with testing. I have some ideas for taking a swing at it, but for now it's the main thing preventing me from doing this for all of my Tycho projects.

Beyond that, I want to look a bit into bnd-maven-plugin. This diverges from maven-bundle-plugin in that it's geared towards using bnd configuration files directly. During the build process, I think the results would be the same, since maven-bundle-plugin can already pass through whatever configuration I want, but it would be a better match for the Eclipse bndtools tooling. Additionally, externalizing the bnd config files would mean they'd be the same if I decided to switch to Gradle, as Open Liberty uses.

Finally, and specific to this Open Liberty project, I may want to consider using bnd to generate Liberty Feature manifests, as it itself does. These features are implemented as OSGi "subsystems" packaged .esa files. Currently, I'm using esa-maven-plugin to generate their specialized manifests, but I've already hit some limitations in the area of cross-feature dependencies. Apparently, bnd takes some wrangling to suit this, but is worth it. I'll consider that one a "stretch goal", though.

For now, I'm pretty pleased with the new setup. The projects still work on Domino, I can run them on there from the workspace, I was able to eliminate the p2 feature projects outright, and now I don't have to worry about packaging up a dependencies site just to have something to point at in Eclipse. Heck, I can even use Visual Studio Code now! It's pretty nice.

Developing Open Liberty Features, Part 2

Aug 18, 2019 9:14 AM

  1. Developing an Open/WebSphere Liberty UserRegistry with Tycho
  2. Developing Open Liberty Features, Part 2
  3. Converting Tycho Projects to maven-bundle-plugin, Initial Phase

In my earlier post, I went over the tack I took when developing a couple extension features for Open Liberty, and specifically the way I came at it with Tycho.

Shortly after I posted that, Alasdair Nottingham, the project lead for Open Liberty, dropped me a line to mention how programmatic service registration isn't preferred, and instead the idiomatic way is to use Declarative Services. I had encountered DS while fumbling my way through to getting these things working, but I had run into some bit of trouble or another, and I ended up settling on what I got working and not revisiting it.

Concepts

This was a perfect opportunity to go back and do things the right way, though, so I set out to do that this morning. In my initial reading up, I ran across a blog post from the always-helpful Vogella Blog that talks about coming at OSGi DS from essentially the same perspective I have: namely, having been used to Equinox and the Eclipse plugin/extension mechanism. As it turns out, when it comes to generic OSGi, Equinox can kind of poison your brain. The whole term "plug-in" instead of "bundle" comes from earlier Eclipse; "features", "update sites", and all of p2 are entirely Equinox-specific; and the "plugin.xml" extension mechanism is of a similar vintage. However, unlike some other vestiges that were tossed aside, "plugin.xml" is still in active use.

At its core, the Declarative Services system is generally similar to that route, in that you write classes to implement a given interface and then declare that your bundle provides that using XML files. The specifics are different - DS is more type-safe and it uses individual XML files in the "OSGI-INF" directory for each service - but the concept is similar. DS also has an annotation-based mechanism for this, which allows you to annotate your service classes directly and not worry about maintaining XML files. It's something of a compiler trick: the XML files still exist, but your tooling of choice (PDE, bnd, etc.) will generate the files based on your annotations. It took a bit for Eclipse to get on board with this, but, as of Neon, you can enable this processing in the preferences.

Implementation

Fortunately for me, my needs are simple enough that making the change was pretty straightforward. The first step was to delete the Activator class outright, as I won't need it for this. The second was to add an optional import for the org.osgi.service.component.annotations package in my Liberty extension bundle. I suspect that this is a bit of a PDE-ism: the annotations aren't even retained at runtime (and the package isn't present in the Liberty server), but this is the only mechanism Eclipse has to add a dependency for a plug-in project.

The annotation for the user registry was as straightforward as can be, needing a single line in this heavily-clipped version of the class:

1
2
3
@Component(service=UserRegistry.class, configurationPid="dominoUserRegistry")
public class DominoUserRegistry implements UserRegistry {
}

With that, Eclipse started generating the associated XML file for me, and the registry showed up at runtime just as it had before.

The TrustAssociationInterceptor was slightly more complicated because it had some extra initialization properties set, in particular the one to mark it as executing before normal SSO. This was a little tricky in two ways: Java annotations don't have any mechanism for specifying a literal Map for properties, and the before-SSO property is a boolean, but I could only write a string. It turned out that the property, uh, property on the annotation has a little mini-DSL where you can mark a property with its type. The result was:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
@Component(
	service=TrustAssociationInterceptor.class,
	configurationPid=DominoTAI.CONFIG_PID,
	property={
		"invokeBeforeSSO:Boolean=true",
		"id=org.openntf.openliberty.wlp.userregistry.DominoTAI"
	}
)
public class DominoTAI implements TrustAssociationInterceptor {
}

Further Features

This is proving to be a pretty fun side project within a side project, and I think I'll take a crack at developing some more features when I have a chance. In particular, I'd like to try developing some API-contribution features so that they can be deployed to the server once and then used by web apps without having to package them (similar in concept to XPages Libraries). This is how Liberty implements its Jakarta EE specifications, and I could see making some extra ones. That's also exactly what CrossWorlds does, and so I imagine I'll crib a bunch of that work.