Showing posts for tag "java"

XPages JEE 2.15.0 and Plans for JEE 10 and 11

Fri Feb 16 15:30:40 EST 2024

  1. Updating The XPages JEE Support Project To Jakarta EE 9, A Travelogue
  2. JSP and MVC Support in the XPages JEE Project
  3. Migrating a Large XPages App to Jakarta EE 9
  4. XPages Jakarta EE Support 2.2.0
  5. DQL, QueryResultsProcessor, and JNoSQL
  6. Implementing a Basic JNoSQL Driver for Domino
  7. Video Series On The XPages Jakarta EE Project
  8. JSF in the XPages Jakarta EE Support Project
  9. So Why Jakarta?
  10. XPages Jakarta EE 2.5.0 And The Looming Java-Version Wall
  11. Adding Concurrency to the XPages Jakarta EE Support Project
  12. Adding Transactions to the XPages Jakarta EE Support Project
  13. XPages Jakarta EE 2.9.0 and Next Steps
  14. XPages JEE 2.11.0 and the Javadoc Provider
  15. The Loose Roadmap for XPages Jakarta EE Support
  16. XPages JEE 2.12.0: JNoSQL Views and PrimeFaces Support
  17. XPages JEE 2.13.0
  18. XPages JEE 2.14.0
  19. XPages JEE 2.15.0 and Plans for JEE 10 and 11

Today, I released version 2.15.0 of the XPages Jakarta EE project. As is often the case lately, this version contains bug fixes but also a few notable features:

  • You can now specify Servlets in WEB-INF/web.xml (as opposed to just via the @WebServlet annotation. This is helpful for defining a Servlet when the actual implementation is in a JAR or when following non-annotation-based examples
  • You can now specify context-param values in WEB-INF/web.xml in the NSF and META-INF/web-fragment.xml in JAR design elements, which will be available to JSP, JSF, JAX-RS, @WebServlet-annotated Servlets, and web.xml-defined Servlets
  • Added @BooleanStorage annotation for NoSQL entities to define how boolean values are converted to note items
  • Added CRUD operations for calendar events to NoSQL, around a few new methods on Repository. This exposes some of the capabilities of NotesCalendar and can be used for, for example, providing an iCalendar feed based on a mail database. To go with that, XPages JEE also re-exports iCal4J as included in the Domino stack for NSF use, though this API is... not smooth

The first two here are focused around bringing NSFs more in line with "normal" Jakarta EE applications, while the latter are some nice improvements for the NoSQL driver. I hope to put the last one in particular to good use - for example, OpenNTF's site will be able to provide a calendar of webinars and other events that we can manage internally using a normal Notes calendar, and that sounds nice to me.

Next Versions

I still have the 3.x branch of the project chugging along, and I think it'll be ready for a real release before too long. Since it'll be a breaking-changes release thanks to upstream changes, I'm using it as an opportunity to consolidate the sprawl of features and XPages Libraries. Currently, my plan is:

  • One for "core", covering most things in the Jakarta EE Core Profile, plus the other utility specs I've implemented: Transactions, Bean Validation (which really should be in Core in my estimation), Concurrency, Servlet, and so forth, plus Data and NoSQL
  • One for "UI", covering Jakarta Pages, Jakarta Faces, and MVC - basically, the stuff you could use to replace XPages to make an HTML-generating app in your NSF
  • One for MicroProfile, or at least the specs I've implemented so far. I'm a little tempted to wrap this in to Core, since things like OpenAPI are useful almost all the time, but it's a clean-enough separation that it'll be fine

This will require Domino 14, since Jakarta EE 10 requires at least Java 11.

That brings me to some unexpected good news: though Jakarta EE 11 was long planned to use Java 21 as its minimum version (since 21 is the current LTS), it looks like they've switched to making Java 17 the baseline. For me, this is a little sad in an idealistic sense, since it pushes things like Virtual Threads out of the realm of being a core part of JEE, but I'm very happy that I'll be able to use all JEE 11 specs in Domino 14. Even if Domino 15 used Java 21, it'd still be a long while before that came, and we'd lag behind the standard for at least a year. Instead, this puts the project back in line with upstream, and allows me personally to potentially resume committing to Jakarta NoSQL - I'd been out of the loop for a very long time when it moved to 11 and then 17 as its required version.

I don't know right now whether JEE 11 will be the same sort of breaking change for the project (which would mean a 4.x release) or if I'll be able to make it a 3.x one - the specs aren't out yet, so time will tell. The big focus of 11 will be further centralization on CDI instead of EJB, and I'm all for it.

My plan is to get 3.x out for Domino 14, based on JEE 10, as soon as time allowed, and then I'll start looking into bumping to JEE 11 when it releases in the summer.

Notes/Domino 14 Fallout

Fri Dec 15 11:53:44 EST 2023

  1. AbstractCompiledPage, Missing Plugins, and MANIFEST.MF in FP10 and V10
  2. Domino 11's Java Switch Fallout
  3. fontconfig, Java, and Domino 11
  4. Notes/Domino 12.0.2 Fallout
  5. Notes/Domino 14 Fallout

Notes and Domino 14 are out now and, as I discussed back in June the big deal for me is the move to Java 17. This also came with a refresh of the Eclipse innards, from Neon (circa 2016) to 2021-12 (circa, uh, 2021). The Eclipse update is welcome, but so far it's been less impactful than the Java update - at some point, I'll want to see if some of the current-era Eclipse plugins work here, but that's for the future.

In the mean time, there's a bunch to know, so let's get to it! I've broken this one down into "critical" and "less critical" sections, since this post will likely have a similar life to my target platform one.

Critical To Know

ndext

I mentioned this in June, but an important thing to know about Java 17 is that "jvm/lib/ext" no longer exists. Fortunately, Notes and Domino have long had a secondary location for this sort of thing: "ndext" in the program directory. Any JARs in this folder are available at runtime on both platforms, and so the quick fix is to move anything you had in "jvm/lib/ext" to "ndext".

...but.

When it comes to Notes, there's a distinct difference between "available at runtime" and "on the build path". While Java agents here are fine - they modified the editor to include "ndext" in the build path - there's no such accommodation for XPages. So far, the best workaround I've thought of is to add any such JARs manually to the JRE definition in Eclipse's preferences:

Screenshot of Designer's JRE preferences, showing the addition of an ndext JAR

When doing this, there's a huge caveat: do not just add everything from this directory to the JRE. Most of it will be redundant, like the SWT stuff, but some will be actively harmful, in particular "jsdk.jar". That's an even-older Servlet version than comes with XPages, and including that will cause Designer to think that methods added in Servlet 2 aren't present. Only add the JARs you're extending it with.

Ideally, this will be remedied one way or another in the future, but for now we'll have to do this to account for it. The silver lining may be that it's a good impetus to OSGi-ify your dependencies if possible.

Poi Remains

This isn't new, but it's worth mentioning while we're on the topic of "ndext". Since Notes 11, the client ships with an old version of Apache Poi, but this is painfully distributed right in the JVM and not at the OSGi layer. Newer versions of Poi 4 XPages deal with this, but it's important to know that, while Poi is in Notes, it's not in Domino. If you write agents using these classes, or add them to your JRE and use them in XPages, you'll also need to deploy them to Domino.

Java Policy Location

This one's more of a note, and it's reiterating something from June: the location of "java.policy" and (if you add it) "java.pol" changed since Java 8. They used to be in "jvm/lib/security" but they're in "jvm/conf/security" now. They work the same as before, as does putting a file named ".java.policy" in the Domino user's home dir, so the other characteristics haven't changed.

sun.misc.BASE64Decoder

Back in Domino 11, the move from IBM J9 to OpenJ9 meant that some IBM internal forks of Sun internal classes were no longer available. The most notable of these were the BASE64 classes com.ibm.misc.BASE64Encoder and BASE64Decoder. These were very popular to use in the pre-Java-8 days, before java.util.Base64 existed, and so it was worth noting that they were gone then.

Well, sun.misc.Base64Encoder has now met a similar fate - it's probably still in there somewhere, but it's no longer accessible by user code. If you haven't made the switch to java.util.Base64, do so now.

Target Platform Bug

This is another note, but this classic bug that's been with us since 9.0.1FP10 is... fixed, I think! I'd thought at first that it remained, since the Target Platform config still just lists the Eclipse home dir, but installing and using an XPages library seemed to work properly without further change. Good!

Java Compiler Level

This is a fiddly one. Though Designer uses Java 17 by default now, it has the ability to compile down to older versions for past compatibility, such as running on a pre-14 server. Java agents have had their own way of dealing with this for a while, though it seems like maybe they default to Java 8 now, which is good. With XPages, it seems like it's a little less obvious.

From my experience, this is what I've found:

  • Existing projects may have a specific compiler setting specified (similar to agents in the above-linked post) and so will remain as Java 1.8
  • New projects seem to use the active workspace setting, which will matter
  • Upgrades of Designer in place will keep the default compiler level for the workspace at 1.8
  • A fresh installation sets the compiler level to 17

This is all... fine. It'd be nice if it was a little more obvious to the developer (like if it was controlled by the xsp.properties setting for minimum version), but this is more or less the Eclipse way of doing things. We'll just have to be aware for a while of these settings and their interactions when developing for pre-14 deployment. If you're targeting an older server, you should make sure that you go to Project Properties for your NSF and set the Java Compiler level to fit:

Screenshot of the Java Compiler settings for an NSF

Less Critical

Alright, that covers the big-ticket stuff, but there are a few more things to know.

Java Deprecation Warning On HTTP Start

This one is documented, though oddly in the "no longer included" page: when you start HTTP on Domino, you'll see a message like this:

WARNING: A terminally deprecated method in java.lang.System has been called
WARNING: System::setSecurityManager has been called by lotus.notes.AgentSecurityManager (file:/C:/Domino/ndext/Notes.jar)
WARNING: Please consider reporting this to the maintainers of lotus.notes.AgentSecurityManager
WARNING: System::setSecurityManager will be removed in a future release

This is actually totally fine. As it says, the whole SecurityManager apparatus is gone in future versions, but Notes and Domino still use it for agents (and, unfortunately, XPages). While I long for the day when I never have to think about it again, this is a reasonable-enough compromise for 14 as a "transitional" version in its Java journey. So... you can ignore this and not worry.

"Java Main Sources" and "Java Test Sources" Working Sets

If you use Working Sets in Designer, you may notice two entries not of your creation:

Screenshot of the 'Select Working Set' dialog in Designer 14

These showed up in Eclipse somewhere along the line, presumably to make it easier for people to select those types of projects without manually managing their working sets. Designer inherits this and HCL didn't remove them, but you're free to delete them if you want.

"AbstractCompiledPage cannot be resolved"

This is another one that came up back in 9.0.1FP10 and it's still here, but it's less of an issue: every once in a while, when building a project, Designer will complain about the AbstractCompiledPage class not being found. In my experience, this only shows up temporarily during a build, but it's worth noting that, if it does stick around for you, a Project - Clean should fix it.

JAX-B and CORBA

After Java 8, a few Java EE components were removed from the normal JRE distribution in favor of developing them in Java EE, then Jakarta EE. JAX-B is one of them (now Jakarta XML Binding) - we don't normally use this directly, but it comes up sometimes, either directly (likely as one of the other old-timey BASE64 workarounds) or as a dependency. It shouldn't matter in Designer, but, if you're writing Domino-targeting code outside Designer, you may need to be aware. One way or another, you should add this as a dependency - in a basic case, you can add "jakarta.xml.bind-api.jar" and "jaxb-impl.jar" from "ndext" to your build path.

Similarly, CORBA support classes ("org.omg" stuff, usually) used to come with the JRE and now don't. To work around this, HCL did what I've done in the past and included "glassfish-corba-omgapi.jar", in their case putting it in "ndext". If you're using Notes.jar with Java > 8 outside Designer, you'll need this one too.

Conclusion

I think that covers it for now. Considering how much changed with Java from 8 to 17, this could have been a lot rougher, though I fear that some of these workarounds will plague us for a long time.

In the mean time, I'd appreciate it if you vote for this Aha idea to keep Domino on a good Java update cadence. It'd be a shame if we sit on 17 right up until the very end of its life, as we did with 6 and 8, in part since these multi-version moves are more painful than single-LTS updates.

Overdue CollabSphere Followup

Wed Oct 11 16:59:29 EDT 2023

Though it's been over a month since CollabSphere 2023, I've somehow now yet gotten around to talking about it here. Time to remedy that!

Webinar and Workshop

I presented a couple sessions at CollabSphere, but the meatiest was my workshop on the XPages Jakarta EE project. A bit before the show, I wrote a post discussing modes of development with it and that formed the structure of the workshop.

It also formed the structure of August's OpenNTF webinar. Fortunately, unlike the workshop, the webinar was recorded, and that is up on OpenNTF's YouTube channel. Though the workshop was longer and had some refinements and audience discussion, they both worked from the same original slide deck and so the webinar did a pretty good job of covering the same material.

As part of the presentations, I created four versions of the same basic to-do tracker app, written in each of the four modes I talked about. I put them up in the project repository with a quick README introducing each of them. With the project installed (they were written to 2.13.0 and above), you should be able to sync the ODPs with NSFs in Designer and poke around yourself. They're all intentionally-similarly structured and basic, with all of their elements in either Java classes, file resources, or stylesheets. They're also all intentionally under-developed, so you're not allowed to make fun of them.

JNX

This one will be a doozy! In fact, it's such a doozy that I'm going to keep kicking the can down the road as far as properly talking about it.

In short, Domino JNX is a more-modern API for Domino access - it was initiated and is used heavily by the Domino REST API (DRAPI), but also stands on its own. It started out essentially as another swing at Karsten Lehmann's Domino JNA and shares a lot of the same ideas and approaches (and code - the committer info in the GitHub repo is deceptive, as I got to paste a whole mountain of existing code from another repo into this one and thus got credit for it).

Now that it's open source, there's some significant work to do as far as documentation, examples, integration, and distribution go. I plan to find time to improve a lot of these aspects, including blogging here, though it's the sort of thing that always ends up below other priorities.

In any event, it's quite neat and has a lot of good capabilities. I plan to write a driver for JNoSQL for it, either to be alongside or to wholly supplant the Notes.jar-based one currently used by the project. The neat thing there is that users of XPages JEE won't have to care much: it'll just get a bit faster in parts but should largely work the same, except maybe with a change to the way you point at other databases.

Modes Of App Development With XPages Jakarta EE

Fri Jul 28 11:46:50 EDT 2023

I've been working on my workshop for this year's CollabSphere, and one of the main decisions I have to make is what I'm going to focus on. The idea of the workshop is to give a bit more brass-tacks information about how to use the project: rather than just a list of features, it'll be about the specific business of building an app using it.

But how does one build an app in it? There's certainly no lack of tools available, but that leads to the opposite problem: what's the right one for your project? What's likely to be the most common path people take?

The Types

As I've been working on it, I've grouped things into four main categories, and I figured it'd be useful to enumerate them here to coordinate my thoughts and provide some general information. There aren't hard lines between these: you can use any mixture of some or all of the parts in an app, and do different mixes in different apps. These are just what I expect to be the main groupings:

  • "XPages Plus", using some new capabilities in existing or new apps with XPages-based UIs
  • REST services, focusing on providing REST endpoints for JavaScript-based apps or other servers
  • MVC and JSP, focusing on clean, lightweight UIs for document-based apps, but less ideal for complex business logic
  • JSF, building the same sorts of apps XPages is adept at, but using newer technology

"XPages Plus"

The first route is how the project got started: you keep building XPages apps but sprinkle in a few new capabilities to improve them.

For example, you could replace your managed beans defined in faces-config.xml with CDI beans, allowing you to get the quick benefit of annotation-based definitions and then the bigger benefits of @Inject, producer methods, and interceptors.

You could also start using newer EL features, like the long-desired ability to pass parameters to methods.

This path wouldn't necessarily require a lot of reworking of your app or changing the way you think about XPages development, but would still be something of a minor development refresh and can set you up well for future improvements.

Your data access will likely still be through the traditional xp:dominoDocument and xp:dominoView components, but you could also write beans that access data with lotus.domino or ODA, or switch to using the NoSQL driver.

REST Services

Alternatively, you could decide you want to focus your apps around REST services with either a JavaScript app in, for example, React as the front end, or providing services to remote servers.

With this, you'd largely stop using XPages design elements entirely, instead defining your services in Java classes with JAX-RS annotations. This brings huge advantages over other ways to write REST services on Domino, with the JAX-RS annotations allowing for clear, logical definition of services, their parameters, and their output. Moreover, the ancillary tooling brings things like automatic OpenAPI definitions, which would be annoying to maintain using things like the XPages-side REST controls.

This path is good if you're specifically aiming to build a JavaScript-based app, either because you just like it, because your organization decided to go that route, or if you have a larger team that splits the duties of front-end and back-end developers. It can also naturally blend into the next one.

Your data access here won't be through the XPages components, but you could still use lotus.domino or ODA classes, or switch to the NoSQL driver. That actually goes for the next two, too, so we'll just count that as assumed.

MVC and JSP

I'll admit that part of the reason I want to consider this a top-tier route is because I just personally really like it. I've had a blast writing apps like this blog and the OpenNTF site using this path, with its much-cleaner code and back-to-basics approach to HTML.

Regardless of my personal enjoyment of it, though, this has some nice advantages. The fact that MVC builds on top of JAX-RS means that it melds well with the REST-services approach above. For example, you might primarily write REST services for a JS app, but then do a set of "admin" pages with MVC. Or you might use this as part of the prototype phase: structure your app the same way you will when you expand to a multi-tier team, but start out by doing a quick UI with MVC on top of the same or related endpoints.

With this path, your app will start with Java classes with JAX-RS annotations, and then you'd mix it with JSP files inside WebContent/WEB-INF. One down side to this approach is that Designer doesn't provide much help for writing JSP files. In the tooling, I bind .jsp and .tag files to the HTML editor, so you at least get normal HTML assistance, but that won't help you with specific JSP tags and EL. Fortunately, the set of tools you'll likely use in JSP is comparatively small, so you'll eventually memorize things like <c:forEach items="..." var="...">...</c:forEach> in much the same way that you could eventually write out an <xp:repeat/> in your sleep in XPages.

JSF

This one, technically tricky though it may be, is conceptually straightforward: write the same sort of apps you do with XPages, but do it with modern JSF instead. This makes a lot of sense, since JSF shares XPages's acumen with complicated forms with partial refreshes and changing state data, but has benefited from some development that didn't happen on the XPages side.

It's not a direct replacement: in particular, JSF has no knowledge of Domino data sources, so there's no xp:dominoDocument or xp:dominoView. You'd still need to do your data access via beans, as in the previous two options, likely using either lotus.domino/ODA or the NoSQL driver. Additionally, Designer really doesn't help you here - again, I map .xhtml and .jsf files to the HTML editor, but JSF components have a lot of properties to set, and so you'll be spending a lot of time referencing documentation.

Still, it's clear why this is proving to be a popular path. The development model is the same as in XPages, while the JSF stack (especially including PrimeFaces) brings a lot of amenities that aren't in XPages and are also more portable to other environments.

Conclusion

So, for now, I'm thinking of splitting up the workshop to cover each of these paths a bit. That runs the risk of feeling like too much of a grab bag, but I don't want to give the opposite impression, that the project only allows for some specific path. It's a broad platform update, accommodating many development approaches, and I want to keep that clear. Fortunately, each path has a pretty-clean pitch, and the shared components (CDI, bean validation, the REST client, etc.) build on each other well, so the idea that it's a pool of features that you can swim in is, I think, compelling.

Kicking the Tires on Domino 14 and Java 17

Fri Jun 02 13:50:53 EDT 2023

Tags: domino java

As promised, HCL launched the beta program for Domino 14 the other day. There's some neat stuff in there, but I still mostly care about the JVM update.

I quickly got to downloading the container image for this to see about making sure my projects work on it, particularly the XPages JEE Support project. As expected, I had a few hoops to jump through. I did some groundwork for this a while back, but there was some more to do. Before I get to some specifics, I'll mention that I put up a beta release of 2.13.0 that gets almost everything working, minus JSP.

Now, on to the notes and tips I've found so far.

AccessController and java.policy

One of the changes in Java 17 specifically is that our old friend AccessController and the whole SecurityManager framework are marked as deprecated for removal. Not a minute too soon, in my opinion - that was an old relic of the applet days and was never useful for app servers, and has always been a thorn in the side of XPages developers.

However, though it's deprecated, it's still present and active in Java 17, so we still have to deal it. One important thing to note is that java.policy moved from the "jvm/lib/security" directory to "jvm/conf/security". A while back, I switched to putting .java.policy in the Domino user's home directory and this remains unchanged; I suggest going this route even with older versions of Domino, but editing java.policy in its new home will still work.

Extra JARs and jvm/lib/ext

Another change is that "jvm/lib/ext" is gone, as part of a general desire on Java's part to discourage installing system-wide libraries.

However, since so much Java stuff on Domino is older than dirt, having system-wide custom libraries is still necessary, so the JVM is configured to bring in everything from the "ndext" directory in the Domino program directory in the same way it used to use "jvm/lib/ext". This directory has actually always been treated this way, and so I started also using this for third-party libraries for older versions, and it'll work the same in 14. That said, you should ideally not do this too much if it's at all avoidable.

The Missing Java Compiler

I mentioned above that everything except JSP works on Domino 14, and the reason for this is that V14 as it is now doesn't ship with a Java compiler (JSP, for historical reasons, does something much like XPages in that it translates to Java and compiles, but this happens on the server).

I originally offhandedly referred to this as Domino no longer shipping with a JDK, but I think the real situation was that Domino always used a JRE, but then had tools.jar added in to provide the compiler, so it was something in between. That's why you don't see javac in the jvm/bin directory even on older releases. However, tools.jar there provided a compiler programmatically via javax.tools.ToolProvider.getSystemJavaCompiler() - on a normal JRE, this API call returns null, while on a JDK (or with tools.jar present) it'd return a usable compiler. tools.jar as such doesn't exist anymore, but the same functionality is bundled into the newer-era weird runtime archives for efficiency's sake.

So this is something of a showstopper for me at the moment. JSP uses this system compiler, and so does the XPages Bazaar. The NSF ODP Tooling project uses the Bazaar to do its XSP -> Java -> bytecode compilation of XPages, and so the missing compiler will break server-based compilation (which is often the most reliable compilation currently). And actually, NSF ODP Tooling is kind of double broken, since importing non-raw Java agents via DXL is currently broken on Domino 14 for the same reason.

I'm pondering my options here. Ideally, Domino will regain its compiler - in theory, HCL could switch to a JDK and call it good. I think this is the best route both because it's the most convenient for me but also because it's fairly expected that app servers run on a JDK anyway, hence JSP relying on it in the first place.

If Domino doesn't regain a compiler, I'll have to look into something like including ECJ, Eclipse's redistributable Java compiler. I'd actually looked into using that for NSF ODP Tooling on macOS, since Mac Notes lost its Java compiler a long time ago. The trouble is that its mechanics aren't quite the same as the standard one, and it makes some more assumptions about working with the local filesystem. Still, it'd probably be possible to work around that... it might require a lot more of extracting files to temporary directories, which is always fiddly, but it should be doable.

Still, returning the built-in one would be better, since I'm sure I'm not the only one with code assuming that that exists.

Overall

Despite the compiler bit, I've found things to hold together really well so far. Admittedly, I've so far only been using the container for the integration tests in the XPages JEE project, so I haven't installed Designer or run larger apps on it - it's possible I'll hit limitations and gotchas with other libraries as I go.

For now, though, I'm mostly pleased. I'm quite excited about the possibility of making more-dramatic updates to basically all of my projects. I have a branch of the XPages JEE project where I've bumped almost everything up to their Jakarta EE 10 versions, and there are some big improvements there. Then, of course, I'm interested in all the actual language updates. It's been a very long time since Java 8, and so there's a lot to work with: better switch expression, text blocks, some pattern matching, records, the much-better HTTP client, and so forth. I'll have a lot of code improvement to do, and I'm looking forward to it

The Loose Roadmap for XPages Jakarta EE Support

Thu May 04 10:29:44 EDT 2023

  1. Updating The XPages JEE Support Project To Jakarta EE 9, A Travelogue
  2. JSP and MVC Support in the XPages JEE Project
  3. Migrating a Large XPages App to Jakarta EE 9
  4. XPages Jakarta EE Support 2.2.0
  5. DQL, QueryResultsProcessor, and JNoSQL
  6. Implementing a Basic JNoSQL Driver for Domino
  7. Video Series On The XPages Jakarta EE Project
  8. JSF in the XPages Jakarta EE Support Project
  9. So Why Jakarta?
  10. XPages Jakarta EE 2.5.0 And The Looming Java-Version Wall
  11. Adding Concurrency to the XPages Jakarta EE Support Project
  12. Adding Transactions to the XPages Jakarta EE Support Project
  13. XPages Jakarta EE 2.9.0 and Next Steps
  14. XPages JEE 2.11.0 and the Javadoc Provider
  15. The Loose Roadmap for XPages Jakarta EE Support
  16. XPages JEE 2.12.0: JNoSQL Views and PrimeFaces Support
  17. XPages JEE 2.13.0
  18. XPages JEE 2.14.0
  19. XPages JEE 2.15.0 and Plans for JEE 10 and 11

At Engage, HCL officially announced Java 17 in Domino 14 (I'm sure they announced other things too, but I have my priorities). This will allow me to do a lot in pretty much all of my projects, but it's particularly pertinent to XPages JEE.

Currently, the project targets generally Jakarta EE 9, which came out in late 2020 and was "just" a switch from javax.* to jakarta.*, with no official new features. However, Jakarta EE 10 came out a year ago - in addition to bringing a raft of new features, it also bumped the minimum Java version to Java 11, pushing it outside of Domino's realm. Accordingly, I've had to hold off on a lot of major- and minor-version bumps in the XPages JEE project as new releases started being compiled for Java 11. Once V14 is out, though, I'll be able to move to the current JEE platform... at least until JEE 11 comes out next year and requires Java 21, anyway.

So I've been working on how I'm going to approach this, and what I'm thinking is that I'll do it in two phases: first, a final 2.x release that provides Java 17/Domino 14 compatibility for existing components, and then a new 3.x breaking-changes release to bring in Jakarta EE 10 components.

The Final 2.x Release

I currently have this penciled in for the next release, 2.12.0, but that may change if I decide I want to get a real 2.12.0 release out before Domino 14 is at least in stable beta form. Let's call it "2.99.0" for now.

The idea here will be that I'll want to make sure all existing code in NSFs continues to work unchanged: upgrade your server to V14, install 2.99.0, and your apps keep working. In theory, this shouldn't be too complex. There's some shimming needed for Weld (the CDI implementation) to account for changes from Project Jigsaw in Java 9 and later, and there might be some stuff around AccessController, but in general I expect it'll just be some tweaks here and there. Time will tell, of course.

Once that's out, I plan to not look back (unless there's demand, I suppose). The switch to Java 17 is a huge deal, and I don't think it'll be worth spending much more time with Java 8 once it's no longer required. The 2.x branch is already, I feel, in a pretty good place, so I'll feel comfortable having a stable final version.

The Breaking 3.0 Release

Then, the plan will be to start down the path of 3.x with breaking changes - not everything, but some. For one, JEE 10 has a handful of backwards-incompatible changes. Those are mostly for legacy true-JEE code, though, and the main ones that XPages JEE code will likely want to be aware of will be the switch of XML namespaces to shorter representations. That will affect JSP and JSF code, but the old URIs (the jcp.org ones) will continue to work, at least for a while.

Most of the breaking changes will probably happen internally. I've talked for a long while now about my desire to do some reorganization of the project. The big one is wrangling the proliferation of Eclipse Features and XPages Libraries. Anyone who has installed the project in Designer is well aware of just how many times you have to click "yes, I want to install the thing I'm installing", and that alone is enough to warrant a reorganization. Beyond that, though, I've had to take care to try to make it so that the individual components don't depend on each other unnecessarily. There's a certain amount of good discipline that provides, but it eventually wears a bit thin.

I'm not quite sure what form the consolidation will take, but it'll probably be something like three features: "core", "extended", and "MicroProfile". "Core" would probably roughly map to the actual Jakarta Core Profile, plus things that I find essentially obligatory like Bean Validation. "Extended" would be all the things like JSP and JSF, the "leaves" on the dependency tree: they depend on core features, but nothing depends on them. Then "MicroProfile" would be, well, MicroProfile features. The only thing still giving me pause is that there's not too much case for not installing all of these all the time anyway - if you don't want to use, say, JSF, you don't have to; additionally, it's not like Domino is a svelte cloud-native mini server meant to be deployed a thousand times in a cluster, so having the extra bundles sitting there isn't really onerous. We'll see. I hem and haw a lot on this, but eventually I'll have to make a decision.

Regardless of what form that takes, I expect that the changes to in-NSF code will be either minimal or none - for users of the project, it'll mostly be a matter of making sure to fully uninstall the old plugins before an upgrade and then tweaking Xsp Properties to select whatever the new form of the XPages Libraries ends up being.

Side Note: Jakarta NoSQL and Data

One interesting aspect of this move will be the path Jakarta NoSQL has been on. Though I've included it in the XPages JEE project for a little while now (and continue to heavily expand on it), it's always been technically a beta release. It's clearly proven itself stable even in its beta form, but it's going through a shift in the run-up to Jakarta EE 11. Specifically, the higher abstraction levels - the Repository interface and friends - are moving to a new project, Jakarta Data. The idea of that project will be that it will be able to sit on top of Jakarta NoSQL and other storage types, namely JPA.

It's going to be very neat, but it's created a bit of a pickle for me. Since it's targetting Jakarta EE 11, that means the release of it and NoSQL are going to require at least Java 21, and there's no word on when Domino will support that.

One option would be to stick with what I have now for the foreseeable future: a mildly-forked version of Jakarta NoSQL 1.0.0-b4. It's a workhorse and has been doing a good job, and it'd mean that app code wouldn't have to change. I'm not crazy about this for obvious reasons: I don't want to have one component stuck way behind while all the other parts get a nice jump forward, even if it works.

The other main option I'm considering is sliding forward to another beta release and landing there until Java 21 support shows up. The current development versions of the Data spec and JNoSQL with its implementation target Java 17, so I'll probably go with whatever the last beta is before the official switch to 21. Though it's tough to predict the future, that will probably end up being API-wise similar enough to the release forms of them that future jumps won't be difficult. We shall see, anyway.

Timeline

Anyway, the timeline for this is a little vague, and will mostly depend on when the Domino 14 betas come out and whether they contain anything show-stopping. My hope is to be able to have something that passes all the test cases ASAP with betas and then to have it continue to be stable through to the actual release.

I'm looking forward to leaving Java 8 behind for good, though, that much is certain.

Integrating External Java Apps With Keep And Keycloak

Wed May 03 09:43:59 EDT 2023

Last year, I wrote a post describing some early work on a Jakarta NoSQL driver for the Domino REST API (hereafter referred to as "Keep" to avoid ambiguity with the various other Domino REST APIs).

I've since picked back up on the project and similar aspects, and I figured it'd be useful to return to provide some more details.

OpenAPI

For starters, I mentioned in passing my configuration of the delightful openapi-generator tool, but didn't actually detail my configuration. It's changed a little since my first work, since I found where you can specify using the jakarta.* namespace.

I use a config.yaml file like:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
additionalProperties:
  library: microprofile
  dateLibrary: java8
  apiPackage: org.openntf.xsp.nosql.communication.driver.keep.client.api
  invokerPackage: org.openntf.xsp.nosql.communication.driver.keep.client
  modelPackage: org.openntf.xsp.nosql.communication.driver.keep.client.model
  useBeanValidation: true
  useRuntimeException: true
  openApiNullable: false
  microprofileRestClientVersion: "3.0"
  useJakartaEe: true

That will generate client interfaces that will mostly compile in a plain Jakarta EE project. The files have some references to an implementation-specific MIME class to work around JAX-RS's historical lack of one, but those imports can be safely deleted.

Keycloak/OIDC in Keep

I also mentioned only in passing that you could configure Keep to trust the Keycloak server's public keys with a link to the documentation. Things on the Keep side have expanded since then, and you can now configure Keep to reference Keycloak using Vert.x's internal OIDC support, and also skip the step of creating special fields in your person docs to house the Notes-format DN. For example, in a Keep JSON config file:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
{
	"oidc": {
		"my-keycloak": {
			"active": true,
			"providerUrl": "https://my.keycloak.server/auth/realms/myrealm",
			"clientId": "keep-app",
			"clientSecret": "<my secret>",
			"userIdentifierInLdapFormat": true
		}
	}
}

That will cause Keep to fetch much of the configuration information from the well-known endpoint Keycloak exposes, and also to map names from Keycloak from the LDAP-style format of "cn=Foo Fooson,o=SomeOrg" to Domino-style "CN=Foo Fooson/O=SomeOrg". This is useful even when using Domino as the Keycloak LDAP backend, since Domino does the translation in the other direction first.

Keycloak/OIDC in Jakarta EE

In the original post in the series, talking about configuring app authentication for the AppDev Pack, I talked about Open Liberty's openidConnectClient feature, which lets you configure OIDC at the server level. That's neat, and I remain partial to putting authentication at the server level when it makes sense, but it's no longer the only game in town. The version of Jakarta Security that comes with Jakarta EE 10 supports OIDC inside the app in a neat way, and so I've switched to using that.

To do that, you make a CDI bean that defines your OIDC configuration - this can actually be on a class that does other things as well, but I like putting it in its own place:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
package config;

import jakarta.enterprise.context.ApplicationScoped;
import jakarta.security.enterprise.authentication.mechanism.http.OpenIdAuthenticationMechanismDefinition;
import jakarta.security.enterprise.authentication.mechanism.http.openid.ClaimsDefinition;

@ApplicationScoped
@OpenIdAuthenticationMechanismDefinition(
	clientId="${oidc.clientId}",
	clientSecret="${oidc.clientSecret}",
	redirectURI="${baseURL}/app/",
	providerURI="${oidc.domain}",
	claimsDefinition = @ClaimsDefinition(
		callerGroupsClaim = "groups"
	)
)
public class AppSecurity {
}

There are a couple EL references here. baseURL is provided for "free" by the framework, allowing you to say "wherever the app is hosted" without having to hard-code it. oidc here refers to a bean I made that's annotated with @Named("oidc") and has getters like getClientId() and so forth. You can make a class like that to pull in your OIDC config and secrets from outside, such as a resource file, environment variables, or so forth. providerURI should be the same base URL as Keep uses above.

Once you do that, you can start putting @RolesAllowed annotations on resources you want protected. So far, I've been using @RolesAllowed("users"), since my Keycloak puts all authenticated users in that group, but you could mix it up with "admin" or other meaningful roles per endpoint. For example, inside a JAX-RS class:

1
2
3
4
5
6
7
@Path("superSecure")
@GET
@Produces(MediaType.TEXT_PLAIN)
@RolesAllowed("users")
public String getSuperSecure() {
	return "You're allowed in!";
}

When accessing that endpoint, the app will redirect the user to Keycloak (or your OIDC provider) automatically if they're not already logged in.

Accessing the Token

In my previous posts, I mentioned that I was able to access the OIDC token that the server used by setting accessTokenInLtpaCookie in the Liberty config, and then getting oidc_access_token from the Servlet request object's attributes, and that that only showed up on requests after the first.

The good news is that, with the latest Jakarta Security, there's a standardized way to do this. In a CDI bean, you can inject an OpenIdContext object to get the current user's token:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
package bean;

import jakarta.enterprise.context.RequestScoped;
import jakarta.inject.Inject;
import jakarta.security.enterprise.identitystore.openid.OpenIdContext;

@RequestScoped
public class OidcContextBean {
  
	@Inject
	private OpenIdContext context;
  
	public String getToken() {
		// Note: if you don't restrict everything in your app, do a null check here
		return context.getAccessToken().getToken();
	}
}

There are other methods on that OpenIdContext object, providing access to specific claims and information from the token, which would be useful in other situations. Here, I only really care about the token as a string, since that's what I'll send to Keep.

With that token in hand, you can build a MicroProfile Rest Client using the generated API interfaces. For example:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
public class SomeClass {
	/* snip */
	@Inject
	private OidcContextBean oidcContext;

	/* snip */

	private DataApi getDataApi() {
		return RestClientBuilder.newBuilder()
			.baseUri("http://your.keep.server:8880/api/v1/")
			.register((ClientRequestFilter) (ctx) -> {
				ctx.getHeaders().add(HttpHeaders.AUTHORIZATION, "Bearer " + oidcContext.getToken()); //$NON-NLS-1$
			})
			.build(DataApi.class);
	}
}

That will cascade the OIDC token used for your app login over to Keep, allowing your app to access data on behalf of the logged-in user smoothly.

I've been kicking the tires on some example apps and fleshing out the Jakarta NoSQL driver using this, and it's been going really smoothly so far. Eventually, my goal will be to make it so that you can take code using the JNoSQL driver for Domino inside an NSF using the XPages JEE project and move it with minimal changes over to a "normal" JEE app using Keep for access. There'll be a bit of rockiness in that the upstream JNoSQL API is changing a bit to adapt to Jakarta Data and will do so in time for JEE to require Java 21, but at least it won't be too painful an analogy.

XPages JEE 2.11.0 and the Javadoc Provider

Thu Apr 20 09:47:58 EDT 2023

  1. Updating The XPages JEE Support Project To Jakarta EE 9, A Travelogue
  2. JSP and MVC Support in the XPages JEE Project
  3. Migrating a Large XPages App to Jakarta EE 9
  4. XPages Jakarta EE Support 2.2.0
  5. DQL, QueryResultsProcessor, and JNoSQL
  6. Implementing a Basic JNoSQL Driver for Domino
  7. Video Series On The XPages Jakarta EE Project
  8. JSF in the XPages Jakarta EE Support Project
  9. So Why Jakarta?
  10. XPages Jakarta EE 2.5.0 And The Looming Java-Version Wall
  11. Adding Concurrency to the XPages Jakarta EE Support Project
  12. Adding Transactions to the XPages Jakarta EE Support Project
  13. XPages Jakarta EE 2.9.0 and Next Steps
  14. XPages JEE 2.11.0 and the Javadoc Provider
  15. The Loose Roadmap for XPages Jakarta EE Support
  16. XPages JEE 2.12.0: JNoSQL Views and PrimeFaces Support
  17. XPages JEE 2.13.0
  18. XPages JEE 2.14.0
  19. XPages JEE 2.15.0 and Plans for JEE 10 and 11

Yesterday, I put two releases up on OpenNTF, and I figure it'd be worth mentioning them here.

XPages Jakarta EE Support

The first is a new version of the XPages Jakarta EE Support project. As with the last few, this one is mostly iterative, focusing on consolidation and bug fixes, but it added a couple neat features.

The largest of those is the JPA support I blogged about the other week, where you can build on the JDBC support in XPages to add JPA entities. This is probably a limited-need thing, but it'd be pretty cool if put into practice. This will also pay off all the more down the line if I'm able to add in Jakarta Data support in future versions, which expands the Repository idiom currently in the NoSQL build I use to cover both NoSQL and RDBMS databases.

I also added the ability to specify a custom JsonbConfig object via CDI to customize the output of JSON in REST services. That is, if you have a service like this:

1
2
3
4
5
@GET
@Produces(MediaType.APPLICATION_JSON)
public SomeCustomObject get() {
	return findSomeObject();
}

In this case, the REST framework uses JSON-B to turn SomeCustomObject into JSON. The defaults are usually fine, but sometimes (either for personal preference or for migration needs) you'll want to customize it, particularly changing the behavior from using bean getters for properties to instead use object fields directly as Gson does.

I also expanded view support in NoSQL by adding a mechanism for querying views with full-text searches. This is done via the ViewQuery object that you can pass to a repository method. For example, you could have a repository like this:

1
2
3
4
public interface EmployeeRepository extends DominoRepository<Employee, String> {
	@ViewEntries("SomeView")
	Stream<Employee> listFromSomeView(Sorts sorts, ViewQuery query);
}

Then, you could perform a full-text query and retrieve only the matching entries:

1
2
3
4
5
Stream<Employee> result = repo.listFromSomeView(
	Sorts.sorts().asc("lastName"),
	ViewQuery.query()
		.ftSearch("Department = 'HR'", Collections.singleton(FTSearchOption.EXACT))
);

Down the line, I plan to add this capability for whole-DB queries, but (kind of counter-intuitively) that would get a bit fiddlier than doing it for views.

XPages Javadoc Provider

The second one is a new project, the XPages Javadoc Provider. This is a teeny-tiny project, though, not even containing any Java code. This is a plugin for either Designer or normal Eclipse and it provides Javadoc for some standard XPages classes - specifically, those covered in the official Javadoc for Designer and the XPages Extensibility APIs. This covers things like com.ibm.commons and the core stuff from com.ibm.xsp, but doesn't cover things like javax.faces.* or lotus.domino.

The way this works is that it uses Eclipse's Javadoc extension point to tell Designer/Eclipse that it can find Javadoc for a couple bundles via the hosted version, really just linking the IDE to the public HTML. I went this route (as opposed to embedding the Javadoc in the plugin) because the docs don't explicitly say they're redistributable, so I have to treat them as not. Interestingly, the docs are actually still hosted at public.dhe.ibm.com. If HCL publishes them on their site or makes them officially redistributable, I'll be able to update the project, but for now it's relying on nobody at IBM remembering that they're up there.

In any event, it's not a huge deal, but it's actually kind of nice. Being able to have Javadoc for things like XspLibrary removes a bit of the guesswork in using the API and makes the experience feel just a bit better.

The Big Apple Silicon Switch, Followup

Thu Feb 16 13:17:49 EST 2023

Tags: docker java

Yesterday, I wrote a post detailing my recent switch to Apple Silicon, and in it I noted that the main remaining problem I was butting heads with was running Domino in Docker containers, specifically in the context of working with Java.

While my general solution of connecting to Docker running remotely is good, it's all the better to have this stuff working locally. I gave it another shot today and found that the solution is in the post: disable the JIT.

Java uses the JIT - its just-in-time compiler - to speed up execution dynamically when running a program, and it's part of what makes Java surprisingly fast in practice. It's a whole rabbit hole of performance implications and comparisons, but it will suffice to say "JIT = good". However, as I found when running test cases that load libnotes.dylib directly, J9's JIT doesn't play well with the x64-to-ARM JIT that these Macs use, either Rosetta's or qemu's.

So, long story short, the solution was to modify my testing container to disable the JIT when in such an environment (the referenced commit has since been followed up a few times).

First off, I added a bit in my Domino One-Touch Config file to point to a Java options file in the notes.ini:

1
2
3
4
5
6
7
8
{
	/* ... */
	"notesINI": {
		/* ... */
		"JavaUserOptionsFile": "/local/JavaOptionsFile.txt"
	}
	/* ... */
}

I modified the Dockerfile to bring this in from the build environment:

1
2
# ...
COPY --chown=notes:notes staging/JavaOptionsFile.txt /local/

Finally, I modified the class I use to build the container to optionally populate this file with the flag to disable the JIT:

1
2
3
4
5
6
String arch = DockerClientFactory.instance().getInfo().getArchitecture();
if(!"x86_64".equals(arch)) {
	withFileFromTransferable("staging/JavaOptionsFile.txt", Transferable.of("-Djava.compiler=NONE"));
} else {
	withFileFromTransferable("staging/JavaOptionsFile.txt", Transferable.of(""));
}

(Transferable is a Testcontainers-ism for any available source of a file to put into the build environment. Often, this will be something from the project classpath, but it can also be arbitrary data like this.)

With that in place, the test suite works! Eventually!

The thing to know about this is that it's slow. It's going to be naturally slow due to emulation, and I imagine the lack of Java's JIT doesn't help anything. Running the XPages JEE test suite takes about 520 seconds in this setup, as compared to 180 seconds when run via my remote native VM. Such a dramatic drop in speed is enough that I'm likely to continue using the remote VM for most things, and only use local emulation for things that significantly benefit from filesystem binds. For what it's worth, it seems like multithreading is what really kills it: most tests are slower by roughly 50%, while a big multithread test balloons from 20s to 160s. That's consistent with my experience with local unit tests, where the multithread ones were the ones that crashed the process.

For the record, I found that this currently only works at all with Docker Desktop's default (qemu-based) emulation. When I switch the emulator to Rosetta, I hit a StackOverflowError from the HTTP JVM during init without any useful other information. That's fair enough, since that emulation type is still flagged as Experimental in Docker Desktop anyway.

Anyway, it's good to know that there's a way to make it work. It's still no replacement for a native container, but at least it works at all. Similar techniques should work with other tasks, like setting the above option in the JAVA_OPTS environment variable.

The Big Apple Silicon Switch

Wed Feb 15 12:01:19 EST 2023

Tags: docker java

Almost exactly two years ago now, I wrote a post describing using Apple Silicon for work while my iMac was in the shop. That was an interesting experience, but too thorny for me to want to really stick with. Some of that was due to the limitations of the hardware - it was a pretty low-end MacBook Air - but most of that was due to the very-much-in-progress world of Java tooling for ARM Macs.

Since then, I'd been itching to find a machine to replace the iMac Pro and the recently-released Mac mini finally fit the bill. It, now dubbed Tethys, arrived on Friday, and I've been going through the process of re-carving-out my working environment, this time starting clean.

Eclipse

Compared to my first dive into ARM Macs, things have gotten a lot smoother. The biggest one for me is the stability of Java generally and Eclipse specifically. Last time, I was on the bleeding edge of that, and I was fortunate enough to at least help diagnose and report some of the things barring the port from being complete. The experience was very janky but the sheer snappiness of Eclipse on ARM really stuck with me, and it's still there in spades.

And, really, there's not even a lot of special things to know. If you download the aarch64 build of Eclipse, it just works like you'd want it to. I had to make sure to download some x64 builds of Java for when I'm running legacy native code, but that wasn't really different from my current collection of different Java versions anyway.

This is all a huge distinction from before, where I was going to weird lengths to try to run Eclipse via X11 remotely just to get something responsive.

Notes and libnotes

For my needs, Notes works fine. It's a shame that it's not native, but, as long as I run my test suites with an x64 JVM, things work about as well as they do on x64.

One immediate problem I ran into, though, was that some test cases that push multithreading would hard crash with a native error to do with the x64-to-ARM translation and JIT. Running with an OpenJ9-based JVM, I found that specifying -Djava.compiler=NONE in the launch args avoided this trouble. I'm sure that the tests run a bit slower now, but they were already emulated anyway, so it's not noticeable.

Domino and Docker

One of the sticking points with my half-transition years ago was the need to run my Domino Docker containers in emulation. Though HCL will have to port Domino to ARM eventually, they haven't yet released such a thing, and so that situation remains the same.

And, unfortunately, the segfault I encounter during my use remains. I had hoped that changed in qemu would fix it, but it appears to be unchanged. Recent versions of Docker can also take advantage of Apple's support for Rosetta in Linux, but for my needs that just creates a different segfault earlier in the process.

So, on this front, I'm doing the same thing I started doing before: running Docker on a VM on actual x64 hardware and using the DOCKER_HOST environment variable to point to it. Fortunately, I've taken steps in the intervening years to make this more practical. The main thing to work around with such a setup is the inability to use filesystem binds, and so I've changed a lot of my Dockerfiles and Testcontainers setups to instead copy more into the image at build. I still haven't solved every problem there, in particular a huge in-container build I'd like to do that points at a giant pool of dependencies that would be onerous to put into the image, but I'm working on that.

Conclusion

Overall, I'm pleased as punch with this thing. It's zippy all around, I haven't once heard the fan even when really hitting the CPU and GPU, and the state of the software ecosystem is such that I only use a very few x64-compiled processes during my daily work. I'd expected to have more of a checklist to work down to clean them all up, but it's really just the for-now-required bits. Not bad at all.

XAgents to Jakarta REST Services

Sun Feb 05 15:16:48 EST 2023

  1. Code-First REST APIs With XPages Jakarta EE Support
  2. Code-First REST APIs Followup: OpenAPI
  3. XAgents to Jakarta REST Services

For a good long time now, XAgents have been one of the common ways to do non-HTML output in an XPages environment - JSON, mostly. I think the technique was codified and the term coined by Stephan Wissel back in 2008 and the idea has been the same since.

Effectively, an XAgent lets you write a Servlet but with a bit more scaffolding. Though XPages has a path to use Servlets officially, that method is more out-of-the-way than XAgents and doesn't (without further hoop jumping) give you some niceties like sessionAsSigner.

However, though they're venerable and sort of convenient, XAgents are lacking in a number of ways. For one, stuffing them inside an XPage is ungainly: even if the XPage just calls out to some Java code, you're polluting your UI-element space with something kind of unrelated, forcing you to name your XPages stuff like "apiEmployees.xsp". More importantly, though, it doesn't provide a lot of affordances for the sort of work you'd actually want to do when writing a REST API. Though XPages has a couple mechanisms for generating JSON, little of that is exposed by the XPage editor environment, and I suspect that many or most XAgents writing JSON commit the cardinal sin of doing so through direct string concatenation, likely often without much escaping. Further, there's no built-in mechanism for picking apart path segments or multi-branched routing.

So I figured today is a good opportunity to talk a bit about one of the XPages Jakarta EE Support project's flagship features: writing REST services that consume and emit JSON. This post covers some of the same ground I've talked about before, but sometimes it's useful to re-contextualize this sort of thing.

The XAgent Way

To level-set the discussion, I'll give a starting example in an XAgent, and we'll keep it simple. The idea here will be that you have Person documents in a database and you want to write an API that will take a UNID and emit the corresponding person's first and last name. This is very similar to the Employees example in the "Code-First REST API" example project from the JEE repo, but without diving into more-complex parts or the Jakarta NoSQL data layer.

In an XAgent, you might do something like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
<?xml version="1.0" encoding="UTF-8"?>
<xp:view xmlns:xp="http://www.ibm.com/xsp/core" viewState="nostate" rendered="false">
	<xp:this.afterRenderResponse><![CDATA[#{javascript:
		var resp = facesContext.getExternalContext().getResponse();
		resp.setContentType("application/json");
		
		var writer = facesContext.getResponseWriter();
		
		var doc = database.getDocumentByUNID(param.unid);
		var result = toJson({
			"firstName": doc.getItemValueString("FirstName"),
			"lastName": doc.getItemValueString("LastName")
		});
		writer.write(result);
		
		writer.endDocument();
		facesContext.responseComplete();
	}]]&gt;</xp:this.afterRenderResponse>
</xp:view>

You'd put that in an XPage and call it like:

1
2
$ curl http://your.server/someapp.nsf/apiPeople.xsp?unid=68D1CAA80F65780D8525894D006B1CE7
{"lastName":"Fooson","firstName":"Foo"}

That does the job well enough. However, this will get gangly very fast. If you want to expand on the types of data the Person document contains, add more operations to the API, add other query parameters, or so forth, you'll have to either have a giant blob of code here, spin it off to an SSJS script library, or move it to Java classes. And, all along the way, the dev environment won't be giving you any help with the process - it knows nothing about writing REST APIs, and so it's all "manual". Better than with a classic agent, but not great.

Jakarta Way, Take 1

So let's move this over to Jakarta EE. We'll start by doing basically a concept-for-concept move: just take the above and make a REST class out of it.

Such a class could look something like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
package rest;

import com.ibm.xsp.extlib.util.ExtLibUtil;

import jakarta.json.Json;
import jakarta.json.JsonObject;
import jakarta.ws.rs.GET;
import jakarta.ws.rs.core.MediaType;
import jakarta.ws.rs.Path;
import jakarta.ws.rs.PathParam;
import jakarta.ws.rs.Produces;

import lotus.domino.Database;
import lotus.domino.Document;
import lotus.domino.NotesException;

@Path("v1/person")
public class PersonResourceV1 {
	
	@Path("{unid}")
	@GET
	@Produces(MediaType.APPLICATION_JSON)
	public JsonObject get(@PathParam("unid") String unid) throws NotesException {
		Database database = ExtLibUtil.getCurrentDatabase();
		Document doc = database.getDocumentByUNID(unid);
		
		return Json.createObjectBuilder()
			.add("firstName", doc.getItemValueString("FirstName"))
			.add("lastName", doc.getItemValueString("LastName"))
			.build();
	}
}

This one you'd call in a similar way, and get similar results:

1
2
$ curl http://your.server/someapp.nsf/xsp/app/v1/person/68D1CAA80F65780D8525894D006B1CE7
{"firstName":"Foo","lastName":"Fooson"}

There's a bit more boilerplate - it is Java - but you can already see some of the benefits. The URL is a little more REST-like and the code is much more task-focused. Instead of disabling all the normal features of the XPage and manually emitting text, a large amount of the code is explicitly describing what you intend to do in a REST service. The @Path annotations describe the components of the URL, and the use of @Path("{unid}") lets us name one of those parts without having to do a query string. Similarly, the @GET and @Produces(MediaType.APPLICATION_JSON) lines directly say what's going on: you can do a GET request to the URL and get JSON back.

These annotations pay off in a couple ways. First of all, the code will be a lot easier to understand when you come back later. Imagine if your API also handled new documents, deletions, and modifications - with an XAgent, you'd have branching paths and would have to rely on either good naming or commenting to mentally traverse it. With this, you would have other methods with similar annotations - @PUT, @POST, @DELETE - and would see exactly where requests are going to go.

And it's not just for you the programmer (or for the next human replacing you when you retire to a beach somewhere): these annotations mean something to tools that process it as well. One such tool comes with the XPages JEE project: the MicroProfile OpenAPI generator. With this class present, you automatically get a useful OpenAPI spec:

1
$ curl http://your.server/someapp.nsf/xsp/app/openapi.yaml
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
---
openapi: 3.0.3
info:
  title: XAgent Comparison
servers:
- url: http://your.server/someapp.nsf/xsp/app
paths:
  /v1/person/{unid}:
    get:
      parameters:
      - name: unid
        in: path
        required: true
        schema:
          type: string
      responses:
        "200":
          description: OK
          content:
            application/json:
              schema:
                type: object

This sort of thing pays off tremendously once you start working with a client JS app, especially with a larger development team.

Jakarta Way, Take 2

But we can do better than this. As it is, the direct translation from the XAgent still left the actual output pretty vague. We at least know it's emitting a JSON object, but that's the extent of it. Moreover, the app code wastes some conceptual time actually building the JSON object, which is okay but unnecessary. Let's bring JSON Binding into the mix. This will increase the size of the code, but will pay off in conceptual cleanliness and in future work.

That could look like:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
package rest;

import com.ibm.xsp.extlib.util.ExtLibUtil;

import jakarta.validation.constraints.NotEmpty;
import jakarta.ws.rs.GET;
import jakarta.ws.rs.core.MediaType;
import jakarta.ws.rs.Path;
import jakarta.ws.rs.PathParam;
import jakarta.ws.rs.Produces;

import lotus.domino.Database;
import lotus.domino.Document;
import lotus.domino.NotesException;

@Path("v2/person")
public class PersonResourceV2 {
	
	// Normally, this class would be in a separate file
	public static class Person {
		private String firstName;
		private @NotEmpty String lastName;
		
		public String getFirstName() {
			return firstName;
		}
		public void setFirstName(String firstName) {
			this.firstName = firstName;
		}
		public String getLastName() {
			return lastName;
		}
		public void setLastName(String lastName) {
			this.lastName = lastName;
		}
	}
	
	@Path("{unid}")
	@GET
	@Produces(MediaType.APPLICATION_JSON)
	public Person get(@PathParam("unid") String unid) throws NotesException {
		Database database = ExtLibUtil.getCurrentDatabase();
		Document doc = database.getDocumentByUNID(unid);
		
		Person result = new Person();
		result.setFirstName(doc.getItemValueString("FirstName"));
		result.setLastName(doc.getItemValueString("LastName"));
		return result;
	}
}

Calling this will work the same way as last time, but with a "v2" to indicate our new-and-improved back end:

1
2
$ curl http://your.server/someapp.nsf/xsp/app/v2/person/68D1CAA80F65780D8525894D006B1CE7
{"firstName":"Foo","lastName":"Fooson"}

What does this get us? Well, for one, we're no longer explicitly working with JSON, which is nice. We could change the media type to XML and the output type will automatically adapt, and the same will go for any future formats we write an adapter for. Additionally, this will work better with a more-structured database access layer. I'll leave that part out here, but the aforementioned "Code-First REST API" example shows that.

Some more of the payoff shows up when we check the generated OpenAPI spec. If we make the same call as above, the output will be more descriptive now:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
---
openapi: 3.0.3
info:
  title: XAgent Comparison
servers:
- url: http://your.server/someapp.nsf/xsp/app
paths:
  /v2/person/{unid}:
    get:
      parameters:
      - name: unid
        in: path
        required: true
        schema:
          type: string
      responses:
        "200":
          description: OK
          content:
            application/json:
              schema:
                $ref: '#/components/schemas/Person'
components:
  schemas:
    Person:
      required:
      - lastName
      type: object
      properties:
        firstName:
          type: string
        lastName:
          minLength: 1
          type: string

For someone consuming this spec, this is starting to get really good. No longer does the output just say it's a generic JSON object: now it describes it as "Person", shows the two properties that will be output, and declares that lastName will always be at least one character both for input and output. As you add more operations and types, this spec will continue to grow, naturally piggybacking on the annotations and types you write.

Conclusion

Strictly speaking, you can do the same stuff with an XAgent that you can with the XPages JEE project. You could manage the internal complexity and you could manually write the OpenAPI spec, but it would be much, much harder to do, and I think it'd be a safe bet to say that very few XAgent-based APIs have specs to go with them anyway.

Especially as the scope of your app grows, the development and maintenance experience with the JEE approach is night-and-day compared to classical mechanisms like XAgents. If you're still writing APIs that way or similar, I definitely recommend you give this path a try.

XPages Jakarta EE 2.9.0 and Next Steps

Tue Nov 22 12:53:21 EST 2022

  1. Updating The XPages JEE Support Project To Jakarta EE 9, A Travelogue
  2. JSP and MVC Support in the XPages JEE Project
  3. Migrating a Large XPages App to Jakarta EE 9
  4. XPages Jakarta EE Support 2.2.0
  5. DQL, QueryResultsProcessor, and JNoSQL
  6. Implementing a Basic JNoSQL Driver for Domino
  7. Video Series On The XPages Jakarta EE Project
  8. JSF in the XPages Jakarta EE Support Project
  9. So Why Jakarta?
  10. XPages Jakarta EE 2.5.0 And The Looming Java-Version Wall
  11. Adding Concurrency to the XPages Jakarta EE Support Project
  12. Adding Transactions to the XPages Jakarta EE Support Project
  13. XPages Jakarta EE 2.9.0 and Next Steps
  14. XPages JEE 2.11.0 and the Javadoc Provider
  15. The Loose Roadmap for XPages Jakarta EE Support
  16. XPages JEE 2.12.0: JNoSQL Views and PrimeFaces Support
  17. XPages JEE 2.13.0
  18. XPages JEE 2.14.0
  19. XPages JEE 2.15.0 and Plans for JEE 10 and 11

Keeping with my productive week off, today I release version 2.9.0 of the XPages Jakarta EE Support project. Similar to the previous release, this one contains new features primarily related to Jakarta NoSQL, but also has some improvements for JSF and a bunch of bug fixes and compatibility improvements.

Jakarta NoSQL

The improvements to the JNoSQL driver come from some needs I came across when moving older lotus.domino/ODA-based code to using JNoSQL repositories. In particular, I added the remaining applicable view entry properties as available fields to map, added better support for reading note IDs, and fetching documents by note ID.

JSF

While JSF support remains limited by not having a proper way to add in third-party component libraries like PrimeFaces, it's still a potentially-compelling tool in an NSF as an alternative to XPages in some cases. Accordingly, I fixed a few bugs I had run into when loading pages after modifying the NSF design. Additionally, I fixed up support for JSF as an MVC view engine. It now properly joins JSP as a mechanism for rendering your output with an MVC structure, and I think there's some real potential there.

Bug Fixes and Compatibility

Most of the other closed issues deal with a few bugs here and there, and in particular involve some improvements for running apps in XPiNC and on a server with Domino Leap also installed. I don't use XPiNC anymore and haven't tried Leap, so I greatly appreciate bug reports specific to these and the assistance in tracking down the trouble.

The Future and Next Steps

I'm pondering now what the next release of the project will focus on. I have no shortage of feature ideas, and there are a few potentially-disruptive changes I'd like to make.

Unfortunately, those changes will be largely confined to improving the support for the specs that are already present and not advancing to new versions. The predicted Java-version wall arrived: Jakarta EE 10 is out and requires Java 11 and above. Since Domino remains mired in Java 8, that means that new versions of the specs and implementations are hard-incompatible until that changes.

On the plus side, there's still a lot of improvements I can make with Jakarta EE 9 as the baseline.

Reorganization

One big one I've been thinking about is a reorganization of the individual libraries that make up the project. The way it's been designed, almost every spec has its own Equinox Feature and XPage Library to go with it. This was fine early on when it was just CDI, EL, and JAX-RS, but it's grown annoying: installing the project in Designer is a seemingly-endless process of approving each plug-in at a time and the list of libraries to check in Xsp Properties is interminable. More critically, being able to selectively turn on and off specs like this doesn't make sense anymore. CDI has grown so important to Jakarta EE in general and this spec in particular that it doesn't make sense to not have it present if you're going to use this project at all. It's a foundational component of so many other parts and is essentially The Way to do Jakarta-based development.

So I'm thinking I'm going to reorganize the projects into fewer features and libraries, which will be a breaking change that will necessitate a bump to 3.0 - fortunately, the numbers line up well for that. I have a few potential options here:

  1. Just lump them all into one. You'd have basically one big switch to say "this is a JEE project" and everything would be on. The virtue here is that this is how I already work and is essentially the recommended way to do things. Additionally, as far as I know, while having additional components may slow first load (though not as much as other parts), I don't think they have a significant impact if enabled but unused during runtime.
  2. Try to line the specs up with one of the existing Jakarta Profiles. Those profiles are meant to be curated selections of useful specs, and this project has enough to implement what in newer versions is deemed the Core Profile. The trouble with this, though, is that the Core Profile is very much geared to be the shared subset with MicroProfile and similar and is a bit thin for Domino's monolith-focused development style. The Web and Full profiles, on the other hand, require "traditional" APIs like EJB that are not present in this project.
  3. Break them apart into my own "core" and "optional" features. For example, it doesn't make sense to use this without JAX-RS, CDI, and Bean Validation enabled, but JSF is entirely independent of the other specs and is among the least likely to be used in practice for now. This would also allow me to establish a running flow where "experimental" features start out as optional add-ins and then eventually make their way to core.

I'm currently waffling between #1 and #3, with a slight lean towards #1. If I can be sure that either everything or nothing is present, I could get rid of some weird hedges and workarounds, like how the JAX-RS implementation doesn't "officially" know about the CDI library yet references CDI classes explicitly by name.

New Application Types

Currently, to use this project, you can either put your code into an NSF and use the automatic behavior of the libraries or you can put your code in OSGi-based webapps or Servlets and then manually manage integration with these specs.

Both of these are limited by their reliance on the many assumptions IBM built in to how these apps should work. In-NSF apps require that all Jakarta code come from a request including "xsp" in the URL or to a file ending in ".jsp", ".jsf", or ".xhtml". If you're writing, say, an MVC-based app, all of your URLs are going to have to start with something like "foo.nsf/xsp/app/...", which is okay but ugly. Additionally, the way these apps are implemented - NSFComponentModule - severely limits my hooks for listening for things like application and session expiration, which hampers CDI's lifecycle handling a bit.

For a good while, I've pondered the notion of adding another ComponentModule type to handle the case where you want to go all-in on Jakarta EE. With this idea, the new module implementation would have full control over incoming requests, allowing URLs without the xsp/app bit in there, and would have better handling of lifecycles. In this way, I could make it so that your could would look more like (or be identical to) a "normal" .war-based webapp, with fewer workarounds for the existing XPages stuff. This would also allow me to do things like lessen the amount of Servlet 2.5-to-5.0 bridging and could assist tremendously in improving JSF support.

Along similar lines, I've been considering doing something similar for OSGi-based webapps, and I've made some progress along those lines in a feature branch. The idea here would be to do something similar to how you can deploy web.xml-based webapps via OSGi now, but with built-in support for Jakarta EE 9 features (with web.xml then being optional). With this setup, you'd be able to write an app that does an Import-Package for the various jakarta.* packages you want and add a bit in your MANIFEST.MF to signal to this project that it should participate. This could either be a variant of the extension point used by the existing OSGi webapp support or using the Web-ContextPath directive from the OSGi spec. One of the goals here would be to make it so that you would be able to write a Jakarta EE 9 app using normal development tools - Eclipse/IntelliJ/VS Code, Maven, etc. - and then just use maven-bundle-plugin to add the OSGi info you need without having any specific dependencies on Domino bits, especially the nightmare of depending on the non-redistributable XPages OSGi artifacts.

Other Options

And, in the mean time, I have a bunch of other tasks I could work on. Slowly converting my client project to Jakarta NoSQL instead of direct ODA use has turned up a whole slew of things that would be useful to add (for example, stampAll support), so I can slowly burn down that feature-request list.

There's also the notion of documentation! While a lot of the behavior of this project is in theory documented by virtue of the upstream specs and the general world of Jakarta blogs, videos, and courses, there's enough to know about the specifics of the interactions with Domino that more documentation is in order. Historically, I've just done this by expanding the README, but it's gotten pretty unwieldy at this point. It would probably make sense to break the specifics and examples out into at least wiki pages, if not a format that can be built into a PDF/etc. and included in the distribution.

So yep, I'll have my hands busy with this thing for a good while more, I figure.

More Open-Source Updates for Notes/Domino 12.0.2

Mon Nov 21 13:27:51 EST 2022

The other day, I talked about some changes/workarounds for Notes/Domino 12.0.2. Today, I made a few updates to some of the open-source projects I maintain, including another update to the generate-domino-update-site Maven plugin.

Domino Update Site Generator

In the 4.2.0 release, I added code to (mostly, as it turns out) account for HCL moving the NAPI implementation JAR down to jvm/lib/ext. In subsequent use, I found that, while that will suffice for building applications that use the OSGi dependencies, it didn't work for launching applications using it as a baseline - namely, the NSF ODP Tooling Maven plugin.

Today, I released a 4.2.1 version that improves this behavior by re-adding dependencies in the implementation bundle.

I also created a project page for it on OpenNTF. Though the project has always been hosted at the OpenNTF org on GitHub, I hadn't created a project page for it due to it just being a standalone Maven plugin. I figured it'd be useful to create an official page there for it, though.

NSF ODP Tooling

Speaking of the NSF ODP Tooling, I also found that local operations once again started crashing on macOS. Due to changes in macOS and the very weird ways that Notes works, local operations on there are a very moving target, and I have to do a lot of work in the project to account for changes to the embedded JVM and whether specific Notes versions work better with HotSpot or OpenJ9 JVMs.

Long story short, I release 3.10.0 today to account for this. Though I've found that the spawned JVM will still sometimes crash, it's after completing its work, so I considered that fine for now.

OpenNTF Domino API

Finally, I came to the OpenNTF Domino API. This project has admittedly been neglected for a little while: I'm the only active maintainer, and the client project I use it in targets Domino 11.0.1, so the 12.x builds have remained in an incomplete state for a while.

With the release of 12.0.2, I decided I should finish the wrappers for the new classes added in 12.x, so I did so and uploaded a build for 12.0.2. This primarily adds those wrappers, but also included a contributed fix and changes the distribution packaging to combine the XSP and non-XSP versions.

Notes/Domino 12.0.2 Fallout

Thu Nov 17 13:45:21 EST 2022

Tags: designer java
  1. AbstractCompiledPage, Missing Plugins, and MANIFEST.MF in FP10 and V10
  2. Domino 11's Java Switch Fallout
  3. fontconfig, Java, and Domino 11
  4. Notes/Domino 12.0.2 Fallout
  5. Notes/Domino 14 Fallout

Notes and Domino 12.0.2 came out today. Generally, there are some neat features in development and on the server, but there are also a couple things you may run into depending on your workflow and installation type.

Update Site NSF

The update site NSF that ships with Domino uses SWT for some of its GUI elements when importing contents. This still works fine in the 32-bit client, but is broken in the 64-bit client. My guess on that front is that the 64-bit client doesn't come with a 64-bit native SWT JAR, probably because the SWT version used for this likely pre-dates x64's popularity on the desktop.

For now, the workaround is to use a 32-bit client if you're working with the Update Site NSF. Karsten said he's going to patch the OpenNTF version of the NSF to deal with this, so you can also wait for that one.

Domino Update Site Generator

I maintain the generate-domino-update-site Maven plugin that can be used to generate update sites for OSGi development against the Domino stack. These sites are replacements for the IBM Domino Update Site for Build Management, which was released for 9.0.1 and never updated since. Only HCL can make a new distributable version of that, so my tool lets you generate one for yourself from a Notes or Domino installation.

In 12.0.2, HCL shunted the NAPI implementation JAR down to jvm/lib/ext to support the shared JARs between XPages and agents feature. As a side effect, existing versions of my plugin would lack the NAPI classes.

Today, I released version 4.2.0, which fixes this and also contains improvements to let the plugin work on current Java versions and generate sites based on 12+ macOS clients.

Along these lines, I made Aha idea DDXP-I-352 a couple years ago to request that HCL provide such sites themselves or give OpenNTF permission to provide them, so I'd appreciate it if you voted for that.

The Target Platform Bug

This isn't a new thing, but it's worth mentioning here since it comes up frequently: the target platform bug from 9.0.1FP10 remains. As of earlier this year, defect article KB0086688 mentions this, though the status is "Deferred". If this afflicts you, it may help to bring it up with Support and reference that article.

The Myriad Idioms For Finding Implementations In Java

Tue Oct 18 10:25:06 EDT 2022

Tags: jakartaee java
  1. Java Services (Not the RESTful Kind)
  2. Java ClassLoaders
  3. Managed Beans to CDI
  4. The Myriad Idioms For Finding Implementations In Java

A few years ago, I wrote a post about Java service location, which covered things like META-INF/services and OSGI extensions. Today, I'd like to discuss a similar concept: code in a top-level API that finds a specific implementation. For reasons that will become clear shortly, I'll call this the "FactoryFinder pattern".

Background

Not all Java code uses this kind of thing and, while service loading is related, the overlap isn't complete. Where this does come up a lot is in a framework like Jakarta EE, which is very intentionally split between vendor-neutral specification classes/interfaces (the ones starting with jakarta.*) and specific implementations.

For example, the Jakarta REST (n?e JAX-RS) specification only defines various classes and interfaces within the jakarta.ws.rs package space, but doesn't include any actual implementation. That's left to various vendors. The number of implementations varies by spec, and JAX-RS is particularly prolific on this front. In the XPages Jakarta EE project, we use RESTEasy, whose classes are all in the org.jboss.resteasy package space.

There's a (usually) hard wall between these layers: the spec declares an API that programmers can use, and then the implementation has to allow itself to be called by those class names and obey the specification's rules. When writing JAX-RS resources in an NSF, the fact that it's using RESTEasy does not enter into your experience. That raises the question, though, of how this works. How does the vendor-neutral specification locate the implementation classes to hand off the work? Well, that question has a number of different answers.

Entrypoint Classes and Locating Implementations

In general, each spec accomplishes this using one or more entrypoint classes. For example, JAX-RS uses RuntimeDelegate and its static getInstance() method to locate server implementations and ClientBuilder and its newBuilder() method to load client implementations. Outwardly, these methods just promise that they'll find and provide an implementation, but the actual way that specs do this varies.

One of the most common ways to coordinate this loading is to have a class named FactoryFinder. This idiom and specific name proved very popular over at Sun as they built up the JEE specs:

Eclipse Open Type dialog for FactoryFinder

Despite their identical names, each of these classes is a different implementation, and they have different characteristics. There are routines in common, and each spec uses a subset of these. I'll go over the common ones here, in no particular order other than that I'll start with the ones found in the JAX-RS API first.

ServiceLoader

This one is used in basically every spec up until the latest era. This uses the java.util.ServiceLoader class to find implementations by way of text files in META-INF/services named after the spec class and containing implementation class names. For example, RESTEasy contains a file named META-INF/services/jakarta.ws.rs.ext.RuntimeDelegate that references the class org.jboss.resteasy.core.providerfactory.ResteasyProviderFactoryImpl. That looks like this:

1
2
3
4
5
Iterator<T> iterator = ServiceLoader.load(service, FactoryFinder.getContextClassLoader()).iterator();

if(iterator.hasNext()) {
	return iterator.next();
}

FactoryFinder.getContextClassLoader() there is a utility method that just uses an AccessController block to work with Java policy limitations like we see on Domino all the time.

This is simple enough in the normal case, but can get a little tricky when you add in something like OSGi. By default, ServiceLoader will look in the thread-context class loader, which will usually be where your application code lives. Inside an app container, like an NSF, the implementation class may not actually be visible, though. Accordingly, many of these finders fall back to looking using the class loader of the spec class, which has a higher chance of seeing the implementation. That looks similar:

1
2
3
4
5
Iterator<T> iterator = ServiceLoader.load(service, FactoryFinder.class.getClassLoader()).iterator();

if(iterator.hasNext()) {
	return iterator.next();
}

In the XPages Jakarta EE project, neither of these calls will tend to work by default, since neither the app nor the API bundle won't see the implementation bundle by default. In some cases, I deal with this via the methods below, but in others I will do so by re-packaging the implementation as an OSGi fragment bundle. Fragment bundles attach themselves onto their host's classloader fully, and this allows ServiceLoader to find the implementation.

Configuration Properties

A handful of these specs, JAX-RS included, will also look for the name of an implementation class using an external properties file. The placement of this in the priority order - as a fallback after ServiceLoader - and the classes used in the implementation make me figure that these are quite often relics of earlier habits.

JAX-RS, for its part, will look within the java.home system property, which points to the JVM's installation directory. In there, it looks for a properties file named lib/jaxrs.properties:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
String javah = System.getProperty("java.home");
configFile = javah + File.separator + "lib" + File.separator + "jaxrs.properties";
File f = new File(configFile);
if (f.exists()) {
	Properties props = new Properties();
	inputStream = new FileInputStream(f);
	props.load(inputStream);
	String factoryClassName = props.getProperty(factoryId);
	return newInstance(factoryClassName, classLoader);
}

This tries to use the thread-context class loader only, so it wouldn't work for a complex app server situation. Likely, it's meant for either an older type of application or a standalone special-purpose JAR.

System Properties

Similar to reading a designated properties file, these specs will often then fall back to looking for a Java system property of a given name. These properties may be dynamically set at runtime or may be set during the JVM launch. Often, this property will be the name of the interface/abstract class being looked up, like so:

1
2
3
4
String systemProp = System.getProperty(factoryId);
if (systemProp != null) {
	return newInstance(systemProp, classLoader);
}

This one can actually come in handy sometimes - though not ideal, I've used similar cases where I set the name of an implementation or delegation class in a property before initializing the spec. It's best to avoid that when possible, but I'm often glad it's there.

OSGi Escape

Next up is one that JAX-RS doesn't use, but shows up periodically. Though Jakarta EE isn't based around OSGi, a good number of the implementations historically have used (and still use) it, and OSGi always sits in a "not standard, but too popular to consistently ignore" limbo.

To account for this, there's a similarly semi-standard library called the OSGi resource locator. This library provides a class named org.glassfish.hk2.osgiresourcelocator.ServiceLoader that does its own search and loading for ServiceLoader-compatible META-INF/services files within OSGi bundles in the current platform. The idea is that, if you have an OSGi-based platform that you want to work with this type of loading, you will provide the Resource Locator class and let any loaders written to use it fall back to it.

Because this class is not normally present even when actually in OSGi, APIs that make use of it have to be careful and indirect about trying to load it at all. We'll use JAX-B as our example here. They'll generally try to load the bridge class reflectively, which avoids having OSGi-wrapping tools like bnd create a potentially-undesired dependency on the presence of the bridge. Then, they'll reflectively ask it to load service implementations. That tends to look like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
// Use reflection to avoid having any dependency on ServiceLoader class
Class serviceClass = Class.forName(factoryId);
Class target = Class.forName(OSGI_SERVICE_LOADER_CLASS_NAME);
Method m = target.getMethod(OSGI_SERVICE_LOADER_METHOD_NAME, Class.class);
Iterator iter = ((Iterable) m.invoke(null, serviceClass)).iterator();
if (iter.hasNext()) {
	Object next = iter.next();
	logger.fine("Found implementation using OSGi facility; returning object [" +
		next.getClass().getName() + "].");
	return next;
} else {
	return null;
}

That's also generally wrapped in a big try/catch block to avoid gumming up the works if any pieces are missing.

The XPages Jakarta EE project actually contains a reimplementation of this that avoids some hurdle or other that I found with the stock version. I avoided doing something like that for a while, but it ended up being the most practical way to get some of these specs working.

Default Implementation

Back outside the realm of OSGi, a handful of these specifications will also include a hard-coded default provider class name. These are generally the classes from what used to be dubbed reference implementations and which are largely components of GlassFish by virtue of that being Sun's version.

For example, the JSON-P API has a final fallback of trying to look for org.glassfish.json.JsonProviderImpl by name:

1
2
Class<?> clazz = Class.forName(DEFAULT_PROVIDER);
return (JsonProvider) clazz.getConstructor().newInstance();

Though these implementations generally also declare themselves via ServiceLoader files, this is presumably useful in historical or edge cases where there's still a decent chance that the RI will be available. This does have an unfortunate effect on error messages, though, where the case of "I can't find any implementation at all" ends up being reported as e.g. "Provider org.glassfish.json.JsonProviderImpl not found". That's not really a problem with the approach as such, though, but rather just the way it shakes out in practice.

Manually-Set Implementation or Locator

The final mechanism I'm going to discuss is sort of a final escape hatch. Sometimes, the provider class will have a method that lets you set an arbitrary implementation yourself, without having the API do any of these lookups at all. Some, like MicroProfile Config and CDI even go one step further and provide a method that configures not just a specific implementation but rather an implementation locator. These APIs are my friends and I love them.

This mechanism works well for my needs in the XPages Jakarta EE project, where either it's easier to just set one implementation for the whole server or, like with CDI, there's complex logic that requires inspecting the active Servlet request to see what NSF I'm in.

APIs of this style will usually have a method named like setInstance or setProvider on either their core entrypoint class or on the provider locator. For example, MicroProfile Config provides the former on its ConfigProviderResolver class:

1
2
3
public static void setInstance(ConfigProviderResolver resolver) {
	instance = resolver;
}

instance here is a static property. Once it's set - either by this method or by a dynamic lookup - the main instance() method will use it:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
if (instance == null) {
	synchronized (ConfigProviderResolver.class) {
		if (instance != null) {
			return instance;
		}
		instance = loadSpi(ConfigProviderResolver.class.getClassLoader());
	}
}

return instance;

The XPages JEE project makes use of this method at HTTP start, setting a provider resolver that includes some stock config sources as well as some classes that know how to read properties from the Notes environment and from the xsp.properties file.

Though this mechanism seems like the crudest out of the bunch, I'm extremely happy whenever it's there.

Conclusion

That was a lot! And there's not really a lesson to be learned here, but rather more that it's often useful to know about all these different mechanisms. When working in the XPages JEE project, I've had to use almost all of them at one time or another, and I've had to familiarize myself with which APIs use which and adapt them individually. For some, I've altered the implementation to be a fragment bundle; for others, I've created my own fragment to provide services and implementations; and so forth. It's a bit of a shame that there's no grand unified system for this, but at least it can be interesting to see the messy path that these specs have taken as Java technologies and the ecosystem evolved.

Jakarta NoSQL Driver for the AppDev Pack, Part 2

Mon Sep 26 09:55:05 EDT 2022

Tags: java jnosql
  1. Jakarta NoSQL Driver for the AppDev Pack, Part 1
  2. Jakarta NoSQL Driver for the AppDev Pack, Part 2

In my last post, I talked about how I implemented a partial Jakarta NoSQL driver using the AppDev Pack as a back end instead of the Notes.jar classes used by the primary implementation. Though the limitations in the ADP mean that it lacks a number of useful features compared to the primary one, it was still an interesting experiment and has the nice side effect of working with essentially any Java app server and Java version 8 or above.

Beyond the Proton API calls, the driver brought up the interesting topic of handling authentication. Proton has three ways of working in this regard:

  • Anonymous, which is what you might expect based on how that works elsewhere in Domino. This is easy but not particularly useful except in specific circumstances.
  • Client certificate authentication, where you create a TLS keychain for a given user and associate it with a Directory user (e.g. CN=My Proton App/O=MyOrg), and then your app performs all operations as that user. This is basically like if you ran a remote app with NRPC using a client Notes ID.
  • Act-as-User, which builds on the above authentication by configuring an OAuth broker service that can hand out OIDC tokens on behalf of named users. This is sort of like server-to-server communication with the "Trusted Servers" config field in the server doc, but different in key ways.

Client Certificate Authentication

When doing app development, the middle route makes sense as your starting point, since most of your actual work will likely be the same regardless of whether you later then add on act-as-user support. For that, you'll follow the guide to set up your TLS keychain and then feed those files to the com.hcl.domino.db.model.Server object:

1
2
3
4
5
6
7
Path base = Paths.get(BASE_PATH);
		
File ca = base.resolve("rootcrt.pem").toFile();
File cert = base.resolve("clientcrt.pem").toFile();
File key = base.resolve("clientkey.pem").toFile();

Server server = new Server("ceres.frostillic.us", 3003, ca, cert, key, null, null, Executors.newSingleThreadExecutor());

The use of java.io.File classes here is a bit of a shame, but not the end of the world. In practice, you'd likely store your keychain somewhere on the filesystem anyway and then feed the BASE_PATH property to your app via an environment variable. Otherwise, if they were pulled from some other source, you could use Files.createTempFile to store them on the filesystem while your app is running. Those nulls are for the passphrases for the certificates, so they might be populated for you. For the last parameter, making a new executor is fine, but you might want to hand it a ManagedExecutorService.

You can initialize this connection basically anywhere - I put it in a ServletRequestListener to init and term the object per-request, but I think it would actually be fine to do it in a ServletContextListener and keep it app-wide. I did it out of habit from Notes.jar and its heavy requirements on threads and the way Session objects are different per-user, but that's not how Proton connections work.

This form of authenticate is a prerequisite for Act-as-User below, but it might suffice for your needs anyway. For example, if you have a "utility" app, like a bot that looks up data and posts messages to Slack or something, you can call it good here.

Act-as-User

But though the above may suffice sometimes, the integration of user identity with data access is one of the hallmarks of Domino, so Domino apps that wouldn't need Act-as-User support are few and far between.

Act-as-User is a bit daunting to set up, though. In traditional server-to-server communication in Domino, it suffices to just add a server's name or group to the "Trusted Servers" list in the server doc of the server being accessed. Then, it will trust any old name that the app-housing server sends along. Generally, this will be a user that was already authenticated with Domino, like using an XPages-supplied Session object to access a remote server, but it doesn't have to be.

Act-as-User, though, uses OpenID Connect with an authentication server to do the actual authentication, and then the Proton task is told to accept those tokens as legal for acting on behalf of a given user. While you could in theory write your own OIDC server that dispenses tokens for any user name willy-nilly, in practice you'll almost definitely use an existing implementation. In the default case, that implementation will in turn almost definitely be IAM, which is an OAuth broker service with the AppDev Pack that stores its configuration data in an NSF but and reads users from Domino (or elsewhere) via LDAP.

IAM, though, isn't special in this regard. It's packaged with the ADP, sure, but the way it deals with tokens is entirely standards-based. That means that any compatible implementation can fill this role, and, since I'd heard great things about Keycloak, I figured I'd give that a shot. With some gracious assistance from Heiko Voigt, I was able to get this working - I don't want to steal his future thunder by going into too much detail, but honestly the main hurdles for me were just around learning how Keycloak works. Once you have the concepts down, you basically plug in the Keycloak client details in for the IAM ones in the same configuration.

With that set up, you can feed your user token from the web app into your Proton API calls, and then your actions will be running as your user in the same way as if you were authenticated in an on-server XPages or other app. The way this manifests in the Java API is a little weird, but it works well enough: almost all Proton calls have a varargs portion at the end of their method signature that takes OptionalArg instances. One such type is OptionalAccessToken, which takes your auth token as a String. I have a method that will stitch in an access token when present. That gets passed in when making calls, such as to read documents:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
OptionalItemNames itemNamesArg = new OptionalItemNames(itemNames);
OptionalStart startArg = new OptionalStart((int)skip);
OptionalCount countArg = new OptionalCount(limit < 1 ? Integer.MAX_VALUE : (int)limit);
			
List<Document> docs = database.readDocuments(
	dql,
	composeArgs(
		itemNamesArg,
		startArg,
		countArg
	)
).get();

App Authentication

Okay, so that's what you do when you have a setup and a token, but that leaves the process of the user actually acquiring the token. From the user's perspective, this will generally take the form of doing an "OAuth dance" where, when the user tries to access a protected resource, they're sent over to Keycloak to authenticate, which then sends them back to the app with token in hand.

There are a lot of ways one might accomplish this, varying language-to-language, framework-to-framework, and server-to-server. You will be shocked to learn that I'm using Open Liberty for my app here, and that comes with built-in support for OIDC.

Before I go further, I should put forth a big caveat: I'm really muddling through with this one for the time being. The setup I have only kind of works, and is clearly not the ideal one, but it was enough to make the connection happen. I'm not sure if the right path long-term is to keep using this built-in feature or to switch to either a different built-in option or another library entirely. So... absolutely do not take anything here as advice in the correct way to do this.

Anyway, with that out of the way, you can configure your Liberty server to talk to your Keycloak server (or IAM, probably, but I didn't do that):

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
<openidConnectClient id="client01"
    clientId="liberty-tester"
    clientSecret="some-client-secret"
    discoveryEndpointUrl="https://some.keycloak.server/auth/realms/master/.well-known/openid-configuration"
    signatureAlgorithm="RS256"
    sslRef="httpSsl"
    accessTokenInLtpaCookie="true"
    userIdentifier="preferred_username"
    groupIdentifier="groups">
</openidConnectClient>
<ssl id="httpSsl" trustDefaultCerts="true" keyStoreRef="myKeyStore" trustStoreRef="OIDCTrustStore" />
<keyStore id="myKeyStore" password="super-secure-password1" type="PKCS12" location="${server.config.dir}/BasicKeyStore.p12" />
<keyStore id="OIDCTrustStore" password="super-secure-password2" type="PKCS12" location="${server.config.dir}/OIDCTrustStore.p12" />

The keyStores there contain appropriate certificate chains for the TLS connection to your Keycloak server, while the clientId and clientSecret match what you configure/generate on Keyclaok for this new client app.

What got me able to actually use this token for downstream access was the accessTokenInLtpaCookie property. If you set that, then your HttpServletRequest objects after the initial one will have an oidc_access_token property on them containing your token in the format that Proton needs. So that's where the ContextDatabaseSupplier in the previous post got it:

1
2
3
4
5
6
7
@Produces
public AccessTokenSupplier getAccessToken() {
	return () -> {
		HttpServletRequest request = CDI.current().select(HttpServletRequest.class).get();
		return (String)request.getAttribute("oidc_access_token");
	};
}

This is also one of the parts that makes me think I'm not quite doing this ideally. It's weird that the token shows up in only requests after the first, though that wouldn't be an impediment in a lot of app types. It's also very unfortunate to have the app use a server-specific property like that.

Fortunately, Jakarta Security 3.0 sprouted official OIDC support, though the build of Open Liberty I was using didn't quite have all the pieces in place for that - reasonable, considering Jakarta EE 10 only officially came out yesterday and this was weeks ago. It looks like that may provide the token in a contextual object, so I'll have to give that a shot once support settles in.

With my setup in place, janky as it may be, I'm able to access a resource (e.g. a JAX-RS endpoint) marked with @RolesAllowed("uma_authorization") and the server will automatically kick me over to Keycloak and then accept the token when I get back. Then, I can pick that up from the request attributes and use it for Domino data access. Keycloak is getting its user directory from Domino via LDAP in the same way as IAM usually would, but, like IAM with AD, it could be configured to use different user directories. I don't know that I'll want to do that, but it's good to know.

Conclusion

Like the original driver itself, this was mostly an educational exercise for me. I don't currently have any requirements to use the AppDev Pack or OIDC/Keycloak, but I'd wanted to dip my toes in both for a while now, and I'm pleased that I came out successful. I imagine that I'll have an occasion to implement something like this eventually. It may not be the same specific parts, but the core concepts are common, like in Keep's JWT and OAuth support. It's a neat setup, and it's definitely worth doing something similar if you have some experimentation time on your hands.

Jakarta NoSQL Driver for the AppDev Pack, Part 1

Fri Sep 09 10:20:28 EDT 2022

Tags: java jnosql
  1. Jakarta NoSQL Driver for the AppDev Pack, Part 1
  2. Jakarta NoSQL Driver for the AppDev Pack, Part 2

Though the bulk of the work I've been doing for the XPages Jakarta EE project is to bring JEE technologies to Domino, the NoSQL driver has been designed to lead a double life: it works in an XPages context, but it's written to not have any XPages dependencies. One reason for this is that I want it to be usable if you use, for example, the Open Liberty runtime project to side-car apps on a Domino server but still use Notes.jar for data access.

Another reason for its organization, though, is that I intended for the driver to be portable across implementations. The driver itself is split into a main bundle and a ".lsxbe" implementation bundle. My original thought was to make that ready for a JNX or Domino JNA implementation, but it's pretty flexible.

Proton

And that brings me to the topic of this post: the AppDev Pack. I'd actually not used the AppDev Pack at all until I started in on this little project. While I like the idea, it at first only had a JS client, and then even with a Java client lacked some important capabilities I'd need for most of what I do. That's still the case, unfortunately, but my general annoyance with the problems of developing on top of a local Notes runtime tipped the scales to get me to investigate it.

In essence, the core part of the AppDevPack - the Proton addin - is kind of like a modern take on the CORBA driver for Notes.jar, in that it provides a remote way to make API calls on the server that are "low-level-ish" and ideally higher-performance than REST APIs. Since it's remote, it imposes no requirements on the environment of the app making the calls, and could in theory work with any language that has a gRPC library (though in practice HCL has only ever shipped JS and Java clients without providing the gRPC spec to generate others).

The API that Proton provides is heavily geared around batch operations, focusing on using DQL for querying. This suits my needs well, as Jakarta NoSQL essentially assumes you'll have something like DQL available to do arbitrary document queries. It's also geared around requesting specific items from documents to return, which further suits me well - for efficiency purposes, I wrote code into the LSXBE driver to only read desired items anyway, so that adapted naturally.

The Driver

So, over the long weekend here, I set out into writing a driver suitable for use in standalone JEE applications to access Domino via Proton. The goal here is to make it so that code written to target the LSXBE driver will work the same way with this Proton-based one, with the only differences being limitations in what Proton provides and a switch to providing a lotus.domino.Database context to providing a com.hcl.domino.db.model.Database one.

And, though the limitation list there is a bit lengthy, I accomplished the main part of my goal there. Since it shares a lot of code in common with the LSXBE driver, the implementation here is pretty small. The bulk of the code happens in ProtonDocumentCollectionManager, which is the implementation of all the raw JNoSQL primitives - insert, update, select, etc. - and then ProtonEntityConverter, which is the utility class that translates between Proton's concept of Document and JNoSQL's concept of DocumentEntity.

The Proton API is... very weird from a Java perspective. Some of it bears a little similarity to the NIO Files API if you squint, but otherwise it's just an odd duck. Fortunately, a consumer of this driver doesn't have to care about that beyond the initial connection to the server. The specifics of actually connecting to a server and opening a database are the same as with normal Proton use - you do that and then provide a CDI bean to hand off the Database object to the driver:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
@RequestScoped
public class ContextDatabaseSupplier {
	// This type of supplier is required
	@Produces
	public DatabaseSupplier get() {
		// Here, the `Database` object is set in a `ServletRequestListener`, though it likely could be app-wide
		return () -> {
			HttpServletRequest request = CDI.current().select(HttpServletRequest.class).get();
			return (Database)request.getAttribute(DatabaseContextListener.ATTR_DB);
		};
	}
	
	// This supplier is optional - leaving it out or returning null will cause the driver to skip performing Act-As-User
	@Produces
	public AccessTokenSupplier getAccessToken() {
		return () -> {
			HttpServletRequest request = CDI.current().select(HttpServletRequest.class).get();
			return (String)request.getAttribute("oidc_access_token");
		};
	}
}

Next Steps

Now, admittedly, I'm not sure how much more time I'm going to put into this driver. Though there are a few enhancements that could be made - attachments and rich text, namely - I'm probably not going to use this myself in the near future. The immediate cases I have where I'd want to use this would end up involving things like views that aren't currently accessible over Proton, and so it's be really just speculative development. Still, I wanted to make this to kick the tires a bit and see what's possible, and it's got my gears turning.

There'll also be a bit more I want to go over in the next post. In the above code block, I mention Act-As-User tokens; the driver will use those when provided, and the driver itself doesn't care how you acquire them, but it's an interesting topic on its own. During my testing, I was able to get a Liberty-hosted app performing OIDC authentication for its own login and then using that token for access to Proton, and I think that that will deserve some expansion on its own.

In the mean time, if you're curious, the driver is available up on GitHub:

https://github.com/OpenNTF/jnosql-driver-proton

The artifact is published to OpenNTF's Maven repository, though you'll need to follow the instructions in the README there to add the non-redistributable Proton client library to your local Maven repo.

WebAuthn/Passkey Login With JavaSapi

Tue Jul 05 14:05:23 EDT 2022

Tags: java webauthn
  1. Poking Around With JavaSapi
  2. Per-NSF-Scoped JWT Authorization With JavaSapi
  3. WebAuthn/Passkey Login With JavaSapi

Over the weekend, I got a wild hair to try something that had been percolating in my mind recently: get Passkeys, Apple's term for WebAuthn/FIDO technologies, working with the new Safari against a Domino server. And, though some aspects were pretty non-obvious (it turns out there's a LOT of binary-data stuff in JavaScript now), I got it working thanks to a few tools:

  1. JavaSapi, the "yeah, I know it's not supported" DSAPI Java peer
  2. webauthn.guide, a handy resource
  3. java-webauthn-server, a Java implementation of the WebAuthn server components

Definition

First off: what is all this? All the terms - Passkey, FIDO, WebAuthn - for our purposes here deal with a mechanism for doing public/private key authentication with a remote server. In a sense, it's essentially a web version of what you do with Notes IDs or SSH keys, where the only actual user-provided authentication (a password, biometric input, or the like) happens locally, and then the local machine uses its private key combined with the server's knowledge of the public key to prove its identity.

WebAuthn accomplishes this with the navigator.credentials object in the browser, which handles dealing with the local security storage. You pass objects between this store and the server, first to register your local key and then subsequently to log in with it. There are a lot of details I'm fuzzing over here, in large part because I don't know the specifics myself, but that will cover it.

While browsers have technically had similar cryptographic authentication for a very long time by way of client SSL certificates, this set of technologies makes great strides across the board - enough to make it actually practical to deploy for normal use. With Safari, anything with Touch ID or Face ID can participate in this, while on other platforms and browsers you can use either a security key or similar cryptographic storage. Since I have a little MacBook Air with Touch ID, I went with that.

The Flow

Before getting too much into the specifics, I'll talk about the flow. The general idea with WebAuthn is that the user gets to a point where they're creating a new account (where password doesn't yet matter) or logged in via another mechanism, and then have their browser generate the keypair. In this case, I logged in using the normal Domino login form.

Once in, I have a button on the page that will request that the browser create the keypair. This first makes a request to the server for a challenge object appropriate for the user, then creates the keypair locally, and then POSTs the results back to the server for verification. To the user, that looks like:

WebAuthn key creation in Safari

That keypair will remain in the local device's storage - in this case, Keychain, synced via iCloud in upcoming versions.

Then, I have another button that performs a challenge. In practice, this challenge would be configured on the login page, but it's on the same page for this proof-of-concept. The button causes the browser to request from the server a list of acceptable public keys to go with a given username, and then prompts the user to verify that they want to use a matching one:

WebAuthn assertion in Safari

The implementation details of what to do on success and failure are kind of up to you. Here, I ended up storing active authentication sessions on a server in a Map keyed the SessionID cookie value to user name, but all options are open.

Dance Implementation

As I mentioned above, my main tool on the server side was java-webauthn-server, which handles a lot of the details of the processs. For this experiment, I cribbed their InMemoryRegistrationStorage class, but a real implementation would presumably store this information in an NSF.

For the client side, there's a npm module to handle the specifics, but I was doing this all on a single HTML page and so I just borrowed from it in pieces (particularly the Base64/ByteArray stuff).

On the server, I created an OSGi plugin that contained a com.ibm.pvc.webcontainer.application webapp as well as the JavaSapi service: since they're both in the same OSGi bundle, that meant I could share the same classes and memory space, without having to worry about coordination between two very-distinct parts (as would be the case with DSAPI).

The webapp part itself actually does more of the work, and does so by way of four Servlets: one for the "create keys" options, one for registering a created key, one for the "I want to start a login" pre-flight, and finally one for actually handling the final authentication. As an implementation note, getting this working involved removing guava.jar from the classpath temporarily, as the library in question made heavy use of a version slightly newer than what Domino ships with.

Key Creation

The first Servlet assumes that the user is logged in and then provides a JSON object to the client describing what sort of keypair it should create:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
Session session = ContextInfo.getUserSession();
			
PublicKeyCredentialCreationOptions request = WebauthnManager.instance.getRelyingParty()
	.startRegistration(
		StartRegistrationOptions.builder()
			// Creates or retrieves an in-memory object with an associated random "handle" value
		    .user(WebauthnManager.instance.locateUser(session))
		    .build()
	);

// Store in the HTTP session for later verification. Could also be done via cookie or other pairing
req.getSession(true).setAttribute(WebauthnManager.REQUEST_KEY, request);
	
String json = request.toCredentialsCreateJson();
resp.setStatus(200);
resp.setHeader("Content-Type", "application/json"); //$NON-NLS-1$ //$NON-NLS-2$
resp.getOutputStream().write(String.valueOf(json).getBytes());

The client side retrieves these options, parses some Base64'd parts to binary arrays (this is what the npm module would do), and then sends that back to the server to create the registration:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
fetch("/webauthn/creationOptions", { "credentials": "same-origin" })
	.then(res => res.json())
	.then(json => {
		json.publicKey.challenge = base64urlToBuffer(json.publicKey.challenge, c => c.charCodeAt(0));
		json.publicKey.user.id = base64urlToBuffer(json.publicKey.user.id, c => c.charCodeAt(0));
		if(json.publicKey.excludeCredentials) {
			for(var i = 0; i < json.publicKey.excludeCredentials.length; i++) {
				var cred = json.publicKey.excludeCredentials[i];
				cred.id = base64urlToBuffer(cred.id);
			}
		}
		navigator.credentials.create(json)
			.then(credential => {
				// Create a JSON-friendly payload to send to the server
				const payload = {
					type: credential.type,
					id: credential.id,
					response: {
						attestationObject: bufferToBase64url(credential.response.attestationObject),
						clientDataJSON: bufferToBase64url(credential.response.clientDataJSON)
					},
					clientExtensionResults: credential.getClientExtensionResults()
				}
				fetch("/webauthn/create", {
					method: "POST",
					body: JSON.stringify(payload)
				})
			})
	})

The code for the second call on the server parses out the POST'd JSON and stores the registration in the in-memory storage (which would properly be an NSF):

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
String json = StreamUtil.readString(req.getReader());
PublicKeyCredential<AuthenticatorAttestationResponse, ClientRegistrationExtensionOutputs> pkc = PublicKeyCredential
	.parseRegistrationResponseJson(json);

// Retrieve the request we had stored in the session earlier
PublicKeyCredentialCreationOptions request = (PublicKeyCredentialCreationOptions) req.getSession(true)
	.getAttribute(WebauthnManager.REQUEST_KEY);

// Perform registration, which verifies that the incoming JSON matches the initiated request
RelyingParty rp = WebauthnManager.instance.getRelyingParty();
RegistrationResult result = rp
	.finishRegistration(FinishRegistrationOptions.builder().request(request).response(pkc).build());

// Gather the registration information to store in the server's credential repository
DominoRegistrationStorage repo = WebauthnManager.instance.getRepository();
CredentialRegistration reg = new CredentialRegistration();
reg.setAttestationMetadata(Optional.ofNullable(pkc.getResponse().getAttestation()));
reg.setUserIdentity(request.getUser());
reg.setRegistrationTime(Instant.now());
RegisteredCredential credential = RegisteredCredential.builder()
	.credentialId(result.getKeyId().getId())
	.userHandle(request.getUser().getId())
	.publicKeyCose(result.getPublicKeyCose())
	.signatureCount(result.getSignatureCount())
	.build();
reg.setCredential(credential);
reg.setTransports(pkc.getResponse().getTransports());

Session session = ContextInfo.getUserSession();
repo.addRegistrationByUsername(session.getEffectiveUserName(), reg);

Login/Assertion

Once the client has a keypair and the server knows about the public key, then the client can ask the server for what it would need if one were to log in as a given name, and then uses that information to make a second call. The dance on the client side looks like:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
var un = /* fetch the username from somewhere, such as a login form */
fetch("/webauthn/assertionRequest?un=" + encodeURIComponent(un))
	.then(res => res.json())
	.then(json => {
		json.publicKey.challenge = base64urlToBuffer(json.publicKey.challenge, c => c.charCodeAt(0));
		if(json.publicKey.allowCredentials) {
			for(var i = 0; i < json.publicKey.allowCredentials.length; i++) {
				var cred = json.publicKey.allowCredentials[i];
				cred.id = base64urlToBuffer(cred.id);
			}
		}
		navigator.credentials.get(json)
			.then(credential => {
				const payload = {
					type: credential.type,
					id: credential.id,
					response: {
						authenticatorData: bufferToBase64url(credential.response.authenticatorData),
						clientDataJSON: bufferToBase64url(credential.response.clientDataJSON),
						signature: bufferToBase64url(credential.response.signature),
						userHandle: bufferToBase64url(credential.response.userHandle)
					},
					clientExtensionResults: credential.getClientExtensionResults()
				}
				fetch("/webauthn/assertion", {
					method: "POST",
					body: JSON.stringify(payload),
					credentials: "same-origin"
				})
			})
	})

That's pretty similar to the middle code block above, really, and contains the same sort of ferrying to and from transport-friendly JSON objects and native credential objects.

On the server side, the first Servlet - which looks up the available public keys for a user name - is comparatively simple:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
String userName = req.getParameter("un"); //$NON-NLS-1$
AssertionRequest request = WebauthnManager.instance.getRelyingParty()
	.startAssertion(
		StartAssertionOptions.builder()
			.username(userName)
			.build()
	);

// Stash the current assertion request
req.getSession(true).setAttribute(WebauthnManager.ASSERTION_REQUEST_KEY, request);

String json = request.toCredentialsGetJson();
resp.setStatus(200);
resp.setHeader("Content-Type", "application/json"); //$NON-NLS-1$ //$NON-NLS-2$
resp.getOutputStream().write(String.valueOf(json).getBytes());

The final Servlet handles parsing out the incoming assertion (login) and stashing it in memory as associated with the "SessionID" cookie. That value could be anything that the browser will send with its requests, but "SessionID" works here.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
String json = StreamUtil.readString(req.getReader());
PublicKeyCredential<AuthenticatorAssertionResponse, ClientAssertionExtensionOutputs> pkc =
	    PublicKeyCredential.parseAssertionResponseJson(json);

// Retrieve the request we had stored in the session earlier
AssertionRequest request = (AssertionRequest) req.getSession(true).getAttribute(WebauthnManager.ASSERTION_REQUEST_KEY);

// Perform verification, which will ensure that the signed value matches the public key and challenge
RelyingParty rp = WebauthnManager.instance.getRelyingParty();
AssertionResult result = rp.finishAssertion(FinishAssertionOptions.builder()
	.request(request)
	.response(pkc)
	.build());

if(result.isSuccess()) {
	// Keep track of logins
	WebauthnManager.instance.getRepository().updateSignatureCount(result);
	
	// Find the session cookie, which they will have by now
	String sessionId = Arrays.stream(req.getCookies())
		.filter(c -> "SessionID".equalsIgnoreCase(c.getName()))
		.map(Cookie::getValue)
		.findFirst()
		.get();
	WebauthnManager.instance.registerAuthenticatedUser(sessionId, result.getUsername());
}

Trusting the Key

At this point, there's a dance between the client and server that results in the client being able to perform secure, password-less authentication with the server and the server knowing about the association, and so now the remaining job is just getting the server to actually trust this assertion. That's where JavaSapi comes in.

Above, I used the "SessionID" cookie as a mechanism to store an in-memory association between a browser cookie (which is independent of authentication) to a trusted user. Then, I made a JavaSapi service that looks for this and tries to find an authenticated user in its authenticate method:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
@Override
public int authenticate(IJavaSapiHttpContextAdapter context) {
	Cookie[] cookies = context.getRequest().getCookies();
	if(cookies != null) {
		Optional<Cookie> cookie = Arrays.stream(context.getRequest().getCookies())
			.filter(c -> "SessionID".equalsIgnoreCase(c.getName())) //$NON-NLS-1$
			.findFirst();
		if(cookie.isPresent()) {
			String sessionId = cookie.get().getValue();
			Optional<String> user = WebauthnManager.instance.getAuthenticatedUser(sessionId);
			if(user.isPresent()) {
				context.getRequest().setAuthenticatedUserName(user.get(), "WebAuthn"); //$NON-NLS-1$
				return HTEXTENSION_REQUEST_AUTHENTICATED;
			}
		}
	}
	
	return HTEXTENSION_EVENT_DECLINED;
}

And that's all there is to it on the JavaSapi side. Because it shares the same active memory space as the webapp doing the dance, it can use the same WebauthnManager instance to read the in-memory association. You could in theory do this another way with DSAPI - storing the values in an NSF or some other mechanism that can be shared - but this is much, much simpler to do.

Conclusion

This was a neat little project, and it was a good way to learn a bit about some of the native browser objects and data types that I haven't had occasion to work with before. I think this is also something that should be in the product; if you agree, go vote for the ideas from a few years ago.

Rewriting The OpenNTF Site With Jakarta EE: UI

Mon Jun 27 15:06:20 EDT 2022

  1. Rewriting The OpenNTF Site With Jakarta EE, Part 1
  2. Rewriting The OpenNTF Site With Jakarta EE: UI

In what may be the last in this series for a bit, I'll talk about the current approach I'm taking for the UI for the new OpenNTF web site. This post will also tread ground I've covered before, when talking about the Jakarta MVC framework and JSP, but it never hurts to reinforce the pertinent aspects.

MVC

The entrypoint for the UI is Jakarta MVC, which is a framework that sits on top of JAX-RS. Unlike JSF or XPages, it leaves most app-structure duties to other components. This is due both to its young age (JSF predates and often gave rise to several things we've discussed so far) and its intent. It's "action-based", where you define an endpoint that takes an incoming HTTP request and produces a response, and generally won't have any server-side UI state. This is as opposed to JSF/XPages, where the core concept is the page you're working with and the page state generally exists across multiple requests.

Your starting point with MVC is a JAX-RS REST service marked with @Controller:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
package webapp.controller;

import java.text.MessageFormat;

import bean.EncoderBean;
import jakarta.inject.Inject;
import jakarta.mvc.Controller;
import jakarta.mvc.Models;
import jakarta.ws.rs.GET;
import jakarta.ws.rs.NotFoundException;
import jakarta.ws.rs.Path;
import jakarta.ws.rs.PathParam;
import jakarta.ws.rs.Produces;
import jakarta.ws.rs.core.MediaType;
import model.home.Page;

@Path("/pages")
public class PagesController {
    
    @Inject
    Models models;
    
    @Inject
    Page.Repository pageRepository;
    
    @Inject
    EncoderBean encoderBean;

    @Path("{pageId}")
    @GET
    @Produces(MediaType.TEXT_HTML)
    @Controller
    public String get(@PathParam("pageId") String pageId) {
        String key = encoderBean.cleanPageId(pageId);
        Page page = pageRepository.findBySubject(key)
            .orElseThrow(() -> new NotFoundException(MessageFormat.format("Unable to find page for ID: {0}", key)));
        models.put("page", page); //$NON-NLS-1$
        return "page.jsp"; //$NON-NLS-1$
    }
}

In the NSF, this will respond to requests like /foo.nsf/xsp/app/pages/Some_Page_Name. Most of what is going on here is the same sort of thing we saw with normal REST services: the @Path, @GET, @Produces, and @PathParam are all normal JAX-RS, while @Inject uses the same CDI scaffolding I talked about in the last post.

MVC adds two things here: @Inject Models models and @Controller.

The Models object is conceptually a Map that houses variables that you can populate to be accessible via EL on the rendered page. You can think of this like viewScope or requestScope in XPages and is populated in something like the beforePageLoad phase. Here, I use the Models object to store the Page object I look up with JNoSQL.

The @Controller annotation marks a method or a class as participating in the MVC lifecycle. When placed on a class, it applies to all methods on the class, while placing it on a method specifically allows you to mix MVC and "normal" REST resources in the same class. Doing that would be useful if you want to, for example, provide HTML responses to browsers and JSON responses to API clients at the same resource URL.

When a resource method is marked for MVC use, it can return a string that represents either a page to render or a redirection in the form "redirect:some/resource". Here, it's hard-coded to use "page.jsp", but in another situation it could programmatically switch between different pages based on the content of the request or state of the app.

While this looks fairly clean on its own, it's important to bear in mind both the strengths and weaknesses of this approach. I think it will work here, as it does for my blog, because the OpenNTF site isn't heavy on interactive forms. When dealing with forms in MVC, you'll have to have another endpoint to listen for @POST (or other verbs with a shim), process that request from scratch, and return a new page. For example, from the XPages JEE example app:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
@Path("create")
@POST
@Consumes(MediaType.APPLICATION_FORM_URLENCODED)
@Controller
public String createPerson(
        @FormParam("firstName") @NotEmpty String firstName,
        @FormParam("lastName") String lastName,
        @FormParam("birthday") String birthday,
        @FormParam("favoriteTime") String favoriteTime,
        @FormParam("added") String added,
        @FormParam("customProperty") String customProperty
) {
    Person person = new Person();
    composePerson(person, firstName, lastName, birthday, favoriteTime, added, customProperty);
    
    personRepository.save(person);
    return "redirect:nosql/list";
}

That's already fiddlier than the XPages version, where you'd bind fields right to bean/document properties, and it gets potentially more complicated from there. In general, the more form-based your app is, the better a fit XPages/JSF is.

JSP

While MVC isn't intrinsically tied to JSP (it ships with several view engine hooks and you can write your own), JSP has the advantage of being built in to all Java webapp servers and is very well fit to purpose. When writing JSPs for MVC, the default location is to put them in WEB-INF/views, which is beneath WebContent in an NSF project:

Screenshot of JSPs in an NSF

The "tags" there are the general equivalent of XPages Custom Controls, and their presence in WEB-INF/tags is convention. An example page (the one used above) will tend to look something like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
<%@page contentType="text/html" pageEncoding="UTF-8" trimDirectiveWhitespaces="true" session="false" %>
<%@taglib prefix="t" tagdir="/WEB-INF/tags" %>
<%@taglib prefix="c" uri="http://java.sun.com/jsp/jstl/core" %>
<%@taglib prefix="fn" uri="http://java.sun.com/jsp/jstl/functions" %>
<t:layout>
    <turbo-frame id="page-content-${page.linkId}">
        <div>
            ${page.html}
        </div>
        
        <c:if test="${not empty page.childPageIds}">
            <div class="tab-container">
                <c:forEach items="${page.cleanChildPageIds}" var="pageId" varStatus="pageLoop">
                    <input type="radio" id="tab${pageLoop.index}" name="tab-group" ${pageLoop.index == 0 ? 'checked="checked"' : ''} />
                    <label for="tab${pageLoop.index}">${fn:escapeXml(encoder.cleanPageId(pageId))}</label>
                </c:forEach>
                    
                <div class="tabs">
                    <c:forEach items="${page.cleanChildPageIds}" var="pageId">
                        <turbo-frame id="page-content-${pageId}" src="xsp/app/pages/${encoder.urlEncode(pageId)}" class="tab" loading="lazy">
                        </turbo-frame>
                    </c:forEach>
                </div>
            </div>
        </c:if>
    </turbo-frame>
</t:layout>

There are, by shared lineage and concept, a lot of similarities with an XPage here. The first four lines of preamble boilerplate are pretty similar to the kind of stuff you'd see in an <xp:view/> element to set up your namespaces and page options. The tag prefixing is the same idea, where <t:layout/> refers to the "layout" custom tag in the NSF and <c:forEach/> refers to a core control tag that ships with the standard tag library, JSTL. The <turbo-frame/> business isn't JSP - I'll deal with that later.

The bits of EL here - all wrapped in ${...} - are from Expression Language 4.0, which is the current version of XPages's aging EL. On this page, the expressions are able to resolve variables that we explicitly put in the Models object, such as page, as well as CDI beans with the @Named annotation, such as encoderBean. There are also a number of implicit objects like request, but they're not used here.

In general, this is safely thought of as an XPage where you make everything load-time-bound and set viewState="nostate". The same sorts of concepts are all there, but there's no concept of a persistent component that you interact with. Any links, buttons, and scripts will all go to the server as a fresh request, not modifying an existing page. You can work with application and session scopes, but there's no "view" scope.

Hotwired Turbo

Though this app doesn't have much need for a lot of XPages's capabilities, I do like a few components even for a mostly "read-only" app. In particular, the <xe:djContentPane/> and <xe:djTabContainer/> controls have the delightful capability of deferring evaluation of their contents to later requests. This is a powerful way to speed up initial page load and, in the case of the tab container, skip needing to render parts of the page the user never uses.

For this and a couple other uses, I'm a fan of Hotwired Turbo, which is a library that grew out of 37 Signals's Rails-based development. The goal of Turbo and the other Hotwired components is to keep the benefits of server-based HTML rendering while mixing in a lot of the niceties of JS-run apps. There are two things that Turbo is doing so far in this app.

The first capability is dubbed "Turbo Drive", and it's sort of a freebie: you enable it for your app, tell it what is considered the app's base URL, and then it will turn any in-app links into "partial refresh" links: it downloads the page in the background and replaces just the changed part on the page. Though this is technically doing more work than a normal browser navigation, it ends up being faster for the user interface. And, since it also updates the URL to match the destination page and doesn't require manual modification of links, it's a drop-in upgrade that will also degrade gracefully if JavaScript isn't enabled.

The second capability is <turbo-frame/> up there, and it takes a bit more buy-in to the JS framework in your app design. The way I'm using Turbo Frames here is to support the page structure of OpenNTF, which is geared around a "primary" page as well as zero or more referenced pages that show up in tabs. Here, I'm buying in to Turbo Frames by surrounding the whole page in a <turbo-frame/> element with an id using the page's key, and then I reference each "sub-page" in a tab with that same ID. When loading the frame, Turbo makes a call to the src page, finds the element with the matching id value, and drops it in place inside the main document. The loading="lazy" parameter means that it defers loading until the frame is visible in the browser, which is handy when using the HTML/CSS-based tabs I have here.

I've been using this library for a while now, and I've been quite pleased. Though it was created for use with Rails, the design is independent of the server implementation, and the idioms fit perfectly with this sort of Java app too.

Conclusion

I think that wraps it up for now. As things progress, I may have more to add to this series, but my hope is that the app doesn't have to get much more complicated than the sort of stuff seen in this series. There are certainly big parts to tackle (like creating and managing projects), but I plan to do that by composing these elements. I remain delighted with this mode of NSF-based app development, and look forward to writing more clean, semi-declarative code in this vein.

Rewriting The OpenNTF Site With Jakarta EE, Part 1

Sun Jun 19 10:13:32 EDT 2022

Tags: jakartaee java
  1. Rewriting The OpenNTF Site With Jakarta EE, Part 1
  2. Rewriting The OpenNTF Site With Jakarta EE: UI

The design for the OpenNTF home page has been with us for a little while now and has served us pretty well. It looks good and covers the bases it needs to. However, it's getting a little long in the tooth and, more importantly, doesn't cover some capabilities that we're thinking of adding.

While we could potentially expand the current one, this provides a good opportunity for a clean start. I had actually started taking a swing at this a year and a half ago, taking the tack that I'd make a webapp and deploy it using the Domino Open Liberty Runtime. While that approach would put all technologies on the table, it'd certainly be weirder to future maintainers than an app inside an NSF (at least for now).

So I decided in the past few weeks to pick the project back up and move it into an NSF via the XPages Jakarta EE Support project. I can't say for sure whether I'll actually complete the project, but it'll regardless be a good exercise and has proven to be an excellent way to find needed features to implement.

I figure it'll also be useful to keep something of a travelogue here as I go, making posts periodically about what I've implemented recently.

The UI Toolkit

The original form of this project used MVC and JSP for the UI layer. Now that I was working in an NSF, I could readily use XPages, but for now I've decided to stick with the MVC approach. While it will make me have to solve some problems I wouldn't necessarily have to solve otherwise (like file uploads), it remains an extremely-pleasant way to write applications. I am also not constrained to this: since the vast majority of the logic is in Java beans and controller classes, switching the UI front-end would not be onerous. Also, I could theoretically mix JSP, JSF, XPages, and static HTML together in the app if I end up so inclined.

In the original app (as in this blog), I made use of WebJars to bring in JavaScript dependencies, namely Hotwire Turbo to speed up in-site navigation and use Turbo Frames. Since the NSF app in Designer doesn't have the Maven dependency mechanism the original app did, I just ended up copying the contents of the JAR into WebContent. That gave me a new itch to scratch, though: I'd love to be able to have META-INF/resources files in classpath JARs picked up by the runtime and made available, lowering the number of design elements present in the NSF.

The Data Backend

The primary benefit of this project so far has been forcing me to flesh out the Jakarta NoSQL driver in the JEE support project. I had kind of known hypothetically what features would be useful, but the best way to do this kind of thing is often to work with the tool until you hit a specific problem, and then solve that. So far, it's forced me to:

  • Implement the view support in my previous post
  • Add attachment support for documents, since we'll need to upload and download project releases
  • Improve handling of rich text and MIME, though this also has more room to grow
  • Switched the returned Streams from the driver to be lazy loading, meaning that not all documents/entries have to be read if the calling code stops reading the results partway through
  • Added the ability to use custom property types with readers/writers defined in the NSF

Together, these improvements have let me have almost no lotus.domino code in the app. The only parts left are a bean for formatting Notes-style names (which I may want to make a framework service anyway) and a bean for providing access to the various associated databases used by the app. Not too shabby! The app is still tied to Domino by way of using the Domino-specific extensions to JNoSQL, but the programming model is significantly better and the amount of app code was reduced dramatically.

Next Steps

There's a bunch of work to be done. The bulk of it is just implementing things that the current XPages app does: actually uploading projects, all the stuff like discussion lists, and so forth. I'll also want to move the server-side component of the small "IP Tools" suite I use for IP management stuff in here. Currently, that's implemented as Wink-based JAX-RS resources inside an OSGi bundle, but it'll make sense to move it here to keep things consolidated and to make use of the much-better platform capabilities.

As I mentioned above, I can't guarantee that I'll actually finish this project - it's all side work, after all - but it's been useful so far, and it's a further demonstration of how thoroughly pleasant the programming model of the JEE support project is.

Per-NSF-Scoped JWT Authorization With JavaSapi

Sat Jun 04 10:35:05 EDT 2022

Tags: domino dsapi java
  1. Poking Around With JavaSapi
  2. Per-NSF-Scoped JWT Authorization With JavaSapi
  3. WebAuthn/Passkey Login With JavaSapi

In the spirit of not leaving well enough alone, I decided the other day to tinker a bit more with JavaSapi, the DSAPI peer tucked away undocumented in Domino. While I still maintain that this is too far from supported for even me to put into production, I think it's valuable to demonstrate the sort of thing that this capability - if made official - would make easy to implement.

JWT

I've talked about JWT a bit before, and it was in a similar context: I wanted to be able to access a third-party API that used JWT to handle authorization, so I wrote a basic library that could work with LS2J. While JWT isn't inherently tied to authorization like this, it's certainly where it's found a tremendous amount of purchase.

JWT has a couple neat characteristics, and the ones that come in handy most frequently are a) that you can enumerate specific "claims" in the token to restrict what the token allows the user to do and b) if you use a symmetric signature key, you can generate legal tokens on the client side without the server having to generate them. "b" there is optional, but makes JWT a handy way to do a quick shared secret between servers to allow for trusted authentication.

It's a larger topic than that, for sure, but that's the quick and dirty of it.

Mixing It With An NSF

Normally on Domino, you're either authenticated for the whole server or you're not. That's usually fine - if you want to have a restricted account, you can specifically grant it access to only a few NSFs. However, it's good to be able to go more fine-grained, restricting even powerful accounts to only do certain things in some contexts.

So I had the notion to take the JWT capability and mix it with JavaSapi to allow you to do just that. The idea is this:

  1. You make a file resource (hidden from the web) named "jwt.txt" that contains your per-NSF secret.
  2. A remote client makes a request with an Authorization header in the form of Bearer Some.JWT.Here
  3. The JavaSapi interceptor sees this, checks the target NSF, loads the secret, verifies it against the token, and authorizes the user if it's legal

As it turns out, this turned out to be actually not that difficult in practice at all.

The main core of the code is:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
public int authenticate(IJavaSapiHttpContextAdapter context) {
    IJavaSapiHttpRequestAdapter req = context.getRequest();

    // In the form of "/foo.nsf/bar"
    String uri = req.getRequestURI();
    String secret = getJwtSecret(uri);
    if(StringUtil.isNotEmpty(secret)) {
        try {
            String auth = req.getHeader("Authorization"); //$NON-NLS-1$
            if(StringUtil.isNotEmpty(auth) && auth.startsWith("Bearer ")) { //$NON-NLS-1$
                String token = auth.substring("Bearer ".length()); //$NON-NLS-1$
                Optional<String> user = decodeAuthenticationToken(token, secret);
                if(user.isPresent()) {
                    req.setAuthenticatedUserName(user.get(), "JWT"); //$NON-NLS-1$
                    return HTEXTENSION_REQUEST_AUTHENTICATED;
                }
            }
        } catch(Throwable t) {
            t.printStackTrace();
        }
    }

    return HTEXTENSION_EVENT_DECLINED;
}

To read the JWT secret, I used IBM's NAPI:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
private String getJwtSecret(String uri) {
    int nsfIndex = uri.toLowerCase().indexOf(".nsf"); //$NON-NLS-1$
    if(nsfIndex > -1) {
        String nsfPath = uri.substring(1, nsfIndex+4);
        
        try {
            NotesSession session = new NotesSession();
            try {
                if(session.databaseExists(nsfPath)) {
                    // TODO cache lookups and check mod time
                    NotesDatabase database = session.getDatabase(nsfPath);
                    database.open();
                    NotesNote note = FileAccess.getFileByPath(database, SECRET_NAME);
                    if(note != null) {
                        return FileAccess.readFileContentAsString(note);
                    }
                }
            } finally {
                session.recycle();
            }
        } catch(Exception e) {
            e.printStackTrace();
        }
    }
    return null;
}

And then, for the actual JWT handling, I use the auth0 java-jwt library:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
public static Optional<String> decodeAuthenticationToken(final String token, final String secret) {
	if(token == null || token.isEmpty()) {
		return Optional.empty();
	}
	
	try {
		Algorithm algorithm = Algorithm.HMAC256(secret);
		JWTVerifier verifier = JWT.require(algorithm)
		        .withIssuer(ISSUER)
		        .build();
		DecodedJWT jwt = verifier.verify(token);
		Claim claim = jwt.getClaim(CLAIM_USER);
		if(claim != null) {
			return Optional.of(claim.asString());
		} else {
			return Optional.empty();
		}
	} catch (IllegalArgumentException | UnsupportedEncodingException e) {
		throw new RuntimeException(e);
	}
}

And, with that in place, it works:

JWT authentication in action

That text is coming from a LotusScript agent - as I mentioned in my original JavaSapi post, this authentication is trusted the same way DSAPI authentication is, and so all elements, classic or XPages, will treat the name as canon.

Because the token is based on the secret specifically from the NSF, using the same token against a different NSF (with no JWT secret or a different one) won't authenticate the user:

JWT ignored by a different endpoint

If we want to be fancy, we can call this scoped access.

This is the sort of thing that makes me want JavaSapi to be officially supported. Custom authentication and request filtering are much, much harder on Domino than on many other app servers, and JavaSapi dramatically reduces the friction.

XPages Jakarta EE 2.5.0 And The Looming Java-Version Wall

Wed May 25 14:41:52 EDT 2022

  1. Updating The XPages JEE Support Project To Jakarta EE 9, A Travelogue
  2. JSP and MVC Support in the XPages JEE Project
  3. Migrating a Large XPages App to Jakarta EE 9
  4. XPages Jakarta EE Support 2.2.0
  5. DQL, QueryResultsProcessor, and JNoSQL
  6. Implementing a Basic JNoSQL Driver for Domino
  7. Video Series On The XPages Jakarta EE Project
  8. JSF in the XPages Jakarta EE Support Project
  9. So Why Jakarta?
  10. XPages Jakarta EE 2.5.0 And The Looming Java-Version Wall
  11. Adding Concurrency to the XPages Jakarta EE Support Project
  12. Adding Transactions to the XPages Jakarta EE Support Project
  13. XPages Jakarta EE 2.9.0 and Next Steps
  14. XPages JEE 2.11.0 and the Javadoc Provider
  15. The Loose Roadmap for XPages Jakarta EE Support
  16. XPages JEE 2.12.0: JNoSQL Views and PrimeFaces Support
  17. XPages JEE 2.13.0
  18. XPages JEE 2.14.0
  19. XPages JEE 2.15.0 and Plans for JEE 10 and 11

Earlier today, I published version 2.5.0 of the XPages Jakarta EE Support project. It's mostly a consolidation and bug-fix release, but there are few interesting features and notes about the implementation. Plus, as teased in the post title up there, there's a looming problem for the project.

New Features

There are two main new features in this version.

First, I added some configurable CORS support for REST services. Fortunately for me, RestEasy comes with a CORS filter by default, and it just needs to be enabled. I wired it up using MicroProfile Config to read some values out of xsp.properties:

1
2
3
4
5
6
7
8
rest.cors.enable=true                   # required for CORS
rest.cors.allowCredentials=true         # defaults to true
rest.cors.allowedMethods=GET,HEAD       # defaults to all
rest.cors.allowedHeaders=Some-Header    # defaults to all
rest.cors.exposedHeaders=Some-Header    # optional
rest.cors.maxAge=600                    # optional
# allowedOrigins is required, and can be "*"
rest.cors.allowedOrigins=http://foo.com,http://bar.com

I also added support for using the long-standing @WebServlet annotation. Though REST services will generally do what you want, sometimes it's handy to use the lower-level Servlet capability, and now you can do so inline:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
@WebServlet(urlPatterns = { "/someservlet", "/someservlet/*", "*.hello" })
public class ExampleServlet extends HttpServlet {
	private static final long serialVersionUID = 1L;
	
	@Inject
	ApplicationGuy applicationGuy;

	@Override
	protected void doGet(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException {
		resp.setContentType("text/plain");
		resp.getWriter().println("Hello from ExampleServlet. context=" + req.getContextPath() + ", path=" + req.getServletPath() + ", pathInfo=" + req.getPathInfo());
		resp.getWriter().println("ApplicationGuy: " + applicationGuy.getMessage());
		resp.getWriter().flush();
	}
}

Consolidation

There were a couple specs where I had previously either copied the source into the repository (CDI, Mail) or had maintained a local branch fork (NoSQL). Those were always uncomfortable concessions to reality, but I decided to look further into ways to handle that.

For NoSQL, part of it was what I talked about in my last post: using Eclipse Transformer to make use of javax.* compiled binaries and source converted to jakarta.* automatically. But beyond that, it had the same problem that I had forked Mail for. Namely, it hits the same trouble that lots of non-OSGi code does in an OSGi context, where it uses ServiceLoader in a non-extensible way. Though I have an open PR to make use of the pseudo-standard "HK2" ServiceLoader provider, waiting for that would mean continuing the local-build trouble.

Instead, for all of these cases I made use of OSGi's Weaving capability to re-write those parts of the class files on the fly. While this is a bit unfortunate, it works well in practice. The only real down side for now is having to be a bit more careful when bumping the versions in the future, but this type of code changes very rarely.

The Looming Wall

While this has been going swimmingly, I've started to hit some real impediments with Domino's Java version. The next release of Jakarta EE, version 10, requires Java 11 as a minimum. This is similar to the move Equinox (Domino's OSGi framework of choice) made just under two years ago, and which has itself bitten me with a blockage to upgrading Tycho to version 2.0 and above. Java 11 is about four years old now, and is no longer even the latest LTS release, so this all makes sense.

I've known this was coming for a while, but incompatible versions of JEE specs and implementations started to trickle in over the past year, leading to me leaving notes for myself about maximum versions. JEE 10 itself is fairly imminent now, so I'll be capped at the ones released with JEE 9 a while ago.

So I've been pondering my options here.

In one sense, I solved this problem years ago. The Domino Open Liberty Runtime project has had the ability to download any version of open-source Java that you want, and I expanded it last year to let you pick from several common flavors. Liberty maintains a breathless pace of advancement, adding official support for Java 18 the month after it came out. If one wants to run JEE apps on Domino, that's the most complete way. However, though it does its job technologically well, it's not exactly a natural fit for Domino developers in its current state.

But I've been considering anew a notion I had years ago, which is to write an extension for Liberty so that it reads class files and resources out of an NSF directly. In some early investigation a bit ago, this started to appear quite doable. In theory, I could write an adapter that would take an incoming request for "foo.nsf" and then read files out of the NSF in the same way XPages does, but instead feeding them to Liberty's runtime. Doing this would essentially implement all remaining JEE and MicroProfile specs in one fell swoop on top of the "any Java version" support, but would add the fault-prone attribute of running a separate process and proxying requests to it. In practice, that setup has proven itself good, but it's certainly more complicated than the "single process on port 80" deal that Domino's HTTP is now.

That route also wouldn't inherently support XPages, which would be something of an impediment to the XPages JEE project's original remit. That's something I've also pondered, and in theory I could make an auto-vivifying version of the XPages Runtime project that grabs all the pertinent XPages bundles from the current server and patches them into the Liberty server as an extension feature, similar to how all the built-in Liberty features work. This could be done, but I'll admit that I balk a bit at the prospect. Though I run XPages outside Domino constantly, it's with full knowledge of the tradeoffs and special considerations. Getting a normal NSF-based XPages app to run in this way would take some additional work.

Anyway, those options could work, but none of them are great. The true fix would naturally be for HCL to move to a newer Java version in Domino's HTTP stack, but I don't control that, so I'll content myself with considering what to do in the mean time. Admittedly, pondering this sort of thing is enjoyable in its own right. Also fortunately, even without tackling this, there's still plenty of stuff in the pile for me to tackle as the fancy strikes me.

Putting Eclipse Transformer To Use In Dependency Wrangling

Tue May 24 15:46:15 EDT 2022

Tags: jakartaee java

Setting code aside, the backbone of the XPages Jakarta EE Support project is its dependency pool. In it, I use my fork of the p2-maven-plugin to wrangle all the spec and implementation dependencies. Aside from just collecting them, this file does a ton of work to create and reconfigure their OSGi bundle rules to get everything working on Domino.

There have been limitations, though, and some of them have to do with the Jakarta NoSQL project. Though there are side branches of that project using the jakarta.* namespace, the main master branch is still on javax.* for a couple Jakarta depenencies. Historically, I've dealt with this by running a build locally and deploying it to OpenNTF's Maven server. However, this adds a bit of randomness to the mix: if a snapshot build of NoSQL goes out to the main repository that happens to be newer, then building the dependency repository locally might pick up on that instead, since it's named the same thing.

Transformer

Fortunately, IBM wrote the solution for me: Eclipse Transformer. This Transformer is a rules engine to translate files (Java and related resources, namely) based on configuration - and, while it's generic, it's really designed for the transition from javax.* to jakarta.* namespaces.

It allows you to do these transformations at runtime or (as I'll be doing here) ahead of time, even if you don't have access to the original source. Though I do have access to the source, it's more useful at the moment to act like I don't.

I'd known about the tool and have seen how it's used heavily by both app servers and implementation vendors to be able to support both old- and new-style uses, and so I've kept it in mind for in case the need ever came up. It's a perfect fit for this.

p2-maven-plugin

I considered a couple ways to handle this, but realized the cleanest for now would be to integrate it into the dependency pool generator that I already have, since it fits right in with the OSGi transformations I'm doing.

So I went on over to the p2-maven-plugin fork and got to work. When defining Maven artifacts to bring in, the format looks like this:

1
2
3
4
<artfiact>
    <id>jakarta.servlet:jakarta.servlet-api:4.0.4</id>
    <source>true</source>
</artfiact>

Now, Servlet already has a jakarta.* version, but it'll be useful here as an example that avoids the other transformations I'm doing.

My addition is to add a transform configuration option here, with jakarta as the only value for now:

1
2
3
4
5
<artfiact>
    <id>jakarta.servlet:jakarta.servlet-api:4.0.4</id>
    <source>true</source>
    <transform>jakarta</transform>
</artfiact>

...and that'll be it! When that is specified, the code will now run the artifact and its source JAR transparently through Transformer and the version you get in your p2 repository will reflect the transition. And, well, it works perfectly in my case. The resultant NoSQL spec and dependencies are functionally equivalent to the ones in the jakarta.* source branch, but without having to actually change the source files yet. Neat.

Implementation

Though it took a bit to track down the best way to do it, it turned out that Transformer is quite easy to embed into a Java app like the Maven plugin. The majority of the code ends up being effectively Java boilerplate to provide the default values for Jakarta transformation. Truncated, it looks like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
String inputFileName = t.getAbsolutePath(); // the artifact in ~/.m2/repository
File dest = File.createTempFile(t.getName(), ".jar"); //$NON-NLS-1$
String outputFileName = dest.getAbsolutePath();

Map<String, String> optionDefaults = JakartaTransform.getOptionDefaults();
Function<String, URL> ruleLoader = JakartaTransform.getRuleLoader();
TransformOptions options = /* build TransformOptions object that reads the above variables */

Transformer transformer = new Transformer(logger, options);
ResultCode result = transformer.run();
switch(result) {
case ARGS_ERROR_RC:
case FILE_TYPE_ERROR_RC:
case RULES_ERROR_RC:
case TRANSFORM_ERROR_RC:
	throw new IllegalStateException("Received unexpected result from transformer: " + result);
case SUCCESS_RC:
default:
	return dest;
}

There are plenty of options to specify, but that's really about it. Once given the Jakarta defaults, it will do the right thing in the normal case, both for the compiled class files as well as the source JAR.

I'm not sure if I'll need it in other cases (NoSQL will move over in the main branch eventually), but it's sure handy here and should be useful in a pinch. From time to time, I've run across dependencies that would be useful to include but use old JEE specs, and this could do the trick in those cases too.

Poking Around With JavaSapi

Thu May 19 16:49:22 EDT 2022

Tags: dsapi java
  1. Poking Around With JavaSapi
  2. Per-NSF-Scoped JWT Authorization With JavaSapi
  3. WebAuthn/Passkey Login With JavaSapi

Earlier this morning, Serdar Basegmez and Karsten Lehmann had a chat on Twitter regarding the desire for OAuth on Domino and their recollections of a not-quite-shipped technology from a decade ago going by the name "JSAPI".

Seeing this chat go by reminded me of some stuff I saw when I was researching the Domino HTTP Java entrypoint last year. Specifically, these guys, which have been sitting there since at least 9.0.1:

JavaSapi class files in com.ibm.domino.xsp.bridge.http

I'd made note of them at the time, since there's a lot of tantalizing stuff in there, but had put them back on the shelf when I found that they seemed to be essentially inert at runtime. For all of IBM's engineering virtues (and there are many), they were never very good at cleaning up their half-implemented experiments when it came time to ship, and I figured this was more of the same.

What This Is

Well, first and foremost, it is essentially a non-published experiment: I see no reference to these classes or how to enable them anywhere, and so everything within these packages should be considered essentially radioactive. While they're proving to be quite functional in practice, it's entirely possible - even likely - that the bridge to this side of thing is full of memory leaks and potential severe bugs. Journey at your own risk and absolutely don't put this in production. I mean that even more in this case than my usual wink-and-nod "not for production" coyness.

Anyway, this is the stuff Serdar and Karsten were talking about, named "JavaSapi" in practice. It's a Java equivalent to DSAPI, the API you can hook into with native libraries to perform low-level alterations to requests. DSAPI is neat, but it's onerous to use: you have to compile down to a native library, target each architecture you plan to run on, deploy that to each server, and enable it in the web site config. There's a reason not a lot of people use it.

Our new friend JavaSapi here provides the same sorts of capabilities (rewriting URLs, intercepting requests, allowing for arbitrary user authentication (more on this later), and so forth) but in a friendlier environment. It's not just that it's Java, either: JavaSapi runs in the full OSGi environment provided by HTTP, which means it swims in the same pool as XPages and all of your custom libraries. That has implications.

How To Use It

By default, it's just a bunch of classes sitting there, but the hook down to the core level (in libhttpstack.so) remains, and it can be enabled like so:

set config HTTP_ENABLE_JAVASAPI=1

(strings is a useful tool)

Once that's enabled, you should start seeing a line like this on HTTP start:

[01C0:0002-1ADC] 05/19/2022 03:37:17 PM  HTTP Server: JavaSapi Initialized

Now, there's a notable limitation here: the JavaSapi environment isn't intended to be arbitrarily extensible, and it's hard-coded to only know about one service by default. That service is interesting - it's an OAuth 2 provider of undetermined capability - but it's not the subject of this post. The good news is that Java is quite malleable, so it's not too difficult to shim in your own handlers by writing to the services instance variable of the shared JavaSapiEnvironment instance (which you might have to construct if it's not present).

Once you have that hook, it's just a matter of writing a JavaSapiService instance. This abstract class provides fairly-pleasant hooks for the triggers that DSAPI has, and nicely wraps requests and responses in Servlet-alike objects.

Unlike Servlet objects, though, you can set a bunch of stuff on these objects, subject to the same timing and pre-filtering rules you'd have in DSAPI. For example, in the #rawRequest method, you can add or overwrite headers from the incoming request before they get to any other code:

1
2
3
4
5
public int rawRequest(IJavaSapiHttpContextAdapter context) {
    context.getRequest().setRequestHeader("Host", "foo-bar.galaxia");
        
    return HTEXTENSION_EVENT_HANDLED;
}

If you want to, you can also handle the entire request outright:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
public int rawRequest(IJavaSapiHttpContextAdapter context) {
    if(context.getRequest().getRequestURI().contains("foobar")) {
        context.getResponse().setStatus(299);
        context.getResponse().setHeader("Content-Type", "text/foobar");
        try {
            context.getResponse().getOutputStream().print("hi.");
        } catch (IOException e) {
            e.printStackTrace();
        }
        return HTEXTENSION_REQUEST_PROCESSED;
    }
    
    return HTEXTENSION_EVENT_HANDLED;
}

You probably won't want to, since we're not lacking for options when it comes to responding to web requests in Java, but it's nice to know you can.

You can even respond to tell http foo commands:

1
2
3
4
5
6
7
8
9
public int processConsoleCommand(String[] argv, int argc) {
    if(argc > 0) {
        if("foo".equals(argv[0])) { //$NON-NLS-1$
            System.out.println(getClass().getSimpleName() + " was told " + Arrays.toString(argv));
            return HTEXTENSION_SUCCESS;
        }
    }
    return HTEXTENSION_EVENT_DECLINED;
}

So that's neat.

The fun one, as it usually is, is the #authenticate method. One of the main reasons one might use DSAPI in the first place is to provide your own authentication mechanism. I did it years and years ago, Oracle did it for their middleware, and HCL themselves did it recently for the AppDev Pack's OAuth implementation.

So you can do the same here, like this super-secure implementation:

1
2
3
4
public int authenticate(IJavaSapiHttpContextAdapter context) {
    context.getRequest().setAuthenticatedUserName("CN=Hello From " + getClass().getName(), getClass().getSimpleName());
    return HTEXTENSION_REQUEST_AUTHENTICATED;
}

The cool thing is that this has the same characteristics as DSAPI: if you declare the request authenticated here, it will be fully trusted by the rest of HTTP. That means not just Java - all the classic stuff will trust it too:

Screenshot showing JavaSapi authentication in action

Conclusion

Again: this stuff is even further from supported than the usual components I muck around in, and you shouldn't trust any of it to work more than you can actively observe. The point here isn't that you should actually use this, but more that it's interesting what things you can find floating around the Domino stack.

Were this to be supported, though, it'd be phenomenally useful. One of Domino's stickiest limitations as an app server is the difficulty of extending its authentication schemes. It's always been possible to do so, but DSAPI is usually prohibitively difficult unless you either have a bunch of time on your hands or a strong financial incentive to use it. With something like this, you could toss Apache Shiro in there as a canonical source of user authentication, or maybe add in Soteria - the Jakarta Security implementation - to get per-app authentication.

There's also that OAuth 2 thing floating around in there, which does have a usable extension point, but I think it's fair to assume that it's unfinished.

This is all fun to tinker with, though, and sometimes that's good enough.

So Why Jakarta?

Thu Apr 28 16:10:11 EDT 2022

Tags: jakartaee java
  1. Updating The XPages JEE Support Project To Jakarta EE 9, A Travelogue
  2. JSP and MVC Support in the XPages JEE Project
  3. Migrating a Large XPages App to Jakarta EE 9
  4. XPages Jakarta EE Support 2.2.0
  5. DQL, QueryResultsProcessor, and JNoSQL
  6. Implementing a Basic JNoSQL Driver for Domino
  7. Video Series On The XPages Jakarta EE Project
  8. JSF in the XPages Jakarta EE Support Project
  9. So Why Jakarta?
  10. XPages Jakarta EE 2.5.0 And The Looming Java-Version Wall
  11. Adding Concurrency to the XPages Jakarta EE Support Project
  12. Adding Transactions to the XPages Jakarta EE Support Project
  13. XPages Jakarta EE 2.9.0 and Next Steps
  14. XPages JEE 2.11.0 and the Javadoc Provider
  15. The Loose Roadmap for XPages Jakarta EE Support
  16. XPages JEE 2.12.0: JNoSQL Views and PrimeFaces Support
  17. XPages JEE 2.13.0
  18. XPages JEE 2.14.0
  19. XPages JEE 2.15.0 and Plans for JEE 10 and 11

I've spent a lot of time over the last while tinkering with the XPages Jakarta EE Support project in particular and Jakarta technologies in general, and I figured it'd be worth discussing a bit why I like this stack and why I think it's worth putting work into.

There are a couple facets to this, I think. Why is it good on its own? Why is it good as a complement or replacement for XPages? And why is it good compared to the other roads offered for Domino developers?

Quick Aside: Spring and Others

Before I get much further, I should mention early on that this isn't so much about Jakarta as opposed to technologies like Spring. Spring is good! It's similar in concept, both because it started from a JEE-aligned mindset and now because Jakarta and MicroProfile have been adopting a lot of the best concepts. It's kind of a "D&D and Pathfinder" situation. While there are some philosophical differences, and Jakarta is (now) run by an open-source organization as opposed to an individual company, the distinction for our purposes isn't important.

This also goes for some other technologies that could potentially be slotted in for server-based app dev, like Vert.x. Vert.x, for its part, often serves different purposes, and so that discussion is also separate.

Technical Reasons

Going into all the specific things that I think are good about JEE technologies would be quite an ordeal, so I'll stick to summarizing some overarching themes that I appreciate.

Presumably as a sign of my own ever-increasing age, I appreciate the staid nature of many aspects of it. While some of that comes from the near-stagnation the stack suffered from towards the end of Oracle's sole stewardship of it, it's good that things like Servlet have remained consistent in important ways since the very beginning. Some aspects have come and some will soon go, but the main aspects have remained pleasantly consistent because they were designed to be simple and largely adaptable. Servlet has its limitations, but they're limitations that don't generally show up for normal use.

I also quite appreciate how annotation-based most of the specs are. This was a good way of moving away from the original "pile of XML" configuration process of the early versions of Java EE while still retaining introspection abilities. What I mean by that is the ability of programs (like a server or an IDE) to look at a Jakarta app and glean important information without having to actually execute the code. As a point of comparison, take this hypothetical version of a REST server, where you declare endpoints programmatically:

1
2
3
4
public void initServer(ServerConfig config) {
    config.addHandler("/foo", new FooHandler());
    config.addHandler("/foo/bar", new BarHandler());
}

...and then compare that to the annotation-based way of doing it:

1
2
3
4
5
6
7
8
@Path("/foo")
public class FooHandler {
	/* snip */

	@GET
	@Path("bar")
	public Object getBar() { /* ... */ }
}

Both could be functionally the same at runtime, but the latter allows tools to inspect the classes statically to provide summaries and capabilities in the UI in a way that would be technically possible but much more difficult otherwise. This is certainly not unique to Jakarta, but it's an important feature of it nonetheless.

Moreover, I think that the stack is morphing itself nicely into a cleaner, modern form. It's been a rocky process, but a lot of the individual specs are either adapting themselves onto CDI or using it as the baseline. As much as I sang the praises of Servlet in the earlier paragraph, you can write a thoroughly-capable app using CDI and JAX-RS without ever caring about much else beyond a data layer.

This adaptability is also paying off with newer-era work like Quarkus. Quarkus is an intriguing project that combines slices of Jakarta, MicroProfile, and others with the native-compilation capabilities of GraalVM to provide a toolchain that lets you write quite-efficient compiled apps, targeted primarily for Kubernetes deployments where the startup and response time of a single node is very important. This is really solving a lot of problems I don't have, but it's interesting to watch, and to see how these goals feed back into Jakarta with things like CDI Lite.

Jakarta As An XPages Extension Or Successor

XPages was (and is) a fork of a subset of Java EE, with the split happening somewhere just before 2006's Java EE 5. It's a small subset, but you can look down that list of technologies and see a few that remain to this day: Servlet 2.4/2.5, JSF 1.1/1.2 by way of the XPages fork, JavaMail, and a few other miscellaneous packages. DAS and the Extension Library brought in JAX-RS 1.1, so you can add a dash of 2009's Java EE 6 to the pot.

The XPages Jakarta EE Support project started as a mechanism for bringing in a newer JAX-RS version, followed by CDI to replace managed beans and EL 3 to replace XPages's primordial Expression Language support - essentially, as a slowly-growing platform update. In its current form, it brings in a wide slate of technologies, but the fact that it was starting as an extension of an existing ancient Java EE fork made it possible to do this gradually, piece by piece. Really, up until the move to the jakarta.* namespace, it was a process of just glomming compatible parts onto the existing Servlet baseline.

Even after that switch, the historical alignment with the older parts of the stack makes building on comparatively straightforward. That applies both to me as the person doing the adapting the spec implementations and (hopefully) to a developer actually using them. While XPages predated the annotation-heavy push in JEE as well as CDI entirely, a lot of the core concepts are in common, and I expect that it'd be an easier transition from XPages alone to Jakarta EE than, say, classic Domino web dev to XPages was. It certainly was for me, anyway.

Jakarta As A Cultural Match

This topic covers both my general appreciation for a thoroughly-open-source platform and why I specifically like it in relation to other roads open to Domino developers.

Java EE had for a long time been kind of open source: though Sun and then Oracle held the reins, the specs grew to be free to implement, and over time there flourished a slew of compatible servers, many of which are now or have always been fully open-source.

I like this for a lot of reasons. For one, it's just good as a programmer to have source access. Normally, you can just go by the spec, but having full access to the server's source lets you debug thorny problems when you hit an edge case. While closed-source software certainly has its place, there's just a layer of "all else being equal, source access is better".

Beyond being able to see and debug the source, it's valuable that the platform is open source and the implementations I use are as well, and the whole thing is guided by the extremely-established Eclipse Foundation. While a company handing something over to an open-source organization can sometimes just be a way to usher it to a plausibly-deniable death, the activity around Jakarta EE shows that isn't the case here. While Oracle and IBM still tend to naturally top the charts, it has a diverse pool of contributors, and its fate isn't tied to the interests of a single company. As with closed-source software, sometimes being shepherded by a single company has advantages, but it leaves you more exposed to the winds of their financial incentives.

This all contributes to a platform where I can be comfortable writing a bunch of code with the knowledge that, while it may not be good forever, the path will be at least clear. While there's something of an industry for modernizing old Java EE applications (one our old friend Niklas Heidloff is involved in), it's a task shared by a lot of companies and, indeed, a lot of the work is "replace old vendor-specific code with standards-based code". While nothing can truly prevent you from having a pile of obsolete code other than not writing it in the first place, following a path like this that's shared with a broad slice of the industry is a good way to mitigate the trouble.

And I think that paragraph is what a lot of it comes down to for me. As much as I can be, I want to be out of the business of writing code that doesn't have a built-in "plan B". If Domino magically stopped existing tomorrow, code written in this way wouldn't necessarily directly work elsewhere, but it'd be a much shorter journey than for the Notes-client and classic Domino web code I wrote in my early career. And, really, some of it would directly work elsewhere. The stuff that's just describing a REST entrypoint or a page layout? The stuff that describes the interaction between those and an abstracted data layer? That code doesn't care what your server is, and there's a whole ecosystem of servers ready to do the job. That, there, is what makes this worthwhile for me.

Intercepting Class Loading in OSGi, A Travelogue

Mon Jan 10 10:36:08 EST 2022

Tags: java osgi xpages

Yesterday, I had a problem. I was trying to get MicroProfile Config working inside an NSF to add to the XPages Jakarta EE project, and I was severely blocked by odd behavior.

To describe that, I'll lightly cover what MP Config is. It's a CDI extension that allows you to annotate properties on a bean to indicate that they're intended to come from an available configuration source - often a .properties file in the project, but it's a pluggable system. Your bean will look like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
package example;

// ...

@ApplicationScoped
public class ConfigExample {
    @Inject
    @ConfigProperty(name="java.version", defaultValue="unknown")
    private String javaVersion;
    
    /* other methods go here */
}

The idea is that you'll then have a properties file or environment variable to fill in the value, allowing you to separate your configuration from the implementation in a consistent way. Here, I'm making use of the fact that a default provider looks up Java system properties, so I could just get it working before investigating adding providers.

Since I'd already added CDI and a CDI-based extension in the form of MVC, I figured this would be easy.

The Problem

The problem I hit, though, was bizarre. CDI would identify the bean above, but would hit this problem:

1
2
3
4
5
org.jboss.weld.exceptions.DeploymentException: WELD-001408: Unsatisfied dependencies for type String with qualifiers @Default
  at injection point [UnbackedAnnotatedField] @Inject private example.ConfigExample.javaVersion
  at example.ConfigExample.javaVersion(ConfigExample.java:0)
WELD-001475: The following beans match by type, but none have matching qualifiers:
  - Producer Method [String] with qualifiers [@Any @ConfigProperty] declared as [[UnbackedAnnotatedMethod] @Dependent @Produces @ConfigProperty protected io.smallrye.config.inject.ConfigProducer.produceStringConfigProperty(InjectionPoint)]

The gist of this is that it noticed that the javaVersion property is supposed to be an injected property, but it had no idea what the source should be. It did know about the MicroProfile provider, which handles @ConfigProperty, but it couldn't put two and two together.

I banged my head against this for a while, and eventually determined that the class as loaded from the NSF is stripped of the @ConfigProperty annotation outright. Other annotations, such as @Inject and even custom annotations, would remain, but not @ConfigProperty. I wrestled with OSGi dependency chains for a while, to no avail.

The Enemy

Eventually, I found the core, and it was an old nemesis of mine. It's this method in com.ibm.xsp.util.ClassLoaderUtil:

ClassLoaderUtil.checkProhibitedClassNames

This field is called by the ClassLoader used in an NSF to ensure that certain classes, by name prefix, cannot be loaded by code coming from an NSF. The last three lines there make a sort of sense: Domino is supposed to be an app container for XPages apps, and ideally it's not a simple process for an app to break out of its container to muck about in the parent environment. Fair enough. The NAPI line is presumably there because IBM wanted to protect developers from themselves, even though Notes devs had been making unauthenticated calls to C APIs for freaking ever.

It's the first two, and specifically the first, that are the source of my trouble. Those prohibitions are presumably meant to isolate XPages apps from the fact that they live in an OSGi world, with the assumption that anything beginning with org.eclipse. refers to something like org.eclipse.core.runtime, the OSGi system bundle.

And this is the issue. MicroProfile is not in any way related to OSGi, but it sure is an Eclipse project. Accordingly, the class name of @ConfigProperty is org.eclipse.microprofile.config.inject.ConfigProperty, and thus cannot be loaded from an NSF.

Attempted Workarounds

So I considered my options.

One was to fork MP Config and rename the packages. That would work, but it would defeat the portability goals of the XPages JEE project, and would also just be a hassle - I've already had to fork a few specs, and each new one adds to the maintenance burden. That remained an option, but it would be a last resort.

My next idea was to wrap around the ModuleClassLoader class used by NSFComponentModule for class loading purposes. This class is blessedly non-final, and so in theory I could look at the instance in a module and swap it out with a replacement. I tinkered with this a bit, but the trouble became the way it's layered, with a DynamicClassLoader private class within - something harder to subclass. In theory, I could reproduce the behavior of it wholesale, but that would be both fragile (if the implementation ever changes) but also verging on if not outright illegal (it's one thing to be API-compatible, and another to reproduce the internal functionality). After some wrangling, I decided to look elsewhere.

The True Workaround

I realized eventually that I don't really care about ModuleClassLoader as such: it does its job fine, and it's only the response that it gets from ClassLoaderUtil that is the problem. If I could change that, I would be set.

I've used the Javassist project here and there for a long time, ever since its inclusion in ODA for one reason or another. It's a handy toolkit, and notably includes the capability to alter a method implementation on the fly. There's my loophole.

The reason this kind of thing can work is related to how Java handles classes and calls between them. For all intents and purposes, you can consider a method call from one bit of Java to another to be a string-based lookup, saying "find a class named X and a method named Y, and then execute it". The "find" part there is much looser than you might think. It's easy to think of class references like C static linking, but they're really not. When code asks for a class, it asks the context ClassLoader, and that object can do basically whatever the heck it wants to find it, as long as it eventually emits a Class that the runtime can deal with.

Javassist's manipulation makes use of the fact that classes are generally eventually just a bunch of bytes, and you can do whatever you want with a bunch of bytes. Using Javassist, it's fairly simple to, once you have a handle on the class, alter the method. Truncated, that's:

1
2
3
4
5
6
ClassPool pool = /* build a ClassPool that can load the class */;
CtClass cc = /* get the class from the pool */;
cc.defrost();
CtMethod m = cc.getDeclaredMethod("checkProhibitedClassNames");
m.setBody("{ return false; }");
Class<?> result = cc.toClass();

And this works, as far as it goes: I now have a Class version of ClassLoaderUtil that skips the onerous check.

The trouble now was to get this to be actually used by other classes. Generally, once a ClassLoader loads a class, it's difficult to feed it another version unless it's designed to do so: most ClassLoader implementations, including those used here, are designed to read and emit classes by their own rules, not have new data fed into them.

I tried digging through the Eclipse OSGi ModuleClassLoader (distinct from the NSF ModuleClassLoader) for entrypoints and had some initially-promising work with Eclipse's internal ClassLoaderHook type, but eventually determined that this would require more patching than I'd want, if it was possible at all.

I also considered using Java's instrumentation capabilities to intercept class loading, but that would require setting up a special Java agent in the launch parameters, which would be too onerous.

But then I remembered something I had heard about when looking into getting ServiceLoader to work with OSGi: a concept in the OSGi spec called "weaving".

OSGi Weaving

I had noted that this concept existed, but set it aside in large part due to how esoteric it sounds: the term "weaving" makes it sound like it's a way to interact with the threads of fate or something, which is evocative but not something that seems immediately useful.

What it really is, though, is an OSGi-friendly version of the above: when the OSGi runtime goes to load a class from a bundle, it reads the data but then gives any such listeners an opportunity to manipulate the code before it's actually reified into a class. This is how the ServiceLoader mediator does its thing: it looks for ServiceLoader calls during loading and re-"weaves" them to run through OSGi instead.

This was perfect: it provides exactly the hook I want and it does it in a clean, spec-based way, without having to do weird reflection to reassign object properties or anything.

The Implementation

So I went about writing such an implementation. All the pieces are there on Domino, and the mechanism for registering a WeavingHook is something I'd done before in Open Liberty: it's a type of OSGi service that you can register and manage in an Activator class. It's also the sort of thing that would work well with Declarative Services, but Domino doesn't have a DS handler installed and I figured I didn't need to solve that quite yet.

So I wrote a WeavingHook implementation:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
public class UtilWeavingHook implements WeavingHook {
    @Override
    public void weave(WovenClass c) {
        if("com.ibm.xsp.util.ClassLoaderUtil".equals(c.getClassName())) {
            ClassPool pool = new ClassPool();
            pool.appendClassPath(new LoaderClassPath(ClassLoader.getSystemClassLoader()));
            CtClass cc;
            try(InputStream is = new ByteArrayInputStream(c.getBytes())) {
                cc = pool.makeClass(is);
            } catch (IOException e) {
                throw new UncheckedIOException(e);
            }
            cc.defrost();
            try {
                CtMethod m = cc.getDeclaredMethod("checkProhibitedClassNames");
                m.setBody("{ return false; }");
                c.setBytes(cc.toBytecode());
            } catch(NotFoundException | CannotCompileException | IOException e) {
                new RuntimeException("Encountered exception when weaving ClassLoaderUtil replacement", e).printStackTrace();
            }
        }
    }
}

This builds on the above Javassist usage to now load the class from the byte array provided by OSGi, transform it, and then write the new version back. Since this happens while OSGi is reading the class to begin with, there's never a time when there's an older, less-permissive version of the class running around, as long as I get my service in early enough.

This service is registered in the Activator without too much fuss:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
public class JakartaActivator implements BundleActivator {
    private final List<ServiceRegistration<?>> regs = new ArrayList<>();

    @Override
    public void start(BundleContext context) throws Exception {
        regs.add(context.registerService(WeavingHook.class.getName(), new UtilWeavingHook(), null));
    }

    @Override
    public void stop(BundleContext context) throws Exception {
        regs.forEach(ServiceRegistration::unregister);
        regs.clear();
    }
}

The final bit to get right was the "get the service in early enough" aspect. The main task was making sure that this bundle was activated before any XPages apps loaded, and that was a job for my old friend IServiceFactory, which is the extension point that's intended to add handlers for incoming URLs but has the desirable attribute of being initialized right at the very start of HTTP loading.

With this in place, I now have a fix automatically applied to that fiendish class on load, and MicroProfile Config (and future MP specs) works like a charm.

Conclusion

This was an arduous one, and I think the FTL victory jingle actually physically played when I got it working. I've hated this restriction for a long time, and I'm glad to finally have a workaround.

It was also enlightening to properly learn about OSGi's weaving capability. As mentioned above, this is what the ServiceLoader bridge does, and I'd tinkered with that at one point, but never got it working. I suspect now that it should be entirely doable to make it work, most likely also involving bringing in an implementation of the Declarative Services OSGi capability. That should be a fun project in its own right.

Moreover, the fact that I now have a system in place to do this weaving on the fly means that I may be able to un-fork some of the specs I had to fork to get working previously, which specifically required altering ServiceLoader calls. Even if I don't get the official service bridge in, perhaps I can use this technique to just alter the parts I need to on the fly, and otherwise use stock implementations from Maven.

But, for now, the way is cleared for further progress, and a bizarre mystery is solved. I call that a good day.

JSP and MVC Support in the XPages JEE Project

Mon Dec 20 11:20:06 EST 2021

Tags: jakartaee java
  1. Updating The XPages JEE Support Project To Jakarta EE 9, A Travelogue
  2. JSP and MVC Support in the XPages JEE Project
  3. Migrating a Large XPages App to Jakarta EE 9
  4. XPages Jakarta EE Support 2.2.0
  5. DQL, QueryResultsProcessor, and JNoSQL
  6. Implementing a Basic JNoSQL Driver for Domino
  7. Video Series On The XPages Jakarta EE Project
  8. JSF in the XPages Jakarta EE Support Project
  9. So Why Jakarta?
  10. XPages Jakarta EE 2.5.0 And The Looming Java-Version Wall
  11. Adding Concurrency to the XPages Jakarta EE Support Project
  12. Adding Transactions to the XPages Jakarta EE Support Project
  13. XPages Jakarta EE 2.9.0 and Next Steps
  14. XPages JEE 2.11.0 and the Javadoc Provider
  15. The Loose Roadmap for XPages Jakarta EE Support
  16. XPages JEE 2.12.0: JNoSQL Views and PrimeFaces Support
  17. XPages JEE 2.13.0
  18. XPages JEE 2.14.0
  19. XPages JEE 2.15.0 and Plans for JEE 10 and 11

Over the weekend, I wrapped up the transition to jakarta.* for my XPages JEE Support project and uploaded it to OpenNTF.

With that in the bag, I decided to investigate adding some other things that I had been itching to get working for a while now: JSP and MVC.

JSP? Isn't That, Like, A Billion Years Old?

Okay, first: shut up.

Expanding on that point, it is indeed pretty old - arriving in 1999 - and its early form was pretty bad. It was designed as an answer to things like PHP and ASP and bore all those scars: it used actual Java syntax on the page to control output, looping, conditionals, and the like. It even had special directives to import Java classes for the page! All that stuff is still in there, too, which isn't great.

However, JSP used judiciously - focusing on JSTL tags for control/looping and EL references to CDI beans for data access - is a splendid little thing, and it has the advantage that it remains part of the JEE spec.

Domino flirted with JSP for a long time. It's what Garnet was all about and was part of how OpenNTF got off the ground. IBM did eventually ship the custom tags, and they ship with Domino to this day, sitting in the data/domino/java directory, gathering dust. Domino also inherited JSP from WebSphere as part of XPages... kind of. It has hooks for using JSP files in Expeditor-container webapps, but the implementation is conspicuously missing - present only in Notes, presumably for some sort of Social nonsense reason.

For better or for worse, none of that matters now anyway: it's all crusty and old and, critically, uses javax.*. I had to go a different route.

JSP Implementation

From what I gather, there's basically only one real open-source JSP implementation: Jasper, which is a part of Tomcat. Basically everyone just uses that, and that works well enough. There are various re-bundlings of it to remove the Tomcat dependencies, and I went with the GlassFish one, since it was pretty clean.

Diving into it, there were a few things that were potential and actual problems.

First, JSP files aren't evaluated directly. Instead, they're compiled into Servlet class implementations, either on the fly or ahead of time. This process is basically the same as how XPages work: the JSP is translated into a Java file, which is then compiled into a class, which is then reused by the runtime for subsequent requests. Jasper has a dependency on Eclipse JDT, which worried me: when I looked into this in the past, I found that JDT (at least how it was used for JSP) makes a lot of assumptions about working with the actual filesystem. I lucked out here, though: Jasper actually uses the JavaCompiler API, which is more flexible. The JDT dependency seems like either a vestige of an older version or a fallback option.

However, despite the fact that JavaCompiler can work purely in memory, Jasper does do a lot of filesystem-bound work when it comes to loading tag libraries, such as JSTL. I ended up having to deploy a bunch of stuff to the filesystem. Ideally, I'll find a better way around this.

Hooking It Up To Domino

Having a JSP interpreter is one thing, but having it respond to URLs like "http://example.com/foo.nsf/bar.jsp" is another, especially if that should also participate in the XPages class space of the NSF.

I originally considered an HttpService implementation that would accept incoming *.jsp URLs. This could work, but it would be less than ideal: the HttpService, while working in the XPages OSGi layer, wouldn't know about the internal layout of the NSF. I'd have to either reinvent it or wrangle my way over to the active NSFService (the one that runs XPages), find or load the NSF's module, and root around in there. Possible, but not ideal.

Fortunately, I lucked out tremendously: the NSFService class has an addHandledExtensions static method that I can just call to tell it that incoming ".jsp" requests should go to the XPages runtime. This looks like it was added for more Social-nonsense reasons, but I'm happy it's there regardless. Better still, when the runtime finds a URL it was told to handle, it queries IServletFactory implementations like those you can use in an NSF for custom servlets. I already had one in place for JAX-RS, so I made another one (refactored since that commit) for JSP.

Once that was in place (plus some other fiddly details), I got to what I wanted: writing JSPs inside an NSF and having them share the XPages class space:

Screenshot of Designer and a browser showing an in-NSF JSP

Next Up: MVC

Once I had JSP in place (and after some troublesome fiddling with JSF), I decided to take a swing at adding my beloved MVC to the stack.

This had its own complications, this time for the inverse problem as JSP. While Jasper is a creature of the early 2000s and uses older, less-flexibile Java APIs that I had to write around, MVC is the opposite. It's a pure child of the modern, CDI-based world and thus does everything through CDI and ServiceLoaders. However, though I've had CDI support in the project for a long time, actually tying together anything to do with CDI or ServiceLoaders in OSGi is eternally difficult, especially on Domino.

Service Loading

I had to wrangle for this for a while, but I eventually came up with a functional-but-odd workaround: I made use of my own custom ServiceParticipant extension capability that lets me have an object perform pre/post behavior around each JAX-RS request in order to futz with the ClassLoader. I had trouble where the NSF ClassLoader didn't find classes from the MVC implementation, though it should have, so I ended up overriding the ClassLoader to first look explicitly there. It's not pretty, but it works and at least it doesn't require filesystem stuff.

Servlets and Request Dispatchers

Another aspect of being a more-modern child than Jasper is that Krazo makes ready use of Servlet capabilities that have been there for a while but which don't exist on Domino.

For example, Krazo uses a ServletContainerInitializer instance to do pre-research in the app to find classes that should get MVC behavior. Without this scan, MVC won't be applied. This is a Servlet 3.0 feature dating to 2009 and Domino doesn't support it - or any kind of annotation-based classpath scanning, for that matter.

Fortunately, I didn't really need to fully support this concept - I really just needed to make sure this ran whenever the JAX-RS support was being loaded for an NSF. So I made it possible to contribute these via an extension point and added my own scanning implementation to gather the applicable types. Essentially, a backport of this feature to apply in an NSF. With that in place, I was able to register the initializer and have it do its work.

My next hurdle was to do with the way Krazo delegates to JSPs. Specifically, it queries the ServletContext (essentially, the app container) for Servlet registrations that can handle the desired extensions (".jsp" and ".jspx" here) and routes to that using a RequestDispatcher. Well, Domino supports none of this. Trying to get a RequestDispatcher is hard-coded to throw an exception saying "Domino doesn't support this" and the bit about getting ServletRegistrations was new in 3.0. Originally, I stubbed these out, but I decided to give a swing at backporting this as well.

While an NSF doesn't have "Servlet registrations" as such, it does have a list of the aforementioned IServletFactory instances, so I decided to write my own. I wrote a getRequestDispatcher implementation that queries the current module's Servlet factories for a match and, when found, return a basic implementation. Then, I wrote a custom subtype of IServletFactory to provide additional information and made use of that to emulate the Servlet 3+ behavior, at least well enough to let Krazo do what it needs.

Seeing It Together

Once I figured out all these hurdles, I got to what I wanted: I can make a JAX-RS service in an NSF that acts as an MVC controller:

Screenshot of Designer and a terminal showing an MVC controller in an NSF

Neat! There are still some rough edges to clean, but it's great to see in action.

Conclusion and Next Steps

So why is this good? Well, there's a certain amount of box-checking going on: the more JEE specs I can get going, the better.

But beyond that, this is helping to crystallize some of my thinking about what Domino (web) developers are even supposed to freaking do nowadays. This remains an extremely-vexing problem, but I know the answer isn't XPages as it exists now. Maybe the answer is to move XPages to a better container or maybe it's to add a better container to Domino (or both of those, I guess). This is another option, one that preserves the "just fire up Designer and edit some code" niceties of the XPages experience while gaining better, more modern capabilities. I could see writing an app with this, doing all my work in CDI beans and using JSP as the front end - pure open-source solutions with active developers - all inside the NSF. Is it the real best answer? I don't know. Maybe. It's something, though, and specifically something worth trying.

Updating The XPages JEE Support Project To Jakarta EE 9, A Travelogue

Tue Dec 14 16:41:59 EST 2021

  1. Updating The XPages JEE Support Project To Jakarta EE 9, A Travelogue
  2. JSP and MVC Support in the XPages JEE Project
  3. Migrating a Large XPages App to Jakarta EE 9
  4. XPages Jakarta EE Support 2.2.0
  5. DQL, QueryResultsProcessor, and JNoSQL
  6. Implementing a Basic JNoSQL Driver for Domino
  7. Video Series On The XPages Jakarta EE Project
  8. JSF in the XPages Jakarta EE Support Project
  9. So Why Jakarta?
  10. XPages Jakarta EE 2.5.0 And The Looming Java-Version Wall
  11. Adding Concurrency to the XPages Jakarta EE Support Project
  12. Adding Transactions to the XPages Jakarta EE Support Project
  13. XPages Jakarta EE 2.9.0 and Next Steps
  14. XPages JEE 2.11.0 and the Javadoc Provider
  15. The Loose Roadmap for XPages Jakarta EE Support
  16. XPages JEE 2.12.0: JNoSQL Views and PrimeFaces Support
  17. XPages JEE 2.13.0
  18. XPages JEE 2.14.0
  19. XPages JEE 2.15.0 and Plans for JEE 10 and 11

I think it's been a little while since I talked about the XPages Jakarta EE Support project of mine. The goal of that is sort of the inverse of the XPages Runtime project: rather than bringing XPages to a proper modern app server, the JEE Support project brings a handful of current Jakarta EE specs to XPages. It started out a few years ago as a sort of proof-of-concept, but I've since been using it for client work to do things like use newer Jakarta REST (n?e JAX-RS), CDI, and newer EL in XPages and OSGi bundles.

The Specification Move

Originally, I targeted a set of specifications from Java/Jakarta EE 8. Some of these were new to Domino outright, while some (such as JAX-RS) existed in the XPages stack already but in very old forms. I implemented those and for a good while just used the project as a source of parts for client work, tweaking it here and there as needed.

However, the long-prophesized package-name switch from javax.* to jakarta.* came to fruition in Jakarta EE 9, released a bit over a year ago. In the intervening year, most implementations of the specs made the switch, and the versions I was using started to show their age (for example, I was using RESTEasy 3, which was already old when I adopted it, and it's going to 6 now). Beyond just the philosophical sadness of my project being behind, I started to grow specific needs to upgrade components: we switched to JSON-B a while ago, but then some new bug fixes in Yasson were coming only to post-jakarta.* builds.

The Initial Work

I first gave a shot to this in August, initially planning to move only JSON-P and JSON-B over to the new namespace. However, I quickly hit the limits of that, since a lot of these specs are interdependent. JAX-RS using JSON-P and JSON-B to emit JSON content, Yasson has some ties to CDI, and so forth. I realized that it was going to have to be all-or-nothing.

So I rolled up my sleeves and assessed the task ahead of me. At a basic level, there was the job of updating my dependencies, which immediately had some good aspects and bad aspects:

  • Previously, I was using a hodgepodge of spec packages like the JBoss bundling of JAX-RS in order to get something that would work and be license-friendly. Now that it was all over at Eclipse, I could switch to the nice, clean official versions and have no license worries.
  • I also used to have all sorts of OSGi rule overrides to account for Domino-specific oddities like ancient versions of various specs being supplied by the default classpath or other, conflicting bundles, all with no versioning. Once I was looking for e.g. jakarta.annotation instead of javax.annotation, I was no longer bound to that particular nightmare.
  • Not all of my dependencies were ready. When I first started, RESTEasy (my JAX-RS provider of choice) had not yet uploaded a JEE-9-compatible version. My main choices were to try using Eclipse Transformer, which would add a whole new layer to the task, or to switch to another provider.

Then there's the elephant in the room: the freaking Servlet API, which much of this depends on. Since the Servlet API is the job of the web container, I can't realistically upgrade it. Fortunately, that's only half true: I can't give it new capabilities (like Web Sockets), but I can wrap the old stuff with the new. And, like the other specs, the switch of the package name was a tremendous blessing, allowing me to deploy the official Servlet 5 API unchanged. Then, I did the tedious work of writing a slew of adapter classes, half wrapping a javax.servlet component and pretending it's jakarta.servlet and half going the other direction. Since the methods are either direct analogs, optional features, or can be emulated, this was actually much easier than I thought it would be. And there: Servlet 5 on Domino! Kind of!

The Showstopper

However, I soon hit what seemed to be a show-stopper: a LinkageError problem when using CDI that didn't show up previously. My search for the topic found only one hit: an issue in Open Liberty referencing almost exactly the same problem. My heart sank when I read that their fix was to upgrade the Equinox runtime - something that's outside my powers on Domino (probably).

So, disheartened, I set it aside for a couple months. I figured there was a small chance that Weld (the CDI implementation at the heart of the trouble) would put out an update that fixed it - after all, an older version worked.

Resuming Work

After setting it aside, it kept eating away at the back of my mind, and two things kept pushing me to go back to it:

  • I'll need to do it eventually. I (and my client projects) can't just be stuck at the old style forever.
  • I really didn't want to admit defeat and switch back to Gson for JSON processing.

So I went back to it. My initial hope - that a new version of Weld would magically fix the problem - proved to not have come to fruition. Still, though, I wasn't sure that it was the exact same problem Liberty encountered. For one, my use of CDI studiously avoids actually telling it about OSGi, since I've had little luck making use of that with Domino's OSGi stack. That was enough cause to make me think I could work around it.

And work around it I did! The trouble turned out to be, unsurprisingly, a bit esoteric, but boiled down to the runtime re-registering proxy classes for the same core components. My guess is that, somewhere along the line, Weld changed some sort of internal cache in a way that would break when using a bunch of ephemeral per-NSF containers as I do. I implemented my own (since it's an intended extension point) and added a bit of a cache, and I was back to the races.

As a convenient blessing, RESTEasy released 6.0.0.Beta1 just days before I got back to it, a major release targeted at JEE 9. That meant that I could save a ton of work by not having to re-work everything for another JAX-RS implementation. I had been looking into Jersey, which I'm sure would have done the job, but it's fiddly work trying to put all these pieces together on Domino, and I was all the happier to not have to re-do it all.

JavaMail

But then I hit a new problem: the javax.mail API, now jakarta.mail. The first part of this is easy enough: bring in the new spec bundle and everything will point to it. Great! I hit an immediate problem, and one I had been dreading dealing with. Though the spec changed package names, the implementation didn't. That brought me face-to-face again with an old nemesis of mine, sitting there in Domino's classpath, corrupting it:

A screenshot of Domino's ndext directory

The way the Mail API works is that there's a file, called "mailcap", that lists implementations for common data types, like:

1
2
3
4
text/plain;;		x-java-content-handler=com.sun.mail.handlers.text_plain
text/html;;		x-java-content-handler=com.sun.mail.handlers.text_html
text/xml;;		x-java-content-handler=com.sun.mail.handlers.text_xml
multipart/*;;		x-java-content-handler=com.sun.mail.handlers.multipart_mixed; x-java-fallback-entry=true

So, while all the entrypoint classes are jakarta.mail.* now, the implementations remain com.sun.mail.*, all with the same class names. And, since this little jerk of a JAR is sitting in the system classpath, it has a way of showing up all the time, complaining that com.sun.mail.handlers.text_plain is incompatible with jakarta.activation.DataContentHandler.

This is extremely fiddly to deal with, potentially involving writing a special ClassLoader implementation that blocks calls down to the lower-level JAR. While maybe possible, I'm not sure it'd be possible in a way that would be practical for normal use in an app.

And so, with a heavy heart, I forked the thing and added an "org.openntf" in front of all the package names. And that... works! It works just fine. It means that I'm on the hook for manually integrating any upstream changes, but at least it works without having to worry about any conflicts.

That wasn't the end of my trouble with this spec, though. The spec package itself, in jakarta.mail.Session uses ServiceLoader to look for services, and it uses it in the form that looks them up with the current thread's ClassLoader. Because I'm working in OSGi, that ClassLoader - the XPage app's loader - won't know about the implementation classes directly, and this call fails. And, while there's a whole sub-spec in OSGi for dealing with this, I've never had success actually getting it working in Domino.

So I forked that freaking thing too and modified the calls to use its own ClassLoader, which could find the implementation by way of it being a fragment bundle attached to it.

And, with that, finally, I had Jakarta Mail properly hooked up and working without having to jump through too many hoops. I'd still prefer to not have forked the source, but it was the best of a bad lot of choices.

The Final Tally

That brings the specs updated/added in this project to:

  • Expression Language 4.0
  • Contexts and Dependency Injection 3.0
    • Annotations 2.0
    • Interceptors 2.0
    • Dependency Injection 2.0
  • RESTful Web Services (JAX-RS) 3.0
  • Bean Validation 3.0
  • JSON Processing 2.0
  • JSON Binding 2.0
  • XML Binding 3.0
  • Mail 2.1
    • Activation 2.1

Not too shabby, if I say so myself. Technically, Servlet 5.0 is in there, but it doesn't actually bring any newer-than-2.4 powers to the Servlet container, so it's really just infrastructural details.

Now I'll just have the work of updating my client project and finally getting to use whatever that Yasson bug fix was that prompted this in the first place.

Java's Shakier Old APIs

Fri Dec 10 11:24:25 EST 2021

Tags: java

In my last post, I sang the praises of InputStream and OutputStream: two classes from Java 1 that, while not perfect, remain tremendously useful and used everywhere.

Then, a tweet by John Curtis got me thinking about the opposite cases: APIs from the early Java days that are still with us, are still used relatively frequently, but which are best avoided or used very sparingly.

There are a handful of APIs from the early days that may or may not still exist, but which aren't regularly encountered in most of our work: the Applet API, for example, was only recently actually removed, and it was clear for a long time that it wasn't something to use. Some other APIs are more insidious, though. They're right there alongside newer counterparts, and they're not marked as @Deprecated, so you just have to kind of magically know why you shouldn't use them.

Old Collections

One of these troublesome holdovers is a "freebie" for Domino developers: java.util.Vector. This is paired with other "first revision" collection classes like Hashtable, classes that predate the Collections Framework in 1.2 and which were retrofitted into it.

These classes aren't incorrect as such: they do what they're supposed to do and function as working implementations of List and Map. The trouble comes in that they're sub-optimal compared to other options. In particular, they're very-heavily synchronized in a way that hurts performance in the normal cases and isn't even really ideal in the complex multi-threaded case.

Unfortunately, since these classes aren't deprecated, an IDE would only warn you about it if it's using some stylistic validation above normal compilation. Such classes are identified best by looking for a warning paragraph like this at the bottom of their Javadoc:

Javadoc 'old class' warning for Hashtable

java.util.Date

The java.util.Date class has a simple concept: represent a point in time. However, it's a neverending font of limitations and caveats:

  • It's essentially a wrapper for a Unix timestamp in milliseconds precision, and doesn't get more precise
  • It's not immutable even though it'd make sense to be. Effective Java includes repeated examples of why this is bad
  • Though it's called "Date", it's always a single timestamp, and can't represent a day in the abstract
  • In Java 1, it also was responsible for parsing date strings, and this functionality remains (though at least deprecated)
  • As mentioned in the prompting tweet, the DateFormat classes that go with this are not thread-safe, even though one could reasonably assume they would be based on their job
  • There's no concept of time zone, though the string representation would lead one to think there might be
  • The related Calendar class is a little more structured, but in a weird way and having a lot of the same limitations

Nonetheless, Date is the obvious go-to for date/time-related operations due to its age and alluring name. And, in fact, it wasn't even until Java 8 that there was a first-party better option. That's when Java basically adopted Joda Time outright and brought it into Java as the java.time package. This system has what's required: the notion of dates and times as separate entities, time zones both as named entities (like "America/New_York") and just as offsets (like UTC-5:00), and full immutability and thread-safety, and tons else.

Unfortunately, it will be a long time for old habits to die and longer for older code to fade away, so we're stuck with Date for a while, even if only to always call #toInstant on it.

java.io.File

The java.io.File class is kind of similar to Date: it was created in Java 1 as a basic way to work with files on the filesystem. It still does that, and (as far as I know) it's not as outright bad as the above, but it's limited and non-optimal.

In Java 7, the NIO Path API was added, which replaces File in a more-generic and -adaptable way. Whereas File refers specifically to the filesystem, the Path API is adaptable to whatever you'd like while sharing the same semantics. It can also participate in the NIO ecosystem properly.

Much like how Date has a #toInstant method, File has a #toPath method to work with the transition. I make a habit of doing this almost all the time when I'm working with existing code that still uses File. And there is... a lot of this code. Even APIs that can take Path arguments will potentially turn them into Files internally to keep working with their older implementation.

There are also a bunch of related APIs where the replacements exist but aren't quite as straightforward. ZipFile is a perfect example of this: it (and its child class JarFile) has constructors that take either a File or a String representing a file path, and that's alarming. However, the ZIP File System Provider that works with Paths is neat, but it's not as clear of a replacement for ZipFile as Path is for File. That's actually one of the reasons I use ZipInputStream even in a case where ZipFile would also work.

Conclusion

I'm sure there are other similar traps around, but those are the main ones I can think of off the top of my head. It's a bit of a shame that Sun/Oracle have been so historically reticent to mark classes wholesale as deprecated. While IDEs and and toolchains have gotten better at providing "stylistic" recommendations like this, it's been slow going, and it's not universal. The best thing you can do for now is to just know about the newer alternatives and use them enough that the old kinds immediately read as "code smell" when you come across them.

Generating Archive Files On The Fly In Java

Thu Dec 09 10:30:36 EST 2021

Tags: java liberty

When working on version 3.0 of the Domino Open Liberty Runtime, I had occasion to do something I've done in other situations, but it occurred to me that it'd make a good post on its own. Specifically, part of one of the new features involved creating archive data on the fly, purely in-memory, and that's something that comes in handy quite a bit.

Background: The Task

The task at hand in that project involved the way the runtime will deploy custom extension features for Liberty when creating the server. There are a few of these, all centered around adding integration with Domino in one way or another. For the previously-existing Liberty features, this was done in three parts:

  • The actual Liberty extension code, which is a Java project that produces a Liberty-compatible OSGi bundle.
  • A "subsystem" module, which is a code-less Maven project that uses esa-maven-plugin to embed the above bundle and generate a "SUBSYSTEM.MF" file to describe it. This ESA/subsystem bit is a mechanism for distributing packaged features from the OSGi spec.
  • A "deployment" module, which is a small Java project that provides an extension for the Domino-side runtime to house and deploy the above ESA file.

For 3.0, I wanted to make a feature that would provide Notes.jar and the NAPI to application. Since those files are proprietary and non-distributable, I couldn't include them in the actual runtime distribution and would instead have to look them up from Domino's environment at runtime. Additionally, since all I wanted to do was provide the existing API and not add any new code, there was no particular need to make a code project like the first one above.

More Background: Java Streams

The way these extensions are registered to Domino is by classes that provide some metadata about the feature and then a method called getEsaData() that returns an InputStream. Though InputStreams aren't the only way to represent arbitrary binary blobs like this, they're used everywhere by virtue of them arriving with Java 1, and they're extremely adaptable.

Basically, the idea of an InputStream is that it's just a mechanism to read a sequence of bytes from somewhere. In Domino terms, they're like NotesStream, but good.

Their utility comes from their simplicity and adaptability. Because the abstract class only deals with reading bytes and a few operations for skipping around, they can be used for all sorts of things. The prototypical use is for reading a file. For example:

1
2
3
4
Path someFile = Paths.get("/foo/bar/baz.txt");
try(InputStream is = Files.newInputStream(someFile)) {
  // read file data from the stream
}

They're not limited to that, though: the JDK comes with all sorts of InputStream variants like ByteArrayInputStream, which lets you read from a byte[] in memory.

In addition to being arbitrary as to where the byes are coming from, streams are also very composable. Many types of streams either must or may wrap an existing stream to alter it in some way. One of the more-common cases where you'd do this is when reading ZIP file data. Taking something similar to above:

1
2
3
4
5
6
7
8
Path someZipFile = Paths.get("/foo/bar/baz.zip");
try(
  InputStream is = Files.newInputStream(someFile);
  ZipInputStream zis = new ZipInputStream(is, StandardCharsets.UTF_8)
) {
  ZipEntry entry = zis.getFirstEntry();
  // work with the ZIP entries
}

The thing to note here is that, while this happens to be coming from a ZIP file on disk, it doesn't have to be: that first is could just as easily be a stream coming from HttpURLConnection or a ByteArrayInputStream.

Along with InputStream, Java also has OutputStream. Luckily, OutputStream is similarly simply designed, and has uses that are a direct mirror for everything above: there exist ByteArrayOutputStream, ZipOutputStream, and all sorts of others.

Putting It Together

Back to the original goal, my task was to create a class that would provide an InputStream containing ESA data - that is to say, a ZIP file - to the runtime, which could then deploy it as a Liberty feature. The previous extensions did this by embedding the ESA in their JAR and then returning an InputStream to that. Now, though, I wanted to do it all dynamically.

Now, I talked a big game above about how streams didn't have to have anything to do with files, and it could all be done in memory. That's still all true, but technically here I ended up using files for caching purposes. The above is still good to know, though!

So anyway, my goal was to deliver an InputStream to the runtime that represented an ESA that looks like this:

Contents of a generated ESA file

Of those entries, "corba.jar" is the CORBA API from Maven Central to make Notes.jar work on Java 9+, while "Notes.jar" comes from jvm/lib/ext and "lwpd.commons.jar" and "lwpd.domino.napi.jar" come from the OSGi framework in the running Domino server. The remaining entries - the two "MF" files and the embedded JAR - are composed on the fly.

The starting point here is that I identify a cache location within my working directory based on the current Domino build number, and I name that path out. Then, I open it up as a ZIP to fill with contents, like above:

1
2
3
4
5
6
try(
  OutputStream os = Files.newOutputStream(out, StandardOpenOption.CREATE);
 ZipOutputStream zos = new ZipOutputStream(os, StandardCharsets.UTF_8)
) {
  // work happens here
}
SUBSYSTEM.MF

The next part is to build the "SUBSYSTEM.MF" file. As implied by the extension, this file has the same syntax as "MANIFEST.MF" files, and so I can use the java.util.zip.Manifest class to handle encoding and formatting. I start out by loading a template from the current bundle's resources:

1
2
3
4
Manifest subsystem;
try(InputStream is = getClass().getResourceAsStream("/subsystem-template.mf")) { //$NON-NLS-1$
  subsystem = new Manifest(is);
}

There, I'm using the constructor from Manifest that reads from an existing stream. Often, that would be reading a "MANIFEST.MF" from an existing JAR, but it'll work with any stream.

Then, I fill it in with some details, with lines like:

1
2
3
4
5
Attributes attrs = subsystem.getMainAttributes();
attrs.putValue("Subsystem-Name", getShortName()); //$NON-NLS-1$
String featureName = getShortName() + "-" + getFeatureVersion(); //$NON-NLS-1$
attrs.putValue("IBM-ShortName", featureName); //$NON-NLS-1$
// etc.

Finally, I create an entry in the ZipOutputStream and write the contents. The way ZipOutputStream works is that its "stream-iness" counts towards whatever the most-recently-added entry is.

1
2
zos.putNextEntry(new ZipEntry("OSGI-INF/SUBSYSTEM.MF")); //$NON-NLS-1$
subsystem.write(zos);
Embedded Bundle

Alright, so far, so good. Up until now, this is the "normal" case for working with ZIP files, where you make a new entry and pour in some text data. What's neat, though, is that the encapsulation capabilities of these streams can be stacked, which is what comes up next.

Specifically, I wanted to put a ZIP file (the .jar) within this surrounding ZIP (the .esa). The way this is done is by just composing the same tools we've been working with again. Here, esa is what zos was above: the outermost package ZIP contents. I just renamed it in this method for clarity inside the code itself.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
// This will be a shell bundle that in turn embeds the API JARs
esa.putNextEntry(new ZipEntry(BUNDLE_NAME + "_" + dominoVersion + ".0.0.jar")); //$NON-NLS-1$ //$NON-NLS-2$
		
// Build the embedded JAR contents
try(ZipOutputStream zos = new ZipOutputStream(esa, StandardCharsets.UTF_8)) {
  Manifest manifest;
  try(InputStream is = getClass().getResourceAsStream("/manifest-template.mf")) { //$NON-NLS-1$
    manifest = new Manifest(is);
  }
  
  // Finish manifest
  // More work here
}

So there, I'm doing basically the same thing as I did originally, to make a ZipOutputStream. Since the ZipOutputStream really doesn't care what the stream is writing to is, it works just as well when writing to a file as when writing to another ZIP stream - the cascading streams handle their own encoding and it works out in the end.

Once I write the manifest, I can make use of the Files utility class to embed each of the JARs from the filesystem:

1
2
3
4
for(Path jar : embeds) {
  zos.putNextEntry(new ZipEntry(jar.getFileName().toString()));
  Files.copy(jar, zos);
}

Finally, I download the CORBA JAR on the fly, so for that one I use a utility function to download from the remote URL:

1
2
3
4
5
OpenLibertyUtil.download(new URL(URL_CORBA), is -> {
  zos.putNextEntry(new ZipEntry("corba.jar")); //$NON-NLS-1$
  IOUtils.copy(is, zos);
  return null;
});

Here, I use IOUtils from Apache Commons IO because it's not copying from a filesystem path, but the idea is basically the same, and exactly the same as far as the destination ZIP is concerned.

The Final Result

Once this is all written to the cached file on the filesystem, the final result is just to return a stream from it:

1
return Files.newInputStream(out);

Since the job of this extension class is only to return an InputStream, the consuming code doesn't care that the extension did all this work, as opposed to the other ones that just return a stream of an embedded resource: everything else is the same.

So, all in all, this isn't a groundbreaking new technique, but that's the point: the way these lower-level JDK components work, you get a tremendous amount of flexibility from just a few common parts.

Journeys Debugging Open Liberty and MVC

Tue Nov 30 16:40:24 EST 2021

I mentioned in my last post that I've been tinkering with a modern structure for OpenNTF's web site as a side project. In that, I talked about how I've been going with Jakarta MVC for the front end, but ran into an odd problem with the latest versions in Open Liberty, and that was the impetus to tinkering with ERB.

Well, I decided to go back and take a swing at trying to make JSP work in this case, since it's (still) a good engine for this purpose, and it could be a fun experiment. I was indeed able to do it, and I think the path I took is worth chronicling here.

Context

In previous projects, such as this blog, I've used older versions of the software stack involved - basically, Jakarta 8, which is before the "big bang" switch from the javax.* to jakarta.* package namespace. Since this is a clean new app, I really want to lean to the newest versions across the board, so I pegged my plans to that.

Though Jakarta EE 9 and 9.1 (the Java 11 official version) have been out for a bit, the switchover comes with the sort of turbulence one would expect. Until just this past week, Open Liberty supported JEE 9 only in beta releases - these have historically been plenty stable for me, but it's always asking for trouble. Even with that non-beta version out, I found myself still on the beta track: I'm addicted to using MicroProfile Config, and MicroProfile's move to JEE 9 support is still itself in the RC stage.

So, okay, betas it is.

The Problem

Once I set everything up on JEE 9 and MP 5, I hit this exception when trying to render a JSP via an MVC Controller object:

java.lang.RuntimeException: SRV.8.2: RequestWrapper objects must extend ServletRequestWrapper or HttpServletRequestWrapper
  at com.ibm.wsspi.webcontainer.util.ServletUtil.unwrapRequest(ServletUtil.java:89)
  at [internal classes]
  at org.eclipse.krazo.engine.ServletViewEngine.forwardRequest(ServletViewEngine.java:135)
  at org.eclipse.krazo.engine.JspViewEngine.processView(JspViewEngine.java:58)
  at org.eclipse.krazo.core.ViewableWriter.writeTo(ViewableWriter.java:159)
  at org.eclipse.krazo.core.ViewableWriter.writeTo(ViewableWriter.java:1)
  at org.jboss.resteasy.core.interception.jaxrs.ServerWriterInterceptorContext.lambda$writeTo$1(ServerWriterInterceptorContext.java:79)
  ... 4 more

The short of it is that Krazo (the MVC implementation) passes JSP rendering along to the app container, rather than doing its own JSP work, which makes perfect sense. This, however, hits trouble within Liberty's ServletUtil class, which attempts to "unwrap" the incoming HttpServletRequest object to find the core Liberty-specific object to use extended methods on.

Normally, this sort of thing would work fine: every app server has its own variant of HttpServletRequest for its own uses, and it's perfectly reasonable to do this kind of unwrapping. However, for some reason, this was going awry.

The specific code from Krazo that calls down into Liberty code does do new HttpServletRequestWrapper(request), but that's also legal: the unwrapRequest method is intended specifically to unwrap spec-standard HttpServletRequestWrapper objects like that. So that's not our culprit, and I had to dig deeper.

Investigation

To start my investigation, I knew I'd need to work with the Krazo layer. Fortunately, though many of the moving parts here are baked into the Liberty server, Krazo is not - I include it as a Maven dependency in my app. So I cloned the Krazo source, added it to my workspace, and set my dependency on the SNAPSHOT version, allowing me to do my work inside Krazo's classes.

Context

So what was going on? At first, I thought that maybe something had snuck in a javax.* class somewhere - old code that wasn't fully migrated to JEE 9. That would certainly cause the trouble: javax.servlet.ServletRequestWrapper and jakarta.servlet.ServletRequestWrapper are, to the JVM's eyes, entirely-unrelated classes with no compatibility whatsoever. And, indeed, looking at the source of ServletUtil could give one that impression right away, since the code uses javax.*.

That's not the trouble, however. Though the class is written to javax.*, I gather that it's run through Eclipse Transporter during packaging of the app server, and the actual class that's involved uses jakarta.*. Okay, that's good to know and makes sense, but it also doesn't get us any closer to the root problem.

For my next step, I wanted to figure out what, specifically, it was looking for. The unwrapRequest method takes a Class<?> parameter to find the needed request type, but the stack trace above hid the path it took to get there. By attaching a debugger to the server, I gleaned that it was being called by the unwrapRequest variant above it that looks specifically for a com.ibm.wsspi.webcontainer.servlet.IExtendedRequest.

Okay, so I have the name of the interface it's looking for - I can work with this. My next step was to try to get a programmatic handle on it. The basic approach, when you don't have the class as part of your project, it to look it up via:

1
Class<?> requestClass = Class.forName("com.ibm.wsspi.webcontainer.servlet.IExtendedRequest");

That doesn't work here, though: though the app server definitely has that class, it (properly) shields the running application from accessing the class directly, so that the app runtime isn't contaminated by the surrounding server code.

What I needed to do next was find a ClassLoader that does know about it, and the best way to do that is to find a class provided from outside the running app and ask that. Fortunately, the incoming request is exactly that. So:

1
Class<?> requestClass = Class.forName("com.ibm.wsspi.webcontainer.servlet.IExtendedRequest", false, request.getClass().getClassLoader());

What that does is ask whatever ClassLoader the request object comes from - that is to say, the container's loader - to find the class. And it worked! Now I could test to verify whether the core request matches the type needed:

1
2
Class<?> requestClass = Class.forName("com.ibm.wsspi.webcontainer.servlet.IExtendedRequest", false, request.getClass().getClassLoader());
System.out.println("does it match? " + requestClass.isInstance(request));

As expected, that resolved to false. In this case, that's good, since it'd have been a much-worse problem if it hadn't. But what is the request object, anyway? Well:

1
2
System.out.println(request.getClass());
// output: jdk.proxy15.$Proxy68

Right, okay, that makes sense: all sorts of stuff uses proxy objects in Java, not the least of which being the CDI environment running the whole show.

Cracking Open Proxies

So I had some good information at this point: the HttpServletRequest that Krazo is handed is a proxy object, but Liberty has a hard requirement that its dispatcher is given an instance of IExtendedRequest, which this is not. That means that something in the stack is taking the original Liberty request object and making a proxy for it - fair enough, but inconvenient for me.

My next thought was that maybe I could track down the type of proxy object it is and, with that knowledge, get the underlying delegate request. That's a common-enough pattern: have an instance property in your proxy class that contains the delegate, and (if I'm lucky) have it accessible via a getter. Java's java.lang.reflect.Proxy class has a static method for determining the object that actually handles called methods:

1
2
System.out.println(Proxy.getInvocationHandler(request));
// output: org.jboss.resteasy.core.ContextParameterInjector$GenericDelegatingProxy@fc04422d

This was starting to come together all the more. Liberty recently switched from Apache CXF to RESTEasy for its JAX-RS implementation, and that could explain why this is trouble now when it wasn't before. More importantly for my immediate needs, that also gave me a lead to track down the proxy class.

However, though it was easy enough to find, my heart sank a bit at what I found: rather than having an easy instance property to get the real request, it uses an object from its container class and in turn asks that for the request. Maybe I'd be able to get to that via reflection, but the prospect of figuring out how to work with nested class contexts caused me to try to look around elsewhere instead.

CDI

Another potential answer came to me in a flash: CDI! Access to the CDI environment is standardized, and maybe I could fetch the original request from there. It'd be extremely likely that it'd just hand me back a similar proxy, but it'd be worth a shot. So here we go:

1
2
3
HttpServletRequest cdiReq = CDI.current().select(HttpServletRequest.class).get();
System.out.println(cdiReq);
// output: com.ibm.ws.webcontainer40.srt.SRTServletRequest40@1a22712c

Oh! Good! That's one of Liberty's internal types! Is it the object I need, though? Well... no. Crap. requestClass.isInstance(cdiReq) is false, so this didn't really get me very far. That's a shame, too, since that solution could have involved no implementation-specific code at all.

Internal Liberty Classes

My next thought was that I should try to find another way to get around to finding the true request object. I looked back through the debug stack to find where it was originally calling unwrapRequest to get a bit more context:

1
2
3
WebContainerRequestState reqState = WebContainerRequestState.getInstance(false);

wasReq = (IExtendedRequest) ServletUtil.unwrapRequest(request);

Okay, so what's this with WebContainerRequestState? That sure smells like an object that's meant to be a request-wide object to get to all sorts of state. If I were to write such a class, I'd use it to stash the incoming request as well as any other incidental data that I wouldn't want to ferret away in a way that might leak into the app. I was a little wary that WebAppRequestDispatcher didn't use it to get the IExtendedRequest, but maybe I'd luck out.

And boy, did I! Looking down the source of the file, I found my mark: public IExtendedRequest getCurrentThreadsIExtendedRequest().

The (Provisional) Solution

Now, I had all the tools I needed. Towards the bottom of Krazo's ServletViewEngine class, I conjured up this reflective incantation:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
try {
  Class<?> requestStateClass = Class.forName("com.ibm.wsspi.webcontainer.WebContainerRequestState", false, request.getClass().getClassLoader());
  Method getInstance = requestStateClass.getDeclaredMethod("getInstance", boolean.class);
  Object requestState = getInstance.invoke(null, false);
  Method getCurrentThreadsIExtendedRequest = requestStateClass.getDeclaredMethod("getCurrentThreadsIExtendedRequest");
  request = (HttpServletRequest)getCurrentThreadsIExtendedRequest.invoke(requestState);
} catch (Throwable e1) {
  // Not on WAS
}
rd.forward(new HttpServletRequestWrapper(request), new HttpServletResponseWrapper(response));

So what I'm doing here is breaking into the container's ClassLoader in order to get a handle on WebContainerRequestState. From there, I'm able to call the method to get the current instance, and then in turn call the method to get the IExtendedRequest. I overwrite the request variable we're working with, and then pass that along to the dispatcher. If any of that were to fail, I just throw up my hands, assume I'm not on Liberty, and continue on as before.

And... it works! It actually works! The pages now render properly, with all the niceness of modern JSP at my fingertips! It was fun to toy with the idea of ERB, but I like this better for an otherwise-pure-Java app for sure.

Next Steps

So I have a solution that works for me, but it's so ugly and implementation-specific that I can't exactly be comfortable with it. Still, the trouble comes from an implementation-specific source, so that may be required. Maybe I'll have to leave it like this.

More responsibly, though, what I should do is narrow this down into a reproducible case without all the other moving parts in the app to make sure that it's actually a bug/incompatibility, and thus something that I can report. This is all open-source software, after all, and it'd do nobody any good for me to let a potential actual problem linger. I'll just have to properly identify where the true culprit is first. Is it because Krazo uses RequestDispatcher in a somewhat-unusual way? Is it because RESTEasy is too aggressive about wrapping requests with no proper way to get to the delegate? Is it that Liberty should handle this case better internally? Or maybe it's just some side effect of the other things I have going on. Research is warranted.

In the mean time, that was a fun one. I don't know that I'll have a need for this specific solution again, but it was good to find, and it's always good to get some troubleshooting practice like this for sure.

Writing A Custom ViewEngine For Jakarta MVC

Tue Nov 02 14:31:20 EDT 2021

One of the very-long-term side projects I have going on is a rewrite of OpenNTF's site. While the one we have has served us well, we have a lot of ideas about how we want to present projects differently and similar changes to make, so this is as good a time as any to go whole-hog.

The specific hog in question involves an opportunity to use modern Jakarta EE technologies by way of my Domino Open Liberty Runtime project, as I do with my blog here. And that means I can, also like this blog, use the delightful Jakarta MVC Spec.

However, when moving to JEE 9.1, I ran into some trouble with the current Open Liberty beta and its handling of JSP as the view template engine. At some point, I plan to investigate to see if the bug is on my side or in Liberty's (it is beta, in this case), but in the mean time it gave my brain an opportunity to wander: in theory, I could use ERB (a Ruby-based templating engine) for this purpose. I started to look around, and it turns out the basics of such a thing aren't difficult at all, and I figure it's worth documenting this revelation.

MVC ViewEngines

The way the MVC spec works, you have a JAX-RS endpoint that returns a string or is annotated with a view-template name, and that's how the framework determines what page to use to render the request. For example:

1
2
3
4
5
6
7
8
9
@Path("home")
@GET
@Produces(MediaType.TEXT_HTML)
public String get() {
  models.put("recentReleases", projectReleases.getRecentReleases(30)); //$NON-NLS-1$
  models.put("blogEntries", blogEntries.getEntries(5)); //$NON-NLS-1$

  return "home.jsp"; //$NON-NLS-1$
}

Here, the controller method does a little work to load required model data for the page and then hands it off to the view engine, identified here by returning "home.jsp", which is in turn loaded from WEB-INF/views/home.jsp in the app.

In the framework, it looks through instances of ViewEngine to find one that can handle the named page. The default spec implementation ships with a few of these, and JspViewEngine is the one that handles view names ending with .jsp or .jspx. The contract for ViewEngine is pretty simple:

1
2
3
4
public interface ViewEngine {
  boolean supports(String view);
  void processView(ViewEngineContext context) throws ViewEngineException;
}

So basically, one method to check whether the engine can handle a given view name and another one to actually handle it if it returned true earlier.

Writing A Custom Engine

With this in mind, I set about writing a basic ErbViewEngine to see how practical it'd be. I added JRuby to my dependencies and then made my basic class:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
@ApplicationScoped
@Priority(ViewEngine.PRIORITY_APPLICATION)
public class ErbViewEngine extends ViewEngineBase {

  @Inject
  private ServletContext servletContext;

  @Override
  public boolean supports(String view) {
    return String.valueOf(view).endsWith(".erb"); //$NON-NLS-1$
  }

  @Override
  public void processView(ViewEngineContext context) throws ViewEngineException {
    // ...
  }
}

At the top, you see how a custom ViewEngine is registered: it's done by way of making your class a CDI bean in the application scope, and then it's good form to mark it with a @Priority of the application level stored in the interface. Extending ViewEngineBase gets you a handful of utility classes, so you don't have to, for example, hard-code WEB-INF/views into your lookup code. The bit with ServletContext is there because it becomes useful in implementation below - it's not part of the contractual requirement.

And that's basically the gist of hooking up your custom engine. The devil is in the implementation details, for sure, but that processView is an empty canvas for your work, and you're not responsible for the other fiddly details that may be involved.

First-Pass ERB Implementation

Though the above covers the main concept of this post, I figure it won't hurt to discuss the provisional implementation I have a bit more. There are a couple ways to use JRuby in a Java app, but the way I'm most familiar with is using JSR 223, which is a generic way to access scripting languages in Java. With it, you can populate contextual objects and settings and then execute a script in the target language. The Krazo MVC implementation actually comes with a generic Jsr223ViewEngine that lets you use any such language by extension.

In my case, the task at hand is to read in the ERB template, load up the Ruby interpreter, and then pass it a small script that uses the in-Ruby ERB class to render the final page. This basic implementation looks like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
@Override
public void processView(ViewEngineContext context) throws ViewEngineException {
  Charset charset = resolveCharsetAndSetContentType(context);

  String res = resolveView(context);
  String template;
  try {
    // From Apache Commons IO
    template = IOUtils.resourceToString(res, StandardCharsets.UTF_8);
  } catch (IOException e) {
    throw new ViewEngineException("Unable to read view", e);
  }

  ScriptEngineManager scriptEngineManager = new ScriptEngineManager();
  ScriptEngine scriptEngine = scriptEngineManager.getEngineByExtension("rb"); //$NON-NLS-1$
  Object responseObject;
  try {
    Bindings bindings = scriptEngine.createBindings();
    bindings.put("models", context.getModels().asMap());
    bindings.put("__template", template);
    responseObject = scriptEngine.eval("require 'erb'\nERB.new(__template).result(binding)", bindings);
  } catch (ScriptException e) {
    throw new ViewEngineException("Unable to execute script", e);
  }

  try (Writer writer = new OutputStreamWriter(context.getOutputStream(), charset)) {
    writer.write(String.valueOf(responseObject));
  } catch (IOException e) {
    throw new ViewEngineException("Unable to write response", e);
  }
}

The resolveCharsetAndSetContentType and resolveView methods come from ViewEngineBase and do basically what their names imply. The rest of the code here reads in the ERB template file and passes it to the script engine. This is extremely similar to the generic JSR 223 implementation, but diverges in that the actual Ruby code is always the same, since it exists just to evaluate the template.

If I continue down this path, I'll put in some time to make this more streamable and to provide access to CDI beans, but it did the job to prove that it's quite doable.

All in all, I found this exactly as pleasant and straightforward as it should be.

Adding Selenium Browser Tests to My Testcontainers Setup

Tue Jul 20 11:20:42 EDT 2021

  1. Tinkering With Testcontainers for Domino-based Web Apps
  2. Adding Selenium Browser Tests to My Testcontainers Setup
  3. Building a Full Domino Image for JUnit Tests

Yesterday, I talked about how I dove into Testcontainers for my app-testing needs. Today, I decided to use this to close another bit of long-open business: automated browser testing. I've been very much a dilettante when it comes to that, but we have a handful of browser-ish tests just to make sure the login page, the main page, and some utility pages load up and include expected content, and those can serve as a foundation for much more.

Background

In general, when you think "automated browser testing", that means Selenium. As a toolkit, Selenium has hooks for the browsers you want and has essentially universal support, working smoothly in Java with JUnit. However, the actual act of loading a real browser is miserable, mostly on account of needing you to install the browser and point to it programmatically, which is doable but is another potential system-specific configuration that I'd much, much rather avoid in my automated builds.

Accordingly, and because my needs have been simple, I've used HtmlUnit, which is a portable Java browser-like library that does the yeoman's work of letting you perform basic Selenium tests without having to configure actual native OS installations. It's neat, imposes basically no strictures on your workflow, and I recommend it for lots of uses. Still, it's not the same as real browsers, and I had to do things like disable JavaScript processing to avoid it tripping up on some funky JS that full-grown browsers can deal with.

Enter Webdriver Containers

So, now that I had Testcontainers configured to run the web app, my eye turned to Webdriver Containers, an ancillary capability of Testcontainers that lets you run these full-fledged browsers via their Docker images, and even has cool abilities like letting you record the screen interactions over VNC. Portability and full production representation? Sign me up.

The initial setup was pretty easy, just adding some dependencies for the Selenium remote driver (replacing my HtmlUnit driver) and the Testcontainers Selenium module:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
<dependency>
    <groupId>org.seleniumhq.selenium</groupId>
    <artifactId>selenium-remote-driver</artifactId>
    <version>3.141.59</version>
    <scope>test</scope>
</dependency>
<dependency>
    <groupId>org.testcontainers</groupId>
    <artifactId>selenium</artifactId>
    <version>1.15.3</version>
    <scope>test</scope>
</dependency>

Programmatic Container Setup

After that, my next task was to configure the containers. I'll skip over some of my troubleshooting and just describe where I ended up. Basically, since both the webapp and browsers are in Docker containers, I had to coordinate how they communicate with each other. There seem to be a few ways to do this, but the route I went was to build a Docker network in my container orchestration class, bind all of the containers to it, and then reference the app via a network alias.

With that addition and some containers for Chrome and Firefox, the class looks more like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
public enum AppTestContainers {
    instance;
    
    public final Network network = Network.builder()
        .driver("bridge") //$NON-NLS-1$
        .build();
    public final GenericContainer<?> webapp;
    public final BrowserWebDriverContainer<?> chrome;
    public final BrowserWebDriverContainer<?> firefox;
    
    @SuppressWarnings("resource")
    private AppTestContainers() {
        webapp = new GenericContainer<>(DockerImageName.parse("client-webapp-test:1.0.0-SNAPSHOT")) //$NON-NLS-1$
                .withExposedPorts(8080)
                .withNetwork(network)
                .withNetworkAliases("client-webapp-test"); //$NON-NLS-1$
        
        chrome = new BrowserWebDriverContainer<>()
            .withCapabilities(new ChromeOptions())
            .withNetwork(network);
        firefox = new BrowserWebDriverContainer<>()
            .withCapabilities(new FirefoxOptions())
            .withNetwork(network);

        webapp.start();
        chrome.start();
        firefox.start();
        
        Runtime.getRuntime().addShutdownHook(new Thread(() -> {
            webapp.close();
            chrome.close();
            firefox.close();
            network.close();
        }));
    }
}

Now that they're all on the same Docker network, the browser containers are able to refer to the webapp like "http://client-webapp-test:8080".

Adding Parameterized Tests

The handful of UI tests I'd set up previously had lines like WebDriver driver = new HtmlUnitDriver(BrowserVersion.FIREFOX, true) to create their WebDriver instance, but now I want to run the tests with both real Firefox and real Chrome. Since I want to test that the app works consistently, I'll want the same tests across browsers - and that's a call for parameterized tests in JUnit.

The way parameterized tests work in JUnit is that you declare a test as being parameterized, and then feed it your parameters via one of a number of mechanisms - "all values of an enum", "this array of strings", and a handful of others. The one to use here is to make a class implementing ArgumentsProvider and configure that:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
import java.util.stream.Stream;

import org.junit.jupiter.api.extension.ExtensionContext;
import org.junit.jupiter.params.provider.Arguments;
import org.junit.jupiter.params.provider.ArgumentsProvider;
import org.testcontainers.containers.BrowserWebDriverContainer;

public class BrowserArgumentsProvider implements ArgumentsProvider {
    @Override
    public Stream<? extends Arguments> provideArguments(ExtensionContext context) throws Exception {
        return Stream.of(
            AppTestContainers.instance.chrome,
            AppTestContainers.instance.firefox
        )
        .map(BrowserWebDriverContainer::getWebDriver)
        .map(Arguments::of);
    }
}

This class will take my configured browser containers, get the WebDriver instance for each, and provide that as parameters to a test method. In turn, the test method looks like this:

1
2
3
4
5
6
7
8
@ParameterizedTest
@ArgumentsSource(BrowserArgumentsProvider.class)
public void testDefaultLoginPage(WebDriver driver) {
    driver.get(getContainerRootUrl());
    assertEquals("Expected App Title", driver.getTitle());

    // Other tests follow
}

Now, JUnit will run the test twice, once for each browser, and I can add any other configurations I want smoothly.

Minor Gotcha: Container vs. Non-Container URLs

Though some of my tests were using Selenium already, most of them just use the JAX-RS REST client from the testing JVM directly, which is not containerized in this setup. That meant that I had to start worrying about the distinction between the URLs - the containers can't access "localhost:(some random port)", while the JUnit JVM can't access "client-webapp-test:8080".

For the most part, that's not too tough: I added some more utility methods named to suit and changed the UI tests to use those. However, there was one tricky bit: one of the UI tests uses Selenium to fetch the page and process the HTML, but then uses the JAX-RS client to make sure that a bunch of references on the page resolve to non-404 resources properly. Stuff like this:

1
2
3
4
5
driver.findElements(By.xpath("//link[@rel='stylesheet']"))
    .stream()
    .map(link -> link.getAttribute("href"))
    .map(href -> rootUri.resolve(href))
    .forEach(uri -> checkUrlWorks(uri, jaxRsClient));

(It's highly likely that there's a better way to do this in Selenium, but hey, it's still a useful example.)

The trouble with the above was that the URLs coming out of Selenium included the full container URL, not the host-accessible one.

Fortunately, that's not too tricky - it's really just string substitution, since the host and container URLs are known at runtime and won't conflict with anything. So I added a "decontainerize" method and run my URLs through it in the stream:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
public URI decontainerize(URI uri) {
    String url = uri.toString();
    if(url.startsWith(getContainerRootUrl())) {
        return URI.create(getRootUrl() + url.substring(getContainerRootUrl().length()));
    } else {
        return uri;
    }
}

// later

driver.findElements(By.xpath("//link[@rel='stylesheet']"))
    .stream()
    .map(link -> link.getAttribute("href"))
    .map(href -> rootUri.resolve(href))
    .map(this::decontainerize)
    .forEach(uri -> checkUrlWorks(uri, jaxRsClient));

With that, all the results came back green again.

Overall, this was a little fiddly, but mostly in a way that helped me learn a little bit more about how this sort of thing works, and now I'm prepped to do real, portable full test suites. Neat!

Java Object Proxies (Not the Networking Kind)

Fri Jul 09 14:39:23 EDT 2021

Tags: java

Though my previous post was also about proxies, I've had this topic percolating for a while, and it's related mostly by name and very-loose concept. Specifically, I'd like to talk about dynamic proxy classes in Java, which is a mechanism to allow you to create "fake" classes that programmatically intercept method calls.

This is something that I didn't properly realize was possible in Java until diving into CDI (though its mechanism is slightly different). In retrospect (and by "@since" annotation), it's obvious that this has been present for a long time, since Java 1.3, but outside of my realm of experience.

Definition, the Roundabout Way

So, to begin with, I'll have to define what I even mean by this, or at least an example of how it works in practice.

Normally, you just have classes and objects based on them - class Foo gets instantiated via new Foo() and there's a pretty clear direct relationship, "Object-Oriented Programming 101" sort of stuff. Let's add a little indirection by way of our old friend Interfaces. Say you have this setup:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
interface Person {
    String getName();
}

class PersonImpl implements Person {
    @Override
    public String getName() {
        return "Joe Schmoe";
    }
}

// ...

Person foo = new PersonImpl();
System.out.println("Hello from " + foo.getName());

This is essentially the same kind of thing that you're doing in the original concept - instantiating an object that you can call methods on - but you're taking a step back. You know here that you're just calling new PersonImpl(), but that's less of a hard requirement: you could instead do Person foo = lookupPerson("Joe Schmoe") and that method could return any implementation of the interface it likes.

So let's do just that, and here's where we see proxies:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
class PersonProxy implements InvocationHandler {
    public Object invoke(Object proxy, Method method, Object[] args) throws Throwable {
        if("getName".equals(method.getName())) {
            return "Proxy-Eyed Joe";
        }
        throw new UnsupportedOperationException("I don't know how to handle " + method);
    }
}

// ...

public Person lookupPerson(String name) {
    return (Person)Proxy.newProxyInstance(Thread.currentThread().getContextClassLoader(), new Class<?>[] { Person.class }, new PersonProxy());
}

For the code calling lookupPerson, this will function largely identically to if you used a normal class, but it's got this weirdness going on beneath. When code calls getPerson (or any other method), rather than calling a directly-implemented method like in the normal class, it instead calls invoke on this InvocationHandler, which can then handle it any way it'd like. If you've used more-dyanmic languages, this may look similar to method_missing in Ruby or invocation forwarding in Objective-C.

Non-Thorough Mention of Other Proxy Types

Before continuing into why you'd want to use this, I think it's important to note that the Proxy object in question above is java.lang.reflect.Proxy, which is a built-in mechanism that ships with the JVM. However, it's both limited and not the only game in town. The main way it's limited is that you can only proxy to interfaces with it, not normal classes. If Person above were class Person instead of interface Person, then the stock Proxy class would be out of luck.

There are other implementations, though - the ones that spring to mind are cglib and Javassist, but I believe there are others. These differ in their implementation (often doing things like bytecode manipulation), capabilities (these generally allow you to proxy to full classes), and performance characteristics. The concepts are largely the same, though, so from now on I'll use "proxying" to refer to the concept generally and not solely to the specific capability that comes with Java.

So What's the Use?

Okay, so you can make a proxy object that allows for programmatic handling of method calls. How would this actually be useful in practice?

In my work, I've had a couple cases where I implement proxies for specific behaviors, and they're also extremely common in Jakarta EE and Spring development. Some of these uses can get pretty arcane in concept or implementation, but others will hopefully be simpler to demonstrate.

Example 1: Counting Method Calls

For this case, say you want to count how many times a method is called in practice (and also say that YourKit doesn't exist for some reason). Taking our example InvocationHandler above, we can expand it to keep a running total of calls:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
class PersonProxy implements InvocationHandler {
    private final Map<Method, AtomicLong> counter = new HashMap<>();

    public Object invoke(Object proxy, Method method, Object[] args) throws Throwable {
        this.counter.computeIfAbsent(method, key -> new AtomicLong()).incrementAndGet();

        if("getName".equals(method.getName())) {
            return "Proxy-Eyed Joe";
        }
        throw new UnsupportedOperationException("I don't know how to handle " + method);
    }
}

(I use AtomicLong here because it's a convenient number holder, but it's also incidentally a step in the direction of making this thread-safe)

Now, as any method is called on your object, you'll get a count of invocations. You can imagine elsewhere having an admin console that lists totals for each object, giving you an idea of where the performance-sensitive parts of your code likely are.

This is, in fact, what MicroProfile Metrics does, albeit in a more-flexible and -complete way than this example. That spec uses annotations to define what you want tracked and how, and then the CDI-based proxy objects can keep track of counts and execution times.

Example 2: Performance Improvements

For this case, imagine you have a model framework in Domino where you have classes representing back-end documents. The loading of these documents can get pretty expensive, especially if you implemented it in a traditional way where your code loads up all values for the front-end Java class from the document at once. However, say you also have a mechanism to look up these documents in bulk via views, where values from the document might be already indexed and readable faster without having to crack the whole thing open. You might end up with a proxy class like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
class PersonModelProxy implements InvocationHandler {
    private final ViewEntry viewEntry;
    private PersonImpl realDoc;

    public PersonModelProxy(ViewEntry viewEntry) {
        this.viewEntry = viewEntry;
    }

    public Object invoke(Object proxy, Method method, Object[] args) throws Throwable {
        if("getId".equals(method.getName())) {
            return viewEntry.getUniversalID();
        }

        PersonModel obj = someExpensiveObjectLoad();
        return method.invoke(obj, args);
    }
}

PersonModel person = fetchModelWithProxy();
String unid = person.getId(); // Fast!
String firstName = person.getFirstName(); // Slow, but could be made fast

Here, because the UNID will already be present in the view entry, we can just return that immediately rather than loading the full backend document. If you expand this to apply to other properties that can come from view columns, you can do efficient batch lookups and tables while not having to have a separate "read from view instead of doc" mechanism on the front end.

This is something I'm doing (using Javassist's proxies) with a client project to put view entries over existing model objects that had accrued over the years. This allows us to keep the same logic while massively speeding up operations that don't actually need the document to be loaded.

Example 3: Repetitive but Predictable Code

If you have behavior that can be reliably derived from some non-algorithmic source (say, from annotations, app configuration, or so forth), proxies can help you write adaptive code once that avoids the need to write tons of boilerplate methods or classes.

As an example, I'll expand a bit on the first notion above, which is profiling. A long time ago, the erstwhile XPages team released the XPages Toolbox, which provides a reasonably-fine-grained view into what takes up the time during an XPages request. It also provides a way for your code to opt into this profiling, but the idiom is extremely repetitive. You can see it in the Extension Library, and it tends to look something like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
public class SomeBean {
    private static final ProfilerType profilerType = new ProfilerType("SomeBean");

    public void doFoo() {
        if(Profiler.isEnabled()) {
            ProfilerAggregator agg = Profiler.startProfileBlock(profilerType, "doFoo");
            long ts = Profile.getCurrentTime();
            try {
                _realDoFoo();
            } finally {
                Profiler.endProfileBlock(agg, ts);
            }
        } else {
            _realDoFoo();
        }
    }

    private void _realDoFoo() {
        // Actual code goes here
    }
}

That's certainly explicable, but imagine writing that for all methods, or even for a large chunk of methods you want to optimize. That could be done a little more cleanly now with Java 8+ features, but it'd still be a drag.

If we go back to the idea handled by MicroProfile Metrics, it would instead look more like:

1
2
3
4
5
6
7
@Profiler(name="SomeBean")
public class SomeBean {
    @Timed(name="doFoo")
    public void doFoo() {
        // Actual code goes here
    }
}

That's a little nicer! In practice, this is really done with CDI, which provides its own nice layer around proxies, but you could see implementing it with IBM's profiler and a proxy like so (forgive the fragility of the code):

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
class ProfilerProxy implements InvocationHandler {
    private final Map<Class<?>, ProfilerType> profilers = new HashMap<>();

    public Object invoke(Object proxy, Method method, Object[] args) throws Throwable {
        ProfilerType profilerType = profilers.computeIfAbsent(proxy.getClass(), c -> new ProfilerType(c.getAnnotation(Profiler.class).name()));

        if(Profiler.isEnabled()) {
            String metricName = method.getAnnotation(Timed.class).name();
            ProfilerAggregator agg = Profiler.startProfilerBlock(profilerType, metricName);
            long ts = Profiler.getCurrentTime();
            try {
                method.invoke(obj, args);
            } finally {
                Profiler.endProfilerBlock(agg, ts);
            }
        } else {
            method.invoke(obj, args);
        }
    }
}

Now you only write the boilerplate once and it'll work on anything, or at least anything you're proxying.

Summary

In general Proxies are the sort of thing where it's somewhat rare that you'll have occasion to write one "directly" yourself, but they're immediately useful in the few times when that matters. They're also vital to know about for our brave new CDI world, since they explain so much of how newer Java standards do what they do - I didn't even go into things like how JNoSQL determines desired behavior just from method name, for example. This one's a deep rabbit hole, but it's extremely useful to know it's even possible, even before you get to putting it to use.

Codicil: Performance

Using these proxy objects quickly brings a curious thought to one's mind: what kind of overhead is there? Do I have to worry about performance degradation if there's this extra layer of indirection?

The long answer is that it's complicated. The short answer, though, is that you will almost definitely not have to worry. Certainly, there's going to be some inherent overhead just because more stuff is happening, but in general that extra work is dwarfed by the actual business of your business logic. If you have a method that computes a complicated value, or fetches something from a database, or (lord help your profiler) makes a remote network call, the overhead of the proxy is going to be several orders of magnitude smaller than the action being performed.

That said, it doesn't necessarily hurt to keep the idea of this impact in mind. While very little in a web application will be harmed in any perceptible way by proxying, there's always a chance that you'll run into a situation where a simple method is called thousands and thousands of times more often than more-complex ones, and that's where you may want to either move it to a non-proxied object or comb through the latest performance metrics for different proxy libraries. As with most optimization, though, that's something to do after you've found that there's a performance problem, not (usually) before.

My Tortured Relationship With libnotes

Sat May 22 12:31:10 EDT 2021

Tags: c java

A tremendous amount of my work lately involves wrangling the core Notes library, variously "libnotes.dylib", "libnotes.so", or (for some reason) "nnotes.dll" (edit: see the comments!). I do almost all of my daily work in Liberty servers on the Mac loading this library, my first major use of the Domino Docker image was to use it as an overgrown carrier for this native piece, and in general I spend a lot of time putting this to use.

It ain't easy, though! Unlike a lot of native libraries that you can just kind of load from wherever, libnotes is extremely picky about its loading environment. There are a few main ways that this manifests.

Required-Library References

libnotes isn't standalone, and it doesn't just rely on standard OS stuff like libc. It also relies on other Notes-specific libraries like libxmlproc and libgsk8iccs, and some of those refer back to each other. This all shakes out in normal practice, since they're all next to the Notes/Domino executable and each other, but it makes things finicky when you're running from outside there.

This seems to be the most finicky on macOS, where the references to each library are marked with @executable_path relative paths, meaning relative to the running executable.

I've wrangled with this quite a bit over the years, but the upshot is that, on macOS and Linux, you really want to set some environment variables before you ever load your program. Naturally, since your running program can't do anything about what happened before it was loaded, this means you have to balance the teacups in your environment beforehand.

Non-Library Files

Beyond the dynamic libraries, libnotes also need some program-support executable files, things like string resource files and whatnot. And, very importantly, it needs an active data directory with an ID, notes.ini, and names.nsf at least. The data-directory contents bits are at least a little more forgiving: though there's still a hard requirement that they be present on the filesystem (since libnotes loads them by string path, not as configurable binary streams), you could at least bundle them or copy them around. For example, for my Dockerized app runners, I tend to have some basic versions of those in the repo that get copied into the Docker container's notesdata directory during build.

Working Around It

As I mentioned, the main way I go about dealing with this is by telling whatever is running my program to use applicable environment variables, ideally in a reproducible config-file-based way. This works perfectly with Docker and with tycho-surefire-plugin in Maven, but doesn't work so well for maven-surefire-plugin (the normal unit test runner) or Eclipse's JUnit tools. In Eclipse's case, I can at least fill in the environment variables in the Run Configuration. It hampers me a bit (I can't just right-click a test case and run it individually without first running the whole suite or setting up a configuration specially), but it works.

I gave a shot recently to copying the Mac dylibs and programmatically fiddling with otool and install_name_tool to adjust their dependency paths, and I got somewhere with that, but that somewhere was an in-libnotes fatal panic, so I'm clearly missing something. And besides, even if I got that working, it'd be a bit of a drag.

What Would Be Nice

What would be really nice would be a variant of this that's just more portable - something where I can just System.load a library, call NotesInitExtended to point to my INI and ID, and be good to go. I'm not really sure what this would entail, especially since I'd also want libinotes, libjnotes, and liblsxbe. I do know that I don't have the tools to do it, which fortunately frees me up to idly speculate about things I'd like to have delivered to me.

As long as I'm wishing for stuff, I'll say what would be even cooler would be a WebAssembly library usable with Wasmer, which is a multi-language WebAssembly runtime. The promise there is that you'd have a single compiled library that would run on any OS on any processor that supports WebAssembly, from now until forever. I'm not sure that this would actually be doable at the moment - for one, I don't know if callback parameters work, which would be fairly critical. Still, my birthday is coming up, and that would be a nice present.

More Notes on Filesystem and Charset Portability

Tue May 18 15:39:40 EDT 2021

Tags: java
  1. Java Hiccups
  2. Bitwise Operators
  3. Java Grab Bag 2
  4. Java Travelogue: The Care and Feeding of Locales
  5. More Notes on Filesystem and Charset Portability

A few months back, I talked about some localization troubles in the NSF ODP Tooling and how it's important to be explicit in your handling of this sort of thing to make sure your code will work in an environment that isn't specifically "Linux or macOS in an en-US environment".

Well, after making a bunch of little tweaks over the last few days, I have two additional tips in this arena! Specifically, my foes this round came from three sources: Windows, my use of a ZIP file filesystem, and the old reliable charset.

Path Separators

The first bit of trouble had to do with how those two things interact. For a long time, I've been in the (commonly-held) habit of using File.separator and File.separatorChar to get the default path separator for the system - that is, \ on Windows and / on most other platforms. Those work well enough - no real trouble there.

However, my problem came from using the Java NIO ZIP filesystem on Windows. Take this bit of code:

1
2
3
4
5
6
7
public static String toJavaClassName(Path path) {
	String name = path.toString();
	if(name.endsWith(".java")) {
		return name.substring(0, name.length()-".java".length()).replace(File.separatorChar, '.');
	}
	/* Other conditions here */
}	

When Path is a path on the local filesystem, that works just fine, taking a path like "com/example/Foo.java" and turning it into "com.example.Foo". It also works splendidly on macOS and Linux in all cases, the two systems I actually use. However, when path represents a path within a ZIP file and you're working on Windows, it fails, returning a "class name" like "com/example/Foo".

This is exactly what happens when compiling an ODP using a remote Domino server running on Windows. For the portability reasons mentioned in my previous post, the client sends a ZIP of the ODP to the server and then the compilation pulls directly out of that ZIP instead of writing it out to the filesystem. The way the ZIP filesystem driver in Java is written, it uses / for its path separator on all platforms, which is consistent with dealing with ZIP files generally. But, when mixed with the native filesystem separator, that line resolved to:

1
return "com/example/Foo".replace('\\', '.');

...and there's the problem. The fix is to change the code to instead get the directory separator from the contextual filesystem in question:

1
2
3
4
5
6
7
public static String toJavaClassName(Path path) {
	String name = path.toString();
	if(name.endsWith(".java")) {
		return name.substring(0, name.length()-".java".length()).replace(path.getFileSystem().getSeparator(), ".");
	}
	/* Other conditions here */
}

A little more verbose, sure, but it has the advantage of functioning consistently in all environments.

This also has significant implications if you use static properties to store filesystem-dependent elements. This came into play in my OnDiskProject class, which contains a bunch of path matchers to find design elements to import from the ODP. Originally, I kept these in a static property that was generated by writing them Unix-style, then running them through a generator to use the platform-native separator character. This had to change, since the actual ODP store may or may not be the platform-native filesystem. This sort of thing is pervasive, and it'll take me a bit to get over my long-standing habit.

Over-Interpreting Character Sets

This one is similar to the charset troubles in my previous post, but ran into subtle trouble in the ODP compiler. Here was the sequence of events:

  1. The ODP Compilers reads the XSP source of a page or custom control using ODPUtil, which read in the string as UTF-8
  2. It then passes that string to the Bazaar's DynamicXPageBean
  3. That method uses StringReader and an IBM Commons ReaderInputStream to read the content
  4. That content is then read in by FacesReader, which uses the default DOM parser to read the XML

In general, that flow worked just fine. However, that's because, in general, I write US-ASCII markup. However, when the page contains, say, Czech diacritics, this goes off the rails. Somewhere in the interpretation and re-interpretation of the file, the UTF-8-iness of it breaks.

Fortunately, this one was a clean one: XML has its own mechanism for declaring its encoding (and it's almost always UTF-8 anyway), so my code doesn't actually need to be responsible for interpreting the bytes of the file before it gets to the DOM parser. So I added a version of the Bazaar method that takes an InputStream directly and modified NSF ODP to use it, with no extra interpretation in between.

Implementing Custom Token-Based Auth on Liberty With Domino

Sat Apr 24 12:31:00 EDT 2021

This weekend, I decided to embark on a small personal side project: implementing an RSS sync server I can use with NetNewsWire. It's the delightful sort of side project where the stakes are low and so I feel no pressure to actually complete it (I already have what I want with iCloud-based syncing), but it's a great learning exercise.

Fair warning: this post is essentially a travelogue of not-currently-public code for an incomplete side app of mine, and not necessarily useful as a tutorial. I may make a proper example project out of these ideas one day, but for the moment I'm just excited about how smoothly this process has gone.

The Idea

NetNewsWire syncs with a number of services, and one of them is FreshRSS, a self-hosted sync tool that uses PHP backed by an RDBMS. The implementation doesn't matter, though: what matters is that that means that NNW has the ability to point at any server at an arbitrary URL implementing the same protocol.

As for the protocol itself, it turns out it's just the old Google Reader protocol. Like Rome, Reader rose, transformed the entire RSS ecosystem, and then crumbled, leaving its monuments across the landscape like scars. Many RSS sync services have stuck with that language ever since - it's a bit gangly, but it does the job fine, and it lowers the implementation toll on the clients.

So I figured I could find some adequate documentation and make a little webapp implementing it.

Authentication

My starting point (and all I've done so far) was to get authentication working. These servers mimic the (I assume antiquated) Google ClientLogin endpoint, where you POST "Email" and "Passwd" and get back a token in a weird little properties-ish format:

1
2
3
4
POST /accounts/ClientLogin HTTP/1.1
Content-Type: application/x-www-form-urlencoded

Email=ffooson&Passwd=secretpassword

Followed by:

1
2
3
4
5
6
HTTP/1.1 200 OK
Content-Type: text/html; charset=UTF-8

SID=null
LSID=null
Auth=somename/8e6845e089457af25303abc6f53356eb60bdb5f8

The format of the "Auth" token doesn't matter, I gather. I originally saw it in that "name/token" pattern, but other cases are just a token. That makes sense, since there's no need for the client to parse it - it just needs to send it back. In practice, it shouldn't have any "=" in it, since NNW parses the format expecting only one "=", but otherwise it should be up to you. Specifically, it will send it along in future requests as the Authorization header:

1
2
GET /reader/api/0/stream/items/ids?n=1000&output=json&s=user/-/state/com.google/starred HTTP/1.1
Authorization: GoogleLogin auth=somename/8e6845e089457af25303abc6f53356eb60bdb5f8

This is pretty standard stuff for any number of authentication schemes: often it'll start with "Bearer" instead of "GoogleLogin", but the idea is the same.

Implementing This

So how would one go about implementing this? Well, fortunately, the Jakarta EE spec includes a Security API that allows you to abstract the specifics of how the container authenticates a user, providing custom user identity stores and authentication mechanisms instead of or in addition to the ones provided by the container itself. This is as distinct from a container like Domino, where the HTTP stack handles authentication for all apps, and the only way to extend how that works is by writing a native library with the C-based DSAPI. Possible, but cumbersome.

Identity Store

We'll start with the identity store. Often, a container will be configured with its own concept of what the pool of users is and how they can be authenticated. On Domino, that's generally the names.nsf plus anything configured in a Directory Assistance database. On Liberty or another JEE container, that might be a static user list, an LDAP server, or any number of other options. With the Security API, you can implement your own. I've been ferrying around classes that look like this for a couple of years now:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
/* snip */

import javax.security.enterprise.credential.Credential;
import javax.security.enterprise.credential.UsernamePasswordCredential;
import javax.security.enterprise.identitystore.CredentialValidationResult;
import javax.security.enterprise.identitystore.IdentityStore;

@ApplicationScoped
public class NotesDirectoryIdentityStore implements IdentityStore {
    @Inject AppConfig appConfig;

    @Override public int priority() { return 70; }
    @Override public Set<ValidationType> validationTypes() { return DEFAULT_VALIDATION_TYPES; }

    public CredentialValidationResult validate(UsernamePasswordCredential credential) {
        try {
            try(DominoClient client = DominoClientBuilder.newDominoClient().build()) {
                String dn = client.validateCredentials(appConfig.getAuthServer(), credential.getCaller(), credential.getPasswordAsString());
                return new CredentialValidationResult(null, dn, dn, dn, getGroups(dn));
            }
        } catch (NameNotFoundException e) {
            return CredentialValidationResult.NOT_VALIDATED_RESULT;
        } catch (AuthenticationException | AuthenticationNotSupportedException e) {
            return CredentialValidationResult.INVALID_RESULT;
        }
    }

    @Override
    public Set<String> getCallerGroups(CredentialValidationResult validationResult) {
        String dn = validationResult.getCallerDn();
        return getGroups(dn);
    }

    /* snip */
}

There's a lot going on here. To start with, the Security API goes hand-in-hand with CDI. That @ApplicationScoped annotation on the class means that this IdentityStore is an app-wide bean - Liberty picks up on that and registers it as a provider for authentication. The AppConfig is another CDI bean, this one housing the Domino server I want to authenticate against if not the local runtime (handy for development).

The IdentityStore interface definition does a little magic for identifying how to authenticate. The way it works is that the system uses objects that implement Credential, an extremely-generic interface to represent any sort of credential. When the default implementation is called, it looks through your implementation class for any methods that can handle the specific credential class that came in. You can see above that validate(UsernamePasswordCredential credential) isn't tagged with @Override - that's because it's not implementing an existing method. Instead, the core validate looks for other methods named validate to take the incoming class. UsernamePasswordCredential is one of the few stock ones that comes with the API and is how the container will likely ask for authentication if using e.g. HTTP Basic auth.

Here, I use some Domino API to check the username+password combination against the Domino directory and inform the caller whether the credentials match and, if so, what the user's distinguished name and group memberships are (with some implementation removed for clarity).

Token Authentication

That's all well and good, and will allow a user to log into the app with HTTP Basic authentication with a Domino username and password, but I'd also like the aforementioned GoogleLogin tokens to count as "real" users in the system.

To start doing that, I created a JAX-RS resource for the expected login URL:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
@Path("accounts")
public class AccountsResource {
    @Inject TokenBean tokens;
    @Inject IdentityStore identityStore;

    @PermitAll
    @Path("ClientLogin")
    @POST
    @Consumes(MediaType.APPLICATION_FORM_URLENCODED)
    @Produces(MediaType.TEXT_HTML)
    public String post(@FormParam("Email") @NotEmpty String email, @FormParam("Passwd") String password) {
        CredentialValidationResult result = identityStore.validate(new UsernamePasswordCredential(email, password));
        switch(result.getStatus()) {
        case VALID:
            Token token = tokens.createToken(result.getCallerDn());
            String mangledDn = result.getCallerDn().replace('=', '_').replace('/', '_');
            return MessageFormat.format("SID=null\nLSID=null\nAuth={0}\n", mangledDn + "/" + token.token()); //$NON-NLS-1$ //$NON-NLS-2$
        default:
            // TODO find a better exception
            throw new RuntimeException("Invalid credentials");
        }
    }

}

Here, I make use of the IdentityStore implementation above to check the incoming username/password pair. Since I can @Inject it based on just the interface, the fact that it's authenticating against Domino isn't relevant, and this class can remain blissfully unaware of the actual user directory. All it needs to know is whether the credentials are good. In any event, if they are, it returns the weird little format in the response and the RSS client can then use it in the future.

The TokenBean class there is another custom CDI bean, and its job is to create and look up tokens in the storage NSF. The pertinent part is:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
@ApplicationScoped
public class TokenBean {
    @Inject @AdminUser
    Database adminDatabase;

    public Token createToken(String userName) {
        Token token = new Token(UUID.randomUUID().toString().replace("-", ""), userName); //$NON-NLS-1$ //$NON-NLS-2$
        adminDatabase.createDocument()
            .replaceItemValue("Form", "Token") //$NON-NLS-1$ //$NON-NLS-2$
            .replaceItemValue("Token", token.token()) //$NON-NLS-1$
            .replaceItemValue("User", token.user()) //$NON-NLS-1$
            .save();
        return token;
    }

    /* snip */
}

Nothing too special there: it just creates a random token string value and saves it in a document. The token could be anything; I could have easily gone with the document's UNID, since it's basically the same sort of value.

I'll save the @Inject @AdminUser bit for another day, since we're already far enough into the CDI weeds here. Suffice it to say, it injects a Database object for the backing data DB for the designated admin user - basically, like opening the current DB with sessionAsSigner in XPages. The @AdminUser is a custom annotation in the app to convey this meaning.

Okay, so great, now we have a way for a client to log in with a username and password and get a token to then use in the future. That leaves the next step: having the app accept the token as an equivalent authentication for the user.

Intercepting the incoming request and analyzing the token is done via another Jakarta Security API interface: HttpAuthenticationMechanism. Creating a bean of this type allows you to look at an incoming request, see if it's part of your custom authentication, and handle it any way you want. In mine, I look for the "GoogleLogin" authorization header:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
@ApplicationScoped
public class TokenAuthentication implements HttpAuthenticationMechanism {
    @Inject IdentityStore identityStore;
    
    @Override
    public AuthenticationStatus validateRequest(HttpServletRequest request, HttpServletResponse response,
            HttpMessageContext httpMessageContext) throws AuthenticationException {
        
        String authHeader = request.getHeader("Authorization"); //$NON-NLS-1$
        if(StringUtil.isNotEmpty(authHeader) && authHeader.startsWith(GoogleAccountTokenHandler.AUTH_PREFIX)) {
            CredentialValidationResult result = identityStore.validate(new GoogleAccountTokenHeaderCredential(authHeader));
            switch(result.getStatus()) {
            case VALID:
                httpMessageContext.notifyContainerAboutLogin(result);
                return AuthenticationStatus.SUCCESS;
            default:
                return AuthenticationStatus.SEND_FAILURE;
            }
        }
        
        return AuthenticationStatus.NOT_DONE;
    }

}

Here, I look for the "Authorization" header and, if it starts with "GoogleLogin auth=", then I parse it for the token, create an instance of an app-custom GoogleAccountTokenHeaderCredential object (implementing Credential) and ask the app's IdentityStore to authorize it.

Returning to the IdentityStore implementation, that meant adding another validate override:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
@ApplicationScoped
public class NotesDirectoryIdentityStore implements IdentityStore {
    /* snip */

    public CredentialValidationResult validate(GoogleAccountTokenHeaderCredential credential) {
        try {
            try(DominoClient client = DominoClientBuilder.newDominoClient().build()) {
                String dn = client.validateCredentialsWithToken(appConfig.getAuthServer(), credential.headerValue());
                return new CredentialValidationResult(null, dn, dn, dn, getGroups(dn));
            }
        } catch (NameNotFoundException e) {
            return CredentialValidationResult.NOT_VALIDATED_RESULT;
        } catch (AuthenticationException | AuthenticationNotSupportedException e) {
            return CredentialValidationResult.INVALID_RESULT;
        }
    }
}

This one looks similar to the UsernamePasswordCredential one above, but takes instances of my custom Credential class - automatically picked up by the default implementation. I decided to be a little extra-fancy here: the particular Domino API in question supports custom token-based authentication to look up a distinguished name, and I made use of that here. That takes us one level deeper:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
public class GoogleAccountTokenHandler implements CredentialValidationTokenHandler<String> {
    public static final String AUTH_PREFIX = "GoogleLogin auth="; //$NON-NLS-1$
    
    @Override
    public boolean canProcess(Object token) {
        if(token instanceof String authHeader) {
            return authHeader.startsWith(AUTH_PREFIX);
        }
        return false;
    }

    @Override
    public String getUserDn(String token, String serverName) throws NameNotFoundException, AuthenticationException, AuthenticationNotSupportedException {
        String userTokenPair = token.substring(AUTH_PREFIX.length());
        int slashIndex = userTokenPair.indexOf('/');
        if(slashIndex >= 0) {
            String tokenVal = userTokenPair.substring(slashIndex+1);
            Token authToken = CDI.current().select(TokenBean.class).get().getToken(tokenVal)
                .orElseThrow(() -> new AuthenticationException(MessageFormat.format("Unable to find token \"{0}\"", token)));
            return authToken.user();
        }
        throw new AuthenticationNotSupportedException("Malformed token");
    }

}

This is the Domino-specific one, inspired by the Jakarta Security API. I could also have done this lookup in the previous class, but this way allows me to reuse this same custom authentication in any API use.

Anyway, this class uses another method on TokenBean:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
@ApplicationScoped
public class TokenBean {    
    @Inject @AdminUser
    Database adminDatabase;

    /* snip */

    public Optional<Token> getToken(String tokenValue) {
        return adminDatabase.openCollection("Tokens") //$NON-NLS-1$
            .orElseThrow(() -> new IllegalStateException("Unable to open view \"Tokens\""))
            .query()
            .readColumnValues()
            .selectByKey(tokenValue, true)
            .firstEntry()
            .map(entry -> new Token(entry.get("Token", String.class, ""), entry.get("User", String.class, ""))); //$NON-NLS-1$ //$NON-NLS-2$ //$NON-NLS-3$ //$NON-NLS-4$
    }
}

There, it looks up the requested token in the "Tokens" view and, if present, returns a record indicating that token and the user it was created for. The latter is then returned by the above Domino-custom GoogleAccountTokenHandler as the authoritative validated user. In turn, the JEE NotesDirectoryIdentityStore considers the credential validation successful and returns it back to the auth mechanism. Finally, the TokenAuthentication up there sees the successful validation and notifies the container about the user that the token mapped to.

Summary

So that turned into something of a long walk at the end there, but the result is really neat: as far as my app is concerned, the "GoogleLogin" tokens - as looked up in an NSF - are just as good as username/password authentication. Anything that calls httpServletRequest.getUserPrincipal() will see the username from the token, and I also use this result to spawn the Domino session object for each request.

Once all these pieces are in place, none of the rest of the app has to have any knowledge of it at all. When I implement the API to return the actual RSS feed entries, I'll be able to just use the current user, knowing that it's guaranteed to be properly handled by the rest of the system beforehand.

Bonus: Java 16

This last bit isn't really related to the above, but I just want to gush a bit about newer techs. My plan is to deploy this app using my Open Liberty Runtime, which means I can use any Open Liberty and Java version I want. Java 16 came out recently, so I figured I'd give that a shot. Though I don't think Liberty is officially supported on it yet, it's worked out just fine for my needs so far.

This lets me use the features that have come into Java in the last few years, a couple of which moved from experimental/incubating into finalized forms in 16 specifically. For example, I can use records, a specialized type of Java class intended for immutable data. Token is a perfect case for this:

1
2
public record Token(String token, String user) {
}

That's the entirety of the class. Because it's a record, it gets a constructor with those two properties, plus accessor methods named after the properties (as used in the examples above). Neat!

Another handy new feature is pattern matching for instanceof. This allows you to simplify the common idiom where you check if an object is a particular type, then cast it to that type afterwards to do something. With this new syntax, you can compress that into the actual test, as seen above:

1
2
3
4
5
6
7
@Override
public boolean canProcess(Object token) {
    if(token instanceof String authHeader) {
        return authHeader.startsWith(AUTH_PREFIX);
    }
    return false;
}

Using this allows me to check the incoming value's type while also immediately creating a variable to treat it as such. It's essentially the same thing you could do before, but cleaner and more explicit now. There's more of this kind of thing on the way, and I'm looking forward to the future additions eagerly.

The Joyful Utility of Optionals in Java

Fri Apr 23 11:11:22 EDT 2021

Tags: java
  1. The Cleansing Flame of Null Analysis
  2. Quick Tip: JDK Null Annotations for Eclipse
  3. The Joyful Utility of Optionals in Java

A while back, I talked about how I had embraced nullness annotations in several of my projects. However, that post predated Domino's laggardly move to Java 8 and so didn't discuss one of the tools that came to Java core in that version: java.util.Optional.

The Concept

Java's Optional is a passable implementation of the Option type concept that's been floating around programming circles for a good long time. It's really come to the fore with the proliferation of large-platform languages like Swift and Kotlin that have the concept built in to the syntax.

Java's implementation doesn't go that far - there's no special syntax for them, at least not yet - but the concept remains the same. The idea is that, when you embrace Optional use, your code will no longer return null, with the goal of cutting down on the pernicious NullPointerException. While you may still return an empty value, you will be doing so in a way that allows (and forces) the downstream programmer to check for that case more cleanly and adapt it into their code.

In Practice

For an example of where this sort of thing is well-suited, take a look at this snippet of code, which is likely to be pretty universal in Domino code:

1
2
3
4
5
View users = db.getView("Users");
Document user = users.getDocumentByKey(userName, true);
if(user != null) {
	// ...
}

Most of the time, this will run fine. However, if, say, the "Users" view is unavailable (if it was replaced in a design refresh, or another developer removed it, or it's reader-inaccessible to the current user), you'll end up with a NullPointerException in the second line. When you have the code in front of you, the problem is obvious quickly, but that will require you to crack open the app and look into what's going on before you can even start actually fixing the trouble. That's also the "good" version of the case - if you're using code that separates the #getView from the call to #getDocumentByKey with a bunch of other code, it'll be harder to track down.

Imagine instead if the Domino API used Optional, and returned an Optional<View> in #getView and similar for #getDocumentByKey. That could look more like this:

1
2
3
4
5
View users = db.getView("Users")
	.orElseThrow(() -> new IllegalStateException("Unable to open view \"Users\""));
users.getDocumentByKey(userName, true).ifPresent(user -> {
	// ...
});

The idea is the same, and you'll still get an exception if "Users" is unavailable, but it will be immediately obvious in the error message what it is that you need to fix.

This also forces the programmer to conceptualize that case in a way that they wouldn't necessarily have without the need to "unwrap" the Optional. Maybe it's actually okay if "Users" doesn't exist - in that case, you could just return early and not even run the risk of an exception at all. Or maybe there's a way to recover from that - maybe look up the user another way, or create the view on the fly.

Implementing It

When spreading Optional across an existing codebase or writing a new one around it, I've found that there are some important things to keep in mind.

First, since Optional is implemented as just a class and not special syntax, I've found that the best way to implement it in your code is to go all or nothing: if you decide you want to use Optional, do it everywhere. The trouble if you mix-and-match is that you'll run into some cases where you still do if(foo != null) { ... }; since an empty Optional is non-null, that habit will bite you.

Usually, though, that's not too much trouble: when you start to change your code, you'll run into tons of type-related problems around code like that, so you'll be cued in to change it while you're working anyway. Just make sure to not leave yourself null-returning method traps elsewhere.

Another fun gotcha you'll hit early is that Optional.of(foo) will throw a NullPointerException if foo is null. That's the JDK being (reasonably) pedantic: if you want to wrap a potentially-null value, you have to instead do Optional.ofNullable(foo). While irksome at first, it drives home the point that one of the virtues of Optional is that it forces you, the programmer, to consider the null case much more than you did previously.

Unwrapping

Optional also provides a number of ways to "unwrap" the value or deal with it, and it's useful to know about them for different situations.

The first one is just someOptional.get(), which will return the contained value or immediately throw a NullPointerException if it's null. I've found that this is best for when you're very confident that the value is non-null, either because you already checked with isPresent or if it being null is a sign that the system is so fubar already that there's no virtue in even customizing the exception.

Somewhat safer than that, though, is what I had above: someOptional.orElseThrow(() -> ...), which will either return the wrapped value or throw a customized exception if it's null. This is ideal for either halting execution with a message for how the developer/admin can fix it or for throwing a useful exception declared in your documentation for a downstream programmer to catch.

There's also someOptional.orElse(someOtherValue). For example, take this case where you have a configuration API that returns an Optional<String> for a given lookup key:

1
String emailTarget = getConfig("EmailTarget").orElse("admin@company.com");

That's essentially the "get-or-default" idiom. Now, you can actually do .orElse(null) if you want, though that's often not the best idea. Still, that can be handy if you're adapting existing code that does a null check immediately already, or if you're writing to another existing API that does use null.

Optional also has a #map method, which may seem a bit weird at first, but can be useful on its own, and is particularly well-suited to use with results from the Stream API like #findFirst. It lets you transform a non-null value into something else and thereby change your Optional to then be an Optional of whatever the transformed value type is. For example:

1
2
3
String userName = findLoggedInUser()
	.map(UserObj::getUserName)
	.orElse("Anonymous");

In this case, findLoggedInUser returns a Optional<UserObj>. Then, the #map call gets the username if present and returns an Optional<String>, which is thereby either unwrapped or turned into "Anonymous".

Optional As A Parameter

Up until this point, I've been talking about Optional as a method return value, but what about using it as a parameter? Well, you can, but the consensus is that you probably shouldn't.

At first, I chafed against this advice - after all, Optional is finally an in-JDK way to express an optional or nullable parameter, so why not use it? The arguments about how it's less efficient for the compiler, while true, didn't sway me much - after all, compilers improve, and it's generally better to do something correct than something microscopically faster.

However, the real thing that convinced me was realizing that, if you have Optional as a method parameter, you're still going to have to null-check it anyway, since you don't control who might be calling your code. So, not only will you have to check and unwrap the Optional, you'll still have to have a null guard for the Optional itself, defeating much of the point.

I do think that there's some utility in Optional parameters in your own in-implementation code, not exposed to the outside. It can be useful to indicate that a parameter value is intended to be ignored outright, for example. As one comment on that SO thread mentions, an Optional parameter allows for three states: null, an empty value, or a present value. You could use null to indicate that you don't want that value checked at all, and then have an empty value mean something particular in the code. But that kind of fiddly hair-splitting is exactly why it should be of limited use and even then very-clearly documented for yourself.

If Java ever gets syntax sugar for Optional (say, declaring the parameter as Object? and having calls passing null auto-wrap them into Optional or something), then this could change.

Interaction With Null Analysis

Finally, I'll mention how using Optional interacts with null annotation analysis.

To begin with, if you're using Eclipse, the immediate answer will be "poorly". Because Eclipse doesn't ship with nullness hints for the core JDK, it won't know that Optional.of returns a non-null value, defeating the entire point. For that, you'll want the lastNPE.org nullness annotations, which will provide such hints. I've found that they're still not perfect here, causing Eclipse to frequently complain about someOptional.orElse(null), but the experience becomes good enough.

IntelliJ, for the record, has such hints built-in, so you don't need to worry there.

Once you have that sorted out, though, they go together really well. For example, pairing the two can help you find cases where, in your Optional translation journey, you're checking whether the Optional itself is null: with null-annotated code, the compiler can see that it will never be null as such and will tell you to change it.

So I advise pairing the two: use Optional for return values everywhere, especially if you're making an API for downstream consumption, and then pair them with null annotations to make sure your own implementation code is correct and to provide hints for opting-in users.

Using Server-Sent Events on Domino

Tue Mar 30 08:57:20 EDT 2021

Tags: jakartaee java

Though Domino's HTTP stack infamously doesn't support WebSocket, WebSocket isn't the only game in town when it comes to getting push-type information to HTTP clients. HTML5 also brought with it the less-famous Server-Sent Events standard, which is basically half of WebSocket: it allows the server to push events to the client, but it's still a one-way communication channel.

The Standard

The technique that SSE uses is almost ludicrously simple: the client makes a request and the server replies that it will provide text/event-stream content and keeps the connection open. Then, it starts emitting events delimited by blank lines:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
HTTP/1.1 200 OK
Content-Type: text/event-stream;charset=UTF-8



event: timeline
data: hello

event: timeline
data: hello

Unlike WebSocket, there's no Upgrade header, no two-way communication, and thereby no special requirements on the server. It's so simple that you don't even really need a server-side library to use it, though it still helps.

In Practice

I've found that, though SSE is intentionally far less capable than WebSocket, it actually provides what I want in almost all cases: the client can receive messages instantaneously from the server, while the server can receive messages from the client by traditional means like POST requests. Though this is less efficient and flexible than WebSocket, it suits perfectly the needs of apps like server monitors, chat rooms, and so forth.

Using SSE on Domino

JAX-RS, the Java REST service framework, provides a mechanism for working with server-sent events pretty nicely. Baeldung, as usual, has a splendid tutorial covering the API, and a chunk of what I say here will be essentially rehashing that.

However, though Domino ships with JAX-RS by way of the ExtLib, the library only implements JAX-RS 1.x, which predates SSE support. Fortunately, newer JAX-RS implementations work pretty well on Domino, as long as you bring them in in a compatible way. In my XPages Jakarta EE Support project, I did this by way of RESTEasy, and there did the legwork to make it work in Domino's OSGi environment. For our example today, though, I'm going to skip that and build a small webapp using the com.ibm.pvc.webcontainer.application extension point. In theory, this should also work XPages-side with my project, though I haven't tested that; it might require messing with the Servlet response cache.

The Example

I've uploaded my example to GitHub, so the code is available there. I've aimed to make it pretty simple, though there's always some extra scaffolding to get this stuff working on Domino. The bulk of the "pom.xml" file is devoted to two main things: packaging an app as an OSGi bundle (with RESTEasy embedded) and generating an update site with site.xml to import into Domino.

Server Side

The real work happens in TimeStreamResource, the JAX-RS resource that manages client connections and also, in this case, happens to emit the messages as well.

This resource, when constructed, spawns two threads. The first one monitors a BlockingQueue for new messages and passes them along to the SseBroadcaster:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
try {
    String message;
    while((message = messageQueue.take()) != null) {
        // The producer below may send a message before setSse is called the first time
        if(this.sseBroadcaster != null) {
            this.sseBroadcaster.broadcast(this.sse.newEvent("timeline", message)); //$NON-NLS-1$
        }
    }
} catch(InterruptedException e) {
    // Then we're shutting down
} finally {
    this.sseBroadcaster.close();
}

Here, I'm using the Sse#newEvent convenience method to send a basic text message. In practice, you'll likely want to use the builder you get from Sse#newEventBuilder to construct more-complicated events with IDs and structured data types (usually JSON).

A BlockingQueue implementation (such as LinkedBlockingDeque) is ideal for this task, as it provides a simple API to add objects to the queue and then wait for new ones to arrive.

The second one emits a new message every 10 seconds. This is just for the example's sake, and would normally be actually looking something up or would itself be a listener for events it would like to broadcast.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
try {
    while(true) {
        String eventContent = "- At the tone, the Domino time will be " + OffsetDateTime.now();
        messageQueue.offer(eventContent);

        // Note: any sleeping should be short enough that it doesn't block HTTP restart
        TimeUnit.SECONDS.sleep(10);
    }
} catch(InterruptedException e) {
    // Then we're shutting down
}

Browsers can register as listeners just by issuing a GET request to the API endpoint:

1
2
3
4
5
@GET
@Produces(MediaType.SERVER_SENT_EVENTS)
public void get(@Context SseEventSink sseEventSink) {
    this.sseBroadcaster.register(sseEventSink);
}

That will register them as an available listener when broadcast events are sent out.

Additionally, to simulate something like a chat room, I added a POST endpoint to send new messages beyond the periodic ten-second broadcast:

1
2
3
4
5
6
@POST
@Produces(MediaType.TEXT_PLAIN)
public String sendMessage(String message) throws InterruptedException {
    messageQueue.offer(message);
    return "Received message";
}

That's really what there is to it as far as "business logic" goes. There's some scaffolding in the Servlet implementation to get RestEasy working nicely and manage the ExecutorService and the obligatory "plugin.xml" to register the app with Domino and "web.xml" to account for Domino's old Servlet spec, but that's about it.

Client Side

On the client side, everything you need is built into every modern browser. In fact, the bulk of "index.html" is CSS and basic HTML. The JavaScript involved in blessedly slight:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
function sendMessage() {
    const cmd = document.getElementById("message").value;
    document.getElementById("message").value = "";
    fetch("api/time", {
        method: "POST",
        body: cmd
    });
    return false;
}
function appendLogLine(line) {
    const output = document.getElementById("output");
    output.innerText += line + "\n";
    output.scrollTop = output.scrollHeight;
}
function subscribe() {
    const eventSource = new EventSource("api/time");
    eventSource.addEventListener("timeline",  (event) => {
        appendLogLine(event.data);
    });
    eventSource.onerror = function (err) {
        console.error("EventSource failed:", err);
    };
}

window.addEventListener("load", () => subscribe());

The EventSource object is the core of it and is a standard browser component. You give it a path to watch and then listen for events and errors. fetch is also standard and is a much-nicer API for dealing with HTTP requests. In a real app, things might get a bit more complicated if you want to pass along credentials and the like, but this is really it.

Gotchas

The biggest thing to keep in mind when working with this is that you have to be very careful to not block Domino's HTTP task from restarting. If you don't keep everything in an ExecutorService and account for InterruptedExceptions as I do here, you're highly likely to run into a situation where a thread will keep chugging along indefinitely, leading to the dreaded "waiting for session to finish" loop. The ExecutorService's shutdownNow method helps you manage this - as long as your threads have escape hatches for the InterruptedException they'll receive, you should be good.

I also, admittedly, have not yet tested this at scale. I've tried it out here and there for clients, but haven't pulled the trigger on actually shipping anything with it. It should work fine, since it's using standard JAX-RS stuff, but there's always the chance that, say, the broadcaster registry will fill up with never-ending requests and will eventually bloat up. The stack should handle that properly, but you never know.

Beyond any worries about the web container, it's also just a minefield of potential threading and duplicated-work trouble. For example, when I first wrote the example, I found that messages weren't shared, and then that the time messages could get doubled up. That's because JAX-RS, by default, creates a new instance of the resource class for each request. Moving the declaration from the Application class's getClasses() method (which creates new objects) to getSingletons() (which reuses single objects) fixed the first problem. After that, I found that the setSse method was called multiple times even for the singleton, and so I moved the thread spawning to the constructor to ensure that they're only launched once.

Once you have the threading sorted out, though, this ends up being a pretty-practical path to accomplishing the bulk of what you would normally do with WebSocket, even with an aging HTTP stack like Domino's.

Domino HttpService and the NSF Router Project

Thu Mar 18 15:27:04 EDT 2021

Tags: domino java

In my last post and its predecessor, I talked about my tinkering at the XspCmdManager level of Domino's HTTP stack and then more specifically about the com.ibm.designer.runtime.domino.adapter.HttpService class.

The Stack

Now, HttpService is about as generic a name as you can get for this sort of thing, and it doesn't really tell you what it represents. You can think of Domino's HTTP stack since at least the 8.5 era as having two cooperating parts: the core native portion that handles HTTP requests in basically the same way as Domino always did, plus the Java layer as organized by XspCmdManager. The Java layer gets "right of first refusal" for any incoming request that wasn't handled by a DSAPI plugin: before routing the request to the legacy HTTP code, Domino asks XspCmdManager if it'd like to handle it, and only takes care of it at the native layer if Java says no.

XspCmdManager on its own doesn't do much. It accepts the JNI calls from the native side, but otherwise quickly passes the buck to LCDEnvironment (I assume the "LCD" here stands for "Lotus Component Designer"). LCDEnvironment, in turn, really just aggregates registered handlers and dispatches requests. It does a little work to handle exception cases more cleanly than XspCmdManager would, but it's mostly just a dispatcher.

The things that it dispatches to, though, are the HttpServices. These are registered by using the com.ibm.xsp.adapter.serviceFactory IBM Commons extension point, such as here in the plugin.xml form:

1
2
3
<extension point="com.ibm.commons.Extension">
  <service type="com.ibm.xsp.adapter.serviceFactory" class="org.openntf.nsfrouter.NSFRouterServiceFactory" />
</extension>

The class you register there is an implementation of IServiceFactory, which supplies zero or more HttpService implementations on request.

As a side note, I've been using this extension point for years and years, but never before to actually handle HTTP requests. It's extremely convenient in that it's something you can register that is loaded up immediately when the HTTP task starts and is notified as it's terminating, giving you a useful lifecycle without having to wait for a request to come in. I learned about it from the OpenNTF Domino API team and it's been a regular part of my toolkit since.

The HttpService

So that brings us to the HttpService implementation classes themselves. Once LCDEnvironment has gathered them all together, it asks each one in turn (via #isXspUrl) if it can handle a given URL. If any of them say that they can, then it calls the #doService method on each in turn (based on the #getPriority method's return value) until one says that it handled it.

There are a few main HttpService implementations in action on Domino:

  • com.ibm.domino.xsp.module.nsf.NSFService, which handles in-NSF XPages and resources
  • com.ibm.domino.xsp.adapter.osgi.OSGIService, which handles OSGi-registered servlets and webapps
  • com.ibm.domino.xsp.module.nsf.StaticResourcesService, which helps serve static resources

These services also tend to go another layer deeper, passing actual requests off to ComponentModule implementations like NSFComponentModule. That's beyond the scope of what I'm talking about today, but it's interesting to see just how much the Domino stack is basically one giant webapp that contains progressively smaller bounded webapps, like a Matryoshka doll.

For those keeping track, we're about here on a typical XPages call stack:

     at com.ibm.domino.xsp.module.nsf.NSFComponentModule.doService(NSFComponentModule.java:1336)
     at com.ibm.domino.xsp.module.nsf.NSFService.doServiceInternal(NSFService.java:662)
     at com.ibm.domino.xsp.module.nsf.NSFService.doService(NSFService.java:482)
     at com.ibm.designer.runtime.domino.adapter.LCDEnvironment.doService(LCDEnvironment.java:357)
     at com.ibm.designer.runtime.domino.adapter.LCDEnvironment.service(LCDEnvironment.java:313)
     at com.ibm.domino.xsp.bridge.http.engine.XspCmdManager.service(XspCmdManager.java:272)

For our purposes this week, the #isXspUrl and #doService methods on HttpService are our stopping points.

NSF Router Service

In a Twitter conversation yesterday, Per Lausten gave me the idea of using this low level of access to implement improved in-NSF routing. That is to say, if you want "foo.nsf/some/nice/url/here" to actually load up "index.xsp?path=nice/url/here" or the like. Generally, if you want to do this, you either have to set up Web Site rules in names.nsf or settle for next-best options like "index.xsp/nice/url/here".

Since an HttpService comes in at a low-enough level to tackle this, though, it's entirely doable to improve this situation there. So, this morning, I did just that. This new project is a pretty simple one, with all of the action going on in one class.

The way it works is that it looks for a ".nsf" URL and, when it finds one, attempts to load a file or classpath resource named "nsfrouter.properties". The contents of this is a Java Properties file enumerating regex-based routing you'd like. For example:

1
2
foo/(\\w+)=somepage.xsp?bit=bar
baz=somepage.xsp

When found, the class loads up the rules and then uses them to check incoming URLs.

The #doService method then picks up that URL, does a String#replaceAll call to map it to the target, and then redirects the browser over:

NSF Router in action

The user still ends up at the "uglier" URL, but that's the safest way to do it without breaking on-page references.

I felt like that was a neat little exercise, and one that's not only potentially useful on its own but also serves as a good way to play around with these somewhat-lower-level Domino components.

Carving Out A Workspace On Apple Silicon

Wed Feb 17 11:24:19 EST 2021

Last month, I mentioned my particular computer trouble, in that my trusty iMac Pro has been afflicted by an ever-worsening fan noise problem. I'd just been toughing it out, since there's never a good time to lose your main machine for a week or two, and my traveler MacBook Escape wasn't up to the task of being a full replacement.

After about a month's delay, my fresh new M1 MacBook Air arrived a few weeks ago and I've been putting it through its paces.

The Basics

As pretty much anyone who has one of these computers has said, the performance is outstanding. For the most part, even with emulation, most of the tasks I do during the day feel the same as they did on my wildly-more-expensive iMac Pro. On top of that, the fact that this thing doesn't even have a fan is both a technical marvel and a godsend as far as ambient room noise is concerned.

For continuity's sake, I used Migration Assistant to bring over my iMac's environment, and everything there went swimmingly. The good-citizen apps I use like MarsEdit and Tower were already ported to ARM, while the laggards (unsurprisingly, the ones made by larger companies with more resources) remain Intel-only but run just fine in emulation.

Hardware

For a good while now, I've had the iMac screen flanked by a pair of similarly-sized but far-inferior Asus screens. With the iMac's lovely screen out of the setup for now, I've switched to using those two Asus screens as my primary ones, with the pretty-but-tiny laptop screen sitting beneath them. It works well enough, though I do miss the retina resolution and general brightness of the iMac.

The second external screen itself was a bit of an issue. Of themselves, these M1 Macs, either for good reason or to mark them as low end, support only two screens total, the laptop screen included. So I ended up ordering one of the StarTech DisplayLink adapters. I expected it to be a crappy experience overall, with noticeable lag, but it actually works much more smoothly than I'd have expected. Other than the fact that it doesn't support Night Shift and some wake-from-sleep slowness that I attribute to it, it actually feels just like a normally-attached monitor.

I also, in order to regain my precious Ethernet connection and (sort of) clean up the dongle situation, I got one of these Anker USB-C docks. I've only had it for a day, but it seems to be working like you'd want so far. So that's nice.

Eclipse and Java

Here's where I've hit my first bout of jankiness, though it's not too surprising. In general, Eclipse and Java work just fine through emulation, and I can even keep running tests and web servers using the libnotes.dylib from the Notes client as I want.

I've found times where tests lag or fail now when they didn't before, though, and that's a little ominous. Compiling locally with NSF ODP, which spawns a sub-process that loads the Notes libraries, usually works, though now I've set up another Domino server on my network to handle that reliably.

I've also noticed some trouble in one of my Eclipse workspaces where it periodically spends a long time (10+ minutes) "Building" without explaining what exactly it's doing, and this is new behavior since the switch. I can't say what the core trouble is there. It's my largest active workspace, so it could be that file polling or other system-call-intensive work is just slower, or it could be an artifact of moving it from machine to machine. I'll probably scrap it and make a new workspace with the same projects to see if it alleviates it.

This all should improve in time, though, when Eclipse, AdoptOpenJDK, and HCL all release macOS ARM ports. IntelliJ has an experimental ARM port out, and I'm curious how that does its thing. I'll probably spend some time kicking the tires on that, though I still find Eclipse's UI much more conducive to the "lots of semi-related projects" working style I have. Visual Studio Code is in a similar boat, so that'll be good for the JavaScript development I do (under protest).

In the mean time, I've done some tinkering with how I could get a fully-native Eclipse environment running and showing up on my Mac, including firing up the venerable XQuartz to run Eclipse as an X client from a Linux VM in the basement. While that technically works, the experience is... well, I'll charitably call it "not Mac-like". Still, it's kind of neat and would in theory push aside any number of concerns.

Docker

Here's the real trouble I'm butting my head against. I've taken to using Docker more and more for various reasons: running app servers with a Domino runtime, running Domino outright, and (where my trouble is now) performing cross-compilation and other native-specific compilation tasks. For example, for one of my clients, I have a script that mounts the project directory to a Docker container to perform a full Maven build with NSF compilation and compile-time tests, without having to worry about the user's particular Notes or Domino installation.

However, while Docker is doing Hurculean work to smooth the process, most of the work I do ends up hitting one of the crashing snags in poor qemu, which crop up particularly with Java compilation tasks. Since compiling Java is basically all I do all day, that leaves me hoping either for improvements in future versions or a Linux/aarch64 port of Domino (or at least libnotes.so).

In the mean time, I'm making use of Docker's network transparency to run Docker on an x64 VM and set DOCKER_HOST locally to point to it. For about half of what I need, this works great: I can run Domino servers and Notes-enabled webapps this way, and I just change which address I'm pointing to to interact with them. However, it naturally removes the possibility of connecting with the local filesystem, at least without pairing it with some file-share jankiness, so it's not a replacement all around. It also topples quickly into the bizarre inner Docker world: for example, I wanted to set up Codewind to work remotely, but the instructions I found for getting started with your own server were not helpful.

Future Use

Still, despite the warts, I'd say this laptop is performing admirably, and better than one would normally expect. Plus, it's a useful exercise in finding more ways to make my workflow less machine-specific. Though I still bristle at the thought of going full Eclipse Che and working out of a web browser, at least moving some more aspects of my workspace to float above the rough seas is just good practice.

I'll probably go back to using the iMac Pro as my main machine once I get it fixed, even if only for the display, but this humble, low-end M1 has planted its flag more firmly than a MacBook Air normally has any right to.

Java Travelogue: The Care and Feeding of Locales

Sun Feb 14 13:37:14 EST 2021

Tags: java
  1. Java Hiccups
  2. Bitwise Operators
  3. Java Grab Bag 2
  4. Java Travelogue: The Care and Feeding of Locales
  5. More Notes on Filesystem and Charset Portability

Over time, people using the NSF ODP Tooling project have periodically hit troubles with files using non-ASCII filenames, as well as some related encoding issues.

Now, I know what you're thinking: why don't people hitting this trouble just be Americans and not use languages with accents? And yes, obviously, that's the optimal solution. However, given that, apparently, most people on the planet are not American, it's for the best to not write software that completely falls apart when encountering an umlaut.

When working to fix this, I found some areas where the fix was pretty obvious, and others where the trouble was a bit more insidious. I figure it'll be potentially useful to write these down, either for others running into similar trouble or my own future self next time I write overly-American code.

Early Encounters: ZIP Files

The earliest place people encountered trouble was with the handling of ZIP files when transferring packages around. When compiling remotely, the local Maven plugin ZIPs up the ODP and related support files (OSGi sites, etc.) for transfer to the server, which then unzips them. This led to a problem wherein the handling of file names in ZIP files is wildly inconsistent over platforms and locales.

Fortunately, this one has a clean fix: when using ZipOutputStream and ZipInputStream (which were my preferred mechanisms), you can specify your encoding:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
try(OutputStream fos = Files.newOutputStream(packageZip, StandardOpenOption.CREATE, StandardOpenOption.TRUNCATE_EXISTING)) {
    try(ZipOutputStream zos = new ZipOutputStream(fos, StandardCharsets.UTF_8)) {
        // Add entries to the ZIP here
    }
}

// And to read:
try(InputStream is = Files.newInputStream(zipFilePath)) {
    try(ZipInputStream zis = new ZipInputStream(is, StandardCharsets.UTF_8)) {
        // Iterate over entries here
    }
}

Since I control both sides of the operation in this case, I can then be confident that it will use UTF-8 across the board.

Next Problem: Filesystem Restrictions

The next problem I ran into actually happened when I was setting up a compiler server in a Docker container. One of the design elements in the example projects is an agent containing umlauts, based on a reported problem. When I tried compiling this project in a Docker-housed Domino server, I ran into this trouble:

java.nio.file.InvalidPathException: Malformed input or input contains unmappable characters: Code/Agents/Example Agent with ref?r?ns.fa
    at sun.nio.fs.UnixPath.encode(UnixPath.java:147)
    at sun.nio.fs.UnixPath.<init>(UnixPath.java:71)
    at sun.nio.fs.UnixFileSystem.getPath(UnixFileSystem.java:281)
    at sun.nio.fs.AbstractPath.resolve(AbstractPath.java:53)
    at org.openntf.nsfodp.compiler.servlet.ODPCompilerServlet.expandZip(ODPCompilerServlet.java:241)

Basically, it was trying to write out what it considered an illegal filename and choked on it.

I first spent some time double-checking my ZIP handling, since I was assuming that the trouble was that the name it got out of the ZIP file was corrupted, hence the "?" instead of "?". This search brought me to this Stack Overflow question, which is asking about the same exception and which talks about the locale of the underlying system. The gist of it is that Java uses a semi-standard property (sun.jnu.encoding) to interpret a lot of things, filename mapping included, and it derives this from the system locale.

I hopped into the Domino container to see what locale it uses (by way of echo $LANG) and saw that it's "C.utf8". I like the sound of that "utf8" part, but the "C" part is different from the comfy "en_US" that I'm used to, and likely causes Java to be more restrictive. Uncharacteristically, the typical "en_US" setup actually avoids this trouble, causing Java NIO to allow all sorts of characters in filenames.

So I started seeing what I could do by way of setting ENV variables as part of the Dockerfile, but then realized that it'd be better to fix this in a way that doesn't depend on external configuration like that.

Java NIO

Here I realized that I didn't actually need to write these files out to the filesystem at all. Over a year ago, I wrote part 1 of an unfinished series talking about the Java NIO filesystem API from Java 7. That API exists for a number of reasons, and the best way to dive into it is to replace your uses of java.io.File, java.io.FileInputStream, etc. with it, which I did in the NSF ODP Tooling a while ago.

What struck me, then, was that this earlier work also separated out the specifics of filesystem access. And, critically, Java ships with a ZIP file system provider that lets you point at a ZIP or JAR file and treat it like any old filesystem. The on-disk project representation I wrote for the compiler uses this NIO API as its entrypoint. By skipping the step of extracting the ODP from the ZIP to the filesystem, I could remove that entire problem from my view.

The Fiddly Parts

This process was mostly smooth, but there are a few fiddly parts that I had to account for:

  1. You have to use newFileSystem when you crack open a ZIP this way, rather than trying to open it by "jar:file" URL directly. Additionally, you have to pass a Map of options including "create":"true" to make it work.
  2. Paths.get, which is a common mechanism for creating either a full or relative path, is a bit insidious. Since those paths are created using the default system filesystem, you can't just pass them to methods like resolve for paths created from another filesystem type. Accordingly, I replaced uses of that with methods based on a context filesystem.
  3. Nested ZIPs aren't supported. That is, they exist like other files in there, but you can't reach further inside of them with a "jar:jar:file" URL. So, when building the classpath for compilation, I have to extract them. I suppose this part is technically a bug if those files have non-ASCII names, but that's rare enough to hopefully not be an issue.

Once I dealt with those, though, things went surprisingly smoothly. I even refactored earlier code to use this, replacing more-complicated streaming logic with conceptually-simpler file-copying logic. My guess is that this new route is slower, but the difference is negligible for my needs, so I'll take the higher abstraction here.

Stream Locales

Unfortunately, while that helped a bit and is definitely conceptually neat, it didn't solve all my trouble. If I recall correctly, at this point, I was able to get the file imported, but the agent name itself was mangled in Notes, something that didn't happen when I compiled it locally.

This brought me to looking into locales used when reading and writing XML from the ZIP or filesystem. Hypothetically, I had done this cleanly. My file-reading utility methods were very similar, just opening up an InputStream (which is too low-level to care about encoding) and passing it along to IBM Commons utilities to interpret it:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
public static String readFile(Path path) {
    try(InputStream is = Files.newInputStream(path)) {
        return StreamUtil.readString(is);
    } catch(IOException e) {
        throw new RuntimeException(e);
    }
}

public static Document readXml(Path file) {
    try(InputStream is = Files.newInputStream(file)) {
        return DOMUtil.createDocument(is);
    } catch(IOException | XMLException e) {
        throw new RuntimeException(e);
    }
}

However, I realized that these were insidious traps, too. By not handling encoding on my side, I was leaving it up to the internals to pick a default encoding, which isn't guaranteed to be UTF-8 (even though it really should be for XML). StreamUtil.readString there has a variant that takes an encoding as the second argument, but I decided to instead handle this one step earlier. Rather than using InputStream, which deals with bytes directly, I decided to switch to Readers, which are more specialized for dealing with character sequences. The Files class provides methods to do this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
public static String readFile(Path path) {
    try(Reader r = Files.newBufferedReader(path, StandardCharsets.UTF_8)) {
        return StreamUtil.readString(r);
    } catch(IOException e) {
        throw new RuntimeException(e);
    }
}

public static Document readXml(Path file) {
    try(Reader r = Files.newBufferedReader(file, StandardCharsets.UTF_8)) {
        return DOMUtil.createDocument(r);
    } catch(IOException | XMLException e) {
        throw new RuntimeException(e);
    }
}

This way, it's explicit what I'm doing, and it allows for extra optimization at the NIO level if possible.

Writing Back Out

These rules also applied to writing back out. For the most part, Files.newBufferedWriter(..., StandardCharsets.UTF_8) was the way to go, though I did find one extra insidious bit:

1
2
3
try(PrintWriter writer = new PrintWriter(os)) {
    // ...
}

Here, PrintWriter doesn't have a character-set argument at all, and so one could be forgiven (hopefully) for just kind of assuming it'll use Unicode. However, delving into the implementation, it uses OutputStreamWriter's no-charset constructor, which in turn calls Charset.defaultCharset(), and there's your potential bug. Since I didn't actually need PrintWriter as such, I replaced this with a charset-specific call and all was well:

1
2
3
try(Writer writer = new OutputStreamWriter(os, StandardCharsets.UTF_8)) {
    // ...
}

Overall

I felt that this was a pretty good exercise to perform, not just because it'll be immediately useful for NSF ODP, but also because it's a good reminder to be more diligent about character encoding. And it's also just a good lesson for two critical parts of programming: take the higher abstraction when you can and be as explicit as possible in your intent.

By switching to using the ZIP filesystem implementation, I was able to remove an entire step and problem domain from my plate. Now, the code that reads and writes filenames server-side should be able to run on basically any locale setting, without concern for the restrictions of the filesystem (within reason). The code is simpler, the operations are the same whether it's working with the filesystem directly or not, and reading the ZIP'ed ODP should actually be slightly more efficient.

And for the rest, explicitly picking your character set is just good practice. Even in a case where the documentation says that it will default to UTF-8, I think it's better to do it this way, so anyone reading your code can see what you're doing without resting on implied behavior. Certainly, you can be too explicit in places where relying on natural behavior makes sense, but this highlighted that character sets aren't one of those cases.

fontconfig, Java, and Domino 11

Fri Jan 29 12:09:35 EST 2021

Tags: java
  1. AbstractCompiledPage, Missing Plugins, and MANIFEST.MF in FP10 and V10
  2. Domino 11's Java Switch Fallout
  3. fontconfig, Java, and Domino 11
  4. Notes/Domino 12.0.2 Fallout
  5. Notes/Domino 14 Fallout

In my last post, I quickly mentioned some trouble I had run into with fontconfig and Poi, in the context of configuring a Docker-based Domino server. However, I think it deserves its own post, so I have something to point to if others run into the same trouble down the line.

The Upshot

The upshot of the issue is that, if you're going to use Poi or or other graphics-adjacent Java libraries in Domino 11 on Linux, you'll need fontconfig and potentially some other support files installed on your system. If you have any GUI stuff installed, they'll probably be there, but it's common for them to be missing on headless servers.

For the official Domino Docker image, which uses Red Hat's package system, I wrote this Dockerfile for my derivative version:

FROM domino-docker:V1101FP2_10202020prod
USER root
RUN yum install --assumeyes fontconfig urw-fonts
USER notes

On Debian-based systems, I believe you just need apt install fontconfig.

The Details

AdoptOpenJDK builds of Java apparently don't include the same font-related support files that the Oracle ones did, and that results in calls made to the AWT layer to throw NullPointerExceptions at various times related to getting font information. This has shown up in a couple issues over in the openjdk-support project on GitHub, with two representative ones being:

https://github.com/AdoptOpenJDK/openjdk-support/issues/80

java.lang.NullPointerException
  at sun.awt.FcFontManager.getDefaultPlatformFont(FcFontManager.java:76)
  at sun.font.SunFontManager$2.run(SunFontManager.java:433)
  ...

https://github.com/AdoptOpenJDK/openjdk-build/issues/693

java.lang.NullPointerException
  at sun.awt.FontConfiguration.getVersion(FontConfiguration.java:1264)
  at sun.awt.FontConfiguration.readFontConfigFile(FontConfiguration.java:219)
  at sun.awt.FontConfiguration.init(FontConfiguration.java:107)
  ...

Domino 11 switched from IBM's proprietary variant of J9 to OpenJ9, and this is another one of the little fiddly details that isn't quite the same between the two.

Most commonly, I've found this crop up when using Poi, specifically calling autosizeColumns when generating a spreadsheet, but in theory any number of things like that will trip across this. Unfortunately, the internal JVM classes aren't terribly helpful in their error reporting, since they get several method calls in just assuming that all is well with the world before bailing with the NPEs like above.

It's a mild annoyance to deal with, but fortunately one with a straightforward fix, at least once you know what the trouble is.

Getting Started with Hotwire in a Java Webapp

Tue Jan 12 17:19:11 EST 2021

Whenever I have a great deal of discretion over how a web app is made these days, I like to push to see how simple I can make the front end portion. I spend some of my client time writing heavy client-JS front ends in React and Angular and what-have-you, and, though I get why they are good, I kind of hate them all.

One of the manifestations of my desires has been this very blog, where I set out to try not only some interesting current tools on the Java side, but also challenged myself heavily to use little to no JavaScript. On that front, I was tremendously successful - and, in fact, the only JavaScript on here is the Turbolinks library, which intercepts same-app links and updates the changed parts inline, without the server knowing about the "partial refresh" going on.

Since then, Turbolinks merged with its cousin Stimulus and apotheosized into Hotwire, which is somewhere in between a JavaScript framework and a manifesto. Specifically, it's a manifesto to my liking, so I've been champing at the bit to use it more.

Hotwire Overview

The "Hotwire" name is a cheeky truncation of HTML-over-the-wire, which itself is a neologism for how the web has historically worked: your server sends HTML, and then your browser does stuff with that. It "needs" a new name to set it apart from full-JS apps, which amount to basically sending an application to the browser, having it initialize the app, and then having the app do what would otherwise be the server's job by way of shuttling JSON around.

Turbo is that part that subsumed Turbolinks, and it focuses on enhancing existing HTML and providing a few web components to bring single-page-application niceties to server-rendered apps. The "Drive" part is Turbolinks, so that was familiar to me. What interested me next was Turbo Frames.

Turbo Frames

If you've ever used the XPages Dojo Tab Container's partialRefresh property before, Turbo Frames will be familiar. There are two main ways you can go about using it: making a "frame" that contains some navigable content (say, a form) that will then refresh in-place or making a lazy-loaded frame that pulls from another URL. The latter is what interested me now, and is what carries similar benefits to the Tab Container. It lets you serve the main page and then defer complex complication of an inner part without having to write your own JavaScript to do an API call or otherwise populate the section.

In my case, I wanted to do something very similar to the example. I have my main page, then a sidebar that can be potentially complicated to generate. So, I set up a Turbo Frame using this bit of JSP:

1
<turbo-frame id="links" src="${pageContext.request.contextPath}/links"></turbo-frame>

The only difference from the example, really, is the bit of EL in ${...}, which just makes sure that the final URL adapts to wherever the app is hosted.

The "links" resource there is another MVC controller that renders a different JSP page, truncated like:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
<html>
    <head>
        <script type="text/javascript" src="${pageContext.request.contextPath}/webjars/hotwired__turbo/7.0.0-beta.2/dist/turbo.es5-umd.js"></script>
    </head>
    <body>
        <turbo-frame id="links">
            <!-- expensive content here -->
        </turbo-frame>
    </body>
</html>

The <turbo-frame id="links"> on the initiating page matches up with the one in the embedded page to figure out what to extract and render.

One little side note here is my use of WebJars to bring in Turbo. This isn't an NPM-based project, so there's no package.json bringing the dependency in, but I also didn't want to just paste the JS into my project. Fortunately, WebJars does yeoman's work: it makes various JS libraries available in Servlet-friendly Java JAR format, giving you a JAR with the JS from whatever the library is in META-INF/resources. In turn, an at-least-reasonably-modern servlet container will serve files up from there as if they're part of your main app. That way, you can just use a Maven dependency and not have to worry.

A Hitch: 406 Not Acceptable

Edit 2021-01-13: Thanks to a new release of Turbo, this workaround is no longer needed.

When I first put this together, I saw that Turbo was doing its job of fetching from the remote URL, but it was getting a 406 Not Acceptable response from the server. It took me a minute to figure out why - the URL was correct, it was just a normal GET request, and nothing immediately stood out as a problem in the headers.

It turned out that the trouble was in the Accept header. To work with other Turbo components, Frames makes a request with a header like Accept: text/html; turbo-stream, text/html, application/xhtml+xml. That first one - text/html; turbo-stream - is problematic. I'm not sure if it's the presence of a qualifier at all on text/html, the space, or the lack of an = (as in text/html;charset=UTF-8), but Liberty didn't like it.

Since I'm not (yet, at least) using Turbo Streams, I decided to filter this out on the server. Since MVC is built on JAX-RS, I wrote a JAX-RS request filter to find any Accept values of this type and strip them out:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
@Provider
@PreMatching
public class TurboStreamAcceptFilter implements ContainerRequestFilter {
    @Override
    public void filter(ContainerRequestContext requestContext) throws IOException {
        MultivaluedMap<String, String> headers = requestContext.getHeaders();
        if(headers.containsKey(HttpHeaders.ACCEPT)) {
            List<String> cleaned = headers.get(HttpHeaders.ACCEPT).stream()
                .map(accept -> {
                    String[] vals = accept.split(",\\s*"); //$NON-NLS-1$
                    List<String> localClean = Arrays.stream(vals)
                        .filter(val -> val.indexOf(';') < 0)
                        .collect(Collectors.toList());
                    return String.join(", ", localClean); //$NON-NLS-1$
                })
                .collect(Collectors.toList());
            headers.put(HttpHeaders.ACCEPT, cleaned);
        }
    }
}

Since those filters happen before almost anything else, this cleared up the trouble.

Summary

Setting the Accept quirk aside, this was a pleasant success, and I look forward to using this more. I've found the modern Java stack of JAX-RS + CDI + MVC + simple JSP to be a delight, and Hotwire slots perfectly-smoothly into it. I still quire enjoy rendering HTML on the server and the associated perk of not having to duplicate business logic on both sides. Next time I have an app that requires a bit of actual JavaScript, I'll likely throw Stimulus into the mix here.

The Difficulties of Domino Project Dependencies

Thu Dec 31 09:38:22 EST 2020

Tags: java maven

This post is a drum I've been banging for a long time, from nagging the dev team in the IBM days through to formally requesting it in HCL's Ideas Portal. That idea there has been "Likely to implement" for a little while now, which is heartening, and either way I figured it'd be useful to have a proper blog post explaining the trouble and what a useful better version would be.

The Core Trouble

The main thing I'm talking about here is the act of having a third-party or (particularly) open-source project that depends on Domino artifacts - namely, Notes.jar, the NAPI, and the XPages UI components. I have more than a few such projects, so it's something I deal with pretty much daily.

When you're dealing with an XPages app in an NSF, this isn't really an issue: all the parts you need are there and are part of the classpath. You just reference lotus.domino.Database or com.ibm.xsp.extlib.util.ExtLibUtil and don't even give it a second thought. When you have a project outside of an NSF or Designer, though, you start to have to worry about this.

OSGi Projects

For OSGi-based projects, this means that you need to have a Target Platform that points to the XPages artifacts and then either have a variant of that that includes a packages Notes.jar or also include Notes.jar in your classpath another way. In Eclipse, this might be accomplished by adding Notes.jar to your active JVM and referencing a Notes or Domino installation's OSGi directories - this is something the XPages SDK helps with.

The immediate trouble this involves is if you want to build this project outside of Eclipse - most commonly now with Maven. This is where the IBM Domino Update Site for Build Management came in, which is a cleanly-packaged p2 update site of the XPages artifacts and Notes.jar, suitable for use with Maven+Tycho and any other tool (like Eclipse) that gets its dependencies out of a p2 repository. Unfortunately, it hasn't been updated since its initial release, and contains just the original 9.0.1 versions.

To aid with creating updated versions of that, I created the generate-domino-update-site tool a while back. Since no one outside HCL can legally share update sites themselves, the tool is the next-best thing: point it at Notes or Domino and it'll make one for you in a consistent way.

With either of those routes, though, there's still a gotcha: you still need to have each developer set up the update site for themselves, and it's only consistent across projects because the community settled on the notes-platform Maven property as a URI pointing to the update site. This is as opposed to something like Eclipse-the-IDE's repositories, which (as a virtue of being open-source) are publicly available and can be referenced freely.

Overall, it's a drag having to bring-your-own-site, but at least the use of notes-platform as a pseudo-standard smooths it out.

Non-OSGi Projects

Things get stickier with non-OSGi projects, though. With OSGi projects, the dependency mechanism lines up with the way the artifacts are delivered from the vendor: they all have OSGi metadata (or have a ready-made hook for it, like Notes.jar) and so just making a p2 site out of them makes them ready to go. They don't, though, have Maven metadata, and so referencing them that way takes extra processing.

I've gone about this two ways to date:

  • The aforementioned update site project also has a mechanism for "Mavenizing" update sites. You point the tool at an existing p2 site (like one created by the first step), pick a groupId for it, and it'll install the files into your local repository.
  • The P2 Maven Resolver plugin, which cuts out that middle step and uses a p2 repository as a source of Maven dependencies directly. This route is more "clever", but some tools get a little shaky with it.

Either way, the experience is okay but not perfect. There are some oddities to do with the different dependency mechanisms between OSGi and Maven, but overall it gets the job done.

The core trouble with it is that it's even less consistent across developers/projects than the Tycho notes-platform idiom. I've personally gone through a couple iterations of the Mavenized layout, with different inter-dependency schemes and groupIds. That leads to drift and incompatibility among projects. For example, I use the xpages-runtime project for client work to do my lingering XPages development, and there's some friction in keeping the dependency schemes between that and the client project in line, even though I'm the only developer.

What I'd Like

What I'd really like would be an official HCL-provided or -sanctioned repository for p2 and Maven use for these artifacts. I've pitched the idea of OpenNTF hosting this, since I already have the tools and servers on hand, though we'd have to come up with a way to agree about who is legally allowed to access it. All the better would be consistently-updated HCL-hosted repositories, where they could link access to one of the various HCL accounts we tend to have.

The best route would be to publish it on a repository that doesn't require authentication. While I'm making wishes, attaching Javadoc would be a classy touch too.

Anyway, that's the gist of it. It's one of the two main thorns in my side when doing Domino-targeted development (the other being initializing the runtime itself in the process), and it'd save me a whole lot of heartache if it had a proper solution.

Quick Tip: JDK Null Annotations for Eclipse

Thu Dec 10 15:17:18 EST 2020

  1. The Cleansing Flame of Null Analysis
  2. Quick Tip: JDK Null Annotations for Eclipse
  3. The Joyful Utility of Optionals in Java

A few years back, I more-or-less found the religion of null analysis, and I've stuck with it with at least my larger projects.

One of the sticking points all along, though, has been Eclipse's lack of knowledge about what code not annotated with nullness annotations does, with the biggest blind spot being the JDK itself. For example, take this bit of code:

1
BigDecimal foo = BigDecimal.valueOf(10).add(1);

That will never throw a NullPointerException, but, since BigDecimal#valueOf isn't annotated at all, Eclipse doesn't know that for sure, and so it may flag it as a potential problem. To deal with this, Eclipse has the concept of external annotations, where you can associate a specially-formatted file with a set of classes and Eclipse will act as if those classes had nullness annotations already.

Core JDK Annotations

Unfortunately - and as opposed to things like IntelliJ - Eclipse for some reason doesn't ship with this knowledge out of the box. For a while, I just dealt with it, throwing in technically-unnecessary checks around things like Optional#get that are guaranteed to return non-null. The other day, though, I decided to look into it more and found lastNPE.org, which is a community-driven project to provide such external annotations.

Better still, they also provide an Eclipse plugin (404 expected on that link - Eclipse knows what to do with it) to apply rules from your project's Maven configuration to the IDE. This not only covers applying external annotations, but also synchronizing compiler configurations.

Sidebar: The Eclipse Compiler

By default, a Java project is compiled with javac, the stock Java compiler. Eclipse maintains its own compiler, varyingly called ECJ or (as shorthand) JDT. Eclipse's compiler is, unsurprisingly, well-geared towards IDE use, and part of that is that it can flag and process a great deal of semantic and stylistic issues that the stock compiler doesn't care about. This included null annotations.

Maven Configuration

With this information in hand, I went to configure my project's Maven build. The first step was to change it to use Eclipse's compiler, since I had recently switched the project away from being Tycho-based (which uses ECJ by default). This can be done by configuring maven-compiler-plugin:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
<build>
  <plugins>
    <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-compiler-plugin</artifactId>
      <version>3.8.1</version>
      <configuration>
        <failOnWarning>false</failOnWarning>
        <compilerId>jdt</compilerId>
        <compilerArguments>
          <properties>${project.basedir}/../../config/org.eclipse.jdt.core.prefs</properties>
        </compilerArguments>
      </configuration>
      <dependencies>
        <dependency>
          <groupId>org.eclipse.tycho</groupId>
          <artifactId>tycho-compiler-jdt</artifactId>
          <version>1.7.0</version>
        </dependency>
      </dependencies>
    </plugin>
  <plugins>
<build>

I use an inline dependency on Tycho's tycho-compiler-jdt to provide the compiler. I stuck with version 1.7.0 for now because Tycho 2.0+ uses the newer core runtime that requires Java 11, which this project does not yet for platform lag reasons. I also find it useful to set <failOnWarning>false</failOnWarning> here because ECJ throws many more (legitimate) warnings than javac. Long-term, it's cleanest to keep this enabled.

I also configured Eclipse's compiler settings like I wanted for one of the project's modules, then copied the settings file to a common location. That's where the compilerArguments bit comes from.

Then, I went through the available libraries from lastNPE.org, found the ones that match the libraries we use, and added them as dependencies in my root project:

<properties>
  <lastnpe-version>2.2.1</lastnpe-version>
</properties>

<dependencies>
  <dependency>
    <groupId>org.lastnpe.eea</groupId>
    <artifactId>jdk-eea</artifactId>
    <version>${lastnpe-version}</version>
    <scope>provided</scope>
  </dependency>
  <!-- and so on -->
</dependencies>

Once I updated the project configurations, Eclipse churned for a while, and then I got to work cleaning up the giant pile of new errors and warnings it brought up. As usual with null checks, this was a mix of "oh, nice catch" and "okay, sure, technically, but come on". For example, it flags System.out.println as a potential NPE because System.out is assignable - this is true, but realistically my app's code is going to be the least of my concerns when System.out is set to null.

In any event, I was pleased as punch to find this. Now, I have a way to not only properly check nullness with core classes and common libraries, but it's a way that's shared among the whole project team automatically and enforced at compile time.

Java Discontinuities in Practice

Mon Nov 23 11:04:22 EST 2020

Tags: java

Earlier this year, I wrote a post about the lay of the Java land, and in it I mentioned the oddities of post-8 Java releases as well as the then-oncoming namespace conversion in Jakarta EE. Those changes are a bit more "real" now, so I think it's worth taking the opportunity to expand on them and how they relate to Java with Domino.

Jakarta EE 9

With Jakarta EE 9 officially out now, I think it's all the more important to keep an eye on what these changes are. For Jakarta's part, there's a convenient post up on Eclipse's site detailing the specifics of what's going on, and most of what I say here is really just going to rehash that.

The "namespace conversion" in question is the switch from javax.* to jakarta.* for EE-related packages like Servlet, due to Oracle not granting rights to the "javax" term. This has involved a lot of fiddly work internally for the Jakarta project as a whole, and all of the included specs have received a major-version bump to reflect the break. In general, these new versions are functionally equivalent to the previous release, but use the different package names - so Servlet 5 has the same capabilities as 4, JSF 3 as 2.3, and on down the line.

A bit of a quirk in this is that not all classes in the javax.* namespace will be moving to jakarta.*, because not everything in there was part of Java EE. For example, Swing is in javax.swing, but it's not going anywhere. It gets fiddlier, too, especially when it comes to XML. The JDK traditionally (more on that in a bit) contained a couple distinct technologies wrapped up under the javax.xml package space, but some of those are actually part of Java EE and make the transition to Jakarta. For example, the javax.xml.transform package (covering XSLT) is part of what was originally termed "Java API for XML Processing", or "JAX-P", and is still part of the Java SE core. The javax.xml.bind package (covering mapping between XML and Java objects) was part of the "Java Architecture for XML Binding" API, or "JAX-B", and is not part of Java SE anymore. It's now "Jakarta XML Binding" and is receiving a package change to jakarta.xml.bind. I think it's cases like these that will be hairy for a lot of people not doing full Jakarta EE 9 work.

For the most part, this won't have an effect on Java development on Domino for a good while. Domino has never tracked changes in the EE world - XPages was a partial fork of Java EE 5 and that's been about it. I think that the ways it will affect Domino development (other than if you just outright do Jakarta EE development, which you should) is that code examples and third-party libraries are going to gradually transition over to the new namespaces, making them incompatible with code in the Domino stack. This will certainly affect things like my XPages Jakarta EE Support project, where future versions of the implementation components won't be usable directly if they use the Servlet spec, even if they don't require Servlet 3+ functionally.

So I think it's worth being aware of what's going on, even if there's not (yet) anything you need to do about it. The same applies to the changes in the core Java runtime itself.

Java 11 and Beyond

After 8, Java switched to a peculiar numbering system, where new major-version-numbered releases come out every six months, but only the ones that come out every three years are Long-Term-Service releases. As of right now, the current version of Java is 15, but 11 is the active LTS one, and so 11 is effectively the "real" current version for concerns like platform vendors. Java 8 is now in the same spot that Java 6 was for a while, where it's been the baseline expectation for a long time, and it's a slog of a process to move the full community past it.

Still, Java 11 is certainly hitting critical mass now. Eclipse-the-IDE started requiring it in the 2020-09 release, and the various app servers have either supported it for a while or are on the cusp of doing so.

There are a lot of nice things added to the language in the releases past 8, but they've also gotten more aggressive about removing things from the core Java SE runtime, and those changes are the things likely to be immediately noticeable Domino-wise. As I mentioned above, JAX-B was always technically an EE specification, but it was shipped with Java SE for a good long time. As of Java 11, though, it's gone, and instead must be either provided by the app server or brought in as an explicit dependency. The same goes for some less-important packages, such as org.omg - though that package sounds fun, it stands for "Object Management Group" and it just included some classes used for CORBA.

I imagine that few Domino developers use JAX-B or CORBA directly, but our old nemesis Notes.jar sure does! If you're doing any project builds outside of Domino that make use of the Notes.jar API, you likely already have or will soon run into this. For Tycho, I made a patch fragment that provides the required API to the com.ibm.notes.java.api bundle a good while back. For non-Tycho projects, your best bet is generally to include a dependency on the GlassFish-packaged variant and a pre-3.0 version of the Jakarta XML Bind API.

There will be some further removals down the line, like RMI Activation, but I don't think any currently on the horizon will be as pertinent as those.

Java With Domino Roundtable Recording

Tue Nov 17 16:37:35 EST 2020

Tags: java

I hosted my "Java With Domino" roundtable earlier today, and I think it went pretty well! We ended up having just about the ideal number of participants, and it was not only great hearing how people feel on the topic, but also seeing and hearing from everyone.

I've put the video up on YouTube:

I'm thinking of doing more of these, and kind of making them a looser, more-casual companion to OpenNTF's webinar series. I don't know whether they'd all be on similar topics or what, but it seems like it'll be worth continuing.

Upcoming Event: Java With Domino Roundtable

Thu Nov 12 15:31:37 EST 2020

Tags: java

The other day, I floated the idea of running an unstructured roundtable discussion of working with Java either on or accessing Domino, and I think it'll be worth giving a shot.

Since Java with Domino is in a weird place, the goal would be to discuss the various ways that people are or want to use it. So that can include XPages, OSGi, REST services generally, Jakarta EE, Spring, Vert.x, and so forth. I'd also like it to be open generally. I imagine I'll have some preliminary remarks, but otherwise the goal is to be less like a webinar and more like a free-flowing discussion, in the vein of the "happy hour" and "coffee break" rooms from CollabSphere and Digital Week.

My current plan is to run it on short notice, next week:

Tuesday, November 17th
2:00 PM US Eastern (19:00 UTC)
https://zoom.us/j/99514285138
Password: Computers!

I'll share the password I come up with on Twitter on the day of the event, so look for it there.

Weekend Domino-Apps-in-Docker Experimentation

Sun Jun 28 18:37:19 EDT 2020

  1. Weekend Domino-Apps-in-Docker Experimentation
  2. Executing a Complicated OSGi-NSF-Surefire-NPM Build With Docker
  3. Getting to Appreciate the Idioms of Docker

For a couple of years now, first IBM and then HCL have worked on and adapted community work to get Domino running in Docker. I've observed this for a while, but haven't had a particular need: while it's nice and all to be able to spin up a Domino server in Docker, it's primarily an "admin" thing. I have my suite of development Domino servers in VMs, and they're chugging along fine.

However, a thought has always gnawed at the back of my mind: a big pitch of Docker is that it makes not just deployment consistent, but also development, taking away a chunk of the hassle of setting up all sorts of associated tools around development. It's never been difficult, per se, to install a Postgres server, but it's all the better to be able to just say that your app expects to have one around and let the tooling handle the specifics for you. Domino isn't quite as Docker-friendly as Postgres or other tools, but the work done to get the official image going with 11.0.1 brought it closer to practicality. This weekend, I figured I'd give it a shot.

The Problem

It's worth taking a moment to explain why it'd be worth bothering with this sort of setup at all. The core trouble is that running an app with a Notes runtime is extremely annoying. You have to make sure that you're pointing at the right libraries, they're all in the right place to be available in their internal dependency tree, you have to set a bunch of environment variables, and you have to make sure that you provide specialized contextual info, like an ID file. You actually have the easiest time on Windows, though it's still a bit of a hurdle. Linux and macOS have their own impediments, though, some of which can be showstoppers for certain tasks. They're impediments worth overcoming to avoid having to use Windows, but they're impediments nonetheless.

The Setup

But back to Docker.

For a little while now, the Eclipse Marketplace has had a prominent spot for Codewind, an IBM-led Eclipse Foundation project to improve the experience of development with Docker containers. The project supplies plugins for Eclipse, IntelliJ, and VS Code / Eclipse Che, but I still spend most of my time in Eclipse, so I went with the former.

To begin with, I started with the default "Open Liberty" project you get when you create a new project with the tooling. As I looked at it, I realized with a bit of relief that there's not too much special about the project itself: it's a normal Maven project with war packaging that brings in some common dependencies. There's no Maven build step that expects Docker at all. The specialized behavior comes (unsurprisingly, if you use Docker already) in the Dockerfile, which goes through the process of building the app, extracting the important build results into a container based on the open-liberty runtime image, bringing in support files from the project, and launching Liberty. Nothing crazy, and the vast majority of the code more shows off MicroProfile features than anything about Docker specifically.

Bringing in Domino

The Docker image that HCL provides is a fully-fledged server, but I don't really care about that: all I really need is the sweet, sweet libnotes.so and associated support libraries. Still, the easiest way to accomplish that is to just copy in the whole /opt/hcl/domino/notes/11000100/linux directory. It's a little wasteful, and I plan to find just what's needed later, but it works to do that.

Once you have that, you need to do the "user side" of it: the ID file and configuration. With a fully-installed Domino server, the data directory balloons in side rapidly, but you don't actually need the vast majority of it if you just want to use the runtime. In fact, all you really need is an ID file, a notes.ini, and a names.nsf - and the latter two can even be massively trimmed down. They do need to be custom for your environment, unfortunately, but at least it's much easier to provide just a few files than spin up and maintain a whole server or run the Notes client locally.

Then, after you've extracted the juicy innards of the Domino image and provided your local resources, you can call NotesInitExtended pointing to your data directory (/local/notesdata in the HCL Docker image convention) and the notes.ini, and voila: you have a running app that can make local and remote Notes native API calls.

Example Project

I uploaded a tiny project to demonstrate this to GitHub: https://github.com/jesse-gallagher/domino-docker-war-example. All it does is provide one JAX-RS resource that emits the server ID, but that shows the Notes API working. In this case, I used the Darwino Domino NAPI (which I really need to refresh from upstream), but Domino JNA would also work. Notes.jar would too, but I think you'll need one of those projects to do the NotesInitExtended call with arguments.

The Dockerfile for the project goes through the steps enumerated above, based on how the original example image does it, and was tweaked to bring in the Domino runtime and support files. I stripped the Liberty-specific stuff out of the pom.xml - I think that the original route the example did of packaging up the whole server and then pulling it apart in Docker image creation has its uses, but isn't needed here.

Much like the pom.xml, the code itself is slim and doesn't explicitly refer to Docker at all. I have a ServletContextListener to init and term the Notes runtime, as well as a Filter implementation to init/term the request thread, but otherwise it just calls the Notes API with no fuss.

Larger Projects

I haven't yet tried this with larger projects, but there's no reason it shouldn't work. The build-deploy-run cycle takes a bit more time with Docker than with just a Liberty server embedded in Eclipse normally, but the consistency may be worth it. I've gotten used to running a killall -KILL java whenever an errant process gloms on to my Notes ID file and causes the server to stop being able to init the runtime, but I'd be glad to be done with that forever. And, for my largest project - the one with the hundreds of XPages and CCs - I don't see why that wouldn't work here too.

Normal Domino Projects

Another route that I've considered for Domino in Docker is to use it to deploy NSFs and OSGi projects. This would involve using the Domino image for its intended purpose of running a full server, but configuring the INI to just serve HTTP, and having the Dockerfile place the built OSGi plugins and NSFs in their right places. This would certainly be much faster than the build-deploy-run cycle of replacing NSF designs and deploying the plugins to an Update Site NSF, though there would be a few hurdles to get over. Not impossible, though.


I figure I'll kick the tires on this some more this week - maybe try deploying the aforementioned giant XPages .war project to it - to see if it will fit into my workflow. There's a chance that the increased deployment times won't be worth it, and I won't really gain the "consistent with production" advantages of Docker when the way I'm developing the app is already a wildly-unsupported configuration. It might be worth it if I try the remote mode of Codewind, though: I have some Liberty servers that Jenkins deploys to, but it'd be even-better to be able to show my running app to co-developers to work on something immediately, instead of waiting for the full build. It's worth some investigation, anyway.

Managed Beans to CDI

Fri Jun 19 13:50:44 EDT 2020

  1. Java Services (Not the RESTful Kind)
  2. Java ClassLoaders
  3. Managed Beans to CDI
  4. The Myriad Idioms For Finding Implementations In Java

When I was getting familiar with modern Java server development, one of the biggest conceptual stumbling blocks by far was CDI. Part of the trouble was that I kind of jumped in the deep end, by way of JNoSQL's examples. JNoSQL is a CDI citizen through and through, and so the docs would just toss out things like how you "create a repository" by just making an interface with no implementation.

Moreover, CDI has a bit of the "Maven" problem, where, once you do the work of getting familiar with it, the parts that are completely baffling to newcomers become more and more difficult to remember as being unusual.

Fortunately, like how coming to Maven by way of Tycho OSGi projects is "hard mode", coming to CDI by way of a toolkit that uses auto-created proxy objects is a more difficult path than necessary. Even better, XPages developers have a clean segue into it: managed beans.

JSF Managed Beans

XPages inherited the original JSF concept of managed beans, where you put definitions for your beans in faces-config.xml like so:

1
2
3
4
5
6
7
8
9
<managed-bean>
	<managed-bean-name>someBean</managed-bean-name>
	<managed-bean-class>com.example.SomeBeanClass</managed-bean-class>
	<managed-bean-scope>application</managed-bean-scope>
	<managed-property>
		<property-name>database</property-name>
		<value>#{database}</value>
	</managed-property>
</managed-bean>

Though the syntax isn't Faces-specific, the fact that it is defined in faces-config.xml demonstrates what a JSF-ism it is. Newer versions of JSF (not XPages) let you declare your beans inline in the class, skipping the XML part:

1
2
3
4
5
6
7
8
package com.example;
// ...
@ManagedBean(name="someBean")
@ApplicationScoped
public class SomeBeanClass {
	@ManagedProperty(value="#{database}")
	private Database someProp;
}

These annotations were initially within the javax.faces package, highlighting that, while they're a new developer convenience, it's still basically the same JSF-specific thing.

While all this was going on (and before it, really), the Enterprise JavaBeans (EJB) spec was chugging along, serving some similar concepts but it really is kind of its own, all-consuming beast. I won't talk about it much here, in large part because I've never used it, but it has an important part in this history, especially when we get to the "dependency injection" parts.

Move to CDI

Since it turns out that managed beans are a terrifically-useful concept beyond just JSF, Java EE siphoned concepts from JSF and EJB to make the obtusely named Contexts and Dependency Injection spec, or CDI. CDI is paired with some associated specs like Common Annotations and Inject to make a new bean system. With a switch to CDI, the bean above can be tweaked to something like:

1
2
3
4
5
6
7
8
package com.example;
// ...
@Named(name="someBean")
@ApplicationScoped
public class SomeBeanClass {
	@Inject @Named("database")
	private Database someProp;
}

Not wildly different - some same-named annotations in a different package, and some semantic switches, but the same basic idea. The difference here is that this is entirely divorced from JSF, and indeed from web apps in general. CDI specifically has a mode that works outside of a JEE/Servlet container and could work in e.g. a command-line program.

Newer versions of JSF (and other UI engines) deprecated their own version of this to allow for CDI to be the consistent pool of variable resolution and creation for the UI and for the business logic.

The Conceptual Leap

One of the things blocking me from properly grasping CDI at first was that @Inject annotation on a property. If it's just some Java object, how would that property ever be set? Certainly, CDI couldn't be so magical that I could just do new SomeBeanClass() and have someProp populated, right? Well, yes, that's right. No matter how gussied up your class definition is with CDI annotations, constructing an instance with new will pay no attention to any of it.

What got me over the hurdle is realizing that, in a modern web app in particular, almost everything you do runs through CDI. JSP request? That can resolve CDI. JAX-RS resource? That's managed by CDI. Filters? CDI. And, because those objects are all being instantiated by CDI, the CDI runtime can do whatever the heck it wants with them. That's why the managed property in the original example is so critical: it's the same idea, just managed by the JSF runtime instead of CDI.

That's how you can get to a class like the controller that manages the posts in this blog. It's annotated with all sorts of stuff: the JAX-RS @Path, the MVC spec @Controller, the CDI @RequestScoped, and, importantly, the @Inject'ed properties. Because the JAX-RS environment instantiates its resource classes through CDI in a JEE container, those will be populated from various sources. HttpServletRequest comes from the servlet environment itself, CommentRepository comes from JNoSQL as based on an interface in my non-JEE project (more on that in a bit), and UserInfoBean is a by-the-numbers managed bean in the CDI style.

There's certainly more indirect "magic" going on here than in the faces-config.xml starting point, but it's a clear line from there to here.

The Weird Stuff

CDI covers more ground, though, and this is the sort of thing that tripped me up when I saw the JNoSQL examples. Among CDI's toolset is the creation of "proxy" objects, which are dynamic objects that intercept normal method calls with new behavior. This is a language-level Java feature that I didn't even know this was a thing in this way, but it's been there since 1.3.

Dynamic scripting languages do this sort of thing as their bread and butter. In Ruby, you can define method_missing to be called when code calls a method that wasn't already defined, and that can respond however you'd like. Years ago, I used this to let you do doc.foo to get a document item value, for example. In Java, you get a mildly-less-loosey-goosey version of this kind of behavior with a proxy's InvocationHandler.

CDI does this extensively, even when you might think it's not. With CDI, all instances are dynamic proxy objects, which allows it to not only inject field values, but also add wrapper code around method calls. This allows tools like MicroProfile Metrics to do things like count invocations, measure timings, and so forth without requiring explicit code beyond the annotations.

And then there are the whole-cloth new objects, like the JNoSQL repositories. To take one of the examples from jnosql.org, here's a full definition of a JNoSQL repository as far as the app developer is concerned:

1
2
3
4
5
6
public interface PersonRepository extends Repository<Person, Long> {

  List<Person> findByName(String name);

  Stream<Person> findByPhones(String phone);
}

Without knowledge of CDI, this is absolute madness. How could it possibly work? There's no code! The trick to it is that CDI ends up creating a dynamic proxy implementation of the interface, which is in turn backed by an InvocationHandler instance. That instance receives the incoming method call as a string and array of parameters, parses the method to look for a concept it handles, and either generates a result or throws an exception. Once you see the capabilities the stack has, the process to get from a JAX-RS class using @Inject PersonRepository foo to having that actually work makes more sense:

  • The JAX-RS servlet receives a request for the resource
  • It asks the CDI environment to create a new instance of the resource class
  • CDI runs through the fields and methods of the class to look for annotations it can handle, where it finds @Inject
  • It looks through its contributed extensions and finds JNoSQL's ServiceLoader-provided extension
  • One of the beans from that extension can handle creating Repository instances
  • That bean creates a proxy object, which handles method calls via invoke

Still pretty weird, but at least there's a path to understanding.

The Overall Importance

The more I use modern JEE, the more I see CDI as the backbone of the whole development experience. It's even to the point where it feels unsafe to not have it present, managing objects, like everything is held together by shoestring. And its importance is further driven home by just how many specs depend on it. In addition to many existing technologies either switching to or otherwise supporting it, like JSF above, pretty much any new Jakarta EE or MicroProfile technology at least has it as the primary mechanism of interaction. Its importance can't be overstated, and it's worth taking some time either building an app with it or at least seeing some tutorials of it in action.

The RuntimeEnvironment Idiom

Thu Jun 18 09:16:48 EDT 2020

Tags: java xpages
  1. XPages: The UI Toolkit and the App Framework
  2. The RuntimeEnvironment Idiom
  3. NSF ODP Tooling 3.1.0: Dynamically Including Web Resources

One of the specific problems that we encountered with my aforementioned client app first when expanding it to include REST services and then later to be portable outside an NSF entirely is dealing with varying mechanisms for interacting with the surrounding environment.

The Problem to Solve

The immediate way this distinction comes up when adding JAX-RS services or other OSGi servlets is trying to get a handle on the current Domino user session or context database. In an XPages app (including in code called in a plugin-based library), you can just do:

1
Session s = ExtLibUtil.getCurrentSession();

However, this will return null if called while processing an OSGi servlet. Instead, servlet code should call:

1
Session s = ContextInfo.getUserSession();

Same idea - they both return a session based on the current authenticated user from the HTTP stack - but they have different backing implementations. So my first pass was to coordinate these inside an AppUtil class in a method like this:

1
2
3
4
5
6
7
public static Session getSession() {
	if(FacesContext.getCurrentInstance() != null) {
		return ExtLibUtil.getCurrentSession();
	} else {
		return ContextInfo.getUserSession();
	}
}

This worked pretty well, until I added Tycho-based compile-time unit tests, which is an OSGi environment where neither of those paths would return a session. So I had to add a fallback that would just eventually spawn a new NotesFactory.createSession() if it couldn't find another one.

It's one thing for a getSession() method to balloon in logic, but Notes runtime access isn't the only problem like this. Take the case of validating model objects as part of the "save" process. In an XPages environment, validation errors should be reported as FacesMessages on the view root or, ideally, attached directly to the form control that represents the invalid field. In a REST service, though, the ConstraintViolationException should bubble right up to the top and be returned as an appropriately-formatted JSON object with a corresponding HTTP status code. Originally, we handled this similarly: we moved the FacesMessage stuff out of the model objects and into the AppUtil class and handled it with an if tree.

The RuntimeEnvironment Class

Eventually, though, there was enough customizable behavior that these branching methods in one class got out of hand, and that's even before getting into cases where a class (like FacesContext) may not even be available at runtime at all. So I implemented a RuntimeEnvironment class as a service. It started out like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
public interface RuntimeEnvironment {
	static final List<RuntimeEnvironment> knownEnvironments = AppUtil.findExtensions(RuntimeEnvironment.class).stream()
			.sorted((a, b) -> Integer.compare(b.getWeight(), a.getWeight()))
			.collect(Collectors.toList());

	public static RuntimeEnvironment current() {
		return knownEnvironments.stream()
			.filter(RuntimeEnvironment::isCurrent)
			.findFirst()
			.orElseGet(UnknownEnvironment::new);
	}

	boolean isCurrent();
	int getWeight();
}

The AppUtil.findExtensions method is a simplified wrapper around the IBM Commons ExtensionManager call to find services in a type-safe way.

This allows me to define a series of RuntimeEnvironment implementations that may or may not be included in a given packaging of the app, while the isCurrent() and getWeight() methods allow me to distinguish between multiple valid environments to find the most specific. To get an idea of what I mean, here is the current suite of environment implementations:

RuntimeEnvironment type hierarchy

These run a wide gamut. XPagesEnvironment and OSGiServletEnvironment are the big ones that kicked it off, but TychoEnvironment is there to handle compile-time tests, while NotesEnvironment lets the same code work in some utilities we launch from within the Notes client - and SWTRuntimeEnvironment allows those same tools to run outside of OSGi.

Once I broke ground on these classes, the number of situations where they're useful began to become obvious. Take, for example, resolving a variable. The on-Domino XPages implementation looks like what you'd expect:

1
2
3
4
5
@Override
public <T> @Nullable T resolveVariable(final String varName) {
	FacesContext context = FacesContext.getCurrentInstance();
	return (T) context.getApplication().getVariableResolver().resolveVariable(context, varName);
}

In the OSGiServletEnvironment and JakartaRuntimeEnvironment cases, though, I can use CDI instead:

1
2
3
4
5
@Override
public <T> @Nullable T resolveVariable(final String varName) {
	Instance<Object> instance = CDI.current().select(NamedLiteral.of(varName));
	return instance.isResolvable() ? (T)instance.get() : null;
}

It gets down to little things, too, like how the POST destination for form-based login can be /names.nsf?Login on Domino but /j_security_check on other webapp servers.

Seeing It Elsewhere

This sort of idiom is by no means anything I came up with. You can see it pretty frequently - in fact, I highlighted the way the IBM Commons stack does a very similar thing when running XPages outside Domino:

IBM Commons Platform hierarchy

This serves essentially the same purpose, being filled with mechanisms for getting output streams, finding resource locations, and retrieving named objects.

Regular Use

Should you implement something like this for most apps? Probably not, no - for most even-moderately-complex XPages applications, having some if tests in a central util to distinguish between XPages and OSGi servlets should be enough. I think it's a useful instructional example, though, and it sure was critical in getting this massive thing working outside Domino. As we make our apps more portable, this is the sort of technique we should keep in mind.

XPages: The UI Toolkit and the App Framework

Wed Jun 17 21:19:40 EDT 2020

Tags: java xpages
  1. XPages: The UI Toolkit and the App Framework
  2. The RuntimeEnvironment Idiom
  3. NSF ODP Tooling 3.1.0: Dynamically Including Web Resources

Lately, one of my client projects has been picking up the pace on the years-long effort of taking a giant XPages app, making the business logic portable, and incrementally cutting down on the "XPage-iness" of it all. I expect that this will be a recurring source of blog posts, and this one is about distinguishing between "XPages the UI toolkit" and "XPages the web app framework".

XPages in an NSF

Coming from a Domino perspective, there is no distinction between the two, and that's largely because of the path we took to get here. Other than existing inside the same NSF, the distinction between "classic" Notes apps (web or client) and XPages couldn't be more stark. Legacy design elements are only shared in ways that are completely divorced from their original UI presentation (for the better), and the runtimes - as much as legacy elements can be said to have a "runtime" - are entirely distinct.

An XPages app can be thought of as conceptually a "normal" Java WAR-based webapp housed inside an NSF, and it has a lot of the trappings: classes in WEB-INF/classes, libraries in WEB-INF/lib, and an OSGi-style WebContent folder for miscellaneous files. It's not technically a normal webapp - there's no "web.xml" and the XPages outer "LCD" runtime is actually more like one giant webapp that acts like many - but it's close.

Critically, though, the Domino HTTP router only routes requests for ".xsp" files or "/xsp/" folders to your app's XPages environment, and this is the biggest technical and conceptual impediment. You can't (within an NSF) intercept just any incoming request and process it as you would in a normal webapp. You can kind of shim your way into it with servletFactory, but it's a fiddly process and limited to "/xsp/..." URLs.

Additionally, a running XPages app only exists in a very constrained way between requests. While the JSF-level "Application" and the Servlet-level session exist, you don't work with a per-app ServletContext the way you do in a Servlet webapp. You can hook in with ApplicationListeners and similar constructs, but they're still based on the lifecycle of the XPages app, which comes into existence only on the first request and dies (usually) half an hour after the last.

These, plus the specifics of Domino data access, combine to make the "XAgent" - an abomination of a concept - the catchall replacement for specialized rendering, batch processing, and even scheduled tasks.

XPages the View Engine

These are all accidents of history, though. They stem from the firm requirement that existing Domino HTTP behavior remain intact even with its brain transplant, as well as the "soft" requirement that XPages in an NSF pretend to be "forms with repeats and partial refresh".

At its core, XPages is "just" a web view engine: its only job is to accept a request from an HTTP client and return some HTML. The concepts it uses to accomplish this - components, renderers, managed beans, themes - are all incidental to the main task. This is the "V" part of MVC. Admittedly, even without the NSF compromises, XPages bleeds beyond its assigned third of the triad, and it inherited this from JSF. JSF is also billed as MVC, but it completely subsumes the "Controller" part and partially eats the "Model" part with its bean management.

Still, though, even a domineering framework like JSF slots in as just one component of a normal webapp, rather than being the whole thing as XPages is in an NSF. For example, take the app behind this blog, which partially looks like this:

Java and JSP resources in the blog

It uses JSP as its view template engine, but is it a "JSP app"? Not really. The fact that it uses MVC 1.0 is more important to understanding it, but that's really an extension to JAX-RS. You could make a strong case that it's a "JAX-RS app", especially when you expand the "Services" section in Eclipse:

REST services in the blog

That covers more of it, but still leaves parts out. It has application-wide beans by way of CDI, entirely-UI-free scheduled tasks kicked off from a ServletContextListener, and core business logic and model objects that are kept in a module that doesn't even know about the Servlet API.

It's layered, but the layers are explicable and the distinctions create a tremendous amount of flexibility. I could, if I wanted, change to ThymeLeaf for the front end with essentially no friction, JSF or Vaadin with only mildly more, or to a client JS REST UI by chopping off the top two layers outright.

Okay, So?

This description isn't a call to action - there's nothing inherently wrong about an XPages app in an NSF, especially a small-to-medium one - but this will be an important part of the conceptual groundwork in the months to come. To figure out what to do with all these piles of XSP markup and framework-specific business logic we have, we'll have to do a lot of deconstruction.

Java ClassLoaders

Fri Jun 05 10:47:55 EDT 2020

Tags: java osgi xpages
  1. Java Services (Not the RESTful Kind)
  2. Java ClassLoaders
  3. Managed Beans to CDI
  4. The Myriad Idioms For Finding Implementations In Java

In my last post, I casually mentioned the concept of ClassLoaders a couple times, and I think that they deserve their own post. ClassLoaders are exactly the kind of thing where, once you do Java long enough, you start to take for granted, but which aren't necessarily immediately obvious for people not as immersed.

The Basics

The core job of a ClassLoader is what it says on the tin: it loads classes. Say you have this bit of code using a class from the core Java library:

1
long now = System.currentTimeMillis();

This uses two types: long, which is a built-in primitive type and not a class at all, and java.lang.System. long doesn't have to come from anywhere, but java.lang.System does, and that's the job of a ClassLoader. In this case, the Java VM will ask the contextual ClassLoader for a class by that name, and the ClassLoader will (at least in Java 8 - things got weird later) look into the core library and find a file named "java/lang/System.class" within "rt.jar", parse its binary contents into an executable class, and hand it back to the VM.

ClassLoaders are also the source of two problem reports you've likely seen: ClassNotFoundException and NoClassDefFoundError. These two basically mean the same thing: the running app tried to load a class by name, but it wasn't found - they just differ in context (the former generally when a class is asked for dynamically, the latter when it's referenced as part of compiled code). This sort of thing can occur when you write code using a class that's present in your development environment but is not present when run later - among XPages developers, this happens quite a bit when people drop some JARs into jvm/lib/ext in their Designer installation but don't do the same on Domino.

Resource Loading

In addition to finding classes, ClassLoaders have a few other tasks, the main one of which of interest to us is loading resources. In my previous post, I talked about how ServiceLoader looks for service files by a given name, like META-INF/services/com.sprockets.data.FizzBuzzConverter. It does this by checking with the current ClassLoader and calling cl.getResources("META-INF/services/com.sprockets.data.FizzBuzzConverter"), which will return a listing of resources from JARs (and JAR-like sources, like an NSF) that it knows about matching that name. In that way, multiple JARs can declare services with the same name without conflicting.

ClassLoader Trees

Though conceptually your running program has "a ClassLoader", in reality it's almost definitely a chained series of ClassLoaders, rooted in the core system ClassLoader and then drilling down more specifically to your app's code. For example, take an application running in Apache Tomcat. In that case, Tomcat's documentation describes four basic tiers:

  • The core JVM ("bootstrap") ClassLoader that comes with any running Java program. As Tomcat's docs note, this implementation may vary
  • The central ("system") ClassLoader that contains the "just above the metal" classes, such as those you may add in the "CLASSPATH" environment variable
  • The Tomcat-specific ("common") ClassLoader, containing classes shared among all running applications. For example, javax.servlet.Servlet would be found here
  • Your app's ClassLoader, containing classes you write as well as any third-party JARs you bundled into your WAR file in WEB-INF/lib

When your code executes and requests a new class, the runtime will check first with your app's local ClassLoader and return what it finds there if present - if the class isn't present there, then that ClassLoader will delegate up to its parent, and so forth until it either finds a class or hits the root and throws a NoClassDefFoundError.

The way that each app has its own ClassLoader is also how you can have multiple apps on the same server that can each know about common core classes, but don't step on each others' toes with their own custom classes. Though javax.servlet.Servlet is the same class for two running apps, one app could have an internal class named "com.example.SomeBusinessLogic" and it wouldn't be visible by other running apps.

Dynamic ClassLoaders

Though the normal case of ClassLoaders is that sort of "do I have this class? If not, ask my parent" chain, the fact that a ClassLoader is itself a custom Java class means that its behavior can be pretty arbitrary. This is present in a normal web app ClassLoader: it knows to look in the WEB-INF/classes path within the WAR file instead of the normal behavior of checking from the root of a JAR, and it knows how to look in WEB-INF/lib for additional JARs to search.

In an XPages application, the active ClassLoader is roughly similar to Tomcat's app ClassLoader example, but with a couple additional capabilities. The main one is that the NSF's ClassLoader - an instance of com.ibm.domino.xsp.module.nsf.ModuleClassLoader - has knowledge of how to treat an NSF as if it were a WAR file. In Designer's "Package Explorer" pane, you get a view of the NSF that makes it look basically like a normal WAR, where classes go in WEB-INF/classes and JARs go in WEB-INF/lib. However, it's still really a nebulous pool of notes floating around, and so the ModuleClassLoader does design-collection lookups for file resources of various types and loads the class bytecode or resource data from there.

It also, in a move presumably designed to inconvenience me personally, has explicit restrictions on what classes it can load: even though it knows about, for example, org.eclipse or com.ibm.domino.napi classes, it has a check to explicitly bar loading these. That's why, even if you configure Designer to see those classes and compile XPages code that references them, they won't be available at runtime.

OSGi ClassLoaders

OSGi ClassLoaders are a particular kind of dynamic ClassLoader. In addition to the normal hierarchical view of the world, they take on special responsibilities for ensuring that your OSGi module (which an XPages app kind of is) sees classes from other bundles based on its dependency rules, but not necessarily their resources. For example, take rules like this in an OSGi bundle's META-INF/MANIFEST.MF:

1
2
Require-Bundle: com.ibm.xsp.core
Import-Package: com.ibm.commons.util

These simple lines hide some beguiling complexity. With this definition, a running class in your bundle will be able to see:

  • All classes at the system level, such as java.lang.System
  • All classes contained within and exported by "com.ibm.xsp.core", such as com.ibm.xsp.FacesExceptionEx and com.ibm.xsp.url.UrlHandler
    • There's also special behavior going on here, because those classes are contained within an embedded JAR in the bundle, referenced as Bundle-ClassPath: lwpd.xsp.core.jar - this is an OSGi-ism
    • Though this bundle lists all of its packages in its Export-Package header, this is not a requirement: it's common for an OSGi bundle to have classes internally that are not accessible from outside
  • All classes exported by its bundle dependency that it marks as visibility:=reexport: "com.ibm.pvc.servlet" and "com.ibm.designer.lib.jsf"
    • This is why you can have a dependency on just "com.ibm.xsp.core" and access javax.faces.context.FacesContext even though it's not in the core XSP bundle
    • This is also transitive, though neither of those re-exported dependencies themselves re-export any dependencies
  • The classes from the "com.ibm.commons" bundle in the "com.ibm.commons.util" package. This means that com.ibm.commons.util.StringUtil is visible, but com.ibm.commons.extension.ExtensionManager is not, despite both being within the same bundle JAR

There are also tons of weird visibility and dependency details as well in OSGi, but that's the gist of it. Note that I specifically mentioned that the resources aren't visible. Though the Require-Bundle: com.ibm.xsp.core line makes all classes exported from the XSP core visible to your code, calling ServiceLoader.load(com.ibm.xsp.acf.HtmlFilteringFactory.class) will not find the DefaultHtmlFilteringFactory implementation declared in there, even though it's done in a ServiceLoader-compatible way. This is why IBM Commons papers over that difference with its "plugin.xml" extension declarations. OSGi actually contains a Service Loader Mediator specification to bridge this gap, but Domino doesn't include an implementation of that part.

Fragment Bundles

There's one special case with OSGi bundles that's worth highlighting: fragments. Normally, each bundle effectively has its own ClassLoader space, walled off from all others by OSGi's broker. However, if you declare your bundle as having a Fragment-Host of another active bundle, your code acts as if it's within the parent, gaining access to not just all of the parent bundle's classes, but also its non-class resources. Moreover, this works in the reverse: the parent also gains access to the fragment's classes as resources, though it generally won't "know" about them at the time of development.

This is a technique that's come in handy for me many times, in particular in cases like the XPages Jakarta EE Support project, where API bundles will use ServiceLoader to find their implementations. In those cases, one of the ways I get it to work in OSGi is to create a fragment bundle out of the implementation, meaning that the bundles remain distinct but now the API can find the META-INF/services files and classes it needs to operate.

This has a good number of other uses, too, such as providing platform-specific native code to an otherwise-platform-independent core bundle. The Notes.jar wrapper used in XPages land uses this type of technique. Though, to my knowledge, Notes.jar doesn't contain any actual native code, it's still delivered in two pieces:

  1. The "com.ibm.notes.java.api" bundle, which lists all of the exported packages but holds no code itself
  2. The "com.ibm.notes.java.api.win32.linux" bundle, which contains the actual Notes.jar and declares Fragment-Host: com.ibm.notes.java.api
    • I'm not sure why this is the case, but maybe Notes.jar is different on System i or something

If you have a bundle that needs access to lotus.domino classes, you then can either do Require-Bundle: com.ibm.notes.java.api or Import-Package: lotus.domino and it'll be resolved out of the fragment. There's also an Eclipse-ism in here: the first bundle has Eclipse-ExtensibleAPI: true, which is a tip-off to the IDE that it should specifically allow fragments to contribute available classes to the development environment. This is generally required when developing with Eclipse's plug-in tooling (shared with Designer), but it's not actually enforced one way or the other by the runtime.

Wrapping It Up

This is all definitely in the category of "you don't normally need to worry about it, but it's very helpful to know", like the previous ServiceLoader topic. Until you're implementing some low-level stuff, you're not likely to interact with the ClassLoader directly, especially to a level beyond finding Thread.currentThread().getContextClassLoader() or Foo.class.getClassLoader(). Knowing about it can help make clear what's going on in situations where a class shows up in development but not at runtime, or when the XPages ClassLoader tries to get to fancy and throws up on itself.

Java Services (Not the RESTful Kind)

Thu Jun 04 16:42:28 EDT 2020

Tags: java
  1. Java Services (Not the RESTful Kind)
  2. Java ClassLoaders
  3. Managed Beans to CDI
  4. The Myriad Idioms For Finding Implementations In Java

The concept of "services" in Java is fairly critical, but, especially with the XPages stack we've grown used to, the term covers quite a few different technologies.

Definition

Before I continue on, I want to make clear what I mean by "service" in this context. It's unrelated to REST services or even remote access of any kind; instead, it's about how an app can find implementations of some kind of class or interface within its runtime.

A very-common type of this sort of thing is a data adapter or converter. Say you have your own object FizzBuzz that you use within your app, one that represents data storable in multiple ways. One way to handle converting from various types to FizzBuzz would be a giant if tree, like:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
public FizzBuzz convert(Object input) {
	if(input instanceof String) {
		// ...
	} else if(input instanceof JsonObject) {
		// ...
	} else if(input instanceof org.w3c.dom.Document) {
		// ...
	} else {
		throw new IllegalArgumentException("Cannot convert to FizzBuzz: " + input);
	}
}

That'd work well enough, especially for a small app. You can imagine, though, how this might get out of hand in an even moderately-complicated case, with the if tree turning into a tangled mess. Moreover, this doesn't allow for any extensibility without directly modifying the convert method - any new type will have to go into this, making management of a large team more cumbersome and completely cutting off the possibility of third-party additions.

So, to keep things scalable, it'd make sense to create an interface that would specify a generic way to convert some type of object to a FizzBuzz:

1
2
3
4
5
6
package com.sprockets.data;

public interface FizzBuzzConverter {
	boolean canConvert(Object o);
	FizzBuzz convert(Object o);
}

Then the code that actually needs to convert would look more like this:

1
2
3
4
5
6
7
8
public FizzBuzz convert(Object input) {
	Stream<FizzBuzzConverter> converters = moreOnHowToFetchLater();
	return converters
		.filter(converter -> converter.canConvert(input))
		.findFirst()
		.map(converter -> converter.convert(input))
		.orElseThrow(() -> new IllegalArgumentException("Cannot convert to FizzBuzz: " + input));
}

In a small case like this, that's not necessarily going to be a big deal, but it doesn't take too long for it to become desirable to break it apart. Take the case of JAX-RS providers, which do exactly this kind of entity conversion when processing HTTP requests. Everything over HTTP comes in as plain text (more or less), but programmers want to be able to accept an int input parameter, or to automatically convert their custom business-logic object to JSON. Without a separation like this, the code to handle all known types would be impossible to manage all in one place, and there'd be no way to handle custom types that didn't exist when the code was written.

Types

There are quite a few distinct types of services that I've run across, and I'll list them here in roughly the likelihood that a programmer coming from an XPages background will encounter them.

ServiceLoader Services

This is the most-common kind of service you're likely to encounter in a Java application, and you can generally identify it by its use of the META-INF/services directory inside a JAR. java.util.ServiceLoader itself was added to Java in 1.6 but was designed to codify habits that become common beforehand.

The way this works is designed to be simple: you create a plain-text file within META-INF/services named after the service class you're implementing, and then put the names of your implementing classes within it, one on each line. So, in our above example, you'd create a file named META-INF/services/com.sprockets.data.FizzBuzzConverter and fill it with something like:

com.sprockets.data.impl.StringFizzBuzzConverter
com.sprockets.data.impl.JsonObjectFizzBuzzConverter
com.sprockets.data.impl.DomDocumentFizzBuzzConverter

Code that calls ServiceLoader.load(FizzBuzzConverter.class) will find all of those files within the current ClassLoader space (more fun with that down the line) and instantiate the named classes, returning an Iterator to loop through them.

IBM Commons Services

Within the XPages stack, the bulk of service interactions are managed by the IBM Commons ExtensionManager class, which is a generic way to ask for a service type by String name.

In the normal case, this acts as a slightly-old-timey variant of the now-standard ServiceLoader mechanism, likely by dint of preceding the standard's introduction. Like ServiceLoader, it looks for files with the name you pass it in the META-INF/services directory in your app and adds instances of all the names it finds within.

What makes it important (and what gives it longevity in non-XPages OSGi apps on Domino) is that it also bridges into the Equinox OSGi service infrastructure when available and looks for services registered there by the com.ibm.commons.Extension name. The reason this is important is that, in an OSGi context, one bundle can't by default see the files in another bundle in its ClassLoader, which means that services registered via META-INF/services in one won't be picked up by a ServiceLoader call in another.

Since XPages's life spanned a pre-OSGi era and the 8.5.2 "Extensibility API" era, it bears the signifiers of both, smoothly papered over by IBM Commons:

com.ibm.xsp.core bundle services

The Equinox loader looks in both places, in fact, which is why you can declare XPages services within an application using META-INF/services as well as within an OSGi bundle's plugin.xml file.

Equinox plugin.xml Extensions

I mentioned above that IBM Commons bridges the difference between ServiceLoader and Equinox, but now I'd better go into a little more detail about the latter.

"Equinox" refers to the particular OSGi implementation that underlies both Eclipse-the-IDE (and thus Notes) and Domino's web stack. While Equinox is the fully-fledged reference implementation of OSGi, plugin.xml is specific to it and I believe pre-dates Eclipse's migration to OSGi (which we still see reflected in 9.0.1FP10+'s plugin trouble).

plugin.xml used to house a lot of information that was moved over to META-INF/MANIFEST.MF, but its primary remaining function is to declare services for the Equinox environment. Eclipse itself uses this extensively, and it remains the primary way to extend the IDE's capabilities.

One important thing to note here is that plugin.xml's extensions aren't limited to just providing a service class implementation. While many do that, it's also used heavily to provide configuration information without executable classes at all.

Multi-type "FactoryFinder" style

This type of service locator is similar to the IBM Commons ExtensionManager, but is usually confined to an individual domain, like a specific Jakarta EE spec. The way this idiom works is that there's a central coordinating class, usually named FactoryFinder, whose job it is to locate implementations of services from one or more sources, and often using a known fallback implementation.

I encountered one of these when diving deep into the XPages stack. javax.faces.FactoryFinder is responsible for finding implementations of very-low-level entities, like the services that spit out JSF applications at the start of initialization, or those that create FacesContext objects.

These will often have specialized behavior. For example, the standard SOAP API looks through a system property, then an external "jaxm.properties" file, then ServiceLoader, then an older META-INF/services name, then OSGi, and finally falls back to a default class name.

Java 9 Modules

I have to admit that I haven't actually used this, but it's too important to skip. Java 9 and above include a module system that is sort of like an OSGi bundle in that it lets you declare what your module exports and other characteristics about its interactions with the outside world.

Along with this support came a new way to declare services. Since this is also baked in to Java itself, it gets the advantage of also working with ServiceLoader. In this case, instead of writing a text file in META-INF/services, you declare the type of service you're providing and the class implementing it in the module definition. This not only unifies the service with other module information, but it also makes it more type-safe and programmatically clear. It's neat-looking.

OSGi

I already mentioned that Equinox can use the "plugin.xml" file to do cross-bundle services in OSGi, but I also mentioned that it's specific to that one implementation and not actually part of the OSGi spec.

Instead, OSGi has a couple (for some reason) standard mechanisms for providing and consuming services. I encountered these mechanisms in practice when I created a UserRegistry implementation for Open Liberty.

In my first version, I declared my services programmatically in the bundle's activator (which is a class that you can write to run when your bundle is loaded/unloaded). In that way, you can dynamically tell the runtime that your bundle provides any number of services.

In my second revision, I changed to using what's dubbed Declarative Services. These do basically the same thing, but are defined for the runtime in a combination of the META-INF/MANIFEST.MF file and some service-definition files in the bundle - essentially, like a re-thought version of plugin.xml.

Summary

Okay! So, what's the upshot? Well, in my work, I use the first two all the time: inside a non-Domino app, META-INF/services is king; when working with Domino, IBM Commons ExtensionManager handles everything I need.

As far as implementing your own services, it's definitely a critical concept to keep in your pocket (I can only assume it's somewhere in Design Patterns). You could certainly go crazy with it and make a real mess of incomprehensible indirection, but it's probably useful more often than you'd think at first. Give it a shot next time you find yourself writing a big if tree with complicated branches.

My Active Open-Source Projects

Fri May 08 11:01:42 EDT 2020

Over the years, I've spawned a number of open source projects, both in my personal GitHub account and in OpenNTF's, but it'd be fair to say that not all of them are actively updated or see common use.

Nowadays, I have a set of tools that I actively develop (either solo or with a team) and which make up critical parts of my development infrastructure, and I figured it'd be useful to give an overview of them.

NSF ODP Tooling

This is my current favorite project by virtue of how much time it saves me every day and for its future potential. I wrote a series on this project a while ago, so I won't go over all the details of it here. The gist of it, though, is that this project lets me have a Maven tree for one of my big client projects that includes an array of OSGi bundles and have the Maven install project build all of those, assemble an update site with them and a bevy of dependencies, compile over a dozen NSFs (most with complicated Java code), and end up with a distribution ZIP containing importable update sites and deployable NTFs, all from my Mac with no Designer involved.

I have visions of this project forming the central infrastructure for a post-Designer world, and that's shaping up in a couple ways so far. One of those ways is the DXL and XPages LSP contributor component that allows for pretty-solid editing of, uh, DXL and XPages in tools that use the XML Language Server, such as Eclipse and Visual Studio Code. And that plays in to the other project I use daily, the XPages JEE Runtime.

XPages JEE Runtime

This is the project that started as a frenzied descent into madness and which I eventually hammered into shape enough to run real apps (with a side path where I also got XPages running on Android and iOS).

Now, this is the main way I do development on that client app. I have an Open Liberty server set up in Eclipse and a webapp variant of the XPages app that points to the same XPages, Custom Controls, and Java code from the NSF's ODP representation, and I have some hooks to direct all database references to the DB running in my dev VM. Since it's not a 100% perfect representation of the Domino environment, I still need to periodically sync it back to the NSF and test how it runs in there (and with the OSGi environment that I'm not using in the webapp), but I'm experienced enough at this point to generally know the potential pitfalls.

There's also a dark part of me that keeps being tempted to actually use this for production at some point, since it works so well now, and pushes aside so many hassles of loading and deploying on Domino itself. That would play in to the next project, the one that's hosting this very blog right now.

Domino Open Liberty Runtime

This is my project where I set up a sidecar Open Liberty instance alongside Domino, which allows for using native local NSF access while also having a full, modern Jakarta EE server with all the bells and whistles.

Though this project is a bit more staid than some of the others, I've gone in and made some interesting improvements lately. One was my journey into RunJava the other month, which I still think is a little too cute to put into production, but which actually should do the job just fine.

The other improvement, though, has some more immediate benefits. I added the ability to specify and auto-download AdoptOpenJDK Java runtimes to use instead of Domino's provided JVM. These runtimes still gain the same benefit of running with local Domino NSF access, but aren't constrained by Domino's once-again-long-in-the-tooth JVM. So you can, for example, specify that you'd rather bring in Java 14 and the runtime will auto-download it for you and launch Liberty using that. I haven't quite rolled that one out to this blog server yet, but it's on the docket. I'd love to bring in Java records, for example, and now there's nothing stopping me from doing so.

XPages Jakarta EE Support

I didn't have a good segue for this one.

This is a project I started a couple years ago initially as a way to expand on Martin Pradny's original plugin to make writing JAX-RS resources inside an NSF easy. It's grown into my project to essentially try to bring the XPages runtime up to code, at least in the parts that I want to use for work. Though it's constrained by the hard limit of the ancient Servlet API Domino's container provides, I've been able to bring in some important updates for EL and JAX-RS, and also allow for using CDI for managed beans and JAX-RS resources.

CDI is actually a whole huge topic that I have some draft posts for. As far as Java development is concerned, CDI is Important with a capital "I".

ODA

There's not a lot of fanfare with the OpenNTF Domino API, but that's largely intentional: as an improvement on the normal lsxbe API, it does its job and doesn't currently need any radical changes. I'm mostly including it here because, though it doesn't change much, it's periodically updated to cover the sprinkling of new Java methods HCL adds with each release.

generate-domino-update-site

While I don't use this project as such daily, I sure do benefit from its output. This is the Maven plugin that generates new update sites, which is required for up-to-date OSGi development for Domino in lieu of IBM/HCL ever updating their own release.

Other than being something I run every new Domino release, I've also made some improvements recently. Some of those just related to improving behavior in edge cases, but a nice one I added the other week was downloading of source components from Eclipse Neon. Though the source for the XPages runtime and the whole Expeditor scaffolding remain unavailable, I am able to look up and download the source for the unmodified Eclipse components, and this results in a more-pleasant development experience in Eclipse.


I have a few other projects that I use periodically, such as the NSF File Server, but those are the big-ticket ones.

Lessons From Fiddling With RunJava

Tue Mar 03 09:49:30 EST 2020

Tags: java websphere

The other day, Paul Withers wrote a blog post about RunJava, which is a very-old and very-undocumented mechanism for running arbitrary Java tasks in a manner similar to a C-based addin. I had vaguely known this was there for a long time, but for some reason I had never looked into it. So, for both my sake and general knowledge, I'll frame it in a time line.

History

I'm guessing that RunJava was added in the R5 era, presumably to allow IBM to use existing Java code or programmers for writing server addins (with ISpy being the main known one), and possibly as a side effect of the early push for "Java everywhere" in Domino that fell prey to strategy tax.

Years later, David Taib made the JAVADDIN project as a "grown up" version of this sort of thing, bringing the structure of OSGi to the idea. Eventually, that morphed into DOTS, which became more-or-less supported in the "Social Edition" days before meeting a quiet death in Domino 11.

The main distinction between RunJava and DOTS (other than RunJava still shipping with Domino) is the thickness of the layer above C. DOTS loads an Equinox OSGi runtime very similar to the XPages environment, bringing in all of the framework support and dependencies, as well as services of its own for scheduled task and other options. RunJava, on the other hand, is an extremely-thin layer over what writing an addin in C is like: you use the public static void main structure from runnable Java classes and you're given a runNotes method that are directly equivalent to the main and AddinMain function used by C/C++ addins.

Utility

Reading back up on RunJava got my brain ticking, and it primarily made me realize that this could be a perfect fit for the Open Liberty Runtime project. That project uses the XPages runtime's HttpService class to load immediately at HTTP start and remain resident for the duration of the lifecycle, but it's really a parasite: other than an authentication-helper servlet, the fact that it's running in nHTTP is just because that's the easiest way to run complicated, long-running Java code. For a while, I considered DOTS for this task, but it was never a high priority and has aged out of usefulness.

So I decided to roll up my sleeves and give RunJava a shot. Fortunately, I was pretty well-prepared: I've been doing a lot of C-level stuff lately, so the concepts and functions are familiar. The main run loop uses a message queue, for which Notes.jar provides an extremely-thin wrapper in the form of lotus.notes.internal.MessageQueue. And, as Paul reminded me, I had actually done basically this same thing before, years ago, when I wrote a RunJava addin to maintain a Minecraft server alongside Domino. I'd forgotten about that thing.

Lessons

Getting to the thrust of this post, I think it's worth sharing some of the steps I took and lessons I learned writing this, since RunJava is in a lot of ways much more hostile a place for code than the cozy embrace of Equinox.

#1: Don't Do This

The main lesson to learn is that you probably don't want to write a RunJava task. It was already the case that DOTS was too esoteric to use except for those with particular talent and needs, and that one at least had the advantage of being kind-of documented and kind-of open source. RunJava gives you almost no affordances and imposes severe restrictions, so it's really just meant for a situation where you were otherwise going to write an addin in C but don't want to have to set up a half-dozen compiler toolchains.

#2: Lower Your Dependencies Dramatically

The first big general thing to keep in mind is that RunJava tasks, if they're not just a single Java class file, are deployed right to the main domino JRE, either in jvm/lib/ext or in ndext. What this means is that any class you include in your package will be present in absolutely everything Java-related on Domino, which means you're in a minefield if you want to bring in any logging packages or third-party frameworks that could conflict with something present in the XPages stack or in your own higher-level Java code.

This is a fiddlier problem than you'd think. A release or so ago, IBM or HCL added a version of Guava to the ndext folder and it wreaked havoc on the version my client's app was using (which I think came along for the ride from ODA). You can easily get into situations where one class for a library is loaded from XPages-level code and another is loaded from this low level, and you'll end up with mysterious errors.

Ideally, you want no possible class conflicts at all. I took the approach of outright white-labeling some (compatibly-licensed) code from Apache and IBM Commons to avoid any possibility of butting heads with other code on the server. I was also originally going to use the Darwino NAPI or Domino JNA for a nicer Message Queue implementation, but scuttled that idea for this reason. It's Notes.jar or bust for safe API access, unfortunately.

#3: Use the maven-shade-plugin

This goes along with the above, but it's more a good tool than a dire warning. The maven-shade-plugin is a standard plugin for a Maven build that lets you blend together the contents of multiple JARs into one, so you don't have to have a big pool of JARs to copy around. That on its own is handy for deployment, but the plugin also lets you rename classes and aggregate and transform resources, which can be indispensable capabilities when making a safe project.

#4: Make Sure Static Initializers and Constructors are Clean

What I mean by this one is that you should make sure that your JavaServerAddin subclass does very little during class loading and instantiation. The reason I say this is that, until your class is actually loaded and running, the only diagnostic information you'll get is that RunJava will say that it can't find your class by name - a message indistinguishable from the case of your class not even being on the server at all. So if, for example, your class references another class that's missing or unresolvable at load time (say, pointing at a class that implements org.osgi.framework.BundleActivator, to pick one I hit), RunJava will act like your code isn't even there. That can make it extremely difficult to tell what you're doing wrong. So I found it best to make very little static other than JVM-provided classes and to delay creation/lookup of other objects and resources (say, translation bundles) until it was in the runNotes method. Once the code reaches that point, you'll be able to get stack traces on failure, so debugging becomes okay again.

#5: Take Care With Threads When Terminating

The Open Liberty runtime makes good use of java.util.concurrent.ExecutorServices to run NotesThread code asynchronously, and I'll periodically execute even a synchronous task in there to make sure I'm working with a properly-initialized thread.

However, when terminating, these services will start to shut down and reject new tasks. So if, for example, you had code that executes on a separate thread and might be run during shutdown, that will fail likely-silently and can cause your addin to choke the server.

#6: That Said, It's a Good Idea to Use Threads

A habit I picked up from writing Darwino's cluster replicator is to make your addin's main Message Queue loop very simple and to send messages off to a worker thread to handle. Doing this means that, for complex operations, the server console and the user won't sit waiting on a reply while your code churns through an individual message.

In my case, I created a single-thread ExecutorService and have my main loop immediately pass along all incoming commands to it. That way, the command runner is itself essentially synchronous, but your queue watcher can resume polling immediately. This keeps things responsive and avoids the potential case of the message queue filling up if there's a very-long-running task (though that's less likely here than if you're drinking from the EM fire hose).

#7: Really, Don't Do This

My final tip is that you should scroll back up and heed my advice from #1: it's almost definitely not worth writing a RunJava addin. This is a special case because a) the goal of the project is to essentially be a server addin anyway and b) I was curious, but normally it's best to use the HttpService route if you need a persistent task.

It's kind of fun, though.

Targeting Domino for Webapps Incidentally

Tue Feb 11 17:26:38 EST 2020

Tags: java maven

I recently had occasion to break ground on a new web project that uses a Notes runtime and has a web front end, and I figured it would be a perfect occasion to structure it in a way that is clean, portable, and, while it will run on Domino, doesn't have to use Tycho.

I ended up coming up with a setup that I'm pretty happy with, and so I put up an example on GitHub for anyone else to use as a reference for similar cases.

What Is This, Specifically?

This is an application that consists of a couple main concepts:

  • Maven for project structure and dependencies
  • Core "plain Java" module that contains code that's intended to be portable and doesn't even know it's in a web app
  • JAX-RS-based REST API
  • Client JS web UI written in Stencil and transpiled with Node
  • Standard webapp project for JEE containers such as Liberty
  • Domino project to wrap the app up as an OSGi bundle

What this is specifically not is an XPages project. And, while it can use a Notes runtime and access NSFs, it's also not something that will be stashed inside an NSF, and the "Notes" part is optional and really only included here to show it's possible. The idea is that this is a standard web app first and a Domino thing second.

Project Structure

The project is organized as a Maven module tree like so:

  • domino-webapp: The parent container project just for configuration
    • core
      • webapp-core: This is the main place for UI-independent business logic
    • web
      • webapp-api-jaxrs: This contains the JAX-RS-based REST API, which exposes the core business logic to the web
      • webapp-webui: This contains a Stencil-based JavaScript app. It doesn't need to be Stencil specifically, or even NPM-based at all, but I find Stencil to be a pretty good choice for this
      • webapp-jee: This is the JEE-container web app, containing very little code of its own and just intended to output a WAR
    • domino
      • webapp-domino: This is the Domino equivalent to the previous project, but contains a chunk of adapter code to get things working, plus some Maven configuration to generate an appropriate OSGi bundle
      • webapp-dist-domino: This is a distribution project that pulls in the Domino OSGi bundle and creates a p2 repository, and then a "site.xml" file for the benefit of importing into an NSF Update Site

How the OSGi Part Works

In going deeper into what's going on, I'm going to start at the end: how to go from a normal web app to a Domino-friendly OSGi bundle. If you're not familiar with what I mean by "web app" in general and in a Domino plugin in particular, it's the sort of thing that Sven Hasselbach wrote a series about a few years back: a Java/Jakarta EE Servlet application using the "WebContainer" extension point in the Domino HTTP runtime.

Traditionally, these projects are built as plain-old Eclipse projects, where you drop a bunch of JARs for your framework of choice into a plug-in project and write your code in there, using Eclipse's Plug-in Development Environment. This works well enough as far as it goes, but puts constraints on how you do development, in particular pretty much requiring Tycho if transitioned to a Maven structure, which would then have massive penalties for the rest of your project.

Fortunately, the thing about an OSGi bundle is that it's really just a JAR file with special metadata, and so it doesn't actually have to be created with a toolchain that has full knowledge of OSGi. As long as the required files end up in the right places inside the JAR (which is in turn just a ZIP file), you're good to go.

In this case, I used the maven-bundle-plugin to decorate the "MANIFEST.MF" file with appropriate OSGi metadata and, importantly, to embed all the compile-scoped project dependencies for me. That second part means that Maven will handle the job of steps 7-10 in Sven's example: it'll bring in the dependencies from Maven, copy them into the right place in the final JAR, and set up the Bundle-ClassPath header to point to them.

It's important to note the "compile-scoped" qualifier there. The Maven projects themselves also depend on a couple things that I know will be present on Domino already, namely IBM Commons, Apache Wink, the Web Container adapter, and Notes.jar. Though it'd probably work if I copied those into the JAR, that would be asking for trouble unnecessarily, so I mark them as "provided" in Maven, and then the bundling process knows to skip over them.

The other OSGi-specific element is the "plugin.xml" file, used by Domino's Equinox framework to identify that the bundle provides a web app. In this case, I put that file in "src/main/resources", where it ends up being copied to the root of the JAR. One down side here is that you have to know ahead of time what the syntax for this file is: since Eclipse won't know this is a plug-in project, you won't get the GUI shown in Sven's example.

There are some other Domino-specific considerations, but I'll return to them later. For now, those parts will cover the OSGi "bridge".

Core: Using the Notes API

The core project doesn't have a lot going on, and that's intentional. It does, though, demonstrate how you can use the JSON-B API for JSON serialization and the Notes API for accessing NSFs and other Notes stuff.

The important parts happen in the project dependencies. The first one is simple: I want to use the JSON-B API, but I was to declare that it will be provided one way or another by the environment. The second one includes Notes.jar by way of my P2 Repository Provider since it's still not available as a normal Maven dependency.

This project contains a single class, which just gathers a bit of information about the runtime environment to be shown as a JSON object. The important part here is my use of NotesThread when calling the Notes API. Since this project can run on non-Domino containers, I can't assume that all threads will already be Notes-friendly, so I use that route. You can also call NotesThread.sinitThread() or go other ways, but I like containing the calls into a separate thread outright in simple cases.

JAX-RS

The JAX-RS project is intended to contain JAX-RS configuration and resource classes, and the immediate part to note is once again the dependency set. Here, I targeted specifically JAX-RS 1.1, which is quite old, but is provided by Apache Wink on all Domino installations. I could theoretically bring in RESTEasy for a newer spec version, but 1.1 is capable enough for now and it keeps things simpler.

In the Application implementation class, I enumerate all of the resource classes used in the app. This is equivalent to the text-file-based method common in Wink apps, but it's portable across JAX-RS implementations and has the side benefit of being compiler-checked. However, though it's a step up from the old Wink way, it's a big step down from the modern JAX-RS way: in newer containers, you can just let the container find your resources by looking for classes with annotations automatically. However, that doesn't fly on Domino and, while you can hack in something roughly equivalent, it's simpler for now to just enumerate the classes explicitly and remember to add them to this list.

There are only two resources here: a Hello World resource and one to ferry the ServerInfo object out using the JAX-RS environment's JSON serializer (more on that in a bit).

The Web UI

The web UI project is complicated, but mostly because NPM-based JavaScript development is complicated. This example uses Stencil, which I quite like, but you can use whatever you'd like: React, Angular, just plain ol' HTML, or whatever.

The important parts here are the use of frontend-maven-plugin to create a Node+NPM environment and build the app and the specific configuration to put the output into "src/main/resources/META-INF/resources". Doing this means that, when this project is wrapped up into a Java-less JAR file, the web resources will be in the "META-INF/resources" directory, which is special on Servlet 3 and above. Any files in there in dependency JARs like this will be visible as if they were in the main web content of your web app.

JEE App

The Jakarta EE app is the simplest of the bunch, and the only actual class in there only exists for example purposes.

The work, such as it is, all happens in the Maven configuration. I declare it to be war-packaged, to not complain if there's no "web.xml" file, to bring in the project dependencies, and to specifically include IBM Commons. It also brings in Notes.jar as a compile-time dependency.

The Domino Shims

Back in the Domino module, it's time to talk about the non-OSGi parts. I've mentioned a few things above that require no configuration in a modern web container, but which will require a bit of legwork in Domino. These are generally related to the fact that Domino's servlet container is version 2.4 and it has no idea about newer standards.

  • I bring in a Eclipse Yasson dependency to provide JSON-B support.
    • To bind that to JAX-RS, I wrote a Provider class that knows how to turn any Java object into JSON when a resource says it wants to output JSON.
    • To register that provider (since it can't be picked up automatically), I subclass the Application class to include it specifically.
  • The ResourcesServlet servlet mimics the Servlet 3 behavior of serving resources out of "META-INF/resources". This specific implementation isn't the best, since it doesn't provide any caching, but it gets the job done and means that the web UI JAR will work the same way on both targets.
  • The RootServlet servlet extends the Wink default REST servlet to shim the ClassLoader around, which avoids a lot of trouble with threads used for web app requests that had previously been used for XPages requests (it's annoying, trust me).
  • I have to include an explicit reference to Wink's JAX-RS provider for some reason to do with bundle class loading.
  • Unlike in the normal web app project, I have to include a "web.xml" file, and this one registers the two servlets above.

Domino Update Site

The second part of the Domino target is the distribution project, which uses the p2-maven-plugin to create a P2 repository. That plugin is a splendid tool for your toolbox and has a lot of capabilities for auto-OSGi-ifying otherwise-non-OSGi projects. In this case, I just want to include the Domino project from the previous step, but I also want to generate an Eclipse feature for it so that it can be imported into an NSF Update Site and with some proper metadata.

I also use the p2sitexml-maven-plugin, which takes the newer-style P2 site generated by the previous step and adds a "site.xml" file, which is needed by the NSF Update Site import process if you want to include categories, which I think are nice.

Seeing It In Action

To run the app on Domino, you can do a Maven install on the root, install the update site from the distribution project onto Domino, and then visit "/exampleapp/". You'll be greeted by a vision of beatuty like this:

Example Webapp Screenshot

Placeholder garishness aside, it shows the Stencil app loading, using the custom favicon, and making a call to the System Info service. That, in turn, shows using the Notes runtime to get the server's distinguished name. It's left as an exercise for the reader to then put in the thousands of hours of work to make a world-class application.

Caveats!

Since this is a Domino thing, there are important caveats.

The first is one I mentioned earlier: because we're restricted to Servlet 2.4/2.5ish, a lot of things just won't work. Indeed, not even all of the 2.4 spec works, as Filters aren't implemented for some reason. Additionally, outside of Servlet and JAX-RS 1.1, you're pretty much in "BYOB" territory when it comes to other JEE specs. In this example, I brought in Yasson for JSON-P and JSON-B and that was pretty simple, but others (say, CDI) would require a lot more fiddly work.

There's also an extra-special caveat when it comes to JSP. Domino's web container knows about JSP, but requires what it calls a "JSP compiler bridge": a special extension that allows for interpreting JSPs inside the special environment it creates. However, it doesn't actually ship with such a bridge. Notes does (and MyFaces too) for what I assume are "social" reasons, but Domino doesn't. You could probably nab the JSP stuff from Notes and drop it onto Domino, but you'd be getting into weird territory. I tried dropping Jasper into the app, but it ran into ClassLoader-casting trouble... hence the bridge, I guess.

Usefulness

Phew! Admittedly, it's a long walk to get to the point where you can just run a web app, and there are quicker ways to get there. However, I do think this is worth it. With this setup, I have a set of Maven projects that work swimmingly in Eclipse and any other Java IDE, a NPM project that acts like any other, and a JEE container front-end for rapid development. No Designer, no NSF syncing, no Plug-in Development Environment, no Tycho. And, though I don't have the full breadth of JEE available to me, JAX-RS is the main one you need for a client-JS app anyway. It's not an appropriate setup for every app, but it's really nice when it fits.

Domino 11's Java Switch Fallout

Tue Jan 07 10:50:48 EST 2020

Tags: java
  1. AbstractCompiledPage, Missing Plugins, and MANIFEST.MF in FP10 and V10
  2. Domino 11's Java Switch Fallout
  3. fontconfig, Java, and Domino 11
  4. Notes/Domino 12.0.2 Fallout
  5. Notes/Domino 14 Fallout

In Notes and Domino 11, HCL switched from using IBM's J9 Java distribution to using the OpenJ9 variant of AdoptOpenJDK. This is a lateral move technically - it's still Java 8 - and it's one presumably made in the short term to avoid licensing costs from IBM and in the long term to align better with AdoptOpenJDK.

However, OpenJ9 is not the same as J9, and AdoptOpenJDK is not the same distribution as the previous one, so there are some minor gotchas to look out for.

BASE64 and Other Internal Classes

A couple months back, I wrote a post describing this situation: namely, that some XPages and agents grew to depend on the presence of JVM-internal classes in the com.ibm namespace, particularly com.ibm.misc.BASE64Encoder and its decoder sibling.

The true fix for this is to ferret out uses of these classes in your code base, but that can be difficult. If you have to maintain legacy code, I made a small shim Jar you can drop on your server to map the two BASE64 classes to their sun.misc versions. I intentionally use those classes, even though they're also not for public use, both because they have the same semantics as the IBM ones and to reinforce that the best solution is to use the vendor-independent java.util.Base64 class.

java.pol

It's been fairly-common practice for a little while now to create a file named "java.pol" in the Java installation directory to loosen the security policy and get around Domino's bizarrely-strict interpretation of the rules. This came into vogue in favor of editing "java.policy" because this file was (usually) not overwritten during Notes/Domino version upgrades.

However, as Per Lausten discovered, AdoptOpenJDK's distribution does not reference this file, and so its policy changes won't take effect. The upshot of this is that there are three main options to loosen the policy:

  • As Per mentions (via Daniele Vistalli), you can create a file named ".java.policy" in the home directory of the user running Domino and it will be honored.
  • You can go back to editing the "java.policy" file, and re-editing it with each new release
  • You can modify "java.security" to reference "java.pol" again. This is kind of a wash, though, since you'll need to re-edit "java.security" every update anyway

Different Implementation Jars

This last one is much more limited in scope, and may actually be limited in effect to just the NSF ODP Tooling project. In that project, in order to create a Domino-compatible runtime environment for local compilation, I included a couple expected Jars from the Notes/Domino installation in the runtime's classpath. One of these was "ibmpkcs.jar", which covers both some security stuff but also the aforementioned BASE64 classes.

The fix in my case was to just make the resolution of that Jar optional, which should work for the normal case, but it'll be something to keep an eye on in the future.

Small Aside: Writing Agents With Java 5+ Features

Tue Nov 26 10:46:44 EST 2019

Tags: designer java

The topic of using Java 5+ language features in agents came up recently in the OpenNTF Slack room (use this invitation link if you haven't already joined!), and I think that it's one of those topics that's worth making a post about for posterity's sake.

For historical and compatibility reasons, the default compiler language level for newly-created agents, at least in the absence of the JavaCompilerTarget notes.ini setting, is Java 1.3:

Java Agent Compiler Properties

This is laughably out-of-date, and doesn't even support 15-year-old features like generics. Accordingly, if you take a piece of code from my last post, you'll end up with a couple errors:

Java agent compilation problems

It recognizes the Java 7+ classes, since it's still backed by a Java 8 JDK, but it complains about try-with-resources and the use of varargs.

Setting the Compiler Level

The fix for this is to change the compiler compliance level for your agent project. Depending on the type of problem, you may be given an Eclipse quick fix option if you hover over the red-underlined text:

Eclipse quick fix

If you don't have that option (for example, there's no such option for the Files.createTempFile vararg problem), you can alternatively go to the "Package Explorer" view, find the temporary project for the agent (named something like "foo.nsf.Some Agent.ja"), and go to Properties. In there, you can go to the "Java Compiler" settings, make sure the "Compiler compliance level" is the level you want, and then check "Use default compliance settings":

Project compiler at 1.8

When you do this and next save the agent, you'll be greeted with this Designer-specific message box:

Java compiler backward compatibility warning

This is good to see, since it shows that Designer picked up the change and will write it into the agent note in items named $JavaCompilerSource and $JavaCompilerTarget. That's the part that really counts, since it's what Designer uses to re-construct the Eclipse project in the future.

Note, though, that the title of the message box is a reminder about an important aspect: if you set the target compiler level to 1.7 or above, then your agent will not run on Domino servers before 9.0.1 FP8. So, if you're using one of those for some reason, take care about the language features you're using. The other breakpoints, if you're working with some really-legacy servers, are Java 1.4 being added in 7, 1.5 added in 8, and 1.6 added in 8.5.

Dealing With Errors

Depending on your Notes version (I suspect this problem cropped up in 9.0.1FP10 and seems to be gone in the V11 beta), changing your compiler level may result in "forbidden reference" errors for basic things like the lotus.domino classes.

If you hit this, the quickest fix is to modify your settings in Designer's preferences, in the "Java" ? "Compiler" ? "Errors/Warnings" section:

Setting to ignore forbidden references

If you set "Forbidden reference (access rules)" in the "Deprecated and restricted API" section to "Ignore", the problems should go away. I'm not actually sure why this problem crops up (perhaps something to do with the structure of Notes.jar), but it was fairly consistent for me for a while.

With all that set, though, you should be free to use up to Java 8 features in agents (and web services, probably) to your heart's content.

Writing a Java NIO Filesystem, Part 1

Mon Nov 25 10:09:25 EST 2019

Tags: java java-nio
  1. Writing a Java NIO Filesystem, Part 1

My recent project, the NSF SFTP File Store consists of two main parts: the SFTP server itself (powered by Apache Mina) and a Java NIO filesystem implementation. I casually mentioned the latter in my introductory post, but I think it's an interesting topic that warrants a post or two of its own.

Introduction to NIO

The term "Java NIO" refers to the non-blocking IO package added in two parts across Java 6 and 7 and can be considered a refresh of previous capabilities in a similar way to the Collections API in Java 2 replaced the handful of original collection classes.

The initial (and larger) part of it added in Java 6 added better capabilities for dealing with all sorts of "byte stuff": buffers of arbitrary types, smoother character-set handling, more-flexible streams, and so forth. I dealt with these constructs a while ago with the "nsfdata" package in ODA. In that case, it proved very useful for dealing with in-memory representations of Notes Composite Data, which is best dealt with as a navigable array of differently-sized structures.

The java.nio.file package was added in Java 7 and is a refresh of the older java.io.File/java.io.FileOutputStream/etc. system. It interacts well with the previous NIO stuff, but can actually be thought of as a distinct thing, despite its shared naming. This package changes around the mechanics of the original filesystem API, and I remember finding it kind of grating when I first encountered it. It's all in service of being more flexible, though, and it's one of those things where repeated exposure ended up making the older API feel weird and wrong.

Using the NIO Filesystem API

Before I get to the actual implementation of a backing filesystem, I think it'll make sense to show some examples of using the NIO filesystem API, especially since these can (and should) be used in any Java-based application today.

When the new classes were added, the older File class was augmented with a method to convert it to a java.nio.file.Path and vice-versa:

1
2
3
File foo = new File("/some/path/to/file");
Path fooPath = foo.toPath();
File fooFile = fooPath.toFile(); // for older-API interoperability

The Path interface is a very-lightweight and implementation-neutral representation of a path. Unlike the File class, it doesn't have methods for actually interacting with the filesystem. In fact, pretty much all you can do with it specifically is get other Paths based on it:

1
2
3
Path foo = Paths.get("/some/path/to/file");
Path parent = foo.getParent();   // "/some/path/to"
Path child = foo.resolve("bar"); // "/some/path/to/bar"

The primary way you use Path objects is via the Files utility class, which provides not only the methods you may be familiar with from File but also some additional ones like getting input and output streams:

1
2
3
4
5
6
7
Path foo = Files.createTempFile("test", ".tmp");
try(OutputStream os = Files.newOutputStream(foo, StandardOpenOption.TRUNCATE_EXISTING)) {
  os.write("hello".getBytes());
}
try(BufferedReader r = Files.newBufferedReader(foo)) {
  System.out.println("file contains " + r.readLine());
}

Here, you can see a couple things going on:

  • Files.newOutputStream replaces FileOutputStream, and so the object you assign it to should be declared as just OutputStream.
  • Many of the Files methods take zero or more open/move/etc. options, and they can be a little odd at first. The method parameters are declared as e.g. OpenOption, but the actual options are available in the StandardOpenOption enum. This is to allow for arbitrary extensions for custom filesystems. For example, you might write a custom option to force creating a backup version of the file before writing.
  • I'm using the try-with-resources syntax from Java 7 here. That isn't actually related to NIO except in that it came in the same release, but it's great and you should use it.

The Files class also contains methods for listing files and walking file trees, which operate with Streams (as of Java 8) and callbacks. For example, this bit from the NSF ODP Tooling walks a file tree using the callback method and stores it in a ZIP file:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
try(OutputStream fos = Files.newOutputStream(result)) {
  try(ZipOutputStream zos = new ZipOutputStream(fos, StandardCharsets.UTF_8)) {
    zos.setLevel(Deflater.BEST_COMPRESSION);
    Files.walkFileTree(path, new SimpleFileVisitor<Path>() {
      @Override
      public FileVisitResult visitFile(Path file, BasicFileAttributes attrs) throws IOException {
        if(attrs.isRegularFile()) {
          Path relativePath = path.relativize(file);
          String unixPath = StreamSupport.stream(relativePath.spliterator(), false).map(String::valueOf).collect(Collectors.joining("/")); //$NON-NLS-1$
          ZipEntry entry = new ZipEntry(unixPath);
          zos.putNextEntry(entry);
          Files.copy(file, zos);
        }
        return FileVisitResult.CONTINUE;
      }
    });
  }
}

Next Up: Implementation

In my next post, I'll get into some specifics of implementing a filesystem using this framework as well as some of the implications of how flexible and straightforward it is.

Options for the Future of the Domino Open Liberty Runtime

Mon Nov 18 10:57:24 EST 2019

Tags: java liberty
  1. Options for the Future of the Domino Open Liberty Runtime
  2. Next Steps With the Open Liberty Runtime
  3. Rapid Progress in Open-Liberty-Runtime Land

I've been thinking lately about what I can do with the Open Liberty Runtime project. If you're not familiar with it, the gist of it is that it gloms the Open Liberty Java/Jakarta EE app server onto Domino to allow deploying up-to-date WAR-based apps "to Domino", sharing the server's access to NSFs.

In its current form, it's serving me reasonably well: this blog is running on it, for example, and writing the SFTP server to target it has been a delight. I've been tracking a set of issues about it, mostly relating to improving the administration/deployment side as it is in ways that would improve my current use. There are some more-out-there ideas like running apps from NSFs, but for the most part it's been focused on "a JEE server attached to Domino".

Slightly-Tweaked Model

But what I've been thinking about lately is tweaking the model a bit to be less like "a new HTTP runtime" and more like a series of related servers, which would then be proxied to. This is in line with what Liberty has been trending towards anyway. Though it started out as essentially a cleaned-up WebSphere monolith, the fact that it's been targeted at cloud-service use has meant that it's been aggressively tuned to allow you to include only the specific features you need, with the idea that there will be many instances of Liberty running, one per app. This is also seen in their relentless push for faster startup times, something that only matters if you plan to start up instances of the server very frequently.

My Runtime project technically supports this model currently. Though it uses a single Liberty runtime download, the main Servers list is geared towards creating multiple running servers with their own app sets, which end up being distinct processes:

Open Liberty Admin Servers view

So one could hypothetically put dozens of servers in here and use it as I've been describing. Other than providing that capability, though, the tooling doesn't really help you much: in particular, it would do nothing to alleviate the problems of conflicting port mappings or unifying all of these apps into a single front. I did a little work making a reverse-proxy webapp, but it's really meant for the case of having one Liberty server with all of your apps, and wouldn't have any meaning in a many-server setup.

One option I've been considering here is giving the server side of this some knowledge of and control over the HTTP ports used by the individual Liberty servers, so that it could assign random ones automatically. Then, I'd also be able to add on a dynamically-configured reverse proxy that would know about these different running servers and could route requests automatically.

Other Runtimes

On top of that, there's nothing about the project that's really inherently tied to Open Liberty as such. It currently assumes that that's the runtime, but the only way it interacts with it is by pouring files out to the filesystem and then running some known shell scripts. There's nothing stopping such a setup from also gaining a bit of knowledge about, say, Node, Swift, or any number of other runtimes. Things would get progressively weirder the further you get from Java when you want to access the Domino runtime, but hey, that's what the C API is for!

Reinventing the Wheel

While mulling this over, though, I did realize one thing: I'm essentially describing a jankier version of Kubernetes. For the most part, the problem of running disparate apps with their own runtimes and needs managed by a central server is solved by that and Docker. This project would be a bit different in that the apps would automatically inherit a Domino runtime and would also (usefully) maintain clustered/replicated configuration by way of the Domino server backbone, so it's not exactly the same thing. And, while HCL is pushing for a Dockerized future, the pieces aren't there yet and won't fully be for a while: while Domino-on-Docker is a thing technically, "many Notes-runtime apps with many server IDs" for NRPC use is a licensing minefield, and the gRPC bindings are currently painfully limited in both capability and language support.

So I likely would be best off just letting time (which is to say, HCL) solve the fancier problems and just focusing on the Runtime's original job of being a better Java app server for Domino. I do think it would be useful to better support the one-server-per-app approach, especially for a hypothetical case of wanting to deploy an XPages app in one Liberty server but then a JSP or JSF app in another, and that'll take a better reverse proxy.

I think it's good to mull over, though. This already provides a good path to better app dev with Domino, and smoothing that out more will make my life all the better and will hopefully be useful to others.

Quick-Tip Thursday: Avoid Future Base64 Trouble In Java

Thu Nov 14 09:35:21 EST 2019

Tags: java sntt

TL;DR: Don't use com.ibm.misc.BASE64Encoder or com.ibm.misc.BASE64Decoder.

If you've been doing Java development for a while, you've likely run into a situation where you need to Base64-encode or -decode something. It's used commonly in MIME documents, data URIs, and various other areas typically related to data transfer.

Unfortunately, using Base64 in Java was awkward for a long time, with no natural choice in the core JDK until Java 8. The trouble over the years, though, is that there are choices in previous releases, and it became somewhat common in the community to use sun.misc.BASE64Encoder for this need. The problem with that is that sun.* packages are officially not part of the JDK and can't be relied upon to exist in every Java implementation. Unfortunately, a combination of a lack of enforcement of this in the tooling and the fact that those classes are in practice pretty universal has let them creep into Java code.

On the Domino side, we have a second problem: com.ibm.misc.BASE64Encoder.

Sidebar: The Different Flavors of Java

For the most part, setting aside Android, it's safe to think of "Java" as a monolithic entity, where having a JVM of a given version number on one system will be equivalent to the same on another. There are subtle differences, though, and several vendors have long maintained their own variants of Sun/Oracle's JVM (which is named HotSpot).

IBM has maintained one such variant, called J9, and infused all of their Java-using products, Domino included, with it. J9 has since been open-sourced and donated to Eclipse and, renamed to OpenJ9, has carved out a niche for itself by being particularly lean and speedy to spin up.

The Snag We Hit With J9

For the most part, the differences in JVM flavors don't matter to developers, but IBM played a bit of a trick on us by making duplicates of some sun.* classes under the com.ibm.* package. Unlike sun.*, which at least has a little community knowledge of being internal-only (and also just looks odd compared to standard classes), com.ibm.* has no such connotation, and there's nothing in Designer or the classes themselves to indicate that com.ibm.misc.BASE64Encoder (internal and non-portable) is any different from, say, com.ibm.commons.util.io.base64.Base64 (portable via IBM Commons).

So, over the years, use of that class has crept into XPages and Java agent code - it does the job and it's always been safe to assume that Domino-the-IBM-product would use J9-the-IBM-JVM. Domino isn't an IBM product anymore, however, and, to add to potential future trouble, even OpenJ9 builds from AdoptOpenJDK don't come with those com.ibm.misc.* classes.

The upshot is that, if your Java code were to run on a JVM other than an fully-IBM-style one (whether intentionally or otherwise), you're liable to run into trouble with code that casually used those JVM-internal classes.

The Options

If you're running a version of Domino using Java 8 (9.0.1FP8+), which you'd darn well better be at this point, you're in luck: the built-in java.util.Base64 class has a nice API and is guaranteed to be present on every JVM. This is your best bet, since it doesn't require any other dependencies beyond the core JDK and it has some nice features like variant encoders for MIME- and URL-safe use.

If you're targeting a version of Domino that still uses Java 6 (don't), you do have one other safe-enough option that's available in both XPages and Java agents: javax.xml.bind.DatatypeConverter. This class is technically part of the JAX-B specification but is included in JVMs from version 6 through version 10. This is less of a clean choice than the java.util.Base64 class both because it's not as nice of an API and because use on Java 11+ will require bringing in an external dependency. Still, if you're stuck on an old release, it'll at least get you through any potential rough patches in the near future.

Finally, in XPages, you have the aforementioned com.ibm.commons.util.io.base64.Base64 class, which will continue to be present due to being included in a base library of the XPages stack, but which is rendered obsolete by java.util.Base64.

So I recommend taking some time to look through any running Java code you have for this potential hiccup and making the switch. It's admittedly a little awkward to do, but it's still a good idea.

Developing Open Liberty Features, Part 2

Sun Aug 18 09:14:24 EDT 2019

  1. Developing an Open/WebSphere Liberty UserRegistry with Tycho
  2. Developing Open Liberty Features, Part 2

In my earlier post, I went over the tack I took when developing a couple extension features for Open Liberty, and specifically the way I came at it with Tycho.

Shortly after I posted that, Alasdair Nottingham, the project lead for Open Liberty, dropped me a line to mention how programmatic service registration isn't preferred, and instead the idiomatic way is to use Declarative Services. I had encountered DS while fumbling my way through to getting these things working, but I had run into some bit of trouble or another, and I ended up settling on what I got working and not revisiting it.

Concepts

This was a perfect opportunity to go back and do things the right way, though, so I set out to do that this morning. In my initial reading up, I ran across a blog post from the always-helpful Vogella Blog that talks about coming at OSGi DS from essentially the same perspective I have: namely, having been used to Equinox and the Eclipse plugin/extension mechanism. As it turns out, when it comes to generic OSGi, Equinox can kind of poison your brain. The whole term "plug-in" instead of "bundle" comes from earlier Eclipse; "features", "update sites", and all of p2 are entirely Equinox-specific; and the "plugin.xml" extension mechanism is of a similar vintage. However, unlike some other vestiges that were tossed aside, "plugin.xml" is still in active use.

At its core, the Declarative Services system is generally similar to that route, in that you write classes to implement a given interface and then declare that your bundle provides that using XML files. The specifics are different - DS is more type-safe and it uses individual XML files in the "OSGI-INF" directory for each service - but the concept is similar. DS also has an annotation-based mechanism for this, which allows you to annotate your service classes directly and not worry about maintaining XML files. It's something of a compiler trick: the XML files still exist, but your tooling of choice (PDE, bnd, etc.) will generate the files based on your annotations. It took a bit for Eclipse to get on board with this, but, as of Neon, you can enable this processing in the preferences.

Implementation

Fortunately for me, my needs are simple enough that making the change was pretty straightforward. The first step was to delete the Activator class outright, as I won't need it for this. The second was to add an optional import for the org.osgi.service.component.annotations package in my Liberty extension bundle. I suspect that this is a bit of a PDE-ism: the annotations aren't even retained at runtime (and the package isn't present in the Liberty server), but this is the only mechanism Eclipse has to add a dependency for a plug-in project.

The annotation for the user registry was as straightforward as can be, needing a single line in this heavily-clipped version of the class:

1
2
3
@Component(service=UserRegistry.class, configurationPid="dominoUserRegistry")
public class DominoUserRegistry implements UserRegistry {
}

With that, Eclipse started generating the associated XML file for me, and the registry showed up at runtime just as it had before.

The TrustAssociationInterceptor was slightly more complicated because it had some extra initialization properties set, in particular the one to mark it as executing before normal SSO. This was a little tricky in two ways: Java annotations don't have any mechanism for specifying a literal Map for properties, and the before-SSO property is a boolean, but I could only write a string. It turned out that the property, uh, property on the annotation has a little mini-DSL where you can mark a property with its type. The result was:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
@Component(
	service=TrustAssociationInterceptor.class,
	configurationPid=DominoTAI.CONFIG_PID,
	property={
		"invokeBeforeSSO:Boolean=true",
		"id=org.openntf.openliberty.wlp.userregistry.DominoTAI"
	}
)
public class DominoTAI implements TrustAssociationInterceptor {
}

Further Features

This is proving to be a pretty fun side project within a side project, and I think I'll take a crack at developing some more features when I have a chance. In particular, I'd like to try developing some API-contribution features so that they can be deployed to the server once and then used by web apps without having to package them (similar in concept to XPages Libraries). This is how Liberty implements its Jakarta EE specifications, and I could see making some extra ones. That's also exactly what CrossWorlds does, and so I imagine I'll crib a bunch of that work.

Developing an Open/WebSphere Liberty UserRegistry with Tycho

Fri Aug 16 15:08:10 EDT 2019

  1. Developing an Open/WebSphere Liberty UserRegistry with Tycho
  2. Developing Open Liberty Features, Part 2

In my last post, I put something of a stick in the ground and announced a multi-blog-post project to discuss the process of making an XPages app portable for the future. In true season-cliffhanger fashion, though, I'm not going to start that immediately, but instead have a one-off entry about something almost entirely unrelated.

Specifically, I'm going to talk about developing a custom UserRegistry and TrustAssociationInterceptor for Open Liberty/WebSphere Liberty. IBM provides documentation for this process, and it's alright enough, but I had to learn some specific things coming at it from a Domino perspective.

What These Services Are

Before I get in to the specifics, it's worth discussing what specifically these services are, especially TrustAssociationInterceptor with its ominous-sounding name.

A UserRegistry class is a mechanism to provide a Liberty server with authentication and user info services. Liberty has a couple of these built-in, and the prototypical ones are the basic and LDAP registries. Essentially, these do the job of the Directory and Directory Assistance on Domino.

A TrustAssociationInterceptor class is related. What it does is take an incoming HTTP request and look for any credentials it understands. If present, it tells Liberty that the request can be considered authenticated for a given user name. The classic mechanisms for this are HTTP Basic and form-cookie authentication, but this can also cover mechanisms like OAuth. In Domino, this maps to the built-in authentication mechanisms and, more particularly, to DSAPI filters.

How I Used Them

My desire to implement these developed when I was working on the Domino Open Liberty Runtime. I wanted to allow Liberty to use the containing Domino server as a user registry without having to enable LDAP and, as a stretch goal, I wanted to have some sort of implicit SSO without having to configure LTPA.

So I ended up devising something of an ad-hoc directory API exposed as a servlet on Domino, which Liberty could use to make the needed queries. To pair with that, I wrote a TrustAssociationInterceptor implementation that looks for Domino auth cookies in incoming requests, make a call to a small servlet with that cookie, and grabs the associated username. That provides only one-way SSO, but that's good enough for now.

The Easy Part

The good part was that my assumption that my comfort with Tycho going in would help was generally correct. Since the final output I wanted was a bundle, I was able to just add it to my project structure like any other, and work with it in Eclipse's PDE normally. Tycho and PDE didn't necessarily help much - I still had to track down the Liberty API plugins and make a local update site out of them, but that was old hat by this point.

What Made Development Weird

I went into the project in high spirits: the interfaces required weren't bad, and Liberty uses OSGi internally. I figured that, with my years of OSGi experience, this would be a piece of cake.

And, admittedly, it kind of was. The core concepts are the same: building with Tycho, bundle activators, MANIFEST.MF, and all that. However, Liberty's use of OSGi is, I believe, much more modern than Domino's, and certainly much less focused on Equinox specifically.

For one, though Liberty is indeed OSGi-based, it doesn't use Maven Tycho for its build process. Instead, it uses Gradle and the often-friendlier bnd tooling to handle its OSGi composition. That's not too huge of a difference, and the build process doesn't really affect the final built feature. The full differences are a whole big topic on their own, but the way they shake out for this purpose is essentially a difference in philosophy, and the different build mechanism was something of a herald of the downstream distinctions.

One big way this shows is in service registration. Coming from an Eclipse heritage, Equinox-based apps tend to use "plugin.xml" to register services, Liberty (and most others, I assume) favors programmatic registration of services inside the bundle activator. While this does indeed work on Equinox (including on Domino), this was the first time I'd encountered it, and it took some getting used to.

The other oddity was how you encapsulate your bundle as a feature in Liberty parlance. Liberty uses the term "feature" to refer to individual components that make up the server, and which you can configure in the "server.xml" file. These are declared using files similar to MANIFEST.MF with specialized headers to declare the name of the feature, the bundles that make it up, and any APIs it provides to the server and apps. In my case, I wrote a generic mechanism to deploy these features when a server is established, which writes the manifest files to the server's feature directory. Once they're deployed, they become available to the server as a feature with the "usr" prefix, like "usr:dominoUserRegistry-1.0" for my case.

In The Future

I have some ideas for additional features I'd like to develop - providing implicit APIs for Darwino and Jakarta NoSQL/JNoSQL would be handy, for example. This way went pretty smoothly, but I'll probably develop non-Domino ones using either Gradle or Maven with the maven-bundle-plugin. Either way, it ended up fairly pleasant once I discarded my old assumptions, and it's another good entry in the "pros" column for Liberty.

Java Grab Bag 2

Fri May 03 15:38:00 EDT 2019

Tags: java
  1. Java Hiccups
  2. Bitwise Operators
  3. Java Grab Bag 2
  4. Java Travelogue: The Care and Feeding of Locales
  5. More Notes on Filesystem and Charset Portability

Following in the vein of "Java Hiccups", I've had a couple things floating around my head lately that I think collectively make for a good post for Java developers, particularly those working in the Domino arena.

Without further ado:

Map#computeIfAbsent

This is a method that was added in Java 8 and, while it's not as big a deal as the addition of streams, it's one of my favorite additions and something I use very frequently. To give a point of reference, consider this common idiom from an imagined XPages app:

Map<String, Object> applicationScope = ExtLibUtil.getApplicationScope();
if(!applicationScope.containsKey("someVal")) {
  applicationScope.put("someVal", someExpensiveOperation());
}
String someVal = (String)applicationScope.get("someVal");

Essentially, using a Map as a cache for a complex computed value. Java 8 added the #computeIfAbsent method (alongside several similar ones) to do this in one go:

String someVal = (String)ExtLibUtil.getApplicationScope().computeIfAbsent("someVal", key -> someExpensiveOperation());

The second parameter is (usually) a lambda, like the ones used in streams, that takes the provided key as an argument and is only executed if the value does not already exist. Due to the way this was added, most implementations do pretty much the same thing as the first block of code, but you don't have to care about that. Your code gets a bit smaller, the intent is much clearer, and it's less prone to small bugs like changing the key and forgetting to change it in all three places.

Arrays Are Weird

Java's built-in array type is loosely based on C's, and that's reflected in the syntax:

int[] foo = new int[4];
foo[0] = 1;
foo[1] = 2;
foo[2] = 3;
foo[3] = 4;

Like C, they are zero-based, declared with the capacity and not the max index, and cannot be resized. Unlike C, arrays aren't just syntactical sugar on top of pointers, and this manifests immediately in bounds checking. Take this line:

foo[4] = 10;

In C, this will (famously) just write an integer 10 value into whatever memory happens to be just beyond the bounds of your array. In Java, you'll get an ArrayIndexOutOfBoundsException, saving you from the insidious bug. But, since Java arrays are (probably) implemented internally very similar to in C - they're likely contiguous blocks of memory sized to the type - they're still extremely efficient, and so they show up in a lot of speed-critical code.

As speedy and safe as they are, though, they're still pretty unfriendly. For starters, they can't be resized. When you do new int[4] (or use literal syntax like new int[] { 1, 2, 3, 4 }), you carve out that much memory and can't shrink or expand it in-place. You can change the values inside an array, just not its size. To "resize" efficiently, you have to make a new array and then use System.arraycopy to populate the new array with the contents of the old.

This is all why the List interface (with its predecessor class Vector) exists: they serve the same function of "ordered collection of stuff", but allow for dynamic resizing. Because these objects are usually "efficient enough" (ArrayList uses "true" arrays under the covers) while having numerous additional benefits, you should use them as your first go-to and only use arrays if you have a reason.

That's in part because the weirdness of arrays doesn't end with their inconvenience. Array types are actually implicitly-created classes, even when they contain primitive types. So:

int foo = 3; // primitive value
foo = null; // syntax error!
int[] bar = new int[] { 3 }; // Object
bar = null; // legal!

List<int> fooList = new ArrayList<>(); // syntax error - primitives can't be in generics List<int[]> barList = new ArrayList<>(); // legal!

Object.class == Object[].class; // false int[].class.isArray(); // not only legal, but true

Java provides two main utility classes for working with arrays: java.util.Arrays (extremely useful for Arrays.asList) and java.lang.reflect.Array (usually only useful in edge cases).

Java Has No Library Versioning System

If you've worked with Java in Designer or Eclipse, you've likely run across this preferences pane or its per-project version:

Eclipse Java compiler settings

These settings affect two things:

  • The syntax allowed in your source (e.g. new ArrayList<>() requires 1.7 or higher)
  • The class file format version (you can think of this like an NSF ODS). You've likely seen the latter in play by receiving an UnsupportedClassVersionError trying to run Java 7 or 8 code on a pre-9.0.1FP8 Domino server.

Conspicuously absent from this short list is anything to do with classes or methods added to the runtime in newer versions. For example, the String class gained a static method String.join to conveniently concatenate strings with a given delimiter. If you're targeting an older Java version but using a newer Java library (as Designer 9.0.1FP10 does with a default target of 1.5 and JVM of 1.8), you can write a line of code using that method without issue - the syntax doesn't require anything above 1.5, so all is clear as far as the compiler is concerned. But if you then try to run that code on an older JVM (such as an older Domino server), you'll get an exception at runtime, since the method doesn't exist.

Unfortunately, the only true answer to this is to tell your IDE about a JRE for each specific Java version you're targeting, something that doesn't happen by default, and which will gradually get more difficult as Java 6 becomes harder and harder to come across.

This is one of the things that OSGi aims to fix - you could, for example, have many versions of Guava installed, and you could declare that your plugin works specifically with version 18. Then, when loading, the runtime will either bind to a matching version or give you an error that no version could be resolved. No mystery involved. Unfortunately, OSGi is a niche thing losing ground, and the module system introduced in Java 9 consciously does not address this.

In a pinch, you can use the file tool on most Unix systems to check the version of an individual class file:

$ file Foo.class
Foo.class: compiled Java class data, version 52.0 (Java 1.8)

Licensing

A little while ago, Oracle raised a bit of a stink by declaring that, as of this year, commercial use of their Java runtimes would require paid licensing. Historically, you could get support for Java for money, and certain additional components had their own licensing requirements, but it was pretty normal otherwise to install Java from java.sun.com and not give it a second thought.

This naturally caused a few questions when it comes to Domino and other Java-incorporating IBM products, and IBM released a statement that basically amounted to "you don't have to worry about it". IBM has maintained their own variant of the JVM (called J9 for Smalltalk-related reasons, not to be confused with Java 9) and anyway has always had arrangements with Sun/Oracle such that IBM's customers don't have to worry about dealing with Oracle directly.

But what about using Java outside of a licensed product, such as if you just want to run Tomcat on some server? The short answer there is that you're still fine, but you just have to know a little about the difference between "a JDK" and "Oracle's JDK". Java and the surrounding JDK have been progressively open-sourced in fits and spurts over the years, and are now at a point where the project called OpenJDK is basically the real Java environment, and then Oracle's JDK is just one implementation of it. It's similar to Linux: the core parts are open-source, and many distributions are entirely free, but there also exist commercial variants for pay.

So, if you want to run a Java stack, you can do so without putting forth a single cent by using an OpenJDK build. Oracle, IBM, and others will still be happy to take your money if you want a commercially-supported Java environment, of course.

For a longer explanation, this blog post from @javachampions is pretty much the definitive word.

 

Java With Domino After XPages

Thu Mar 14 15:10:15 EDT 2019

IBM and HCL held a webcast today to detail some plans for Notes/Domino V11. There were some interesting tidbits elaborating on things like the pub/sub support, and it'll be worth tracking down a recording of the event when it's available.

What's important for this series, though, is that this event served as the long-promised "roadmap" announcement for XPages. The roadmap is, in effect, option three: HCL plans to look into ways to reuse some existing XPages code, but in general you should be aiming to write your UIs in something else, either consuming REST services from an XPages container or accessing Domino data via another route (like the domino-db Node.js module and hypothetical Java gRPC client).

So we know the end of the path: not XPages. However, it's not like we're all just going to throw away our existing apps, so there's work to do determining how we're going to get there. The options remain pretty much what they were after CollabSphere last year, albeit now with the doubt removed. The first two options - returning to LotusScript or going to Node - have their advantages and disadvantages, and you could make a reasonable case for either. Personally, I'm not interested in going down those roads, though, and I think it's better for any app of reasonable complexity to dive into Java. Other members of the community and I have developed tools over the years to make it easier, and now's the time to take some of these steps if you haven't already.

Do Not Use Server JavaScript

Sever JavaScript was always something of a trap for app architecture. There's nothing inherently wrong with having a scripting language on your UI pages, and it certainly helped bridge some gaps, but the way it and Designer intertwined encouraged developers to create non-portable messes. If you're still writing SSJS, stop immediately.

Learn Proper Java

Java has been around for a long time, and the way to right "good" Java code has changed over time and varies greatly by your environment. Some aspects, though, apply generally, and it's useful to stay up-to-date on current practices. I don't know a better resource for this than Effective Java, which has been updated for Java 7-9 since I last read it.

Speaking of which, you should learn about Java 8 streams and lambdas - they're great. Julian Robichaux did a presentation on this topic back at Connect 2017, and the slide deck is very elucidating.

Adopt Standard Java Technologies

Last year, I created a project to bring some modern JEE technologies to XPages. These are some of the same technologies I've been talking about in my "XPages to Java EE" series and, while that project can't bring the full JEE development experience to XPages, using those tools will help you write code that, in some cases, could be directly dropped into a Java EE app with no modifications at all. There's a big asterisk when it comes to actually accessing Domino data, but that's a solvable problem as well (with some more development).

In particular, you should start writing JAX-RS services. Not only is JAX-RS an excellent and very-capable spec, but REST services are portable to absolutely any front end.

Adopt Automated Builds

Maven has been something of a bugaboo for XPages developers for a while, but doesn't have to be. Node development (server- or client-side) revolves around npm and various build plugins, and Maven is much the same thing. One of the biggest improvements I've made lately to all of my active XPages apps is to wrap the on-disk project for them inside a Maven artifact, using the NSF ODP Tooling. That project allows you to automatically build your NSFs alongside other parts of the project (such as OSGi plugins) without having Designer involved.

Check the example project in that repo, and stay tuned for a 2.0 release (probably) imminently.

Learn Other Toolkits

If you're just starting the process of figuring out what to do after XPages, it doesn't particularly matter which other toolkit you learn, as long as it's reasonably modern. If you take some time to learn how to make, say, a React app but end up going with something else down the line, the lessons you learn will apply very closely. A particularly-comfortable option could be to learn JSF, which has a common ancestry with XPages but has up-to-date capabilities.

Whatever it is, though, just learn some other toolkit.

Follow Channels and Accounts for Other Tech

Over the last couple months, I've started following a lot of Jakarta-related blogs and Twitter luminaries. This applies elsewhere - even if you're not using other toolkits yet, it's very helpful to start immersing yourself in the news and culture.

Don't Stay Still

The primary thing to take to heart is the importance of doing something. Unless you're planning to change careers or retire in the short term, you'll have to make one decision or another. XPages is not going to get meaningfully better, and even existing apps will get worse with time as browsers and technology change.

Other environments, though, are already leagues ahead and are constantly improving. Dive in; the water's fine!

Java Hiccups

Wed Nov 07 14:01:00 EST 2018

Tags: java
  1. Java Hiccups
  2. Bitwise Operators
  3. Java Grab Bag 2
  4. Java Travelogue: The Care and Feeding of Locales
  5. More Notes on Filesystem and Charset Portability

To take a break from the doom-and-gloom of my last post, I figured it'd be good to dust off a post idea I've had in my drafts for a while: common hiccups that Java developers - particularly those coming from a Domino background - run into. This is sort of a grab bag of non-obvious concepts that are easy to assume incorrectly about, whether because of the way other languages work or the behavior of the lotus.domino API specifically.

So, roughly in order of complexity:

import Is Just For Cleanliness

In many languages, in particular C/C++/Objective-C, the natural equivalent of Java's import statement has a massive effect, physically grafting files into your source. In Java, though, import is really just for developer convenience. At runtime, there's no difference between having written this:

1
2
3
4
5
import java.util.ArrayList;
import java.util.List;

/* snip */
List<String> foo = new ArrayList<>();

...or this:

1
java.util.List<java.lang.String> foo = new java.util.ArrayList<>();

If you import a class but never use it in a file, it won't have any effect on the runtime behavior of the class. It's just used by the compiler to clarify what you mean when you use a base class name without reference.

Incidentally, as seen here, classes within the java.lang package (but not subpackages) are auto-imported, so it's as if each Java file has an invisible import java.lang.* at the top.

Compilation Doesn't Bake In Libraries

This is related, and is also an area where Java differs from some other environments. With C et al, you have the option to statically link referenced external libraries - which is to say, grab their contents at build time and put them into your compiled result such that they may as well be part of your program. Java doesn't do this: every time you reference a class or method, it's really just storing the equivalent of the string name of that class, which is then resolved at runtime.

This is why it's very easy to run into a ClassNotFoundException: you can compile code with some classes present on the class path, but then run it in a system where they're not present. The Java runtime doesn't pre-check whether all the required classes are available when it starts running, so you only find out when it hits that line of code.

Different Java-based environments deal with this in different ways. Standard (non-Domino) web apps deal with this by including dependency jars inside the WEB-INF/lib folder during the packaging phase of building. OSGi, XPages's framework of choice, has a whole dependency mechanism where you can specify bundles or packages with version ranges, in the hope of bringing some order to the chaos, with mixed success.

Primitives Are A Thing

Though the term "primitive" means a built-in data type generally, here I'm specifically talking about things like int and double. For historical and performance reasons, Java has a conceptual and practical distinction between the objects that you deal with most of the time and the primitive types used mostly for number storage. Namely: byte, char, short, int, long, float, double, and boolean. Unlike object references, there is no concept of a "null" value with these that would cause a NullPointerException. Referring to a variable with one of these types will always contain some value if the code compiles, even if it's just the 0/false default for an object property when not otherwise initialized.

Each of these types has a corresponding "boxed" object version, generally with the un-abbreviated name capitalized, such as Byte and Integer.

The distinction used to be harsher than it is today, thanks to autoboxing. Autoboxing is a compile-time behavior that will automatically convert between the primitive types and their object holders as necessary, allowing this type of code, which would otherwise be illegal:

1
2
Object i = 3;
int j = new Integer(4);

Autoboxing is mildly inefficient, so it's good to know that it exists, but you don't normally need to lose sleep over it.

All Object Variables Are Pointers

In a language like C++, there is a distinction between a variable that "is" an object vs. one that is a pointer to an object somewhere in memory. In Java, however, the former doesn't exist: an object variable is only ever a pointer. This has a couple implications. For one, this code only deals one object:

1
2
3
4
5
SomeClass foo = new SomeClass();
SomeClass bar = foo;
bar.setName("hi");
foo.setName("hello");
bar.getName(); // Will be "hello"

This is also why Java is picky about not referencing object variables until they've been initialized to at least something, so this generates a compile-time error:

1
2
Object foo;
foo.getClass();

Unfortunately, unlike some languages, Java has no language-level support for enforcing the distinction between a null object reference and a non-null one, which is why NullPointerExceptions are so prevalent.

Another implication of this leads into its own hiccup common to LotusScript programmers:

Strictly Speaking, All Method Arguments Are "By Value", But...

All method parameters in Java are "by value" in the LotusScript sense, the fact that all object variables are pointers means that the "value" you're passing to the method for an object parameter is always a reference. Java has no mechanism to pass a reference to a primitive type, nor does it have a mechanism to implicitly duplicate an object when passing it to a method.

Not only is this a bit conceptually confusing at first, but it's also a potential trap for bad programming practices. It's very easy to write a method that performs modifications on objects passed in as parameters, and this is often the right thing to do. However, since the language doesn't have any syntax mechanism for broadcasting this behavior, it's up to you as the programmer to either write the method name in such a way that it's obvious what's going to happen or clearly state it in the documentation if it's something that's going to be used outside the current file.

Casting Objects Doesn't Do Anything

By "casting", I'm referring to something like this:

1
RichTextItem body = (RichTextItem)doc.getFirstItem("SomeItem");

The (RichTextItem) is a cast, and it's yet another area that diverges from some other languages. What casting an object in Java means is that you're going to refer to an object by a different class or interface than the one it's been previously referred to as. It has some runtime implications, but the thing to keep in mind is that it's about choosing the name for something that exists as opposed to changing an object into a different class.

So, for example, the RichTextItem idiom above exists because the Document#getFirstItem method returns an object that's called a Item, but, when the item in the document is rich text, it will actually return a RichTextItem object. RichTextItem is a subclass of Item, and so it's legal to refer to such an object either as RichTextItem, Item, Base (the common interface for all Notes objects), or Object (the common superclass of all objects). In a situation like this, you have to cast the object because you're going to refer to it as a more-specific type of object than the one the method says it returns.

If you do this and the object is not actually of the type you're trying to cast it to (in this case, if it's a plain text item or MIME, most commonly), you'll end up with a ClassCastException, because the cast is enforced at runtime. But, success or fail, the cast will not actually affect the object itself in any way - it will continue on being whatever it was already, regardless of name.

Casting Primitives Does Do Something

For better or for worse, performing a cast on a primitive type does have the possibility of creating a new value. For example:

1
2
int foo = Integer.MAX_VALUE;
short bar = (short)foo;

Because int can hold more data than a short, this case creates a new value based on chopping off the highest-value bits of the internal binary representation of foo. (As a side note, because of the fun way computers deal with numbers, foo is 2147483647, while bar is -1.)

Normally, this behavior doesn't matter too much, since, if you have a method that takes, say, an int and you have a long, you can safely cast it down since it'll likely be a tiny value anyway. It's important to know that it can happen, though, and this behavior is very important when, for example, native C libraries that use unsigned values, which do not exist in Java as such.

Java Has Only A Limited Concept of Immutability

"Immutability" refers to the inability to change the value of an entity once it's created. It's come to the fore as a concept recently because working with immutable objects sidesteps a lot of issues with asynchronous programming. Java, unfortunately, doesn't really have any language-level support for immutable objects in the sense that, for example, Swift does.

Java throws a bit of a curve ball in this area with the final keyword, which means that a variable can't be reassigned after being first initialized. This means that you can't do things like this:

1
2
3
4
5
final int foo = 3;
foo = 4; // compiler error

final SomeClass bar = new SomeClass();
bar = new SomeClass(); // compiler error

This, on the other hand, is entirely legal:

1
2
3
final SomeClass foo = new SomeClass();
foo.setName("hi");
foo.setName("hello");

This is because the only thing blocked from changing here is the value of foo-the-reference, but the object it's referencing can be changed at will.

An object can be made effectively immutable, though, by means of making its outward-facing methods not change any of the internal state. This is used commonly for "value" classes, such as the aforementioned Integer. Though the language doesn't do anything to guarantee that the Integer class doesn't allow mutation, the class is written in such a way that it has no inlet for it.

Because of the value of immutable objects, they're used commonly in the core Java classes and in third-party libraries, particularly newer ones. However, since the language can't tell you if an object is immutable, you have to be on the lookout for whether a given method modifies the existing object in-place or returns a new object reflecting the change. This comes up frequently with Strings, which are immutable in Java. This is something I've seen commonly:

1
2
3
String foo = " hello ";
foo.trim();
System.out.println(foo);

That code will print " hello ", with the leading and trailing spaces (though without the quotes). This is because the String#trim method, like all "changing" methods on String, leaves the original value intact but returns a new String object reflecting the expected value.

This is just something you have to be on the lookout for, especially since this pattern isn't even consistently applied within the core Java classes. The Date class, for example, is infamously bad in a lot of ways, and one of those ways is that it has mutation methods.

Generics In Java Are Weird

A "generic" refers in this case to a class that is declared as being associated with one or more other types that can be defined after the fact. The prototypical example of this is a collection class, like List<String> foo. In this case, the List interface is generic and lets you specify the type of object you expect to find within it, in this case String.

Generics, unfortunately, were added after Java's initial release, and they bear the marks of it. Unlike languages like C++, Java generics are largely syntactic sugar, meant to replace things like:

1
String someString = (String)aListIKnowHasStrings.get(0);

...with this:

1
String someString = aListDeclaredWithStrings.get(0);

However, under the covers, a List only ever really knows it contains Objects, and the second form just transparently shims in a (String) cast at runtime. That's why you can do something like this:

1
((List<Object>)(List<?>)aListDeclaredWithStrings).add(new NotAStringObject());

That line will not only compile, but it will execute without issue at runtime. It's only later, when you try to extract the value to a String variable, that you'll hit a ClassCastException.

Some generic information is retained at runtime, depending on how it's used, but for the most part it's best to think of it as just a syntax nicety. This behavior is endless trouble, but something we have to live with.

Garbage Collection Is Automatic, But Resource Management Isn't Necessarily

This is one of the main things that bites Domino developers as they learn about Java. One of the early things they learn when switching from LotusScript to Java is that now you have to worry about the .recycle() method on your objects, or else you'll have trouble. This leads to two misapprehensions: that Java in general requires "recycling" for every object, and that recycling with Domino objects is about memory in the same way that a Java OutOfMemoryError is.

Unfortunately, the reason that recycle() exists at all requires delving into some nitty-gritty aspects of the Java environment, but I first want to reinforce that Java uses automatic garbage collection at all times to watch for and delete objects that are no longer used. That "no longer used" bit glosses over a bit, but take this as an example:

1
2
3
4
5
6
7
8
9
public void foo() {
  String a = "hello";
  String b = " there";
  return a + b;
}
public void bar() {
  String message = foo();
  System.out.println(message);
}

There are three objects in action here, but, by the time the code reaches the System.out.println line, a and b are no longer used and will be slated for automatic garbage collection. You as a programmer do not need to worry about them.

The lotus.domino objects, though, are trouble. I think it's best to not think of recycle() in terms of "memory" but instead think of the objects as "open resources", in the same way that you might open a network connection or a stream to a file on the filesystem. Unlike object memory, network resources are not necessarily automatically closed by Java - there are some affordances with syntax and the concept of "finalizers", but, in general, the responsibility for closing a resource lies with the programmer.

There are a few grab-bag notes to do with these objects:

  • Different lotus.domino objects refer to different kind of backing resources, which is why problems will sometimes manifest as complaints about memory (when they refer primarily to C-side structures in Domino's native memory, separate from Java) and sometimes as complaints about handles (database and document references, generally)
  • Recycling a "parent" object recycles all of its children, but the relationship is not always clear. Importantly, DateTime objects are children of the ancestor Session, even when you retrieve them from a Document, and so they can linger for a long time
  • Agents and XPages both mitigate and conceal the need for recycling by automatically closing the auto-generated Session(s) at the end of the agent execution or page request. In practice, you only really need to worry about recycling if you're, for example, looping over a large view

I may make another one of these posts in the future, and hopefully this goes a little way to clearing up some common misconceptions.

AbstractCompiledPage, Missing Plugins, and MANIFEST.MF in FP10 and V10

Fri Oct 19 11:48:33 EDT 2018

  1. AbstractCompiledPage, Missing Plugins, and MANIFEST.MF in FP10 and V10
  2. Domino 11's Java Switch Fallout
  3. fontconfig, Java, and Domino 11
  4. Notes/Domino 12.0.2 Fallout
  5. Notes/Domino 14 Fallout

Since 9.0.1 FP 10, and including V10 because it's largely identical for this purpose, I've encountered and seen others encountering a couple strange problems with compiling XPages projects. This is a perfect opporunity for me to spin a tale about the undergirding frameworks, but I'll start out with the immediate symptoms and their fixes.

The Symptoms

There are three broad categories of problems I've seen:

  • "AbstractCompiledPage cannot be resolved to a type"
  • Missing third-party XPages libraries, such as ODA, resulting in messages like "The import org.openntf cannot be resolved"
  • Complaints about MANIFEST.MF, like "MANIFEST.MF has no main section" and others

The first two are usually directly related and have the same fix, while the second can also be caused by some other sources, and the last one is entirely distinct.

Fix #1: The Target Platform

For the first two are based on problems in the active Target Platform, namely one or both of the standard platform components go missing. The upshot is that you want your Target Platform preferences to look something like this:

Working Target Platform

There should be a selected platform (the name doesn't matter, but "Running Platform" is the default name) with entries at least for ${eclipse_home} and for a directory inside your Notes data dir, here C:\Notes\Data\workspace\applications\eclipse. If they're missing, modify an existing platform or create a new one and add an "Installation"-type entry for ${eclipse_home} and a "Directory"-type one for the eclipse directory within your data dir.

Fix #2: Broken Plugins, Particularly ODA

Though V10 didn't change much when it comes to XPages, there are a few small differences. One in particular bit ODA: we had a dependency on the com.ibm.domino.commons plugin, which was in the standard Notes environment previously but is not as of V10 (though it's still present on the server). We fixed that one in the V10 release, and so you should update your ODA version if you hit this trouble. I don't think I've seen other plugins with this issue in the V10 transition, but it's a possibility if Fix #1 doesn't do it.

Fix #3: MANIFEST.MF

This one barely qualifies as a "fix", but it worked for me: if you see Designer complaining about MANIFEST.MF, you can usually beat it into submission by cleaning/rebuilding the project in question. The trouble is that Designer is, for some reason, skipping a step of the XPages compilation process, and cleaning usually kicks it into gear.

I've also seen others have success by deleting the error entry in the Errors view (which is actually a thing that you can do) and never seeing it again. I suspect that the real fix here is the same as above: during the next build, Designer creates the file properly and it goes away on its own.

The Causes

So what are the sources of these problems? The root reason is that Designer is a sprawling mountain of code, built on ancient frameworks and maintained by a diminished development team, but the immediate causes have to do with OSGi.

The first type of trouble - the target platform - most likely has to do with a change in the way Eclipse manages target platforms (look at the same prefs screen in 9.0.1 stock and you'll see it's quite different), and I suspect that there's a bug in the code that migrates between the two formats, possibly due to the dramatic age difference in the underlying Eclipse versions.

The second type of trouble - the MANIFEST.MF - is due to a behind-the-scenes switch in how Designer (and maybe the server) handles dependencies in XPages projects.

Target Platforms

The mechanism that OSGi projects - such as XPages applications - use for determining their dependencies at build time is the notion of a "Target Platform". The "target" refers to the notion that this is the platform that is expected to be available at runtime for what you're building - loosely equivalent to a basic Java classpath. An OSGi project is checked against this Target Platform to determine which classes are available based on their bundle names and versions.

This is distinct from the related concept of a "Running Platform". Designer, being based on Eclipse, is itself built on and runs using OSGi. Internally, it uses the same mechanisms that an XPages application does to determine what plugins it knows about and what services those plugins provide.

This distinction has historically been hidden from XPages developers due to the way the default Target Platform is set up, pointing at the same Running Platform it's using. So Designer itself has the core XPages plugins running, and it also exposes them to XPages applications as the Target. Similarly, the way we install XPages Libraries like ODA is to install them outright into the Designer Running Platform. This allows Designer to know about the library service provided, which it uses to populate the list of available plugins in the Xsp Properties editors.

However, as our trouble demonstrates, they're not inherently the same thing. In standalone OSGi development in Eclipse, it's often useful to have a Target Platform distinct from the Running Platform - such as the XPages environment for plugins - to ensure that you only depend on plugins that will be available at runtime. But when the two diverge in Designer, you end up with situations like this, where Designer-the-application knows about the XPages runtime and plugins and constructs an XPages project and translates XSP to Java using them, but then the compilation process with its empty Target Platform has no idea how to actually compile the generated code.

MANIFEST.MF

I've mentioned that an OSGi project "determines its dependencies" out of the Target Platform, but didn't mention the way it does that. The specific mechanism has changed over time (which is the source of our trouble), but the idea is that, in addition to the Java classes and resources, an OSGi bundle (or plugin) has a file that declares the names of the plugins it needs, including potentially a version range. So, for example, a plugin might say "I need org.apache.httpcomponents.httpclient at least version 4.5, but not 5.0 or higher". The compiler uses the Target Platform to find a matching plugin to compile the code, and the runtime environment (Domino in our case) does the same with its Running Platform when loading.

(Side note: you can also specify Java packages to include from any plugin instead of specific plugin names, but Designer does not do that, so it's not important for this purpose.)

(Other side note: this distinction comes, I believe, from Eclipse's switch from its own mechanism to OSGi in its 3.0 release, but I use "OSGi" to cover the general concept here.)

The old way to do this was in a file called "plugin.xml". If you look inside any XPages application in Package Explorer, you'll see this file and the contents will look something like this:

<?xml version="1.0" encoding="UTF-8"?>
<?eclipse version="3.0"?>
<plugin class="plugin.Activator"
  id="Galatea2dVCC_2fIKSG_dev5csyncagent_nsf" name="Domino Designer"
  provider="TODO" version="1.0.0">
  <requires>
    <!--AUTOGEN-START-BUILDER: Automatically generated by null. Do not modify.-->
    <import plugin="org.eclipse.ui"/>
    <import plugin="org.eclipse.core.runtime"/>
    <import optional="true" plugin="com.ibm.commons"/>
    <import optional="true" plugin="com.ibm.commons.xml"/>
    <import optional="true" plugin="com.ibm.commons.vfs"/>
    <import optional="true" plugin="com.ibm.jscript"/>
    <import optional="true" plugin="com.ibm.designer.runtime.directory"/>
    <import optional="true" plugin="com.ibm.designer.runtime"/>
    <import optional="true" plugin="com.ibm.xsp.core"/>
    <import optional="true" plugin="com.ibm.xsp.extsn"/>
    <import optional="true" plugin="com.ibm.xsp.designer"/>
    <import optional="true" plugin="com.ibm.xsp.domino"/>
    <import optional="true" plugin="com.ibm.notes.java.api"/>
    <import optional="true" plugin="com.ibm.xsp.rcp"/>
    <import optional="true" plugin="org.openntf.domino.xsp"/>
    <!--AUTOGEN-END-BUILDER: End of automatically generated section-->
  </requires>
</plugin>

You can see it here declaring a name for the pseudo-plugin that "is" the XPages application (oddly, "Domino Designer"), a couple other metadata bits, and, most importantly, the list of required plugins. This is the list that Designer historically (and maybe still; it's not clear) uses to populate the "Plug-in Dependencies" section in the Package Explorer view. It trawls through the Target Platform, finds a matching version of each of the named plugins (the latest version, since these have no specified ranges), adds it to the list, and recursively does the same for any re-exported dependencies of those plugins. "Re-exported" isn't exposed here as a concept, but it is a distinction in normal OSGi plugins.

Designer derives its starting points here from implicit required libraries in XPages applications (all those "org.eclipse" and "com.ibm" ones above) as well as through the special mechanism of XspLibrary extension contributions from plugins installed in the Running Platform. This is why a plugin like ODA has to be installed in Designer itself: in the runtime, it asks its plugins if they have any XspLibrary classes and uses those to determine the third-party plugin to load. Here, ODA declares that its library needs org.openntf.domino.xsp, so Designer adds that and its re-exported dependencies to the Plug-in Dependencies group.

With its switch to OSGi in the 3.x series circa 2005, most of the functionality of plugin.xml moved to a file called "META-INF/MANIFEST.MF". This starkly-named file is a standard part of Java, and OSGi extends it to include bundle/plugin metadata and dependency declarations. As of 9.0.1 FP10, Designer also generates one of these (or is supposed to) when assembling the XPages project. For the same project, it looks like this:

Manifest-Version: 1.0
Bundle-ManifestVersion: 2
Bundle-Name: Domino Designer
Bundle-SymbolicName: Galatea2dVCC_2fIKSG_dev5csyncagent_nsf;singleton:=true
Bundle-Version: 1.0.0
Bundle-Vendor: TODO
Require-Bundle: org.eclipse.ui,
  org.eclipse.core.runtime,
  com.ibm.commons,
  com.ibm.commons.xml,
  com.ibm.commons.vfs,
  com.ibm.jscript,
  com.ibm.designer.runtime.directory,
  com.ibm.designer.runtime,
  com.ibm.xsp.core,
  com.ibm.xsp.extsn,
  com.ibm.xsp.designer,
  com.ibm.xsp.domino,
  com.ibm.notes.java.api,
  com.ibm.xsp.rcp,
  org.openntf.domino.xsp
Eclipse-LazyStart: false

You can see much of the same information (though oddly not the Activator class) here, switched to the new format. This matches what you'll work with in normal OSGi plugins. For Eclipse/Equinox-targeted plugins, like XPages libraries, plugin.xml still exists, but it's reduced to just declaring extension points and contributions, and no longer includes dependency or name information.

Eclipse had moved to full OSGi by the time of Designer's pre-FP10 basis (2008's 3.4 Ganymede), but XPages's history goes back further, so I guess that the old-style Eclipse plugin.xml route is a relic of that. For a good while, Eclipse worked with the older-style plugins without batting an eye. FP10 brought a move to 2016's Eclipse 4.6 Neon, though, and I'm guessing that Eclipse dropped the backwards compatibility somewhere in the intervening eight years, so the XPages build process had to be adapted to generate both the older plugin.xml files for backwards compatibility as well as the newer MANIFEST.MF files.

I can't tell what the cause is, but, sometimes, Designer fails to populate the contents of this file. It might have something to do with the order of the builders in the internal Eclipse project file or some inner exception that manifests as an incomplete build. Regardless, doing a project clean and usually jogs Designer into doing its job.

Conclusion

The mix of layering a virtual Eclipse project over an NSF, the intricacies of OSGi, and IBM's general desire to insulate XPages developers from the black magic behind the scenes leads to any number of opportunities for bugs like these to crop up. Honestly, it's impressive that the whole things holds together as well as it does. Even though it doesn't seem like it to look at the user-visible changes, the framework changes in FP10 were massive, and it's not at all surprising that things like this would crop up. It's just a little unfortunate that the fixes are in no way obvious unless you've been stewing in this stuff for years.

App Dev After CollabSphere 2018

Sun Jul 29 10:48:41 EDT 2018

In recent years, MWLUG/CollabSphere has tended to be a good time to get a lay of the land for what IBM - and now HCL - intends for their app dev strategy. Recent Connects weren’t too heavy on announcements of major import for Domino developers, and any news to come out tends to do so in the months leading up to summer.

This year, we’ve had time to digest the implications of the HCL transfer, get a feel for how they intend to handle the product, and generally get a good bead on their app-dev vision. What they’ve said so far this year is clear: LotusScript for old apps on mobile platforms and Node.js for new development (or new developers). As far as XPages, I believe that the most time that it got at the conference was in my session, which was about what to do after XPages.

LotusScript

Though I’ve certainly not hidden how painful the prospect of enhancements to LotusScript is to me, I have to admit that adding a few capabilities for REST data service access makes strategic sense for the platform. Though XPages made a significant mark on Domino app dev, it never pushed aside the classic style, and every move that IBM made for app modernization since then seemed to exist exclusively in the span of the sentence announcing it.

So HCL announced early this year that they planned to port the classic Notes client first to iOS and then later to Android and WebGL+WebAssembly. Adding any kind of Java to this plan - XPages, LS2J, etc. - would present some technical hurdles, and so it makes workload sense to focus on the languages that have runtimes in the C core.

Apps run this way won’t be good, but there’s some logic to the tack of targeting customers for whom “modernization” only really means “we want our same old apps to run offline on new OSes”. Their plan to run on phones also necessitates some more-dramatic changes to the tooling, so it’s possible that they have larger changes in mind - or at least we’ll see a return of the “hide on mobile” checkboxes in Designer.

Node.js

The big HCL push for Node.js seems to me to be a way to get a lot of bang for the buck: by positioning it as the new way to write apps, they’re both (potentially) making Domino more appealing to those not already on the platform and guiding existing developers to a platform for which IBM and HCL are not responsible. Though the domino-db driver is no small technical feat - and it looks like they’ve done a good job making it both fast and native-feeling in Node - it’s a much, much smaller footprint than XPages, which put IBM on the hook for maintaining an entire app-dev stack and UI toolkit with limited outside assistance.

I do think that it’s smart to write a Node.js DB driver - even if it doesn’t bring in an influx of new blood, it provides a legitimate app-dev story and Node is a top-notch platform. The gRPC stack also provides an entryway for future hooks and development without the assumptions of NRPC.

Java

Java development on Domino is in a weird place. Domino 10 doesn’t have anything directly for XPages/OSGi developers, though we’ll get access to DGQF via the Database class. I’ve heard whispers that they’re starting to plan more for Domino 11, but that’s largely conjecture at this point. Certainly, HCL has made it clear that their heart isn’t in it, and honestly I get why. Since XPages has been in essentially maintenance mode since 9.0.1 or earlier, it’s aged itself out of contention for modern app dev. It wouldn’t be impossible to drag it forward to something respectable, but then they’d still have another development environment exclusive to Domino to maintain.

I’m not sure what the best thing to do with the stack is. Though XPages didn’t bring all Domino developers to it, it did bring a significant chunk, and a lot of people have spent upwards of a decade of their life with the toolkit. For my part, I think it makes a lot of sense to move to “normal” Java/Jakarta EE development, which provides the possibility of salvaging Java-side code, though it leaves XSP and SSJS in the lurch. It’s hard to make a good financial case for either significantly upgrading the platform or at least undoing the tight coupling with the Domino server that it accrued over the years, though I’ll admit it’s sort of fun to think about.

Reforming the Blog in Darwino, Part 4

Fri Jul 20 18:59:51 EDT 2018

Tags: java darwino
  1. Reforming the Blog in Darwino, Part 1
  2. Cramming Rails Into A Maven Tree
  3. Reforming the Blog in Darwino, Part 2
  4. Reforming the Blog in Darwino, Part 3
  5. Reforming the Blog in Darwino, Part 4

Last time, I went over my switch in tack for how I'm making the new version of my blog, and my overall focus on picking an interesting stack of JEE technologies. In this post, I'm going to start diving into the implementation of the UI, though I think that it will make sense to dedicate two posts to it.

The biggest decision I made with the UI side of this app is that I didn't want to make a client-side JS app. There's a reason they're so ascendant, and I find development with React or Stencil pretty enjoyable, but I wanted to go a different route here for a few reasons:

  • For a blog, a CSJS app is wildly overkill, and, in fact, would require extra work to fulfull one of the basic requirements of a blog, which is being web-crawler friendly.
  • I want to see how svelte I can make the client payload.
  • Skipping a JS framework (and a CSS one) is a great way to brush up on what plain HTML and CSS are capable of nowadays.
  • Unlike a typical Darwino app, my only target is a full-on Java web server, so I'm not held back on the Java side by the capabilities, say, of Dalvik on Android 4.
  • Part of me misses the simplicity of my early PHP days, albeit not the language.

The Java Side

I decided to go with the MVC 1.0 draft spec because it lets me write extremely focused code. Here is the controller for the home page:

package controller;

import javax.inject.Inject;
import javax.mvc.Models;
import javax.mvc.annotation.Controller;
import javax.ws.rs.GET;
import javax.ws.rs.Path;

import model.PostRepository;

@Path("/")
@Controller
public class HomeController {
	@Inject
	Models models;
	
	@Inject
	PostRepository posts;
	
	@GET
	public String get() {
		models.put("posts", posts.homeList());
		
		return "home.jsp";
	}
}

Naturally, there's a lot of magic going on behind the scenes - there's tons of heavy lifting going on here by JAX-RS, MVC, CDI, JNoSQL, and Darwino - but that's the point. All the other components are off doing their jobs in their areas, while the code that provides the UI doesn't have to care about the specifics.

Things can get more complicated on the pages that actually have some functionality to them, but the code remains pleasantly focused. Take the handler for deleting posts:

@DELETE
@Path("{postId}")
@RolesAllowed("admin")
public String delete(@PathParam("postId") String postId) {
	Post post = posts.findByPostId(postId).orElseThrow(() -> new IllegalArgumentException("Unable to find post matching ID " + postId));
	posts.deleteById(post.getId());
	return "redirect:posts";
}

This adds another level of magic in the form of javax.security.annotation.RolesAllowed, but it's more of the good kind: even with no knowledge of the underlying frameworks, it's pretty clear what every bit of code is doing here. It's a refreshing bit of that Rails simplicity, just more compile-type-safe and much uglier.

Even beyond the minimal code is the cleanliness that this brings to the structure of the application: other than the img, css, and js paths, all of the routing within the application is done care of JAX-RS and MVC. It's not beholden to the folder structure in the project or to a Domino-style implicit app router.

JSP

JSP has been the prototypical Java HTML language for about 20 years, and it's had a rough upbringing. The early versions committed the PHP/XPages sin of encouraging you to put business logic right on the page, and it even still has the typical Java problem that it's tricky to find advice about using it that uses technologies added since 2005.

Still, when used properly, it can be a nice, clean templating language. Again from the main home page:

<%@page contentType="text/html" pageEncoding="UTF-8"%>
<%@taglib prefix="t" tagdir="/WEB-INF/tags" %>
<%@ taglib prefix="c" uri="http://java.sun.com/jsp/jstl/core" %>
<t:layout>
	<c:forEach items="${posts}" var="post">
		<t:post value="${post}"/>
	</c:forEach>
</t:layout>

For an XPages developer, this is extremely comfortable. It's also very refreshingly elemental: there's no server-side persistence of the page, so everything is "load-time bound" and, with just HTML tags and core JSTL tags, nothing ends up on the page that you don't explicitly put there.

Ozark, the MVC implementation, also supports using JSF "Facelets" for the view portion, but JSP suits the task just fine.

HTML + CSS

It'd been far too long since I last really sat down and looked at what baseline HTML and CSS are like - in particular, I'd watched the rise of CSS Flexbox and Grid from afar, and never gave them a shot. Using components that generate their own HTML and pre-existing CSS frameworks to target with class names is all well and good, but it does leave you a bit disconnected from the fundamentals.

So, for this iteration, I tossed aside the very-nice Bootstrap framework I've been using, dusted off one of my old hand-built ones, and got to translating it into CSS Grid. This cut down on the page size enormously: I had already echewed Dojo by not using XPages, but this now also meant that I could ditch the core bootstrap.css, jQuery, and any jQuery plugins.

Beyond CSS Grid, have you seen how nice HTML forms are nowadays? Just looking at this post reveals how much is built in in the way of validation and different input types, even before you write a line of JavaScript.

Turbolinks

Having such a trimmed-down UI means that pages already load extremely quickly, but I figured this was also a perfect chance to try out a bit of clever tech from the team at Basecamp: Turbolinks. Turbolinks is a JS file that you bring into your app which then transparently takes over your in-app links to minimize the amount of rendering you have to do. Since the surrounding boilerplate of the app usually doesn't change between requests, it can figure out the "diff" between old and new and just replace the body. It's essentially like partial refreshes without the server knowing anything about it.

It's still technically inefficient to have the server render and transfer surrounding page elements that are just going to be discarded anyway. But, on the other hand, skipping that means that I don't have to write JavaScript handlers myself, use a full CSJS app framework, or keep state on the server side. The server just keeps doing what it does with a fully context-less request and the browser sorts it out. Basecamp's programmers are masters at the targeted deployment of kludges for maximum benefit.


In the next (final?) post in the series, I'll finish up with my "low-JS" experience and other lessons learned from this project.

Reforming the Blog in Darwino, Part 3

Wed Jul 18 10:30:22 EDT 2018

Tags: java
  1. Reforming the Blog in Darwino, Part 1
  2. Cramming Rails Into A Maven Tree
  3. Reforming the Blog in Darwino, Part 2
  4. Reforming the Blog in Darwino, Part 3
  5. Reforming the Blog in Darwino, Part 4

A good while back, I created a project structure for reforming my blog here in Darwino, but, as happens with low-priority side projects, it withered on the vine, untouched since then. Beyond just the "cobbler's children" aspect to it, I also lost steam due to a couple technology paths I initially headed down.

The first was basing the UI on Angular, which I've never really enjoyed working with. I'm sure I could have ended up with a decent result with it, but Angular always rubbed me the wrong way. And not just Angular: for a dead-simple UI like this, a full JS UI is just weird overkill.

The second was off in the other direction: I initially tried cramming a Rails app in the tree, which could be made to work, but it introduced so many weird edge cases outside of the problem at hand. That alone isn't the end of the world, but not much of what I'd have to solve to make that work would be transferrable elsewhere any time soon, so it'd end up a real time sink.

So, taking what I've learned since and the projects that I've been working on, I've decided to take another swing at it. Before I get into the implementation side, it will be useful to go over the technologies I did choose for the new form.

Java/Jakarta EE

I've recently become kind of enamored with the modern form of the Jakarta EE stack, and so I decided to use this as an opportunity to really dive in to what a blue-ocean small by-the-books Java app looks like nowadays.

JEE got a well-deserved bad rap over the years for its configuration complexity and general impenetrable-ness, but I've been very pleased to find that those tides have largely receded. It's all still there if you want it, but a fresh new app primarily consists of decorating a handful of Java classes with declarative annotations.

JEE consists of a series of individual specs, and building an app involves choosing which ones you want to use, plus (depending on which you choose) picking your app server target.

Tomcat

I originally gave a shot to adding enough OSGi metadata and bundles to target Domino, but decided quickly that it was just not worth it. The HTTP/servlet stack in Domino is just so old that, even if I got everything bound together, I'd still be fighting the platform every step of the way.

The better route was to put it aside and just run a modern Java app server. I went down the list of GlassFish, Payara, WebSphere Liberty (the nearest miss), TomEE, and WildFly, but each one ended up having some problem with either the dependencies I wanted or with their Eclipse integration. I ended up settling on good ol' reliable Tomcat. Tomcat itself isn't actually a JEE server, but it's kind of like a Raspberry Pi: it gives you the baseline for a Java servlet engine, and then you can cobble together your own EE stack on top of it by explicitly bringing in implementations. Though the final .war file is far less svelte this way, I found that this build-your-own method results in the lowest chance of being held back by the platform currently.

As an aside, Sven Hasselbach has been writing a very interesting series on running Jetty on top of the Domino JVM to achieve a similar end, albeit with Spring.

Darwino

For all the same reasons as when I set out on this journey originally, I'm using Darwino for the baseline. This lets me replicate in my existing blog data smoothly while getting the advantages of a superior backing database. I'm not making use of mobile clients or most Darwino services with this, but the baseline is nonetheless a step up, and fits in with a JEE app like a glove.

JNoSQL

I brought in the JNoSQL Darwino driver I wrote a little while ago to handle the model layer. JNoSQL is essentially JPA but reformed for NoSQL access - no cruft, no relational/NoSQL impedence mismatch, and designed to fit with current JEE technologies.

CDI

CDI is one such technology, and it's a very interesting one to work with. The whole "dependency injection" realm is a little fraught and, if my Eclipse UI error reporter is any indication, prone to bizarre errors, but the core concept is good and very useful. I've gotten it into the swing of using it both as the "managed bean" provider for the front end as well as the general service provider glue for the app. It still takes some getting used to, and the learning curve falls prey to a similar problem as when I was learning Maven: something about learning how it works makes you forget what it was like to not know, and so a lot of the answers online assume way more knowledge than a neophyte has.

Bean Validation

I've long been a fan of the Java bean validation API, and it's a clean fit here too: JNoSQL picks up on the presence of Hibernate Validator without configuration beyond the dependency and it just works. No muss, no fuss.

JAX-RS + MVC Spec

JAX-RS is at this point familiar territory for a lot of Domino developers, but I decided to use it as the underpinnings of the whole UI, in tandem with a draft framework called MVC 1.0. The latter's generic name doesn't really give much detail, but it's essentially a spec that enhances JAX-RS entities with knowledge of HTML templating frameworks, allowing you to write a very clear app structure. It's not a server-state-based framework like JSF, but rather a bit "closer to the metal", where you deal directly with the HTTP method cycle.

As I'll go more into in the "UI" post, it's been surprisingly refreshing to get back to basics in this way - JSF/XPages is often a bit conceptually easier to work with (at first) and client-side JS frameworks have some REST+JSON purity to them, but just "this server-rendered HTML page with no server state is everything you need" feels really good sometimes.

Admittedly, the MVC spec itself is in a weird place. It was originally a candidate for inclusion in Java EE 8, but was dropped in the final runup. It's possible that this will prove to be a kiss of death, but the spec is so small but functional that I don't feel bad about taking the risk of building an app on it.


That about covers the technology stack. When I get around to writing the next post, I'll go into some of the specifics about how I decided to set up the UI, which has been a fun experiment of its own. In the mean time, the active repository is up at:

https://github.com/jesse-gallagher/frostillic.us-Blog/tree/develop/frostillicus-blog

A (Java-Centric) Domino Wish List

Thu Jul 12 12:04:41 EDT 2018

Tags: domino java

Seeing the information come out of this week's HCL "Golden Ticket" event has got me thinking about some of my wish-list items for Domino development, mostly in the form of enhancements for existing capabilities and entirely around Java (since that's what I do).

Quality of Life

Javadoc

For some reason, the lotus.domino classes ship without Javadoc or even variable-name information, leading to this trainwreck:

Designer has its built-in help, which is also on the web, but that's quite a few steps down. This is table stakes for a Java API and always has been.

Updated p2 Repository

Back in 2014, the XPages team uploaded a clean p2 repository of the XPages artifacts to OpenNTF, corresponding with the 9.0.1 release. This repository saves a ton of hassle when building Tycho-based projects or just setting up an Eclipse workspace. However, it's quite long the tooth, as there have been several Notes.jar additions not included in there, and, in FP10, a significant upgrade to the undergirding OSGi framework.

I ended up writing a script to generated an updated version, but I don't have the legal ability to publish the results anywhere for easy consumption, meaning it has to be done manually and configured for each build environment. It would be a great convenience if there was an official package (ideally including the Designer plugins as well) and, even better, hosted on OpenNTF so that we could reference it by URL as we do for Eclipse releases (and require users to accept a license first).

Mavenized Repository

The p2 repository is good for Tycho-based projects, but, especially when targetting Domino is only one part of a project, it can be much more convenient to use "normal" Maven projects with maven-bundle-plugin. However, those projects can't use p2 repositories as such. For Darwino's needs, I ended up writing a tool in the (available-for-free) Darwino Studio plugins to Maven-ize a Domino p2 repository, but that hits the same snag as above of requiring manual setup in each instance.

This is another case where my preference would be on the OpenNTF Maven repository (plus Javadoc Jars, naturally).

Extension Library Source

The latest Extension Library release on OpenNTF is from FP9, while the latest on GitHub is from the FP7 era. FP10 shipped with a newer version of indeterminate nature. It'd be good to have this on both of those sites and, like with Javadoc, have source bundles shipped with the product in a way that is picked up automatically by Designer and Eclipse.

Source Bundles for Third-Party Components

The source for the undergirding Equinox stack is available, but it would be best to have, as an adjunct to the updated p2 repositories, the source bundles for the actual versions used so that we don't have to cobble together a platform from Eclipse's repositories.

Open-Source the Rest of the Stack

Having XPages, the Expeditor husk, and the other miscellaneous doodads that make up the proprietary layer as open source with an Apache-compatible license would cover a lot of the above and also be of tremendous use for XPages and non-XPages apps alike that run on or with Domino. I have a hard time imagining that it would lead to a lot of community-driven improvement, but it may do some (I'd have a few words to share with the file-download control, for example), and even just as a static release would be a significant boon.

Domino Connectivity in Eclipse

An idea I've been toying with lately is to make an Eclipse plugin that allows you to add Domino servers to the "Servers" view and control them to some extent. The basics would be to start/stop/restart HTTP, but the stretch goals would be to open a console view, get a list of running modules, integrate with the existing "load bundles from PDE" support, and, ideally, an outright "Run on Server" command for OSGi bundles and NSFs. However, I have so much on my plate that I'm not sure that I'll get to this any time soon unless I get a real itch some weekend.

Longevity

Refreshed JVMs

Feature Pack 8 brought Java 8, a vital step forward. However, since then, Oracle moved to a faster release cycle for Java and the JRE is now at version 10. Domino uses IBM's JVM variant, J9, which they recently moved to the Eclipse foundation as OpenJ9, where it has... sort of been keeping up, I think?

In any event, this increased pace of change has meant that the Java 8 honeymoon is over, and Domino development again requires special consideration when using current tools. I have no idea how complex the integration between Domino's tasks and the underlying JVM is, but my ideal would be to have constant or near-constant parity.

Servlet API 4.x

After the JRE version, the most important foundational element of a Java web app is the servlet API release. The current version is 4.0.1, while Domino supports 2.5 (or 2.4, maybe?). The good news is that the Java/Jakarta EE world seems to be used to lagging versions here, and 2.5 is a minimum version for a lot of current tech in much the same way that Java 6 was until somewhat recently, but there has been quite a bit added in recent years.

Presumably, a reason for the lag is the implied requirements of newer versions, such as WebSockets and HTTP/2 support, that would require heavy modifications to the core Domino HTTP code. Honestly, the more practical route is almost definitely to just use a different JEE server paired with CrossWorlds, some Java wrapper for the GRPC stuff HCL has been talking about, or (best of all) a Darwino app replicating with Domino, but still. WebSphere Liberty is actually really nice, by the way.

Refreshed Equinox

Like with the underlying JVM, the Feature Pack 10 update to a Neon-based OSGi/Equinox framework was a critical shot in the arm for the platform, but it is now also two major versions behind. This is a little less critical, since Equinox brings a bit less to the table for our needs and Neon is "new enough" for now, particularly on the server side, but it'd still be proper to keep pace.

Odds and Ends

Non-OSGi JEE Support

The Equinox framework that Domino uses is quite capable, but there's no pretending that OSGi-targetted development has its share of headaches. Most Java apps just target plain-old .war files and don't impose any particular requirements on the build process. Java development is a much more pleasant experience when you can just toss in any Maven dependency and not have to think about building a target platform for Eclipse or jumping through bundle-resolution hoops. I really like OSGi in theory, but I can't pretend that non-OSGi development isn't much smoother.

Domino technically supports "regular" servlets currently, but, uh, here's a snippet from the current documentation on that:

Hrm.

Full Extension Manager Support for Java

JAVADDIN/DOTS added a lot of EM hooks, but it doesn't cover the full suite of capabilities that a C addin can provide, such as authentication handling. Having this be fully accessible from Java would be useful even when treating Domino just as a data store and not as an app server.

 


 

I'm sure I could come up with more, but that's probably good for now. All easy, right?

Another Project: XPages Jakarta EE Support

Sun Jun 03 16:40:11 EDT 2018

In my dealings with JNoSQL recently, I’ve been delving more into the world of modern Jakarta EE/Java EE/J2EE development, particularly the magic land of CDI.

The JEE stack tends to be organized as a collection of specs and implementations, many of which are really independent of each other and the underlying platform, making them pretty portable onto any reasonably-recent JVM. Now that Domino is actually on a reasonably-recent JVM, that makes it a workable target! So I decided to create a side project to bring some of JEE to XPages.

XPages has always been “sort of Java EE” - you don’t really have the full stack, and it’s far behind on the components that it does have, but a lot of the concepts are there. Of particular interest are managed beans and expression language.

CDI and Managed Beans

The XPages stack contains what amounts to a priomordial version of CDI. Since the release of XPages, JSF improved on the original faces-config.xml declaration method to add annotation-based declarations, and then CDI is something of a codification and expansion of that into the full Java world.

My project uses the Weld reference implementation of CDI to create a CDI context for each XPages app that opts in, allowing it to use annotations on classes to declare beans and properties:

@ApplicationScoped
@Named // or @Named("applicationGuy")
public class ApplicationGuy {
    public void getFoo() {
        return "hello";
    }
}

These can then be used like normal managed beans in an XPage:

<xp:text value="#{applicationGuy.foo}"/>

The project’s README contains some further examples.

I went with the Java SE implementation of Weld instead of the pre-built servlet or OSGi packages since those are a little too smart for this use: they pick up on the fact that they’re in a JSF environment, but expect newer versions of the servlet spec and JSF.

Expression Language

Since its original release, EL went through a similar standardization process as CDI and is now at version 3.0 and is distinct from JSP and JSF. As anyone who has tried to call a method on a bean in EL has found out, the XPages EL implementation lags pretty far behind, at the JSF 1.0/1.1 level. Since that time, it sprouted parameters and “projection” and is essentially a tiny scripting language now.

My project uses GlassFish’s EL implementation to outright replace the stock EL interpreter for apps making use of it. I added some affordances to IBM’s customized data support, so it’s intended as a drop-in replacement:

<xp:text value="${dataObjectExample.calculateFoo('some arg')}"/>

<xp:text value="#{el:requestGuy.hello()}"/> 

Note the “el:” prefix in the runtime-bound expression: that’s to get around Designer’s validation of runtime EL expressions.

So… Why?

That’s a good question! The first two reasons are “because it’s fun” and “to learn more about JEE”, but there’s also practical value for this sort of thing.

XPages is moribund, and that leaves Domino developers with a few options:

  • Go back to LotusScript. The iPad Notes client makes this a terrifyingly-practical option, but it’s soul death.
  • Go to JavaScript (or another platform). This is another route HCL is pushing, and it’s entirely valid: Node is a great platform with excellent support and momentum.
  • Go to modern Java.

For anyone who has invested a lot of time and brainpower in XPages over the years, that last one particularly appealing, and projects like this can help you get there. If you have a large XPages code base, as I do with one of my clients, it makes a lot more sense to work on that in such a way that it gradually becomes less XPage-dependent while avoiding the trap of a full rewrite in another language.

Many of us have already done something of this sort: JAX-RS is another JEE standard, and the Wink implementation in the Extension Library, though also aging, accomplishes this same sort of task. Especially if your services don’t reference Wink explicitly and write just to the spec, they are very portable.

That portability - of code and skillset - is critical. Say you have a class like this:

import javax.inject.Inject;
import javax.ws.rs.Path;
import javax.ws.rs.GET;
import javax.ws.rs.core.MediaType;
import javax.ws.rs.core.Response;

@Path("/issues")
public class IssuesResource {
    @Inject IssueRepository issueRepository;

    @GET
    @Produces(MediaType.APPLICATION_JSON)
    public Response get(@QueryParam("category") String category) {
        return issueRepository.find(category).stream()
            .map(this::doSomething)
            .skip(3)
            .collect(this::toResponse);
        }

​     // ...
}

Which Java platform is that targetting? What’s the data storage mechanism? Who cares? This class certainly doesn’t. That could just as easily be Domino reading from an NSF or (as is actually the case in the example’s source) Tomcat with Darwino.

What’s Next?

Truthfully, maybe not much. Though JEE contains a whole raft of technologies, these two were the ones that scratch my immediate itch. We’ll see, though - the skill portability of erstwhile XPages developers is critically important, and I think that this is another one of the paths that can get us where we need to go.

Compiling and Testing XPages Plugins With Java 9+

Fri Apr 13 15:38:50 EDT 2018

Tags: java xpages

Thanks to 9.0.1 FP8, we've been able to use Java 8 on Domino for a while, and FP10 makes that support a bit more official at the OSGi level. However, Java 8 is no longer the latest Java runtime, and so anyone writing XPages plugins and compiling/testing them via Maven will likely run into a situation where the compiling JRE is 9 or above. There are a couple changes in these runtimes that add some wrinkles to the process, so I took up the task of working around the problems I hit, and I created a Git repository to contain the code and tips I used to do so:

https://github.com/OpenNTF/org.openntf.domino.java9compat

The README in the repo contains a couple bits of Maven configuration that can be used to successfully compile XPages plugins in Java 9 or 10, and the included project contains a patch fragment for the com.ibm.notes.java.api plugin that serves up the Notes.jar to work around the fact that CORBA has been removed from the standard JRE.

For my needs, those changes made me able to compile a reasonably-complex XPages application, but there may be other edge cases I haven't hit yet. As I do, I'll add information and code there.

Next Project: ODP Compiler

Mon Mar 05 17:32:06 EST 2018

Tags: xpages java
  1. Next Project: ODP Compiler
  2. NSF ODP Tooling 1.0
  3. NSF ODP Tooling Example Project
  4. NSF ODP Tooling 1.2
  5. How the ODP Compiler Works, Part 1
  6. How the ODP Compiler Works, Part 2
  7. How the ODP Compiler Works, Part 3
  8. How the ODP Compiler Works, Part 4
  9. How the ODP Compiler Works, Part 5
  10. How the ODP Compiler Works, Part 6
  11. How the ODP Compiler Works, Part 7

One of the larger thorns in my side with my Domino development lately has been trying to automate builds of on-disk projects into NSFs via Jenkins. In theory, the process is pretty straightforward. It even works sometimes! However, particularly once you add in the necessity to deploy OSGi plugins to Designer first and want to run it from Jenkins, things get extraordinarily flaky: Designer may not launch properly from a behind-the-scenes Jenkins runner, the plugin installation may mysteriously fail, and so forth - and the error reporting is difficult at best.

So it's been on my mind for a good while to find a way to get from an ODP to an NSF without involving Designer, and I decided over the last couple days to really take a swing at it. It's not a small task, though; the process involves a number of difficult steps:

  • Install and activate provided OSGi plugins
  • Create an XPages registry that knows about the XPages libraries installed on the server, including those just contributed
  • Translate XPages and Custom Controls into Java source, with intra-file knowledge of the just-added CCs
  • Create a Java classpath that matches the plug-in dependencies that Designer derives from the dependant XPages Libraries
  • Compile the resultant Java source and any Java classes in the NSF into bytecode
  • Recompose the composite data form of the file data for these elements and many file resources into their DXL ".metadata" files for import
  • Create an NSF and import all of this
  • Compile any LotusScript in source-based libraries from the ODP
  • Uninstall any hot-loaded OSGi plugins

Those steps even leave out some fiddly details, like components defined via .xsp-config files in the NSF or XSP-associated .properties files, not to mention any steps I haven't encountered yet. It's a lot of work!

My first hope was to be able to hook into the process that Designer uses, perhaps grabbing a couple pertinent OSGi plugins and going from there. However, from what I can tell, all the involved plugins are intricately tied into many layers of Designer-the-IDE and so are no small matter to use on their own without also including the entire stack. So that left me to cobble together an equivalent process out of parts.

Fortunately, a couple projects have already provided a solid foundation for this. First and foremost is the XPages Bazaar. This is a project that Philippe Riand created a number of years ago, meant to be a workshop for really experimental components in a form less contrained than the ExtLib became. Since he left IBM, it's sat unmaintained, but I figured it'd be a perfect incubator for this project, so I tossed it up on GitHub, recomposed its Maven structure, and cleaned it up a bit for FP10 use. The reason why it makes such a perfect shell is a pair of its features: an XSP interpreter and an on-the-fly Java compiler. The former hooks into the mysterious guts of the XSP runtime to allow for translation of XSP to Java and the latter wraps the official Java compiler API with some OSGi knowledge to compile that along into bytecode.

Even starting with this, the first couple steps still required a lot of digging around. I learned how to install and activate OSGi bundles, how XPages Registries work internally, and made some tweaks to work around problems I encountered. I also encountered the joy of a bizarre javac bug to do with annotations in enum constructors, which my target project used.

Once I had the XPages-side components compiled, the next step was to start composing the NSF. The ODP format for XPages elements and other "file resource"-type entities is to put the code in its "normal" form in a file and then a subset of the DXL in a ".metadata" file next to it. The trouble here is that, even for entities where the file data is stored in the note unprocessed, the storage format isn't a strict binary blob of the file data: it's a composite data stream of file header and segment structures. I thought of two main ways I could go about getting these file resources into the NSF: via IBM's NAPI and by building the structures into the DXL files before import. The NAPI has a convenient FileAccess class for this purpose (presumably used by Designer), but my attempts to use it met primarily with server crashes. I'm sure it's possible to go this route, but I'd already "solved" the DXL problem years ago, for ODA's Design API. So, at least for now, I took the tack of writing out the binary structure manually, pouring it into the DXL as Base64, and importing that. It's a little inefficient, but it works.

Overall, I've made a lot of progress so far, but there's still a lot to be done: not all file types have their data put into the right places, LotusScript isn't properly compiled, Agents don't do anything at all yet, and XPages+CCs aren't actually imported into the NSF. Still, it's in a spot where I'm confident that it can one day work, whichis more than I could have said a week ago. If you'd like, browse around the code and pitch in if it's an itch you'd like to scratch as well.

New Small Project: generate-domino-update-site

Wed Jan 31 21:28:20 EST 2018

Tags: java xpages

For a good while now, the Domino Update Site for Build Management has been an essential tool for anyone setting up a local OSGi/Tycho development environment. However, it's really withered on the vine - the latest official release matches stock Domino 9.0.1, while recent fix packs have brought a number of improvements and new classes/methods. Since the binary code is entirely IBM's to distribute or not at their leisure, we can't update the release itself. Today's release of FP10, though, pushed me over the edge to the point where I at least wrote a tool to help the local creation of updated repositories.

The result of my work is generate-domino-update-site, a small CLI and programmatic script that you can point to a Domino installation, a destination directory, and an Eclipse program root to have it create (effectively) an updated version of the update site. The result contains a bit more than the official one did, but that should hopefully not hurt anything. Beyond just copying files into a directory, the script does the dirty work of re-packing unpacked bundles and features, vivifying the Notes.jar wrapper bundles, and creating p2 metadata (hence the Eclipse dependency).

To use the tool, you can clone the repository and run the jar as described in the readme. I may also get around to uploading it as a compiled result to a proper OpenNTF project page.

Lessons From Writing a JNoSQL Driver

Sat Dec 30 11:12:24 EST 2017

The other day, I decided to start up a side project to write an app for my Stars Without Number game in Darwino. Like back when I wrote a forum/raiding app for my WoW guild, I like to use this kind of opportunity to try new technologies and flesh out my skills in existing ones.

One such tech I've had my eye on for a bit is JNoSQL, which is a framework for integrating with NoSQL databases in Java. It's along the lines of Hibernate OGM, but intended to avoid the pitfalls of the relational/NoSQL that came with trying to adapt JPA directly to NoSQL databases. JNoSQL promised to be much easier to implement for a new database, so I decided to give it a shot.

JNoSQL

JNoSQL is split into two paired components, cleverly named Diana (the driver side) and Artemis (the model/integration side). The task of writing a driver for a new database is pretty well-contained: pick the database type(s) you want to implement (out of key/value, column, document, and graph) and implement about half a dozen interfaces. This is in stark contrast from when I took a swing at writing a Hibernate OGM driver, where the task was significantly more daunting. The final result is only ten Java files, with a chunk of them being utility classes for code organization.

It's a young project - young enough that the best version to run right now is 0.0.4-SNAPSHOT - but it functions well and it's been taken under the wing of the Eclipse foundation, which builds some confidence.

Implementation

Though the task was small, there were still a couple initial hurdles to getting going.

To begin with, I decided to start with the Couchbase driver - this certainly made the overall task easier, since Couchbase's semantics are very similar to Darwino's, but it also meant that I had to be wary of which parts of the codebase were really about implementing a Diana driver and which were Couchbase-isms. Fortunately, this was much easier than the equivalent work when I adapted the CouchDB Hibernate OGM driver, which was a sprawling codebase by comparison.

More significantly, though, it's always tough coming in to modify a codebase written by a single person or small team and learning as you go. The structure of the code is clean, but not quite my normal style (in part because Domino kept me from diving into Java 8 streams for so long), and I also had to ramp up quickly on the internal concepts of Diana. Fortunately, this was mostly easy, since the document-DB driver scaffolding is purpose-built, the hooks are straightforward and the query semantics were extremely easy to adapt. The largest impediment was getting used to the use of the term "Document", which internally refers to a key/value pair, while "DocumentEntity" is closer to the expected meaning.

Like the core implementation, the test suite I adapted from Couchbase was also pleasantly svelte, covering the bases without being an onerous nightmare to convert. Indeed, most of the code I added to it was the Darwino app scaffolding just for the test runtime.

Putting It Into Practice

Once the driver was written, I was hit by a bit of a personal curveball when I went to implement some actual data models. The model side, Artemis, is heavily wrapped together with CDI, which is a Java EE thing that, as I gather, handles managed beans, separation of implementation, and variable injection. This is a pretty normal thing for Java EE developers, but XPages's "don't call it Java EE" environment didn't introduce me to this aspect. As such, the fact that the documentation just kind of casually tossed around CDI terms and annotations threw me for a bit of a loop trying to determine what was what was required and what was just an idiom.

I eventually determined that I could use the reference implementation, Weld, without necessarily going whole-hog on Java-EE-everything. I'm a bit wary of what this bodes for whether I'll be able to use JNoSQL in Darwino on mobile devices, but I'll cross that bridge when I come to it. Once I got a bit of a handle on what Weld is and how to use it in unit tests (hint: make sure you have beans.xml files!), I was able to start writing my model objects and testing them.

Doing It Again

The fact that the bulk of my implementation work ended up being on the app side with CDI goes to show that the Diana driver model really shines. It got me thinking about how difficult it would be in the future, say to write a driver for Domino. There'd be some hurdles - Domino's lack of nested objects and antiquated querying mechanisms would need replacing - but the core task wouldn't be too bad. I don't know if I'd have a need for it, but it's nice to keep in mind as potential future small project.

All in all, I'm optimistic about the use of this. I'd love for Darwino to integrate as smoothly as possible into whatever standard environments it can, and this is one more step in that direction. I'll know as my side app takes shape how much this ingrains itself into my actual work.

First Steps to Code Coverage Analysis in Domino Plugins

Thu Nov 09 08:53:04 EST 2017

Tags: maven domino java

I'm always interested in getting the computer to tell me how to tell it what to do more successfully, and, to further that pursuit, I've started taking an interest in code coverage.

If you're not familiar with the term, "code coverage" refers to reporting on which lines of code were actually executed during runtime, most commonly in association with unit tests. Eclipse (and presumably other IDEs) has support for this, and I've decided to give it a shot.

Since I'm starting this out in the context of Domino plugins, there are more wrinkles than in most tutorials. Namely, the test suites I've written run exclusively through Maven instead of the Eclipse UI due to all the Notes environment setup, so I can't just use the normal UI tools to gather the data. Fortunately, Eclipse's EclEmma will work just fine with the output from a Maven project, as long as you configure it properly. I looked around for a while to find the right combination of tools to use, but it ended up being fairly simple to configure basic output that can be consumed in Eclipse's Coverage view.

There are two main additions. First, add the jacoco-maven-plugin to your root project's project.build.plugins block:

<plugin>
	<groupId>org.jacoco</groupId>
	<artifactId>jacoco-maven-plugin</artifactId>
	<version>0.7.8</version>
	<executions>
		<execution>
			<goals>
				<goal>prepare-agent</goal>
			</goals>
		</execution>
	</executions>
</plugin>

In normal cases, that would suffice. However, since the test configuration I have for Notes overrides the argLine property of the Tycho test runner, there's another step - add the tycho.testArgLine property manually into those blocks, such as in the Windows profile:

<profile>
	<activation>
		<os>
			<family>Windows</family>
		</os>
		<property>
			<name>notes-program</name>
		</property>
	</activation>

	<build>
		<plugins>
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>tycho-surefire-plugin</artifactId>
				<version>${tycho-version}</version>
 
				<configuration>
					<skip>false</skip>
 
					<argLine>${tycho.testArgLine} -Dfile.encoding=UTF-8 -Djava.library.path="${notes-program}"</argLine>
					<environmentVariables>
						<PATH>${notes-program}${path.separator}${env.PATH}</PATH>
					</environmentVariables>
				</configuration>
			</plugin>
		</plugins>
	</build>
</profile>

Once that's configured, running the test suite via Maven will create a new file in the target folder of the test plugin: jacoco.exec. This file can then be consumed in Eclipse by opening the "Coverage" view:

Eclipse's Show View window

In that view, right click and choose "Import Session..." and point to the data file. Click "Next" and check the projects+source folders from your workspace you're interested in analyzing. When you click "Finish", it'll do two things. First, it'll fill the Coverage view with statistics from your run:

Code Coverage stats

(We have a lot of work to do fleshing out our test suites for this one)

Secondly, it'll start highlighting your code to show you what code is executed, which branches are only partially covered, and which lines are skipped entirely. For example (ignore the sickly color scheme - I need to work on that):

Code Coverage example

This shows how several of the if branches are only tested in one direction, while the "Faces" block is skipped entirely. That also shows some of the trouble with testing XPages-run code: the Tycho environment can't reproduce the XPages environment fully, so some branches aren't testable in that way. I haven't looked into the possibility of gathering similar data from JUnit for XPages, so perhaps that's possible.

For now, though, this will have to do. And, like with these other "code improvement" techniques I've integrated lately, there's a lot of potential tedium - juggling when to write a test to cover some code that will obviously always work just to improve the highlighting vs. just focusing on the low-hanging fruit - but I expect that it will be a nice addition to my workflow over time.

That Java Thing, Part 17: My Current XPages Plug-in Dev Environment

Sun Feb 26 11:23:22 EST 2017

Tags: java xpages
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment

It's been a while since I started this series on Java development, but I've been meaning for a bit now to crack it back open to discuss my current development setup for plug-ins, since it's changed a bit.

The biggest change is that, thanks to Serdar's work on the latest XPages SDK release, I now have Domino running plug-ins from my OS X Eclipse workspace. Previously, I switched between either running on the Mac and doing manual builds or slumming it in Eclipse in Windows. Having just the main Eclipse environment on the Mac is a surprising boost in developer happiness.

The other main change I've made is to rationalize my target platform configuration a bit. In the early parts of this series, I talked about adding the Update Site for Build Management to the active Target Platform and going from there. I still basically do this, but I'm a little more deliberate about it now. Instead of adding to the running platform, I now tend to create another platform just to avoid the temptation to use plug-ins that are from the surrounding modern Eclipse environment (this only really applies in my workspaces where I don't also have actual-Eclipse plug-in projects).

The fullest form of this occurs in one of my projects that has a private-only repo, which allows me to stash the artifacts I can't distribute publicly. In that case, I have a number of library dependencies beyond just the core XPages site, and I took the approach of writing a target platform definition file and storing it in the root project, with relative references to the packaged dependencies. With this route, I or another developer can just open the platform file and set it as the target platform - that will tell Eclipse about everything it needs. To do this, I right-clicked on the project, chose "New" → "Other..." and then "Target Definition" under "Plug-in Development":

Target Definition

Within that file, I used Eclipse variable references to point to the packaged dependencies. In this repo, there is a folder named "osgi-deps" next to the root Maven project, so I wanted to tell Eclipse to start at the root project, go up one level, and then delve down into there for each folder. I added "directory" type entries for each one:

Target Definition Entries

The reference syntax is ${workspace_loc:some-project-name}../osgi-deps/Whatever. workspace_loc resolves the absolute filesystem path of the named project within the workspace - since I don't know where the workspace will be, but I DO know the name of the project, this gets me a useful starting point. Each of those entries points to the root of a p2-format update site for the project. This setup will tell Eclipse everything it needs.

Unfortunately, this is a spot where Maven (or, more specifically, Tycho) adds a couple caveats: not only does Tycho not allow the use of "directory" type entries in a target platform file like this (meaning it can't be simply re-used), but it also expects repositories it points to to have p2 metadata and not just "plugins" and "features" folders or even a site.xml. So there's a bit of conversion involved. The good news is that Eclipse comes with a tool that will upgrade old-style update sites to p2 in-place; the bad news is that it's completely non-obvious. I have a script that I run to convert each new release of the Extension Library to this format, and I adapt it for each dependency I add:

java -jar
	/Applications/Eclipse/Eclipse.app/Contents/Eclipse/plugins/org.eclipse.equinox.launcher_1.3.100.v20150511-1540.jar
	-application org.eclipse.equinox.p2.publisher.UpdateSitePublisher
	-metadataRepository file:///full/path/to/osgi-deps/ExtLib
	-artifactRepository file:///full/path/to/osgi-deps/ExtLib
	-source /full/path/to/osgi-deps/ExtLib/
	-compress -publishArtifacts

Running this for each directory will create the artifacts.jar and content.jar files Tycho needs to read the directories as repositories. The next step is to add these repositories to the root project pom so they can be resolved at build time. To start with, I create a <properties> entry in the pom to contain the base path for each folder:

<osgi-deps-path>${project.baseUri}../../../osgi-deps</osgi-deps-path>

There may be a better way to do this, but the extra "../.." in there is because this property is re-resolved for each project, and so "project.baseUri" becomes relative to each plugin, not the root project. Following the sort of best practice approach to Tycho layouts, the sub-modules in this project are in "bundles", "features", "releng", and "tests" folders, so the path needs to hop up an extra layer. With that, I add <repositories> entries for each in the same root pom:

<repositories>
    <repository>
        <id>notes</id>
        <layout>p2</layout>
        <url>${osgi-deps-path}/XPages</url>
    </repository>
    <repository>
        <id>oda</id>
        <layout>p2</layout>
        <url>${osgi-deps-path}/ODA</url>
    </repository>
    <repository>
        <id>extlib</id>
        <layout>p2</layout>
        <url>${osgi-deps-path}/ExtLib</url>
    </repository>
	<repository>
		<id>junit-xsp</id>
		<layout>p2</layout>
		<url>${osgi-deps-path}/org.openntf.junit.xsp.updatesite</url>
	</repository>
	<repository>
		<id>bazaar</id>
		<layout>p2</layout>
		<url>${osgi-deps-path}/XPagesBazaar</url>
	</repository>
	<repository>
		<id>eclipse-platform</id>
		<url>http://download.eclipse.org/releases/neon/</url>
		<layout>p2</layout>
	</repository>
</repositories>

The last entry is only needed if you have extra build-time dependencies to resolve - I use it to resolve JUnit 4.x, which for Eclipse I just tossed unstructured into a "plugins" folder in the "Misc" folder, without p2 metadata.

Though parts of this are annoyingly fiddly, it falls under the category of "worth it in the end" - after some initial trial and error, my target platform is more consistent and easier to share among multiple developers and automated build servers.

Code Safety and Pedantry

Fri Jun 03 10:23:13 EDT 2016

Tags: java

Lately, I've been musing a lot on the topic of code "correctness" - that is, beyond the normal case of wanting code to do what I intended, and further into the realm of sweating even extremely-miniscule details. A lot of this is due to my continued watching of the evolution of Apple's Swift language (I highly recommend following Erica Sadun's blog for this). Swift is very much in the camp of "make sure all your 'i's are dotted and 't's crossed" languages, as opposed to more fast-and-loose languages like JavaScript or Ruby.

I've gone back and forth on the overarching concepts from time to time. I've long been a big Ruby fan, and a lot of that is because of a general feeling that, if you let go of a lot of the "strict old aunt of a compiler" restrictions, you gain a tremendous amount of expressiveness and productivity with few real-world problems. On the other hand, being immersed in Java all the time has shifted my brain to appreciating the benefits of stronger compile-time checks (at least on paper). Overall, I'm more on the latter side than the former now, double-edged sword though it is. This is why I've been diving into things like aggressive null analysis in my code. Even when it seems like it's being a pedant, there are certain classes of bugs that it finds that I wouldn't even normally think of on the fly. For example, the null checker flags this as being a potential NPE:

if(this.foo != null) {
	this.foo.doSomething();
}

My first reaction upon seeing that was along the lines of "you're full of crap, Eclipse", but then I noticed the small path a bug could take to creep in: multithreading. If I'm in a situation where the object containing foo is used across threads, there's a possibility where Thread A would evaluate this.foo != null to true and start to step in to the block. Then, Thread B would get its turn on the processor and evaluate this.foo = null in another method. Thread A would then pick up and try to call doSomething() on the newly-minted null. So I've swallowed my pride and started writing safer code like:

SomeObject localFoo = this.foo;
if(localFoo != null) {
	localFoo.doSomething();
}

What I've always admired about Swift is that it takes these sorts of lessons to heart and adapts the syntax to suit. My interest in code-correctness pedantry in Java has led me to write out verbose abominations like this:

private final @NotNull String foo;

Three of the conceptual tokens there are purely to say things that are best practices to start with: I don't want this property accessible outside the class, I want to make sure it's assigned during construction and not change thereafter, and I want to ensure it's not null. The Swift variant is:

let foo: String

Same thing, half the typing. And, as a bonus, since nullability checking is built in to the language and not a by-convention thing like the null annotations in Java, I can be sure that the rules will be applied. That sort of thing is the dream! But, since Java is the best language for the work I'm doing for now, the important thing is that it at least suits, verbose or not. In a lot of languages, it gets much more difficult to have this sort of assurance.

So, hassle as it is, I suggest that other Domino developers, on their paths through Java, consider picking up the same habits. For every time you run into something like Java complaining that it can't convert a List<String> into a List<Object>, diving fully into null checks and immutability will save you a late-night crash report and angry user. As you develop more in Java, give it a try.

The Cleansing Flame of Null Analysis

Sat May 21 10:18:00 EDT 2016

Tags: java maven
  1. The Cleansing Flame of Null Analysis
  2. Quick Tip: JDK Null Annotations for Eclipse
  3. The Joyful Utility of Optionals in Java

Though most of my work lately has been on sprawling, platform-level stuff or other large existing codebases, part of it has involved a new small app. I decided to take this opportunity to dive more aggressively than previously into automated null analysis and other potential-bugs tools.

What I mean by "null analysis" is letting the IDE or compiler try to help you avoid NullPointerExceptions. Though there are plenty of other programming mistakes you could still make, these are among the most common, and so a little extra work up front to avoid them should pay dividends. Eclipse has some handy options in its Java → Compiler → Errors/Warnings preferences to assist with this:

The first option will pick up on some pretty basic instances, such as:

Object foo = null;
System.out.println(foo.hashCode());

Since this is clearly going to always cause an NPE, Eclipse is able to point this out as an error. The next level gets a little more nebulous: "potential" null pointer access. This crops up when Eclipse can't reliably determine whether a value will be null, either because there is no way to know at compile time (say, database access) or because the compiler's tooling is too limited. Here's a contrived example:

Object foo = Math.random() > 0.5 ? new Object() : null;
System.out.println(foo.hashCode());

This situation is clearly untenable, but there are other situations where you as a programmer can be very confident that the value will not be null (say, if you swap out the > 0.5 for >= 0.0), but the compiler doesn't know that. That's why it often makes sense to leave that as a warning instead of an error.

That's all stuff I've done before, but now I've decided to dive into annotation-based null analysis as well. Unfortunately, in stock Java, this is something of a hot mess (that list even leaves out Eclipse's home-grown version). Since Java didn't grow up with this sort of capability, it's been shoehorned in by various parties over the years. There are other tools to assist you in Java 8, but, unfortunately, I can only target 7 as the highest. For now, I've settled on the "sort-of standard" javax.validation.constraints package. It wasn't really intended for this specific purpose, but it's flexible enough to suit and can be used in Eclipse and FindBugs (though I have my reservations about the choice).

In Eclipse, this type of analysis can be enabled by checking "Enable annotation-based null analysis" below the other options and, unless you're using Eclipse's known annotations, adjusting the "Configure" options next to "Use default annotations for null specifications":

In any event, regardless of the choice of tooling, the "this shouldn't be null" annotations work the same way: you use them to decorate things that you either require not be null when provided to you (method parameters) or you promise to not be null when providing to others (method return values). For example:

public @NotNull Object doSomething(@NotNull Object otherObject) {
	return otherObject.toString();
}

This highlights three things, two good and one bad:

  • Good: The @NotNull in the method parameter means that, as long as the calling code is also checked for null use, the method can be confident that there won't be a NullPointerException when calling a method on otherObject.
  • Good: The @NotNull on the return value means that other code calling this method can be confident that they will not get a null value from it, and so can skip extra null checks.
  • Bad: Eclipse flags otherObject.toString() as a potential problem because it doesn't know for sure that Object#toString doesn't return null, because it has no nullability annotations. As programmers (or as a compiled-code analysis tool), we can be fairly confident that it will be non-null because any object returning null for that is essentially broken on its own.

That last one is a common problem when adopting annotation-based null analysis, at least in Eclipse (I hear it may be better in IntelliJ): its logic doesn't go very deep. If everything is gussied up with these annotations, you're clear - but as soon as you step outside of the project you're working on, you have to add in likely-unnecessary checks. Fortunately, these checks don't realistically hurt (a null check at runtime in a normal app is negligible performance-wise), but they can grate to have to add in.

Glutton for punishment that I am, I decided to go a step further and enable FindBugs processing as an integral step of my build. Though FindBugs can be very picky about the types of things it complains about, it is blessedly more thorough in its analysis than Eclipse, so you generally end up conceding that it is correct when it yells at you. Since the project is Maven-based, I added the check in the project's pom file:

<plugin>
	<groupId>org.codehaus.mojo</groupId>
	<artifactId>findbugs-maven-plugin</artifactId>
	<version>3.0.3</version>
	<configuration>
		<includeTests>true</includeTests>
	</configuration>
	<executions>
		<execution>
			<phase>compile</phase>
			<goals>
				<goal>check</goal>
			</goals>
		</execution>
		<execution>
			<id>findbugs-test-compile</id>
			<phase>test-compile</phase>
			<goals>
				<goal>check</goal>
			</goals>
		</execution>
	</executions>
</plugin>

For most uses, that's all that's required. Now, when the project is compiled, FindBugs will give it a once-over and halt the build if it finds anything it doesn't like. This can be tweaked a great deal - for example, changing the checks to run or the severity of the problem needed to fail the build - but the defaults will likely suit.

Adding these extra checks involves a lot of plusses and minuses. The big minus is that you may end up spending a lot of time "fixing" bugs that don't really exist, time that you could instead spend actually writing your application (and writing new bugs that the tools won't find anyway). There's really nothing to be gained by carefully explaining to Eclipse for the hundredth time that toString always returns non-null.

Still, particularly when tested out in a small, low-surface-area app, this can be a good practice to learn and refine. Eventually, a move to Java 8 will help this more, and it certainly doesn't hurt to add in nullability annotations in the mean time. Overall, I think having the tooling help you avoid a whole suite of common "brain fart" bugs like this is worthwhile.

That Java Thing, Part 16: Maven Fallout

Tue Feb 23 14:33:51 EST 2016

Tags: java maven
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment

So, after the last post's large task of converting to Maven, this step is mostly about picking up the pieces and expanding on some of the concepts. We'll start with M2Eclipse, usually rendered as just "m2e".

m2e

m2e is the set of plugins that acts as Eclipse's interface to Maven. It more-or-less replaces the earlier maven-eclipse-plugin, though you will likely still see references to that around. Eclipse doesn't have any inherent knowledge of how Maven works, m2e has the complicated task of reading your projects' pom.xml files and adapting them to Eclipse's internal configuration. So, for example, in our projects it saw the presence of Tycho and determined that they should be imported as OSGi projects. In other cases, m2e may pick up the presence of things like Android plugins to trigger the use of the Android development tools.

Though it tries mightily, m2e is the source of a lot of the consternation that can come with a switch to Maven-based development. Because most Maven plugins don't have any inherent allowances for working in an Eclipse environment, adapters have to be written for each one in order for them to work with m2e - this is what the dialog yesterday installing the Tycho adapters was about. In some cases, these don't exist and you have to tell m2e to ignore the plugin; in other cases, the adapters DO exist, but are flawed in some way. Most of the time, things go alright, but there are enough edge cases that it can be irritating.

For this kind of task, m2e is pretty unobtrusive, but it's important to know it's there.

Updating the .gitignore

One side effect of m2e's behavior is that it's not a good idea to remove Eclipse's project configuration files from the Git repository. This is not required, but it can avoid a number of annoying problems when dealing with multi-person Maven projects. To start with, open the .gitignore file from the root of your local Git repository (you can get to this easily using Eclipse's Git Repositories view, in the "Working Directory" part of the repo). Add some lines at the end to ignore .project and .classpath, so your whole file should now look like:

._*
Thumbs.db
.DS_Store

*.class

# Mobile Tools for Java (J2ME)
.mtj.tmp/

# Package Files #
#*.jar
*.war
*.ear

# virtual machine crash logs, see http://www.java.com/en/download/help/error_hotspot.xml
hs_err_pid*

# Eclipse project files
.project
.classpath

Depending on how your (hypothetical) team wants to work, it may also make sense to ignore the .settings/ directory, which stores some additional Eclipse project information. However, some of that information may be useful to share - for example, on-save code-cleaning behavior that isn't readily expressed in Maven.

Due to the way Git works, just adding the files to the .gitignore won't remove them from the repository: instead, they'll just no longer show up in the list for new changes. In order to also remove them from the repository without deleting them from the filesystem, go to the "Navigator" pane in Eclipse (if it doesn't show up currently, you can add it via Window → Show View → Navigator), find each .project and .classpath file in the four projects (some will only have the former), right-click, and choose Team → Advanced → Untrack:

Now, commit the changes - though the files remain on the filesystem, they should show up as deleted in the commit dialog:

The target Folder

This one is one we've already prepped a bit for. Whereas most Eclipse projects store their binary output (Java class files, Jars, etc.) in the bin folder or elsewhere, the standard Maven behavior is to use target. For most of the projects, this doesn't matter - we had already configured the plugin to use a subfolder here for its classes, and the temporary files for other aspects don't matter. However, it's still important to know about this; when you're looking for the compiled or packaged output of a Maven project, this is the place to look, and we'll run into this when building the update site.

Building the Update Site

There's an important changed involved now with how the update site is built: it does not involve opening the site.xml and clicking "Build All" anymore. Instead, it involves right-clicking the root project ("parent-xsp") and choosing Run As → Maven Install:

There are two logical followup questions when seeing this change: "what?" and "why?". They're both bound together to what the nature of a Maven project is, and, significantly, the way Eclipse interacts with them. Maven is primarily a command-line tool - granted, it's a set of Java classes, but the primary way to interact with it is via the command line. m2e does a lot of work to interpret the projects in the same way as the `mvn` command-line tool, but it's just a secondary interpretation due to the way Maven and Eclipse work.

The way to fully build a Maven-ized project in a way that fully uses the configuration is to run the command-line tool. Fortunately, m2e comes with its own embedded version and doesn't require you to use a terminal, but the abstraction is very leaky - and this is why you use "Run As" instead of any of the normal "Build"-related commands. The "Run As" commands construct a CLI-type environment and execute the embedded Maven, which is what then does the real work.

Since "parent-xsp" is the root of our projects, it's the starting point to execute a Maven build. When you run this, you'll see a lot of chatter in Eclipse's Console view, particularly the first run: Maven will seek out the plugins needed to build the projects and install them to the local Maven repository (stored in ".m2/repository" in your home folder). After that, it will build and package each of your projects. There's a whole phase system going on here (similar in concept to the XPages lifecycle), as well as many configuration options, but the important part here is that the "install" command (called a "goal" in Maven parlance) is the last phase that we will worry about, and it will cover everything we need here.

Upon completion, the Console text should end with something like this (incidentally, "Reactor" is Maven's term for the entire blob of modules being processed):

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] parent-xsp ......................................... SUCCESS [  0.617 s]
[INFO] com.example.xsp.plugin ............................. SUCCESS [  1.647 s]
[INFO] com.example.xsp.feature ............................ SUCCESS [  0.411 s]
[INFO] com.example.xsp.update ............................. SUCCESS [  4.143 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 30.310 s
[INFO] Finished at: 2016-02-23T13:55:20-05:00
[INFO] Final Memory: 80M/191M
[INFO] ------------------------------------------------------------------------

Part of this process is the cration of the update site, which, due to how we configured it, will be represented twice in the "target" folder in "com.example.xsp.plugins": as a tree of files inside the "site" folder and also zipped up into the "site_assembly.zip" file. There's also a file named "site.zip", but that contains just the site.xml, which is not important. It's these files that you should now target with Designer and the NSF Update Site when updating the plugin. In fact, it'd be a good idea to delete the "features" and "plugins" folders from outside the "target" folder now - they won't be used any more.

As for the "Build All" button in site.xml, it's best to pretend it doesn't exist. It will still work, but it will break your Maven build, because it overwrites the "qualifier" in the version numbers. This is, admittedly, a drag: it's convenient having a clear, logical button to build the site, and it's very inconvenient that Eclipse doesn't tell you not to use it any more. However, the Maven process, besides being now required, has a nice advantage: now building the update site will no longer cause Git to want to check in the change. That's something that can get annoying very quickly when working with another developer on a non-Mavenized OSGi project.

Adding Back The Source Plugin

We'll finish the day on an easy one: adding back in the source plugin. Unlike the original setup, which used a separate feature to house the source plugin, we'll now include it in the same feature. You could also continue to have a separate source feature, which would be useful for very large projects where it would actually be a big burden to deploy the source to servers, but, for XPages libraries, it's generally not worth the cognitive hassle.

Since we already configured the Tycho source plugin earlier, this is just a matter of adding a reference to the (implied) source plugin to the feature.xml:

<?xml version="1.0" encoding="UTF-8"?>
<feature
      id="com.example.xsp.feature"
      label="Example XSP Library Feature"
      version="1.0.0.qualifier">

   <description url="http://www.example.com/description">
      [Enter Feature Description here.]
   </description>

   <copyright url="http://www.example.com/copyright">
      [Enter Copyright Description here.]
   </copyright>

   <license url="http://www.example.com/license">
      [Enter License Description here.]
   </license>

   <plugin
         id="com.example.xsp.plugin"
         download-size="0"
         install-size="0"
         version="0.0.0"
         unpack="false"/>

   <plugin
         id="com.example.xsp.plugin.source"
         download-size="0"
         install-size="0"
         version="0.0.0"/>

</feature>

Now, the source will be included in the output and bundled with the main feature when it's installed. Commit this change:


At this point, we're basically back to where we were previously, OSGi-wise, but in a much better position to scale the project further and take advantage of supporting systems. The next couple posts will cover some of those potential systems, as well as remaining large conceptual topics. There's a great deal to know when it comes to Maven, but it's all helpful.

That Java Thing, Part 15: Converting the Projects

Mon Feb 22 10:26:54 EST 2016

Tags: maven java
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment

Prelude: there was a typo in the previous entry. Originally, the file URL read "file://C:/IBM/UpdateSite", but, on Windows, there should be another slash in there: "file:///C:/IBM/UpdateSite". I've corrected the original post now, but you should make sure to fix your own settings.xml file if needed. Otherwise, Maven will complain down the line about the URI having "an authority component".

The time has come to do the dirty work of converting our existing plugin projects to Maven. There will be some filesystem-side reorganizing and not every project will make it (looking at you, source project), but overall it's mostly a job of pasting a bunch of XML into new files.

For the first leg of this, I recommend removing the projects from your Eclipse workspace by selecting them, right-clicking, and choosing Delete:

On the confirmation dialog, do not select "Delete project contents on disk" - we don't actually want to get rid of the files.

Next, find the projects on your filesystem, create a new folder alongside them named "com.example.xsp", and move the projects inside it. In Maven parlance, we're creating a "multi-module project", and this new folder is the top level in our module hierarchy. This can contain arbitrary levels and can be very helpful in project organization, but this will be a pretty simple parent-and-children case. Next, dive into the folder and delete the "com.example.xsp.source.feature" - we'll be able to generate this through Maven now, and so we can trim down our project count slightly.

Now, create a file named pom.xml in the "com.example.xsp" folder alongside the subfolders, and fill its contents with this:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<groupId>com.example</groupId>
	<artifactId>parent-xsp</artifactId>
	<version>1.0.0-SNAPSHOT</version>
	
	<packaging>pom</packaging>

	<modules>
		<module>com.example.xsp.plugin</module>
		<module>com.example.xsp.feature</module>
		<module>com.example.xsp.update</module>
	</modules>

	<properties>
		<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
		<tycho-version>0.24.0</tycho-version>
		<compiler>1.6</compiler>
	</properties>

	<repositories>
		<repository>
			<id>Luna</id>
			<layout>p2</layout>
			<url>http://download.eclipse.org/releases/luna/</url>
		</repository>
		<repository>
			<id>notes</id>
			<layout>p2</layout>
			<url>${notes-platform}</url>
		</repository>
	</repositories>

	<build>
		<plugins>
			<!--
				Maven compiler options
			-->
			<plugin>
				<groupId>org.apache.maven.plugins</groupId>
				<artifactId>maven-compiler-plugin</artifactId>
				<version>3.1</version>
				<configuration>
					<source>${compiler}</source>
					<target>${compiler}</target>
					<compilerArgument>-err:-forbidden,discouraged,deprecation</compilerArgument>
				</configuration>
			</plugin>
			
			<!--
				Tycho plugins
			-->
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>tycho-maven-plugin</artifactId>
				<version>${tycho-version}</version>
				<extensions>true</extensions>
			</plugin>
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>tycho-packaging-plugin</artifactId>
				<version>${tycho-version}</version>
				<configuration>
					<strictVersions>false</strictVersions>
				</configuration>
			</plugin>
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>tycho-compiler-plugin</artifactId>
				<version>${tycho-version}</version>
				<configuration>
					<source>${compiler}</source>
					<target>${compiler}</target>
					<compilerArgument>-err:-forbidden,discouraged,deprecation</compilerArgument>
				</configuration>
			</plugin>
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>tycho-source-plugin</artifactId>
				<version>${tycho-version}</version>
				<executions>
					<execution>
						<id>plugin-source</id>
						<goals>
							<goal>plugin-source</goal>
						</goals>
					</execution>
				</executions>
			</plugin>
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>target-platform-configuration</artifactId>
				<version>${tycho-version}</version>
				<configuration>

					<pomDependencies>consider</pomDependencies>
					<dependency-resolution>
						<extraRequirements>
							<requirement>
								<type>eclipse-plugin</type>
								<id>com.ibm.notes.java.api.win32.linux</id>
								<versionRange>[9.0.1,9.0.2)</versionRange>
							</requirement>
						</extraRequirements>
						<optionalDependencies>ignore</optionalDependencies>
					</dependency-resolution>

					<filters>
						<!-- work around Equinox bug 348045 -->
						<filter>
							<type>p2-installable-unit</type>
							<id>org.eclipse.equinox.servletbridge.extensionbundle</id>
							<removeAll />
						</filter>
					</filters>

					<environments>
						<environment>
							<os>linux</os>
							<ws>gtk</ws>
							<arch>x86</arch>
						</environment>
						<environment>
							<os>linux</os>
							<ws>gtk</ws>
							<arch>x86_64</arch>
						</environment>
						<environment>
							<os>win32</os>
							<ws>win32</ws>
							<arch>x86</arch>
						</environment>
						<environment>
							<os>win32</os>
							<ws>win32</ws>
							<arch>x86_64</arch>
						</environment>
						<environment>
							<os>macosx</os>
							<ws>cocoa</ws>
							<arch>x86_64</arch>
						</environment>
					</environments>
					<resolver>p2</resolver>
				</configuration>
			</plugin>
		</plugins>
	</build>
</project>

So... yeah, there's a lot going on here. This is the biggest of the "XML dumps" we're going to have and contains by far the greatest number of bizarre "you just have to know about it" parts. "POM" stands for "Project Object Model" - it's the language Maven uses to describe the project. Let's tackle the file from near the top (ignoring the XML header):

project and modelVersion

These elements are effectively just boilerplate: the project element is the root of our project descriptor and it contains some definitions to let XML editors parse the file format. In turn, the modelVersion describes to Maven the specific version we're working with, which has been "4.0.0" for as long as I've been doing this.

groupId, artifactId, and version

These elements are obligatory in one form or another in every project, but are less copy-and-paste-able: they define the name and version of your project. These are Maven's equivalents to OSGi's Bundle-SymbolicName and Bundle-Version, though Maven makes an explicit distinction between the overall grouping of the plugin and its specific name. These are essentially arbitrary, but the convention is to use the standard reverse-DNS version of your domain name for the group ID, and then keep this group ID consistent across different projects (...mostly). The artifact ID is less consistent, but it's good to pick a pattern like "projectPrefix-submodule". Here, we actually reverse that a bit to call it "parent-xsp" in order to emphasize that this project's purpose is entirely to be a parent to the submodules and not an interesting artifact to consume itself. We'll break this convention again for the submodules due to our use of Tycho/OSGi.

packaging

A project's packaging describes the sort of output. By default, if this is left un-specified, it's jar - a normal, run-of-the-mill Jar file. There are a few other common ones you may run into - such as war for J2EE web apps or bundle for non-Tycho OSGi bundles - and the one we're using here is pom. This is actually kind of the "none of the above" option: the "pom" is just the file we're editing now, and is included with every project type. Having a packaging type of pom generally means that either the project has no real outputs of its own (as is the case here) or it's an "ad hoc" project that doesn't fit an existing type.

modules

This block is the hallmark of a parent project: it lists the relative folder paths that contain the submodules. In this case, the names line up with the names we'll use for the submodules, but this could potentially vary depending on the folder names and locations. Parent-child module relationships don't have to be physically hierarchical on the filesystem, but it's a good convention when you don't have a specific reason to break it.

properties

This block is the project-level equivalent to the user-level property we defined in .m2/settings.xml. There are two types of properties that can go here, with no obvious distinction between them: known configuration properties used by Maven itself and arbitrary named variables used by the person writing the pom.

The first property - project.build.sourceEncoding - is an example of the former. During execution, Maven will reference this property defined in the project (or one of its parents) when determining the text file encoding to use. This could be set to something else if you're working with non-Unicode files, but it's important to set it here so that file interpretation will not be platform-dependent. These properties can be read like an equivalent of EL for the project XML: it sets a property of sourceEncoding within the build node in project, but more consisely (more or less).

The other two are variables for use later. The names of these have only very loose conventions, but there seem to be a couple common types: ALL_CAPS, camelCase, and hyphen-delimited. You can also specify variables as dot.delimited and they will work the same way, but that makes them more difficult to distinguish from the system-level properties.

repositories

The repositories block is the start of our OSGi-related weirdness. The block itself isn't OSGi-specific - it has its role in other projects that want to make use of dependencies outside of the core public Maven repositories - but the contents is. We're setting two repositories here: one to point to the main Eclipse repository (the Luna version here, but that could just as well be Mars or Kepler) and one to point to the XPages Update Site. This is where the property we set before comes into play, allowing different developers to keep the update site in different locations without changing the project's config.

build

The build section is often the largest part of a POM file - it contains definitions and configuration for various additional Maven plugins used during compilation. We have two tasks to accomplish here: ensure that we use Java 6 for compilation at the root level (to ensure the build doesn't execute as Java 7 or 8 and be incompatible with Domino) and enable a whole slew of Tycho plugins.

The maven-compiler-plugin block specifies the version of the Java compiler plugin to use (3.1, which is actually kind of old, but the differences aren't important) and then provides it with some configuration to set the Java version level and to not choke on forbidden references. Like the Java version, the latter is a nod to Domino: depending on your JVM configuration, you may run into forbidden-reference errors relating to the lotus.domino classes.

The next slew of blocks all relate to loading up various Tycho components. Tycho's job is esentially to construct an entire OSGi environment during the Maven build, and it consists of a number of moving parts, many of which are basically the Tycho version of normal Maven facilities. The tycho-maven-plugin is the core, and its extensions rule is what allows it to worm itself into various phases of the build process. The tycho-packaging-plugin controls the process of bundling the projects as their various types: the plugin, the feature, and the update site (in our case). The tycho-compiler-plugin is the Tycho variant of the Maven one we configured earlier. The tycho-source-plugin is what allowed us to kick the standalone source feature to the curb - it's the equivalent of the Eclipse-specific feature we had been hooking into before.

The tycho-platform-configuration is the scariest of the bunch. This plugin's job is to establish the OSGi Target Platform we're working with, in conjunction with the repository specified above. Not all of this configuration is necessary for our immediate needs, but may come in handy later. The pomDependencies rule is useful when using mixed-type dependencies in more-complicated projects, while the extraRequirements block forces the inclusion of the plugin fragment that contains Notes.jar. The optionalDependencies rule comes in handy from time to time with XPages projects: there are sometimes cases where there's a dependency that Eclipse has and which the server will have, but which will be awkward to get to in Maven, usually relating to dependencies-of-dependencies not related to compilation. The filters block is... I don't know; just keep it in there. The environments block is a way to describe the platforms on which your code can execute - I believe this is primarily used when testing. The resolver is like filters in that it's a "I just copy it around" thing; presumably, it refers to a specific code path for resolving plugin dependencies.

Whew!

Okay, so... that's the first one down! For the most part, you can carry around the whole bottom section pretty much as-is for your XPages Maven projects (as I do), and then gradually become comfortable with the specifics over time. Now, there's some good news and some bad news:

  • The bad news is that there are three more POM files to write.
  • The good news is that they are much, much simpler.

In recent versions of Tycho, they've added the ability to reduce the number of POMs involved, but there are limits to that, particularly to do with Jenkins, so we'll stick to the traditional way for now.

com.example.xsp.plugin

Go into the "com.example.xsp.plugin" folder and create a new pom.xml containing this:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<parent>
		<groupId>com.example</groupId>
		<artifactId>parent-xsp</artifactId>
		<version>1.0.0-SNAPSHOT</version>
	</parent>
	<artifactId>com.example.xsp.plugin</artifactId>
	<packaging>eclipse-plugin</packaging>
</project>

Like I promised: much simpler. Because the parent POM already brought in all the Tycho plugins and configuration, all we need to do here is the basics. One slightly-unusual aspect here is the packaging type. eclipse-plugin isn't a packaging type known to Maven inherently; instead, it's provided by Tycho, but can be used in the same way.

The artifact ID here is a concession to Tycho: Maven artifact IDs don't usually follow the same full-reverse-DNS conversion as OSGi, but Tycho wants the artifact ID to match the OSGi bundle name.

com.example.xsp.feature

Next up is the pom.xml file in the "com.example.xsp.feature" folder:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<parent>
		<groupId>com.example</groupId>
		<artifactId>parent-xsp</artifactId>
		<version>1.0.0-SNAPSHOT</version>
	</parent>
	<artifactId>com.example.xsp.feature</artifactId>
	<packaging>eclipse-feature</packaging>
</project>

This is very similar to the last, with the only real differences being the artifact ID and the packaging type.

com.example.xsp.update

Now, the pom.xml in "com.example.xsp.update":

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<parent>
		<groupId>com.example</groupId>
		<artifactId>parent-xsp</artifactId>
		<version>1.0.0-SNAPSHOT</version>
	</parent>
	<artifactId>com.example.xsp.update</artifactId>
	<packaging>eclipse-update-site</packaging>

	<build>
		<plugins>
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>tycho-packaging-plugin</artifactId>
				<version>${tycho-version}</version>
				<configuration>
					<archiveSite>true</archiveSite>
				</configuration>
			</plugin>
		</plugins>
	</build>
</project>

This one's slightly longer, but not by too much. Beyond the different artifact ID and packaging, we also provide some additional configuration to the tycho-packaging-plugin. This is among the plugins that were established in the root POM, but it's re-defined here in order to enable the archiveSite configuration option. This will give us a nice ZIP file of the Update Site at the end.

There's one other thing to note here: Tycho considers the eclipse-update-site packaging type to be deprecated, and it may be removed in the future. In most examples you'll see outside of Domino, people use eclipse-repository instead. This gets back to the difference between the old-style ("site.xml") Eclipse Update Sites and the new-style ("category.xml", named P2) Update Sites. For now, we use the old-style variant because it works better with Notes and Domino.

In addition to this POM file, we also have two changes to make in the site.xml: remove the source-feature reference (we'll add this back elsewhere later) and clean up the versions:

<?xml version="1.0" encoding="UTF-8"?>
<site>
   <feature url="features/com.example.xsp.feature_1.0.0.qualifier.jar" id="com.example.xsp.feature" version="1.0.0.qualifier">
      <category name="Example"/>
   </feature>
   <category-def name="Example" label="Example"/>
</site>

The version distinction is to change the Eclipse-generated timestamps at the end of the versions to "qualifier". The reason for this is that, for Tycho, the site.xml acts as a pure configuration file, and will no longer be the site index itself. So Tycho wants the qualifier to be generic, and then will fill it in during compilation. Like the source feature, this will be covered more later.

Last Steps

With our POM files defined, the last step for now is to import the projects back into Eclipse. In Eclipse, go to File → Import, expand the "Maven" category, and choose "Existing Maven Projects":

On the next screen, browse to the "com.example.xsp" directory created earlier. If all goes well, this should find the four projects in their hierarchy:

Everything on this can be left as the defaults, though you may want to specify a more-descriptive working set name - that doesn't affect the project behavior.

When you click "Finish", Eclipse with churn for a bit and then, if you're running Mars and haven't done this before, it will present a dialog about "Maven plugin connectors":

The specifics of what is going on here are a large topic of their own, but the short of it is that Eclipse needs specialized plugins to deal with each Maven plugin, and in this case it's looked for (and found) connectors for Tycho. Click "Finish", "Next", and "OK", accept the license terms, and restart Eclipse as it tells you to.

When Eclipse restarts, it should go through some churning while it updates the Maven projects and should finally settle on no remaining errors.

Closing Out

There will be some things to discuss with the fallout from this conversion, but this will do it for today. Commit your changes, stand up and stretch, and grab a cup of relaxing tea:

That Java Thing, Part 14: Maven Environment Setup

Sun Feb 21 17:51:37 EST 2016

Tags: java maven
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment

Before diving into the task of converting our plugin projects to Maven, there's a bit of setup we need to do. In a basic case, Maven doesn't require much setup beyond the project file itself - it's a "convention over configuration" type of thing that tries to make doing things the default way smooth. However, since it's also a "Java" thing, that means that anything out of the ordinary requires a bunch of XML.

Our big "out of the ordinary" aspect is OSGi. Maven and OSGi are often at loggerheads, but the conflict won't be too great in our situation. Still, it does mean there will be a few hoops to jump through, and one of those hoops is dealing with our dependency on the XPages runtime plugins. Since these plugins are not packaged as fully-Mavenized artifacts (yet (hopefully)), we'll need to configure Tycho to read the p2 (Eclipse) style.

In part 3, we downloaded the Build Management Update Site from OpenNTF, and we'll reuse that here. What we need to do is create a global Maven settings file that, for now, will just contain a definition of a variable to point to this update site. It would also be possible to specify this inside the project itself, but it's better form to use a consistent variable name (the most common convention is notes-platform) in the project files and then have your local settings point to it on your machine.

The global Maven settings file is called settings.xml and is stored in a folder named .m2 in your user's home directory (e.g. C:\Users\someuser\.m2 or /Users/someuser/.m2). Creating a folder named with a leading dot can be a pain in Explorer and the Finder, so it may be necessary to drop into a command line or other tool to do it. One way or another, create this file and set its contents similar to this:

<?xml version="1.0"?>
<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0 http://maven.apache.org/xsd/settings-1.0.0.xsd">
	<profiles>
		<profile>
			<id>main</id>
			<properties>
				<notes-platform>file:///C:/IBM/UpdateSite</notes-platform>
			</properties>
		</profile>
	</profiles>
	<activeProfiles>
		<activeProfile>main</activeProfile>
	</activeProfiles>
</settings>

Adjust the file:// URL as necessary to point to the location on your computer. It has to be a file URL and not a normal path, presumably because repositories are usually expected to be remote HTTP sites.

This is the only configuration we need before getting to the project, but it's a good preview of the sort of "try pasting this big block of XML somewhere" advice you're in for when it comes to Maven use. Over time, the structure of the XML and how it relates to Maven's behavior begins to crystallize, but it's definitely cumbersome to start with, and it will get more opaque before it gets less so.

Depending on your proclivities, this may be a good opportunity to install standalone Maven as well. Eclipse has its own embedded version, so this is not required, but it can be handy sometimes to be able to run Maven from the command line. On your average Linux distribution or OS X with Homebrew, Maven should be installable with a package manager. Otherwise, Maven can be downloaded from maven.apache.org - it doesn't have an installer as such, as it's essentially some scripts around Java classes, but they have tips for adding it to your path.

Next, we'll get to the real meat of this process: actually converting the projects to Maven.

That Java Thing, Part 12: Expanding the Plugin - JAX-RS

Thu Dec 03 15:21:28 EST 2015

Tags: java xpages
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment

A couple of months back, Toby Samples wrote a series on using Wink in Domino. I'm not going to cover all of that ground here, but it's still worth dipping into the topic a bink, as writing REST services in an OSGi plugin is a great way to not only add capabilities to your XPages apps, but to also start making your data (and programming skills) more accessible from other platforms.

There are two main terms to know here: "JAX-RS" and "Wink".

JAX-RS stands for "Java API for RESTful Web Services". If you're observant, you may notice that there's no "X" in that phrase. This is because the "JAX" part is sort of a legacy hanger-on from the common "JAX" prefix used in JAX-WS and the other Java XML APIs... even though JAX-RS doesn't necessarily involve any XML. In any event, JAX-RS is a specification for a way to write RESTful web services in Java with (mostly) inline configuration using annotations.

Apache Wink is a specific implementation of this standard. For the most part, with these Java standards, the actual implementation shouldn't matter too much, but there are always edge cases. In any event, Wink has historically been the preferred method of doing this on Domino by virtue of the fact that the ExtLib's DAS is built on it and so it ships with Domino, along with some IBM support code.

Setting up JAX-RS services is pretty similar to the addition of the XSP Library and theme provider: we'll need to create the Java classes and add some configuration to the plugin.xml. In this case, we'll do it in reverse, starting with the configuration first before diving into the servlet class.

There are actually two ways to proceed here. In the past, I've added servlets directly to the plugin.xml file. This works fine, but Toby's series took a different and more-interesting route: instead of having the plugin provide a servlet alone, it provides a web application which in turn contains a servlet. Doing it this way brings the configuration more in line with what you'd see in a non-OSGi Java web application.

To start with, we'll need to add dependencies on four more plugins. Open the META-INF/MANIFEST.MF file, go to the "Dependencies" tab, and add "org.apache.wink", "com.ibm.domino.osgi.core", "org.eclipse.equinox.http.registry", and "com.ibm.wink":

One word of caution: when adding "org.eclipse.equinox.http.registry", make sure to add version "1.0.100". Eclipse will likely have a newer version available as well, but that won't be there on Domino. Setting the version requirement too high will cause the plugin to fail to load.

While on this screen, add "lotus.domino" in the "Imported Packages" section of the page. OSGi has two mechanisms for specifying dependencies - until this point, we've used exclusively the "Required Plug-ins" route, which attaches a plugin by name as a dependency. The "Imported Packages" route is a little different: it says "I want this Java package available, but I don't care what plug-in provides it". Outside of the Domino world, the Packages route is very common, but for our needs we usually use Plug-ins to avoid some common nitty-gritty issues with the XPages stack.

The plugin.xml addition is pretty similar to the services we added before, but with some different tags. We're going to add another extension point to the list, along with a couple lines of configuration. With a minor cleanup to the previous lines, the result should look like this:

<?xml version="1.0" encoding="UTF-8"?>
<?eclipse version="3.4"?>
<plugin>
	<extension point="com.ibm.commons.Extension">
		<service class="com.example.xsp.ExampleLibrary" type="com.ibm.xsp.Library"/>
	</extension>
	<extension point="com.ibm.commons.Extension">
		<service class="com.example.xsp.theme.ExampleStyleKitFactory" type="com.ibm.xsp.stylekit.StyleKitFactory"/>
	</extension>
	<extension point="org.eclipse.equinox.http.registry.servlets">
		<servlet alias="/examplerest" class="com.example.xsp.servlet.ExampleServlet">
			<init-param name="applicationConfigLocation" value="/WEB-INF/exampleapplication"/>
			<init-param name="propertiesLocation" value="/WEB-INF/exampleservlet.properties"/>
			<init-param name="DisableHttpMethodCheck" value="true"/>
		</servlet>
	</extension>
</plugin>

The code inside of the extension point is sort of a compressed version of the kind of servlet configuration you see in JEE's web.xml file.

This is actually an area where Toby's series and these instructions diverge. In his series, he used a different extension point - com.ibm.pvc.webcontainer.application - to provide his servlet as part of a full web application. That is a very interesting route as well, and opens the door to broader Java application development. Either one will get the job done, but the "just the servlet" route saves a bit of cognitive overhead for now.

Now, to create the files mentioned in the configuration. In the src/main/resources folder, create a folder named WEB-INF and add the exampleapplication and exampleservlet.properties files, both plain text:

The exampleservlet.properties file can be left blank for now. It's there to provide Wink-specific initialization properties that are useful as you get further into it, but are not needed now.

The exampleapplication file lists the classes that will provide the individual REST resources, one per line. For now, we'll just have one:

com.example.xsp.servlet.HelloWorldResource

With the configuration set up, it's time to create two Java classes. First, create a new class named "ExampleServlet" in the "com.example.xsp.servlet" package and have it extend "com.ibm.domino.services.AbstractRestServlet":

This class doesn't actually need to do anything other than exist for our purposes, but it can be a useful point for hooking in behavior that occurs for each servlet request. The actual code will go in the "HelloWorldResource" class mentioned in the application file. So create that class in the "com.example.xsp.servlet" package. Unlike the previous class, this one doesn't need to extend anything:

This one contains the core of our business logic (such as it is):

package com.example.xsp.servlet;

import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.PathParam;
import javax.ws.rs.Produces;
import javax.ws.rs.QueryParam;
import javax.ws.rs.core.Context;
import javax.ws.rs.core.MediaType;
import javax.ws.rs.core.Response;
import javax.ws.rs.core.UriInfo;

import lotus.domino.NotesException;
import lotus.domino.Session;
import lotus.domino.Database;

import com.ibm.commons.util.StringUtil;
import com.ibm.commons.util.io.json.JsonJavaObject;
import com.ibm.domino.osgi.core.context.ContextInfo;

@Path("/helloworld/{id}")
public class HelloWorldResource {
	
	@GET
	@Produces(MediaType.APPLICATION_JSON)
    public Response get(
    		@Context UriInfo context, @PathParam("id") String id, @QueryParam("echo") String echo
    		) throws NotesException {
    	JsonJavaObject json = new JsonJavaObject();
    	
    	json.put("payload", "Hello world");
    	Session session = ContextInfo.getUserSession();
    	if (session != null) {
    		json.put("userName", session.getEffectiveUserName());
    	}
    	Database database = ContextInfo.getUserDatabase();
    	if(database != null) {
    		json.put("databaseFilePath", database.getFilePath());
    	}
    	if(StringUtil.isNotEmpty(echo)) {
    		json.put("echo", echo);
    	}
    	if(StringUtil.isNotEmpty(id)) {
    		json.put("id", id);
    	}
    	
    	return Response.ok(json.toString()).build();
    }
}

There are quite a few things going on here! Let's start with the JAX-RS annotations. JAX-RS, like other "new-era" Java APIs, using annotations heavily in order to put configuration inline and cut down on the number and size of the previous giant blobs of XML. They look a bit unusual if you haven't encountered Java annotations before, but their intent here is hopefully clear.

At the top, the @Path("/helloworld/{id}") annotation indicates that this class is hooked into "/helloworld" within our servlet, and moreover also expects a path parameter that it will refer to as "id". Then, the @GET annotation indicates that our get method will respond to that HTTP method (the name of the method is arbitrary - it could be "foobar" and still work). The @Produces(MediaType.APPLICATION_JSON) annotation further indicates that the method will supply JSON. There are other ways to accomplish this if the method may return different content types, but we know it's always JSON here.

Inside the method signature, there's a bit more magic going on. We're never going to call this method in any code ourselves, but instead Wink is going to inspect it and its parameter list in order to provide what it requests. By annotating the first parameter with @Context, we're telling Wink that we want to have that populated with the context information for the servlet - this is similar to ExternalContext from XSP development. This object isn't actually used in the example (and could be removed), but I've found that it's handy to have it consistently there. The other two parameters use @PathParam("id") and @QueryParam("echo") to extract the "id" parameter we specified earlier as well as look for "echo=..." in the query string.

The method itself uses IBM's JSON library to produce a simple JSON object containing the information we passed in, as well as some Domino-specific stuff, using the class ContextInfo. This class is sort of like the equivalent of ExtLibUtil for Equinox servlets, at least when it comes to getting access to the current session and database. ExtLibUtil itself, though available, doesn't function properly in a servlet, and so we use this instead. The current session is straightforward enough and works like you'd expect, but the current database is interesting. When you register a servlet via this method, it becomes available not only via "yourserver.com/examplerest", but also within each NSF, like "yourserver.com/foo.nsf/examplerest". By default, that will just provide the same results, but, if your code uses ContextInfo.getCurrentDatabase(), you can get the database the URL is requesting. This is how DAS does its thing, and it allows you to write a servlet that can operate in a contextually-appropriate way in each database.

So phew! Now that this is all written, you can build your update site and deploy it to the server. After restarting HTTP, you should be able to go to a URL like "http://yourserver.com/foo.nsf/examplerest/helloworld/someid?echo=hey", and see a result like this:

{
	"echo": "Hey",
	"userName": "CN=Jesse Gallagher/O=IKSG",
	"id": "someid",
	"databaseFilePath": "foo.nsf",
	"payload": "Hello world"
}

To finish, it's time to commit the changes.

Next up, it will be time to start taking the plunge into Maven. Don't worry; it'll be fun!

That Java Thing, Part 11: Diagnostics

Tue Dec 01 08:43:45 EST 2015

Tags: java xpages
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment

Though my surprisingly-packed schedule the last few weeks caused a hiatus, it's time to retun to this series with a quick description of some of the diagnostic tools available to you when doing plugin development (outside of connecting the debugger, which I may do eventually).

The primary tool in your "what the heck is going on?" toolbox should be the XPages Log File Reader. This app does a wonderful job providing a web UI for the important diagnostic files you'll likely need to see during plugin development (or normal XPages development as well). Even if you have control over the server and could see the files on the filesystem, having them in one UI is invaluable. By looking through the available tabs and pages, you can usually track down some error message related to your problem. So, if you don't have this installed currently, make a point of adding it.

Your other best friend will be the oddly-named XPages Portable Command Guide. Its purpose isn't as immediately clear as Mastering XPages, but it contains valuable nuggets of wisdom. In particular, for the purposes of developing plugins, it describes how to use the OSGi console commands.

By running tell http osgi followed by an appropriate command on the Domino server console, you can find out a bunch of important diagnostic state information about installed plugins. Things are a little obtuse in there, but eventually you learn enough to glean what you'll need to know. There are three primary commands I use: ss, diag, and bundle.

The ss command allows you to see a status summary of plugins matching a given name prefix. So, for example, if you run it for "com.example", you should get a listing like this:

[1140:00D0-15DC] 12/01/2015 07:44:05 AM  Remote console command issued by Jesse Gallagher/IKSG: tell http osgi ss com.example
[0FA8:0002-0C54] 12/01/2015 07:44:05 AM  Framework is launched.
[0FA8:0002-0C54] 12/01/2015 07:44:05 AM  id State       Bundle
[0FA8:0002-0C54] 12/01/2015 07:44:05 AM  79 RESOLVED    com.example.xsp.plugin.source_1.0.0.201511121147
[0FA8:0002-0C54] 12/01/2015 07:44:05 AM  83 ACTIVE      com.example.xsp.plugin_1.0.0.201511121147

The "State" column's values are states from the OSGi lifecycle, and it's worthwhile to at least read the list on that page. The gist of it is that "INSTALLED" is bad (though one step better than not being listed at all), while "RESOLVED" means "working but not yet activated", "<<LAZY>> means it is waiting to be used (like an XSP library), and "ACTIVE" means all is well (probably). This can be a very useful tool for seeing whether or not a plugin is properly on the server at all, and then whether there is an issue.

The diag command is the next level to drill down into when you want to get information about a misbehaving plugin. If you pass the full name of a plugin (not just the prefix), it will tell you if there are any missing dependencies. Running it on the example plugin, you should get something like this:

[1140:00D0-0D3C] 12/01/2015 08:18:41 AM  Remote console command issued by Jesse Gallagher/IKSG: tell http osgi diag com.example.xsp.plugin
[0FA8:0002-0C54] 12/01/2015 08:18:41 AM  initial@osginsf:osgi-dev.nsf/D3A3F506C30EFAB585257EFB005C40D1/com.example.xsp.plugin_1.0.0.201511121147.jar [83]
[0FA8:0002-0C54] 12/01/2015 08:18:41 AM    No unresolved constraints.

If, however, the plugin has a dependency that is unfulfilled (like this arbitrary one I added for this purpose), you'll get a different message:

[1140:00D0-140C] 12/01/2015 08:26:36 AM  Remote console command issued by Jesse Gallagher/IKSG: tell http osgi diag com.example.xsp.plugin
[0D50:0002-112C] 12/01/2015 08:26:36 AM  initial@osginsf:osgi-dev.nsf/EE0B673EB07C7AC985257F0E0049B171/com.example.xsp.plugin_1.0.0.201512010823.jar [12]
[0D50:0002-112C] 12/01/2015 08:26:36 AM    Direct constraints which are unresolved:
[0D50:0002-112C] 12/01/2015 08:26:36 AM      Missing required bundle org.eclipse.egit.mylyn.ui_4.0.3.

A slight variant of that situation is when there is an optional bundle that is missing, and this is usually fine. For example, it's common to see XPages plugins that have dependencies on "com.ibm.notes.java.api", which is the nicely-packaged version of the lotus.domino classes, but which is not actually on the server (since it uses Notes.jar directly). In that situation, the message is similar, but not a problem:

[1140:00D0-0B70] 12/01/2015 08:29:57 AM  Remote console command issued by Jesse Gallagher/IKSG: tell http osgi diag com.example.xsp.plugin
[0E18:0002-15A4] 12/01/2015 08:29:57 AM  initial@osginsf:osgi-dev.nsf/FB1CC2377927985185257F0E0049F6E8/com.example.xsp.plugin_1.0.0.201512010827.jar [12]
[0E18:0002-15A4] 12/01/2015 08:29:57 AM    Direct constraints which are unresolved:
[0E18:0002-15A4] 12/01/2015 08:29:57 AM      Missing optionally required bundle com.ibm.notes.java.api_9.0.1.

The final command is much more verbose and usually the least useful: bundle. What this does is to spit out a feed of pertinent information about the plugin. The first section of it looks like this:

[1140:00D0-0AF0] 12/01/2015 08:32:10 AM  Remote console command issued by Jesse Gallagher/IKSG: tell http osgi bundle com.example.xsp.plugin
[0E18:0002-15A4] 12/01/2015 08:32:11 AM  initial@osginsf:osgi-dev.nsf/FB1CC2377927985185257F0E0049F6E8/com.example.xsp.plugin_1.0.0.201512010827.jar [12]
[0E18:0002-15A4] 12/01/2015 08:32:11 AM    Id=12, Status=<<LAZY>>    Data Root=C:\Program Files\IBM\Domino\data\domino\workspace\.config\org.eclipse.osgi\bundles\12\data
[0E18:0002-15A4] 12/01/2015 08:32:11 AM    No registered services.
[0E18:0002-15A4] 12/01/2015 08:32:11 AM    No services in use.
[0E18:0002-15A4] 12/01/2015 08:32:11 AM    Exported packages
[0E18:0002-15A4] 12/01/2015 08:32:11 AM      com.example.xsp.beans; version="0.0.0"[exported]
[0E18:0002-15A4] 12/01/2015 08:32:11 AM      org.apache.commons.lang3; version="0.0.0"[exported]
[0E18:0002-15A4] 12/01/2015 08:32:11 AM      org.apache.commons.lang3.builder; version="0.0.0"[exported]
[0E18:0002-15A4] 12/01/2015 08:32:11 AM      org.apache.commons.lang3.concurrent; version="0.0.0"[exported]
[0E18:0002-15A4] 12/01/2015 08:32:11 AM      org.apache.commons.lang3.event; version="0.0.0"[exported]
[0E18:0002-15A4] 12/01/2015 08:32:11 AM      org.apache.commons.lang3.exception; version="0.0.0"[exported]
[0E18:0002-15A4] 12/01/2015 08:32:11 AM      org.apache.commons.lang3.math; version="0.0.0"[exported]
[0E18:0002-15A4] 12/01/2015 08:32:11 AM      org.apache.commons.lang3.mutable; version="0.0.0"[exported]
[0E18:0002-15A4] 12/01/2015 08:32:11 AM      org.apache.commons.lang3.reflect; version="0.0.0"[exported]
[0E18:0002-15A4] 12/01/2015 08:32:11 AM      org.apache.commons.lang3.text; version="0.0.0"[exported]
[0E18:0002-15A4] 12/01/2015 08:32:11 AM      org.apache.commons.lang3.text.translate; version="0.0.0"[exported]
[0E18:0002-15A4] 12/01/2015 08:32:11 AM      org.apache.commons.lang3.time; version="0.0.0"[exported]
[0E18:0002-15A4] 12/01/2015 08:32:11 AM      org.apache.commons.lang3.tuple; version="0.0.0"[exported]
[0E18:0002-15A4] 12/01/2015 08:32:11 AM    Imported packages
[0E18:0002-15A4] 12/01/2015 08:32:11 AM      org.osgi.framework; version="1.4.0"<System Bundle [0]>
[0E18:0002-15A4] 12/01/2015 08:32:11 AM      com.ibm.designer.runtime; version="0.0.0"<update@../../shared/eclipse/plugins/com.ibm.xsp.core_9.0.1.20150605-1000/ [256]>

The "Exported packages" part is usually the most useful: it lets you ensure that the packages you think are being exported are indeed being exported. This won't tell you for sure that all of the classes in those packages are present and working, but it's a start.

The "Imported packages" section, which can run for hundreds of lines, is very verbose, but is potentially useful if you want to track down why code in your plugin that interacts with the outside is misbehaving. Perhaps it hasn't found the right dependent package at runtime, or perhaps the version of the other plugin it's using is not what you expect.

One word of warning about the bundle command: it can very easily display more information than Administrator's remote console will show you, so it's useful to look at the server console directly or the logs for everything. Fortunately, the last couple of lines relate to permission setup, which is not usually useful for this need.


These commands so far have applied to the Domino server, but they also apply to Notes/Designer. To see the OSGi console, though, you need to launch Notes specially, with "-RPARAMS -console" in the command line. I keep a second shortcut on my Start menu around for this purpose. When you launch it, there will be quite a bit of noise in there, since Eclipse is a very chatty thing, but it can be invaluable in a pinch. When using this console, since it's the OSGi console directly and not routed through a server task, you can drop the tell http osgi prefix to the commands and just do things like ss com.example directly.

The client also has useful diagnostic logs, though there's no handy XPages application to go along with it (unless it happens to run in the local web preview, I guess). Instead, if you go to Help → Support in Designer, you have a couple options to view log and trace information. If you're working on a plugin and don't see it in the Xsp Properties page list, this is the place to check.


In the next couple of posts, we'll add a little more code to the plugin, diving into using JAX-RS to serve web services, before marching towards Maven-ization.

That Java Thing, Interlude: Effective Java

Mon Nov 16 08:15:53 EST 2015

Tags: java
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment

While taking a short breather in my continuing Java series, I think that now is a good time to reiterate my advice for all Domino developers to read Effective Java. It's probably not the best way to learn Java from scratch, but it's an invaluable tour through tons of important Java concepts. Even if you don't use most of the knowledge immediately, reading every section will help immerse you in the language and give you a better appreciation for its texture, which is one of the most important aspects of being a better programmer.

There is one caveat, though, when it comes to serialization. The serialization chapter in the book, though characteristically thorough and accurate, paints a much more dire and complicated picture of serialization than we as XPages developers usually have to worry about. It focuses on long-term storage of serialized objects - say, on the filesystem as a data format - whereas most of it going on in an XPages app is to make sure that your managed beans and data contexts don't throw exceptions when you do a partial refresh. Though we do run into it a bit when storing Java objects in Domino documents with MIMEBean or ODA, a managed bean class can get away with just "implements Serializable" attached and not a second thought.

Now go, make haste to Amazon and purchase the book!

That Java Thing, Part 10: Expanding the Plugin - Serving Resources

Thu Nov 12 12:02:24 EST 2015

Tags: java xpages
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment

After sharing code, one of the handiest uses of a plugin is sharing web resources - stylesheets, JavaScript files, and so forth. This process is similar to the last couple steps in that, though it is not very complicated on the whole, it's pretty non-obvious how to get around to doing it.

To start with, we'll create some resources to serve up. Expand the src/main/resources folder in your project (it will be slightly more useful to use the "normal" folder version and not the source folder with the brown package icon, due to Eclipse's UI) and go to New → Other, then pick "Folder" within "General". Name the folder "web":

The create within it a folder named "example", and within that folders named "css" and "js":

Then, populate it with some files - I created a basic stylesheet named "style.css" and a JavaScript file name "script.js" and placed them within the css and js folders, respectively. It doesn't matter what they contain, as long as it's something you can test later; you could also drag in any files you have from the filesystem.

As a side note, as I did this, creating folders within the hierarchy, Eclipse got a bit wonky about refreshing the list, presumably because of the combined source/normal folder nature of src/main/resources. This inconvenience is a concession to the way m2e, the Maven integrator for Eclipse, will act later - it makes this folder a source folder automatically, so we may as well get used to it.

Next, create a theme file within src/main/resources directly (which is to say, not in the web folder) and name it "example.theme". When you create it, Eclipse will likely have trouble opening up an editor: it will try to use one from the OS, which will almost definitely be invalid. Instead, right-click the file and choose Open With → Text Editor:

Set its contents to:

<theme extends="webstandard" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="platform:/plugin/com.ibm.designer.domino.stylekits/schema/stylekit.xsd" >
	<resources>
		<styleSheet href="/.ibmxspres/.extlib/example/css/style.css"/>
		<script clientSide="true" src="/.ibmxspres/.extlib/example/js/script.js"/>
	</resources>
</theme>

Those paths look so weird because we'll be piggybacking on the ExtLib's resource-provider system.

Now, back to Java to create the classes that will provide these resources to the platform. Because it's less terrifying, we'll start with the StyleKitFactory, which provides the theme to the server. Right-click on src/main/java and create a new Java class:

  • Set its package to "com.example.xsp.theme"
  • Set its name to "ExampleStyleKitFactory"
  • Add two interfaces: com.ibm.xsp.stylekit.StyleKitFactory and com.ibm.xsp.stylekit.StyleKitListFactory. The former will provide the theme to the server, while the latter will tell Designer about the name it can use

Set the class contents to this:

package com.example.xsp.theme;

import java.io.InputStream;
import java.util.Arrays;
import java.util.List;

import com.ibm.xsp.stylekit.StyleKitFactory;
import com.ibm.xsp.stylekit.StyleKitListFactory;

public class ExampleStyleKitFactory implements StyleKitFactory, StyleKitListFactory {

	private static final String[] THEMES = {
		"example" //$NON-NLS-1$
	};
	private static final List<String> THEMES_LIST = Arrays.asList(THEMES);
	
	@Override
	public String[] getThemeIds() {
		return THEMES;
	}

	@Override
	public InputStream getThemeAsStream(String themeId, int scope) {
		if (scope == StyleKitFactory.STYLEKIT_GLOBAL) {
			if (THEMES_LIST.contains(themeId)) {
				return getThemeFromBundle(themeId + ".theme"); //$NON-NLS-1$
			}
		}
		return null;
	}

	@Override
	public InputStream getThemeFragmentAsStream(String themeId, int scope) {
		return null;
	}
	
	private InputStream getThemeFromBundle(final String fileName) {
		ClassLoader cl = getClass().getClassLoader();
		return cl.getResourceAsStream(fileName);
	}
}

There's quite a bit going on here! Some of it is due to the nature of the task and some of it comes from my own built-up habits that come in handy as the class grows. Towards the top of the class, THEMES contains an array of theme names known by the plugin - in this case, just the one, but it's good to have this standardized. THEMES_LIST contains a List wrapper around that array for programmatic convenience later.

The getThemeIds method is from StyleKitListFactory and is used by Designer to generate its list of available themes in the GUI. Implementing this interface isn't required, but it's a cross-the-Ts sort of thing.

The remaining methods implement StyleKitFactory, which provides the server with the theme data itself. getThemeAsStream is called by the runtime when it's searching for a theme requested by the app, so it contains a couple checks to make sure that the request is indeed intended for a plugin-based (global) theme with a name that this plugin knows about. It then uses a small utility method, getThemeFromBundle, to get the theme as an InputStream from the plugin's internal filesystem. Technically, this could be anything - it could construct the theme on the fly, fetch it from a URL, or get it from anywhere else, as long as it returns an InputStream, but this version is the most common.

getThemeFragmentAsStream is an interesting beast. The Extension Library uses it to hook in extra theme info for its Bootstrap themes on the fly. This is presumably useful for ad-hoc theme hierarchies and avoiding the theme-inheritance cap, but we don't have any need for it here.

Finally, the "$NON-NLS-1$" comment business is because I've developed a habit of enabling Eclipse's translated-strings warnings - the comments denote to the IDE that the hard-coded strings on those lines are not intended to be translated. You can ignore those if you so desire.

Moving on, now it's time to set up our resource provider, which will serve up the web resources. Before that, we'll take a minor detour back to the Activator class to fix up a method signature. Add "public" to the getContext method:

public static BundleContext getContext() {
	return context;
}

Now, we're going to implement the resource-loading class, modeled after the one from Bootstrap4XPages. Create a new class:

  • Set its package to "com.example.xsp.minifier"
  • Set its name to "ExampleLoader"
  • Set its superclass to com.ibm.xsp.extlib.minifier.ExtLibLoaderExtension

This class will have three methods of importance to us: getOSGiBundle (which is obligatory), loadCSSShortcuts, and getResourceURL:

package com.example.xsp.minifier;

import java.net.URL;

import javax.servlet.http.HttpServletRequest;

import org.osgi.framework.Bundle;

import com.example.xsp.Activator;
import com.ibm.commons.util.DoubleMap;
import com.ibm.xsp.extlib.minifier.ExtLibLoaderExtension;
import com.ibm.xsp.extlib.util.ExtLibUtil;

public class ExampleLoader extends ExtLibLoaderExtension {
	
	private static final String[] LIBRARY_RESOURCE_NAMESPACES = {
		"example" //$NON-NLS-1$
	};

	@Override
	public Bundle getOSGiBundle() {
		return Activator.getContext().getBundle();
	}

	@Override
	public void loadCSSShortcuts(DoubleMap<String, String> aliases, DoubleMap<String, String> prefixes) {
		if(prefixes != null) {
			for(int i = 0; i < LIBRARY_RESOURCE_NAMESPACES.length; i++) {
				String namespace = LIBRARY_RESOURCE_NAMESPACES[i];
				prefixes.put("9T0a" + i, "/.ibmxspres/.extlib/" + namespace); //$NON-NLS-1$ //$NON-NLS-2$
			}
		}
	}
	
	@Override
	public URL getResourceURL(HttpServletRequest request, String name) {
		for(String namespace : LIBRARY_RESOURCE_NAMESPACES) {
			if(name.startsWith(namespace)) {
				String path = "/web/" + name; //$NON-NLS-1$
				return ExtLibUtil.getResourceURL(getOSGiBundle(), path);
			}
		}
		return null;
	}
}

Once again, there's a lot going on, but you can see some general similarities to the theme provider. The getOSGiBundle method makes use of the Activator.getContext method we just modified, while the other two methods use the same sort of "internal array of Strings" pattern as before to make it easy to add more entries to the list down the line.

Things are a little strange in the loadCSSShortcuts method. The goal of this method is to provide the shorthand codes used in the minified URLs for resources, and so each plugin should make sure to provide unique ones. However, I don't know of any coordinated enforcer class that does this "making sure" for us, so you kind of just have to make up a unique string of characters that should be unique. Here, I picked "9T0a" as the base, just because they usually look like that.

The getResourceURL method is a little simpler, and is the equivalent of the earlier getThemeAsStream - given the incoming request for a resource name, it checks to see if it falls within its bailiwick and, if so, uses a method to provide a URL to the server environment.

These methods, like the ones in the theme provider, can be generally used as a "just drop them in the plugin" sort of thing without modification. There are other potential tweaks you can make, but it's probably best to keep it simple.

There's two more steps to getting these classes working. The theme provider needs to be registered in the plugin.xml and the resource loader needs to be added in the Activator. First, for the theme provider. Add another extension point entry to the plugin.xml:

Your plugin.xml source should look something like:

<?xml version="1.0" encoding="UTF-8"?>
<?eclipse version="3.4"?>
<plugin>
   <extension
         point="com.ibm.commons.Extension">
      <service
            class="com.example.xsp.ExampleLibrary"
            type="com.ibm.xsp.Library">
      </service>
   </extension>
   <extension
         point="com.ibm.commons.Extension">
      <service
            class="com.example.xsp.theme.ExampleStyleKitFactory"
            type="com.ibm.xsp.stylekit.StyleKitFactory">
      </service>
   </extension>

</plugin>

For the loader, open the Activator class and add a new line to the start method to add the resource loader to the ExtLib's bank. The class should now look like:

package com.example.xsp;

import java.util.logging.Level;
import java.util.logging.Logger;

import org.osgi.framework.BundleActivator;
import org.osgi.framework.BundleContext;

import com.example.xsp.minifier.ExampleLoader;
import com.ibm.xsp.extlib.minifier.ExtLibLoaderExtension;

public class Activator implements BundleActivator {

	private static BundleContext context;
	
	public static final Logger log = Logger.getLogger(Activator.class.getPackage().getName());
	static {
		log.setLevel(Level.FINEST);
	}

	public static BundleContext getContext() {
		return context;
	}

	/*
	 * (non-Javadoc)
	 * @see org.osgi.framework.BundleActivator#start(org.osgi.framework.BundleContext)
	 */
	public void start(BundleContext bundleContext) throws Exception {
		Activator.context = bundleContext;
		
		if(log.isLoggable(Level.INFO)) {
			log.info("Starting Example XPages Library");
		}
		
		ExtLibLoaderExtension.getExtensions().add(new ExampleLoader());
	}

	/*
	 * (non-Javadoc)
	 * @see org.osgi.framework.BundleActivator#stop(org.osgi.framework.BundleContext)
	 */
	public void stop(BundleContext bundleContext) throws Exception {
		Activator.context = null;
	}
}

(Despite appearances, the "start" method there is not actually commented out - its blue appearance is a bug in the JavaScript syntax highlighter used in this blog)

Now to see if all this work has paid off: build the update site and install it in Designer and Domino. When you relaunch Designer and open the "Xsp Properties" page of the NSF you're working with, you should see the "example" theme in the list of options (as long as you have a recent ExtLib release installed in Designer - the capability was added post-9.0.1):

Additionally, when you open the app on the web, you should see your CSS and JS files included. One final note about these resources: though the URLs in the theme start with "/.ibmxspres/.extlib/example", URLs references from non-XSP files (say, referencing font files from with CSS) should start instead with "/xsp/.ibmxspres/.extlib/example". The XPages runtime adds that "/xsp" when generating URLs for pages, but it doesn't do any such processing for static references.

So... that was more complicated than it seemed at the start! Still, once it's in place, it's all pretty much "set it and forget it" - in the future, you can add files and themes more easily, just adjusting the static arrays in the appropriate classes as necessary.

Finally, commit the changes and take a relaxing breath.

That Java Thing, Part 9: Expanding the Plugin - Jars

Wed Nov 11 07:50:14 EST 2015

Tags: java xpages
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment

So it appears that I once again forgot to commit my changes. Well, consider this a cautionary tale, but we can still salvage the situation a bit by committing the previous changes before embarking on an unrelated modification - it's that mixing of different changes that can cause trouble in a source control repository.

For today's post, we'll add a third-party Jar to our plugin in order to use it internally and provide it to external applications (and we'll cover why those are distinct things).

The Jar we'll use is Apache Commons Lang, because it's a nice simple case without complicated dependencies. Embedding it inside a plugin is actually not the ideal deployment strategy for this, since it already contains OSGi metadata, but this setup is often the most expedient.

To start, download the latest binary release from their site and extract the Zip. Now, in Eclipse, right-click the "com.example.xsp.plugin" project, choose New → Folder, and name it "lib". Then, right-click that new folder and choose "Import...":

Then, choose "File System" from within "General":

On the next pane, browse to the extracted folder and, within Eclipse's file-picker list, choose the binary Jar:

Next, we have to make sure that this Jar file gets included in the final build and is available to classes within the Jar. To do that, open the META-INF/MANIFEST.MF file and go to the "Runtime" tab. In the "Classpath" section, click "Add...", and find the newly-added file:

This does two things: it adds it to the plugin's internal classpath (specified in the MANIFEST.MF file) and, as a convenience in Eclipse, adds the file to build.properties (which controls what is included in the build). If you go to the "build.properties" tab, it should look something like this:

source.. = src/main/java/,\
           src/main/resources/
output.. = target/classes/
bin.includes = META-INF/,\
               .,\
               plugin.xml,\
               lib/commons-lang3-3.4.jar

You should also see the Jar within your project list - being at the top level with this "jar of bits" icon means that Eclipse knows that it is included in the project's classpath:

At this point, the Jar's classes are available for use inside the plugin itself, but not exposed to the outside world. So we could add a method like this to the ExampleBean class and then call it from an XPage:

/**
 * @return the Java home directory path according to Apache Commons Lang
 */
public String getJavaHome() {
	return org.apache.commons.lang3.SystemUtils.getJavaHome().getAbsolutePath();
}

As an aside, sometimes Designer doesn't quite obey the normal rules of OSGi visibility, but Domino does, so you can fall into a trap with packages that aren't marked as visible but are still accessible in Designer.

Sometimes we do want to make these classes visible to downstream users, such as when the point of including the Jar is to distribute a set of foundational libraries used across apps. To do this, go back to the MANIFEST.MF file's "Runtime" tab. Now, click the "Add..." in the "Exported Packages" section. You should see all of the Apache commons packages - select them and click OK:

Now, when you build and install this plugin, the classes will be directly available in your XPages applications. This can be a good way to take a handful of Jars you may have sitting around in your jvm/lib/ext folder and distribute them in a better way. It's not the best way, since it misses some OSGi advantages like source and Javadoc for the embedded classes, but it can be much simpler than doing it "right".

Now, commit the changes for today:

In the next post, we'll add some web resources to the plugin and make them available in a theme and able to participate in resource aggregation.

That Java Thing, Part 8: Source Bundles

Tue Nov 10 07:46:28 EST 2015

Tags: java xpages
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment

Before anything else today, Eric McCormick reminding that yesterday's post missed the final step: committing the changes. So, let's take care of that right now. On my side, since my Windows VM is also being accessed from the Mac side, it's worthwhile to add a .gitignore file to the root of the repo to keep out all the .DS_Store nonsense. GitHub's Java gitignore is a good start, though skipping the part about ignoring Jar files. In your text editor of choice, create a new file named ".gitignore" in the root directory of the Git repository, with this content:

._*
Thumbs.db
.DS_Store

*.class

# Mobile Tools for Java (J2ME)
.mtj.tmp/

# Package Files #
#*.jar
*.war
*.ear

# virtual machine crash logs, see http://www.java.com/en/download/help/error_hotspot.xml
hs_err_pid*

Then commit the files:

Now, on to today's topic: adding a source bundle. This is a small topic, but will be very useful as you work down the line. As it stands right now, even though you have access to the source code of your plugin, Designer doesn't, and that means that you don't have nice parameter names, inline Javadoc, or viewable source when working with your classes in an actual XPages application. Since XSP libraries are almost always open-source or internal-use-only, including a source bundle is the fastest way to achieve all of these.

The way to do this is pretty non-obvious, but not complex once you know what to do. The work will be done with a new feature project, so go to File → New → Other and then "Feature Project" within "Plug-in Development". This should be set up very similarly to the original feature:

  • Set the project name to "com.example.xsp.source.feature"
  • Override the location with a folder inside the Git repository
  • Pick a variant on your original feature name, so "Example XSP Library Source Feature"

Unlike before, however, we don't want to select any plugins on the next pane - instead, just hit "Finish". This feature is going to use some Eclipse trickery to automatically generate a "source" version of the existing feature. Once the project is created, open its feature.xml file and go to the "feature.xml" tab. On there, add a bit of XML to manually specify the non-existent source plugin (I've left out the "description", "copyright", and "license" blocks, which are required even if just stubs):

<?xml version="1.0" encoding="UTF-8"?>
<feature
      id="com.example.xsp.source.feature"
      label="Example XSP Library Source Feature"
      version="1.0.0.qualifier">

   <!-- *snip* -->

	<includes
         id="com.example.xsp.plugin.source"
         version="0.0.0"/>
</feature>

Then, go to the "build.properties" tab and add a second line to tell Eclipse's builder to generate the source plugin:

bin.includes = feature.xml
generate.feature@com.example.xsp.plugin.source = com.example.xsp.feature

Next, open the site.xml file in the update site project and add the new feature to the "Managing the Site" section:

Now, save and "Build All". Then, go back to Designer and install the latest version from the update site, which should now have a second feature listed:

When you install that and relaunch Notes/Designer, you'll be able to access the source of your plugin. To see this in action, open your application in Designer and then go to the "Package Explorer" view (you may have to add it via Window → Show Eclipse Views → Package Explorer). Find your NSF's project and expand the "Plug-in Dependencies" category. Towards the bottom, you should see the example plugin Jar file - expand that and double-click on one of the classes:

If all went well, you should see the source to the original class, instead of Eclipse's incomprehensible bytecode dump. Additionally, now, when you access these classes from Java code inside your NSF, you'll get improved inline help, with correct parameter names and Javadoc (if you actually write the Javadoc).

The next step will be another small but useful topic: adding some third-party Jars to your plugin, either to use internally or to expose to your XPages applications.

That Java Thing, Part 7: Adding a Managed Bean to the Plugin

Mon Nov 09 06:37:27 EST 2015

Tags: java xpages
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment

For today's step, we'll deal more with the meat of the task: putting your own code in the plugin. There are a lot of options for what a plugin can provide, ranging from just storing classes to be accessed from elsewhere to being full-on OSGi web applications. Adding a managed bean certainly falls on the simpler side of this spectrum, but it's also one of the most common uses. These steps should also be very familiar if you've created a managed bean in an XPages NSF before.

Before we get started, it will be useful later to access Extension Library code from this plugin, so there's a quick detour. In order to allow the plugin to "see" ExtLib code, the appropriate ExtLib plugin needs to be added as a dependency. Go to the MANIFEST.MF file's "Dependencies" tab and add com.ibm.xsp.extlib to the list:

Now, create a new class with an appropriate name by right-clicking on the src/main/java folder and selecting New → Class as before. Set its package to "com.example.xsp.beans" and its name to "ExampleBean". You don't have to name your bean classes with "Bean", but sometimes it makes sense. Make sure to add the java.io.Serializable interface via the "Add..." button in that section.

When you create the class, Eclipse will put a squiggly underline beneath the class name, indicating that it has a warning for you. If you hover your cursor over it (or hit Ctrl-1 while the cursor is on the line), you'll get more information and an option to fix it:

What this message is talking about has to do with serialization - the process of storing the Java object outside of memory and retrieving it later - and is something that would come up more in applications that intentionally store data long-term. For a thorough explanation of this and many other things, I can't recommend Effective Java enough. For our purposes as XPages developers, though, it's fine to pick the "default" option - we want more classes Serializable than other Java apps, but we care less about their long-term storage.

Next, add a basic "getter" method that returns a String we can see later, as well as a static method to retrieve the bean from the XPages environment from Java:

package com.example.xsp.beans;

import java.io.Serializable;

import javax.faces.context.FacesContext;

import com.ibm.xsp.extlib.util.ExtLibUtil;

public class ExampleBean implements Serializable {
	private static final long serialVersionUID = 1L;
	
	public static ExampleBean get() {
		return (ExampleBean)ExtLibUtil.resolveVariable(FacesContext.getCurrentInstance(), "exampleBean");
	}

	public String getFoo() {
		return "hello from " + getClass().getSimpleName();
	}
}

Now, in order to make this a managed bean, well have to add a local faces-config file. These files have the same format as the one in the NSF. There's another nod towards Maven here: we'll put the XML file in a separate "resources" folder in the project, which is where Maven expects to find this sort of thing. Right-click the project and choose New → Source Folder:

Name it "src/main/resources":

Right-click on this newly-created source folder and choose New → Other, and within the dialog choose "File" from the "General" section:

On the next screen, name the file "beans.xml" and click Finish:

Most likely, Eclipse will open the file on the "Design" tab, which is pretty useless for this - it's better to switch to the "Source" tab. In this file, fill in the basic faces-config structure and an entry for the managed bean:

<?xml version="1.0" encoding="UTF-8"?>
<faces-config>
	<managed-bean>
		<managed-bean-name>exampleBean</managed-bean-name>
		<managed-bean-class>com.example.xsp.beans.ExampleBean</managed-bean-class>
		<managed-bean-scope>view</managed-bean-scope>
	</managed-bean>
</faces-config>

Now that we have this file, we have to tell the XPages runtime to actually use it as a faces-config file. To do that, go back to the ExampleLibrary class and implement the getFacesConfigFiles method, which lists the XML files from the plugin to contribute to the environment:

@Override
public String[] getFacesConfigFiles() {
	return new String[] {
			"beans.xml"
	};
}

The "@Override" bit above the method is indicating to the compiler that you are intentionally overriding a method declared in the superclass or interface - this is to help you ensure that you keep the method names correct for any code calling them.

Now, there are two minor remaining items to take care of before we deploy: exposing the class to Java code elsewhere and making sure the newly-created source folder is included in the build.

To do the former, open up the MANIFEST.MF file and go to the "Runtime" tab. Here, we can list the Java packages that should be available to code within the application (or other plugins). Since we'll want to have code that accesses the bean directly, add the "com.example.xsp.beans" package to the "Exported Packages" list:

This is important to do for any new package that is going to be accessed externally.

Next, go to the "build.properties" tab of the MANIFEST.MF editor (or open the build.properties file directly) and add the src/main/resources folder to the list of source folders:

source.. = src/main/java/,\
           src/main/resources/
output.. = target/classes
bin.includes = META-INF/,\
               .,\
               plugin.xml

The syntax of this file takes a little getting used to - the backslashes on many lines indicate that the next line should also be part of the same definition, but is split up for legibility.

With that, we're all set on this side. Head over to the site.xml file in the update site project and click "Build All" to build a new version of the plugin. Once that's done, go to the Update Site NSF for the server and import the new build, and then restart task http on the server console. Similarly, use the File → Application → Install routine in Designer to install the new version of the plugin there, and relaunch Notes.

Now that the bean is installed, you can test it out to make sure it's working by creating a new XPage in an app on the server with this content on the "Source" tab:

<?xml version="1.0" encoding="UTF-8"?>
<xp:view xmlns:xp="http://www.ibm.com/xsp/core">
	<xp:text value="#{exampleBean.foo}"/>
</xp:view>

When you launch the page in a browser, it should say "hello from ExampleBean". To try out the static get() method we created, change the XPage source to:

<?xml version="1.0" encoding="UTF-8"?>
<xp:view xmlns:xp="http://www.ibm.com/xsp/core">
	<xp:text><xp:this.value><![CDATA[#{javascript:
		"the bean says: " + com.example.xsp.beans.ExampleBean.get().getFoo()
	}]]></xp:this.value></xp:text>
</xp:view>

When you view that, you should see basically the same thing, prefixed with "the bean says: ". Whenever I make a change to test something that should have the same visual result, I like to toss in a small variation to make sure that my changes did, in fact, take effect.

For a lot of cases, this step may be all you need - many XPage libraries consist primarily as storehouses for Java classes and a few managed beans. The next couple steps are going to deal with some variations on this, plus some nice-to-haves.

That Java Thing, Part 6: Creating the Feature and Update Site

Sun Nov 08 10:45:08 EST 2015

Tags: java xpages
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment

The last post covered turning our nascent plugin into a proper XPages library, so now it's time to fill in the remaining pieces to get it installed.

To do that, we'll need a feature and an update site. The feature will point to our plugin - in a more-complicated setup, this could point to several related plugins, but we'll just need the one. The update site will do similarly, referencing the feature in a way that Eclipse-type platforms know about.

Go to File → New and choose "Feature Project" within "Plug-in Development":

Hit "Next" and fill out the feature details:

  • Set the project name to "com.example.xsp.feature"
  • Override the location with a folder inside the Git repository (remembering to include the feature name in the path, to avoid placing it in the top-level folder)
  • Pick a feature name, such as "Example XSP Library Feature"

Click "Next", find and check "com.example.xsp.plugin" in the list of plugins to include in the feature, and click "Finish":

That's about it for the feature. Next up: the update site. As before, go to File → New and choose "Update Site Project" within "Plug-in Development":

There's not much configuration to do for this one: just name it "com.example.xsp.update" and override its location to be within your Git repository:

Once the update site is created, click "Add Feature.." in the site.xml editor and add the new "com.example.xsp.feature" project:

One last optional step is to set up a category in the update site, which will help organization on the server. To do that, click "New Category" and give it a name and ID. Then, drag the feature in the list onto the category to add it to the group:

Now the update site definition is set up, and the next step is to click "Build All" to have Eclipse package your plugin inside the update site. That will create a number of files within your update site project, which are the actual artifacts we need:

Next is to install the library into Designer. Launch Notes, then Designer (the sequence is important to avoid a bug with NSF update sites, so I stick to this habit), and open Preferences. In there, go to the "Domino Designer" section and check "Enable Eclipse plug-in install":

Hit "OK" to close the preferences and then go to File → Application → Install. In the dialog that pops up (sometimes there's a lengthy delay before it does for some reason), select "Search for new features to install" and click "Next". On the next pane, choose "Add Folder Location..." and browse to the update site project:

The next dialog will prompt for a name, and for now the default will do. Click "OK" in this dialog and "Finish" in the main one. If all goes well, that should bring up another dialog after a couple of seconds, which will allow you to select the feature:

Click "Next" and then "Finish". As it installs, you'll get a prompt to install the unsigned plugin, so select "Install this plug-in" and click "OK". Designer will do its thing and then display a small toast window in the bottom right asking if you want to restart now. It at least used to be the case that clicking this didn't do a proper restart (presumably due to the Notes stuff), so I still avoid clicking it. Instead, close Notes and Designer, and then relaunch them (Notes first again).

Now, to see that everything is working, open an existing application in Designer (or make a new one), go to Application Configuration → Xsp Properties → Page Generation tab, and look for "com.example.xsp.library" in the "XPage Libraries" section:

If it's not there, something went wrong. Unfortunately, debugging this sort of thing can get hairy, so it'd probably be best to ask me or someone else knowledgable about plugins for assistance for now.

Assuming it did work, though: great! Next up is the installation on the server. This is a whole tutorial on its own, and fortunately IBM has done the job for me. The upshot of those instructions is:

  • Create a database on the server using the "Eclipse Update Site" template (updatesite.ntf) and clean up the ACL to something appropriate. To see the template, make sure to click "Show advanced templates".
  • Import com.example.xsp.update by using the "Import Local Update Site" option and pointing to its site.xml file
  • Set the notes.ini property "OSGI_HTTP_DYNAMIC_BUNDLES" on the server to point to the file name of that update site
  • Restart HTTP or the entire server

Once HTTP or the server is restarted, you can test to see if it's working by creating an application on the server that has the library checked in the Xsp Properties page as above, and then visiting an XPage in it with a browser. If it's working, it will load normally; otherwise, it will complain about the application relying on a missing library.

So that covers the build and installation cycle! There's one last change to make in the projects before we commit them to Git. The update site project produces tons of Jar files - two for every "Build All" click - and there's no need to check them into the repository. This is a job for a .gitignore file, which we'll put in the update site project because we don't want to ignore intentional Jars later - .gitignore files cascade hierarchically. Right-click on the update site project in Eclipse and choose New → File:

Name the file ".gitignore" (with the starting dot) and click "Finish". The file contents should be a single line:

*.jar

Now, go to commit the changes to the repository, and you should be able to see the projects and Git ignore file we created, but not the Jars:

Now that we have the library built and deployed to the server, the next step will be to actually make it useful, by adding some Java code and accessing it from the XPages application.

That Java Thing, Part 5: Expanding the Plugin

Fri Nov 06 08:48:55 EST 2015

Tags: java xpages
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment

In the last post, we created an empty plugin project for our XPages library. Today, we'll flesh that plugin out a bit and add the actual XSP hooks.

The first step is to start filling in some details in the plugin's Manifest file, which is the core description of what the plugin does and what it requires in its environment. To begin with, open the MANIFEST.MF file in the META-INF folder of your project and check the "This plug-in is a singleton" checkbox:

During one of the steps later, Eclipse would have yelled at us to do that anyway. This checkbox means that the plugin expects to be the only one of its kind active on the server. This is because it will contribute to the XSP Library extension point, and it should prevent duplicate contributions of the same library from different versions.

Next, go to the Dependencies tab of the Manifest editor, click "Add..." in the "Required Plug-ins" section on the left, and add "com.ibm.xsp.core":

Do the same with "com.ibm.commons":

Now, save and close the Manifest. The next step is to fix up a minor problem I forgot about in the last post: the base package name. By default, the base package is named after the plugin project, which means it now contains a redundant "plugin" part. That's not a problem per se, but it's not needed, and this will be a good introduction to Eclipse refactoring.

Expand the "src/main/java" (or "src" if you left it as the default) package folder within the project, then right-click on "com.example.xsp.plugin" and choose "Refactor" → "Rename...". Change the name to "com.example.xsp", make sure "Update references" is checked, check "Update fully qualified names in non-Java text files", and click "Preview". This should list a couple changes - Eclipse looks for references to the class in both Java and non-Java files to try to cover all of the bases. In a larger project, there may still be lingering references elsewhere, but in this case it does everything for us. Click "OK".

It's possible that there may be a small detour at this point. On my machine, I noticed a strange problem in Eclipse after making this change: it set the output folder of the project to a nonsense directory within the source. To make sure this isn't happening, double-click on build.properties and go to the "build.properties" tab. Make sure the "output.." line reads output.. = target/classes (or output.. = bin if you left the defaults). This is the file that controls many of the compilation settings for the plugin, and this specifically specifies the location for the compiled Java classes.

Now, time to expand the Activator a bit. Expand "com.example.xsp" and open the "Activator.java" file. The Activator class is an optional but there-by-default class that provides a couple hooks during the lifecycle of the plugin. For XSP libraries, there's usually not too much going on here, but it can serve as a useful coordinating and debugging point.

One such potential use is a centralized logger configuration. At least for now, we'll just use the basic JVM logging classes, since "proper" OSGi logging is a whole thing. With the addition of a logging property and initialization message, the Activator looks like this:

package com.example.xsp;

import java.util.logging.Level;
import java.util.logging.Logger;

import org.osgi.framework.BundleActivator;
import org.osgi.framework.BundleContext;

public class Activator implements BundleActivator {

	private static BundleContext context;
	
	public static final Logger log = Logger.getLogger(Activator.class.getPackage().getName());
	static {
		log.setLevel(Level.FINEST);
	}

	static BundleContext getContext() {
		return context;
	}

	/*
	 * (non-Javadoc)
	 * @see org.osgi.framework.BundleActivator#start(org.osgi.framework.BundleContext)
	 */
	public void start(BundleContext bundleContext) throws Exception {
		Activator.context = bundleContext;
		
		if(log.isLoggable(Level.INFO)) {
			log.info("Starting Example XPages Library");
		}
	}

	/*
	 * (non-Javadoc)
	 * @see org.osgi.framework.BundleActivator#stop(org.osgi.framework.BundleContext)
	 */
	public void stop(BundleContext bundleContext) throws Exception {
		Activator.context = null;
	}

}

The static block that sets the log level is a good trick to know about: that block executes once, when the class is first loaded. It's distinct from a contructor, which turns each time a new object of the class is created. It may help to think of the static portions of a class (the properties, methods, and this block) as a sort of special singleton version of the class created by the runtime. This sort of thing is non-obvious to new Java developers, but it starts to make sense after you do it for a while.

With the Activator in place, the next step is the actual XPages library class, which provides the specific details about the plugin's XSP interactions. Right-click on the "com.example.xsp" package and choose "New" → "Class" (if "Class" doesn't appear in that list, choose "Other" and then find "Class" within the "Java" folder). In the "New Java Class" dialog, click "Browse..." in the "Superclass" line around the middle, and look for the class AbstractXspLibrary:

Set the class name to "Example Library" and leave everything else as the defaults:

When creating the class, it will fill in a single required method: getLibraryId(). This method's job is to return a string that should be unique across XPages libraries, and which will show up as the library's identifier in Designer. By convention, this is related to the Java reverse-DNS name you're using, plus a suffix like ".library":

@Override
public String getLibraryId() {
	return Activator.class.getPackage().getName() + ".library";
}

There are a number of other methods, though, that are worth overriding in a normal plugin, related to the plugin setup, the other XSP libraries it depends on, and its versioning. This is a reasonable baseline for a modern XPages library that will use the Extension Library:

package com.example.xsp;

import java.util.logging.Level;
import java.util.logging.Logger;

import com.ibm.xsp.library.AbstractXspLibrary;

public class ExampleLibrary extends AbstractXspLibrary {
	
	private static final Logger log = Activator.log;
	
	static {
		if(log.isLoggable(Level.FINE)) {
			log.fine(ExampleLibrary.class.getName() + " loaded");
		}
	}

	@Override
	public String getLibraryId() {
		return Activator.class.getPackage().getName() + ".library";
	}

	@Override
	public String getPluginId() {
		return Activator.getContext().getBundle().getSymbolicName();
	}
	
	@Override
	public String getTagVersion() {
		return "1.0.0";
	}
	
	@Override
	public String[] getDependencies() {
		return new String[] {
				"com.ibm.xsp.core.library",
				"com.ibm.xsp.extsn.library",
				"com.ibm.xsp.domino.library",
				"com.ibm.xsp.designer.library",
				"com.ibm.xsp.extlib.library"
		};
	}
	
	@Override
	public boolean isGlobalScope() {
		return false;
	}
}

With that class in place, there's one final step for making this plugin a proper XSP library: the extension point. Extension points are how the XPages runtime knows which plugins provide libraries like this. Open the MANIFEST.MF file again and click on "Extensions" on the right side of the first page:

Eclipse will ask you if you want to show the hidden Extensions panel, which you do. On that tab, click "Add..." and choose "com.ibm.commons.Extension":

Once added, there will be two fields on the right side. In "type", enter "com.ibm.xsp.Library". In "class", click "Browse" and search for the name of the library created earlier, "com.example.xsp.ExampleLibrary". Once this is added, you should be able to click on the "plugin.xml" tab in the editor and see something like this:

<?xml version="1.0" encoding="UTF-8"?>
<?eclipse version="3.4"?>
<plugin>
   <extension
         point="com.ibm.commons.Extension">
      <service
            class="com.example.xsp.ExampleLibrary"
            type="com.ibm.xsp.Library">
      </service>
   </extension>

</plugin>

There will also be a file named "plugin.xml" in your project.

After all that, we have a functional basis for our plugin. There are definitely a lot of things to remember in this process, but they start to make a sort of sense the more you do it.

With that set, commit your changes, where you can see the moved Activator file and the addition of the Library stuff:

In the next post, we'll create the feature and update-site projects needed to actually install this plugin in Designer and Domino.

That Java Thing, Part 4: Creating the Plugin

Thu Nov 05 09:38:56 EST 2015

Tags: java xpages
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment

To make a basic XPages library, we'll need to create the trio of OSGi projects: the plugin, the feature, and the update site. For a long time, the XSP Starter Kit has been a great go-to starting point for this sort of thing. It definitely covers almost all of the potential ground, but it can be a bit overkill when you just want to put some classes in a shared place. So, for this exercise, we'll start from scratch.

But before we do that, we should create a local Git repository first. This isn't required, but it's a good idea, and Git repositories are so "cheap", technically, that there's no reason not to. There are a number of ways to do it - you can create and manage them via the command line, via a dedicated tool like SourceTree, or via the embedded Git client in Eclipse. We'll do the last one here.

To work with Git repositories, first add the Git Repositories view (Eclipse's "view" refers to the panes you see in the window) to your Eclipse UI by going to Windows → Show View → Other...:

Once you add it, you can click on "Create a new local Git repository" in the pane - I suggest creating a folder beneath the "git" folder in your home directory, for organizational purposes:

Now, on to creating an actual project. To do that, go to File → New → Project... and find "Plug-in Project" inside "Plug-in Development":

In the form that shows up after you hit Next, fill in some project details:

  • Set the project name to "com.example.xsp.plugin"
  • Override the location with a folder inside the Git repository you created earlier (make sure to include the plugin name in the path, rather than using the top level of the Git repo)
  • Change the source folder to "src/main/java" and output folder to "target/classes". These are nods towards Maven that aren't strictly required, but are a "may as well" thing.
  • Set the target platform to "an OSGi framework" → "Equinox"

On the next page, the only change needed is to set the Execution Environment to "JavaSE-1.6" (at least until Domino gets a newer JVM):

Then, click "Finish". It will ask you if you want to switch to the "Plug-in Development" perspective - "perspectives" are an Eclipse term for groupings+layouts of views for different purposes. You can choose either Yes or No, since the other perspective is very similar to the default J2EE perspective. If you choose "Yes", you'll have to re-add the Git Repositories view as above.

Once you've created the project, it's time to check it in to Git. Right-click on the Git repository in the Git Repositories view and choose "Commit...". The first time you do this, it will prompt you for a name and email address, which are pretty arbitrary, but it's good to keep them consistent across your Git presences. On the commit page, provide a useful message and click the "Select All" button (the middle one in the top right of the "Files" section) to include all of the newly-created files, and hit "Commit":

Now that the plugin's skeleton is in place, the next post will cover some details about it and how to turn it into an XSP Library.

That Java Thing, Part 3: Eclipse Prep

Wed Nov 04 04:26:58 EST 2015

Tags: java xpages
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment

Before you dive into OSGi development, you'll need a proper development environment, and that usually means Eclipse. Domino Designer, being Eclipse-based, can technically do the job and has the advantage of some of the setup being done, but it's so out-of-date at this point that it's not worth it (believe me, I tried originally). Newer Eclipse versions have improved noticeably, so it's best to grab a recent version. Currently, that means Mars, or Eclipse 4.5.1.

Before Eclipse, you'll need to install Java if you haven't already. To do that, go to Oracle's site and download the latest release for your platform. Once you install it and find out how many devices run Java, you can move on to installing Eclipse.

Head to the Eclipse download page and download the "Eclipse IDE for Java EE Developers" for your platform. They've recently added an Installer at the top of the page, and that works as well, but adds some new wrinkles that haven't made it into common knowledge yet. For now, you may want to stick with the "...or download an Eclipse Package" section. The "bitness" (32 or 64) doesn't matter too much - you should generally match the bitness of your OS.

Once you have Eclipse installed and running ("installation" generally involves dragging the extracted ZIP file content somewhere), the next step is to teach Eclipse about the XPages artifacts. If you're running on Windows and have a local Domino server, the XPages SDK may be your friend, though it hasn't been fully tested for Mars. If you are running on a non-Windows OS or otherwise don't want to go the full SDK route, there's a quick route that lacks the automation and debugger integration.

Setting up the environment used to be something of a hassle, but it became much simpler ever since IBM released the Update Site for Build Management. Primarily made for Maven-development use (sort of), this Update Site also serves as a quick stop to pick up all of the dependencies you need for XPages development in one place.

To make use of the Update Site, download the file from OpenNTF, extract it, and then extract the UpdateSite.zip file contained therein. You should end up with a folder like this:

In Eclipse, go to the preferences ("Window" → "Preferences" on Windows or "Eclipse" → "Preferences" on a Mac), then go to "Plug-in Development" → "Target Platform":

Edit the existing platform and, in the "Locations" tab of the ensuing dialog, click "Add..." → "Directory" → "Next", browse to that extracted UpdateSite folder, and click "Next" to verify that it finds a bunch of plugins, and then "Finish". Now, you'll be able to reference XPages platform plugins in your projects.

At this point, Eclipse is set up for the essentials, but it may still be a good idea to set up an appropriate JVM for Domino development to make sure you have a consistent baseline and to avoid problems where newer Java releases added classes methods not avilable in Java 6. Fortunately, on Windows (and Linux?), a Notes installation comes with a JVM that does the job nicely. To add it, go back to preferences, then "Java" → "Installed JREs" → "Add...". Choose "Standard VM" → "Next", browse to the "jvm" folder in your Notes (or Domino) installation directory, name it something appropriate, and then click Finish:

On the Mac, you can get a similar Java release from Apple, which nowadays will end up in /Library/Java/JavaVirtualMachines (older installers put it in /System/Library/Java/JavaVirtualMachines).

Now that Eclipse is set up, the next step will be to break ground on making an actual XPages plugin.

That Java Thing, Part 2: Intro to OSGi

Tue Nov 03 06:44:39 EST 2015

Tags: java xpages
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment

OSGi once stood for "Open Services Gateway initiative", but that name slid from "impossibly vague" to "entirely obsolete" rather quickly. For our needs, OSGi is a mechanism for bringing sanity to the "big pile of Jars" that you might otherwise have in a large Java system. It provides a standardized way to describe the name of a library, its version, its dependencies, its capabilities, and its interactions with other libraries. In this way, rather than just "commons-io-2.1.jar", you can conceptually deal with "I require org.apache.commons.io, version 2.0 or above" and the environment can suss out what that means based on what is installed.

As Domino developers, we care about a subset of the OSGi concepts, primarily those related to creating and loading plugins. There is a handful of vital terms involved, some of which I've already been tossing around:

  • JAR: a Java Archive file is a ZIP file aimed at Java, generally containing classes and other resources (text files, images, and so forth). There are variants like "War" (Web Application Archive) and "Ear" (Enterprise Archive), but they're all still ZIP files.
  • Bundle: A "bundle" is, in effect, OSGi's word for a JAR. It expands on a contained file called META-INF/MANIFEST.MF (so named because Java likes to be difficult sometimes) to define the description of the bundle: its name, version, and so forth.
  • Plug-in: This is basically the same thing as a bundle. Presumably, it technically refers to a specialized kind of bundle, but we can use the terms interchangeably. It also technically has that hyphen in it, but I often write it "plugin" anyway.
  • Feature: This excessively-vague term is OSGi's way of grouping plugins together into a single conceptual installable unit. For example, the main feature included in the Extension Library references ten distinct plugins.
  • Update Site: Update sites are to features what features are to plugins: a way to organize these features into a collection, either to install them all at once or to provide a pool of available software to install. Eclipse-the-IDE uses them extensively for the latter case. As Domino developers, we primarily use them to distribute libraries and install them into the server's Update Site NSF.
  • Target Platform: A target platform refers to the set of plugins available in a given environment, which is used to determine whether a given plugin has the dependencies it will need to run. In our case, the target platform is usually the XPages runtime environment, containing OSGi base plugins, Java servlet support plugins, and the XPages runtime plugins themselves (among others). Plugin development environments use these heavily.
  • Extension Point: An extension point is a way for a plugin to either declare that it can either provide or consume a given service. For example, you might have a plugin that says "I can make use of any plugin that provides a color" and then another that says "I can provide the color blue". By using extension points, the first plugin can find the other, even though it may have been developed entirely independently.
  • Java EE: Java Enterprise Edition (sometimes written J2EE thanks to Java's weird relationship with versions) is basically the standard "web server stuff" for Domino (though it covers other aspects). XPages is sort of like a Java EE environment, but the experience is distinct. Notably, a Java EE server is a general platform on which other web-dev toolkits - JSP, JSF, Vaadin, JAX-RS, and so forth - can run. Java EE isn't inherently related to OSGi, but the two eventually bleed into each other with Domino and WebSphere.

OSGi in general goes beyond that, and some other aspects may be useful down the line (such as interacting with the OSGi console), but for now it's enough to think of it as a set of decorations for your projects. With some of the basic terms in line, next I will move on to setting up an Eclipse environment to start development.

That Java Thing, Part 1: The Java Problem in the Community

Mon Nov 02 13:20:23 EST 2015

Tags: java xpages
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment

Java has been a bugbear for the Domino community for a long time. Though Notes and Domino have presented a top-to-bottom (mostly) Java environment for years, the monumental inertia of the corporate development community, the platform's tortured history of insufficiently-fleshed-out Java hooks, and IBM's "pay no attention to that man behind the curtain" pitch with regard to the language created an environment where Java is often something for "other developers".

XPages represented a tremendous leap forward for development on the platform, ditching the completely-unique bizarre forms-based web-dev experience in favor of something much closer to modern web development. At the time, the unstructured, SSJS-heavy façade presented by Designer made a sort of sense: the switch to XPages was already a huge change for Domino developers, and making XPage development look sort of like classic Notes/Domino development was a spoonful of sugar with the medicine.

But the critical flaw of XPages development has never been fixed: there's no smooth path from "hello, world" to a well-structured complex application. SSJS, as implemented in XPages, is unsuitable to the task, while Designer's presentation of the Java layer ranges from "non-obvious" to "hostile". Still, IBM did a good job of presenting the next major tier for XPages developers: writing Java first in the NSF and then in OSGi plugins. The problem has always been in figuring out how to get there.

This hurdle started out as an inconvenience just for those who wanted to figure out how the machine worked, but has grown into a significant problem for all Domino developers. While the XPages stack has remained relatively stagnant, the rest of the development world has raced forward, with new technologies giving rise to new frameworks and transforming the old. This is no discredit to the XPages team: they have consistently put in tremendous work across the board, but it's quite an industry to keep up with.

So we're in a tough spot now. The XPages platform isn't going away any time soon, but it's important for us as developers to progress. One option is to abandon ship entirely: pack up and move to some unrelated platform, leaving Domino behind entirely. But most of us, out of personal affection, professional acumen, or (primarily) business requirements, don't want to do that. Unfortunately, the path to improvement with Domino has only gotten more complicated with time. It began with learning about Java, OSGi, and Eclipse proper, but has since expanded to include a whole rogues gallery of other technologies: Maven, Tycho, m2e, Jenkins, Git, Wink/JAX-RS, jQuery, Bootstrap, Angular, and on and on.

There's no time like the present, though. I've walked this path, and I want to help everyone else walk it too. To kick that off, this series is going to cover the process of creating a basic XPages plugin, making it a little more complex, converting it to Maven, and building it with Jenkins.

To kick things off, the next post will provide an introduction to the concept of OSGi and the parts of it that we need to know for Domino development. Things may seem weird along the way, but trust me: it's worthwhile.

XPages Devs: Enable "Refresh entire application when design changes"

Mon Sep 14 11:48:46 EDT 2015

Tags: xpages java

When developing an XPages application of beyond-minimal complexity, you're likely to run into a problem where your app starts saying that a class is incompatible with itself in one way or another. The exception usually traces down to something like "foo.SomeClass is incompatible with foo.SomeClass" or "cannot assign instance of foo.SomeClass to field X..." where the field is that same class. This has cropped up since time immemorial.

It's actually, though, something that IBM sort-of fixed in 8.5.3 by adding an xsp.properties option of xsp.application.forcefullrefresh=true and then, in 9.0, a GUI option in Xsp Properties:

Basically, this checkbox amounts to "don't break my app periodically". From what I gather, the default behavior is in the interest of being clever with classloaders, but can lead to creeping problems in complicated apps, to the point where changing seemingly-innocuous things like the ACL breaks the app until you restart HTTP or "kick" the app by modifying certain design elements (namely, Java classes or faces-config.xml). Since that behavior is never desired, there is no reason to not check this box, and I enable it on every new NSF I create.

Dealing with OSGi Fragments in Tycho and Designer

Fri Sep 11 19:30:20 EDT 2015

Tags: java tycho osgi

This post is partly to spread information publicly and partly a useful note to my future self for the next time I run into this trouble.

In OGSi, the primary type of entity you're dealing with is a "Bundle" or "Plug-in" (the two terms are effectively the same for our needs). However, there's a sort of specialized type that you may run into called a "Fragment". They're similar to a plug-in in that they're a contained unit of Java code and resources, but they have the special property that they're attached to another plug-in and automatically come along for the ride when the main plug-in is used. This is useful in a couple situations, such as code organization, serving platform-specialized native libraries, after-the-fact additions, or providing library dependencies.

In the basic case, the only requirement is to specify in the fragment what the "parent" plug-in is (Eclipse provides a field for this in its editor) and then including the fragment in the installable feature alongside the plug-in. However, there are a few situations where a bit more work is required if you want to access the classes in the fragment: when used as part of a Tycho build and when used as an XSP Library in Designer (which may also apply to Eclipse dependency use generally).

Tycho

When doing a full Tycho build, even if both the plug-in and its fragment(s) are part of the current build, another project won't automatically include the fragment when doing the compilation. This can lead to a situation where the projects will compile cleanly in Eclipse (which handles the fragment attachment) but fail in Tycho. The trick, though small, is non-obvious: you have to tell the project that is using the fragment code about the fragment in its build.properties.

So say you have three projects: the main plug-in (some.main.plugin), a fragment attached to it (some.main.plugin.fragment), and the project consuming them (some.dependent.plugin). The normal first step is to include the main plug-in in the dependent plug-in's MANIFEST.MF as usual:

Require-Bundle: some.main.plugin

In Eclipse, this will suffice: both the main plug-in and its fragment will show up in the "Plug-in Dependencies" library. For Tycho, though, you have to tip it off using a line like this in build.properties:

extra.. = platform:/fragment/some.main.plugin.fragment

Think of this as saying "hey, dummy, don't forget about the fragment". Once you have that line, the Tycho-enabled Maven build should be able to resolve the fragment's classes and all will be well.

Designer

When using the plug-in and its fragment in an XSP Library in Designer, there's a similar-seeming problem: though Designer will include any direct dependencies of your Library plug-in in the class path, it won't pick up on any fragments by default (though it seems that Domino does). The trick here is that the primary plug-in has to tell Designer that it accepts fragments, which is done by setting Eclipse-ExtensibleAPI in the MANIFEST.MF file for some.main.plugin, like so:

Eclipse-ExtensibleAPI: true

Once that's in place, the fragment should start showing up in your NSF's classpath when the library is enabled.

Adding Components to an XPage Programmatically

Sun Jul 19 09:16:35 EDT 2015

Tags: xpages java

One of my favorite aspects of working with apps using my framework is the component binding capability. This lets me just write the main structure of the page and let the controller do the grunt work of creating fields with validators and converters. There's a lot of magic behind the scenes to make it happen, but the core concept of dynamic component creation is relatively straightforward.

An XPage is a tree of components, and those components are all Java objects on the back end, which can be manipulated and added or removed programmatically. To demonstrate, I'll start with this basic XPage:

<?xml version="1.0" encoding="UTF-8"?>
<xp:view xmlns:xp="http://www.ibm.com/xsp/core" beforePageLoad="#{controller.beforePageLoad}" afterPageLoad="#{controller.afterPageLoad}">
	<xp:div id="container">
	</xp:div>
</xp:view>

Now, I'll add a basic form table using the afterPageLoad method in the controller class:

package controller;

import javax.faces.component.UIComponent;
import javax.faces.context.FacesContext;

import com.ibm.xsp.component.UIViewRootEx2;
import com.ibm.xsp.component.xp.XspInputText;
import com.ibm.xsp.extlib.component.data.UIFormLayoutRow;
import com.ibm.xsp.extlib.component.data.UIFormTable;
import com.ibm.xsp.extlib.util.ExtLibUtil;
import com.ibm.xsp.util.FacesUtil;

import frostillicus.xsp.controller.BasicXPageController;

public class home extends BasicXPageController {
	private static final long serialVersionUID = 1L;

	@SuppressWarnings("unchecked")
	@Override
	public void afterPageLoad() throws Exception {
		super.afterPageLoad();

		UIViewRootEx2 view = (UIViewRootEx2)ExtLibUtil.resolveVariable(FacesContext.getCurrentInstance(), "view");
		UIComponent container = FacesUtil.findChildComponent(view, "container");

		UIFormTable formTable = new UIFormTable();
		formTable.setFormTitle("Some Form");
		formTable.setStyle("margin: 2em; width: 20em");
		container.getChildren().add(formTable);
		formTable.setParent(container);

		UIFormLayoutRow formRow = new UIFormLayoutRow();
		formRow.setLabel("Name");
		formTable.getChildren().add(formRow);
		formRow.setParent(formTable);

		XspInputText inputText = new XspInputText();
		formRow.getChildren().add(inputText);
		inputText.setParent(formRow);
	}
}

There are a few concepts to get a handle on here, but fortunately they're not as esoteric as other aspects of back-end XPages development.

To start out with, there's the question of how you're supposed to know what classes the components are. The best way to find this out is to create a basic XPage containing the control with an ID and then go to the Package Explorer view, then the "Local" source folder, and find the class file for your page in the "xsp" package. In there, you can see the code that actually generates the XPage (which is doing basically the same thing as we're doing here). Look for the ID you gave the control on the page and you can find the class behind it.

Next is the job of finding the parent control you're going to attach the new components to. In this case, I used the FacesUtil class to search for the component by its ID, because I know it's the only one on the page with that base ID. This tack will usually do the trick for you, but there are other ways to find it, such as binding or XspQuery.

Finally, there's the need to both add the new child to the parent's children list and to set the parent in the child itself. This is a bit of housekeeping boilerplate, but it has to be done.

Once you have this code running, you get a basic form (using the Bootstrap3.2.0 theme in this case):

Some Form

This can get much more in-depth: anything you can do declaratively in the XPage XML, you can do programmatically in Java. You can also manipulate existing components: change their properties, rearrange them, or remove them from the tree entirely. I've found knowledge of this to be both useful as a part of the toolbox as well as a way to really clear up the mental model of what's going on on the page.

Parsing JSON in XPages Applications

Thu May 21 12:47:53 EDT 2015

Tags: json java

David Leedy pointed out to me that a post I made last year about generating JSON in XPages left out a crucial bit of followup: reading that JSON back in. This topic is a bit simpler to start with, since there's really just one entrypoint: com.ibm.commons.util.io.json.JsonParser.fromJson(...).

There are a few variants of this method to provide either a callback to call during parsing or a List to fill with the result, but most of the time you're going to use the variants that take a String or Reader of JSON and convert it into a set of Java objects. As with generating JSON, the first parameter here is a JsonJavaFactory, and which static instance property you choose matters a bit. Contrary to the first-Google-result documentation, there are three types, and they differ slightly in the types of objects they output:

  • instance: This uses java.util.HashMap for JSON objects/maps and java.util.ArrayList for JSON arrays.
  • instanceEx: This is like instance, but uses JsonJavaObject for JSON objects/maps.
  • instanceEx2: Like instanceEx, this uses JsonJavaObject for objects/maps but also uses JsonArray for JSON arrays.

Since JsonJavaObject and JsonArray implement the normal Map<String, Object> and List<Object> interfaces you'd expect (and, indeed, subclass HashMap and ArrayList), you can treat them interchangeably if you're just using the interfaces like you should, but it may matter if you're doing something where you expect certain traits of one of the concrete classes or want to use the explicit getString, etc. methods on the JSON-specific ones.*

Anyway, with that out of the way, the actual code to use these is pretty straightforward:

String json = "{ \"foo\": 1, \"bar\": 2}";
Map<String, Object>result = (Map<String, Object>)JsonParser.fromJson(JsonJavaFactory.instance, json);

In this case, I'm immediately casting the result from Object to Map because I'm sure of the contents of the JSON. If you're less confident, you should surround it with instanceof tests before doing something like that. In any event, that's pretty much all there is to it. As with generating JSON, SSJS wraps this functionality in a fromJson method (which may or may not produce the same objects; I haven't checked).


* You could also subclass the standard as the code does if you have specific needs or desires, like using a LinkedHashMap instead of HashMap to preserve the order of the object's keys.

Quick PSA: LS2J Problems in 9.0.1 FP3

Tue May 19 14:25:11 EDT 2015

Tags: java

While I'm at it, I realized that it may be useful to further spread information about the problem that led to me fiddling with the latest IFs and JVM patches in the first place: the LS2J problems in 9.0.1 FP3. Specifically, it's borked. The main way most people have encountered this is via an exception dialog when attempting to add new plugins to an Update Site NSF, since that uses LS2J to accomplish its task - it displays several layers of stack and the upshot is that there's a java.lang.InternalError about calling a constructor.

The fix is to install the JVM patch from here; Interim Fix 3, while also worth an install, doesn't cover this.

There's another caveat about that, though, for those who have both Notes and Domino installed on the same machine. Since the installer for the JVM patch is the same for both, it will pick one of the two and not give you a choice to choose the other. In the case of the 64-bit patch (I don't know why it picks the client in that case), Ulrich Krause posted workaround steps. In my case, both were 32-bit, it found Domino first and not Notes, and the same commands didn't work. My ugly workaround was to fireup regedit, browse to (if I recall correctly) HKEY_LOCAL_MACHINE\SOFTWARE\IBM and rename the "Domino" folder/key to something else, run the installer, and then rename the folder/key back.

How I Use JAX-RS in the frostillic.us Framework

Fri May 01 17:59:14 EDT 2015

Tags: java rest

Inspired by Toby Samples's new blog series on JAX-RS in Domino, I'd like to share a description of how I made use of it to write the REST services in the frostillic.us Framework. This is not intended to be a from-scratch introduction - Toby is handling that well so far - but instead assumes a certain amount of knowledge with OSGi development and why you would want to do this in the first place.

The goal of my REST services is to provide an automatic REST/JSON API for any Framework model objects used in a database without having to include any servlet code in the database itself. It's a business-logic-friendly analogue to the Domino Access Services and "borrows" heavily from that code base. It doesn't use the DAS extension point, though, in large part because I didn't know that existed until recently. As far as I can tell, using that extension point saves you some bootstrapping work and makes it possible to enable/disable the service in the server config, but otherwise the work will likely be fairly similar.

Initial Setup

To get started, this will all have to take place in a plugin, unless it turns out there's a way to do it in-NSF. In this case, this made sense anyway, since I wanted the servlets to be available for everything. The first step was to make a stub class to act as the base of the servlet, even though it doesn't really do anything:

package frostillicus.xsp.model.servlet;

import javax.servlet.ServletException;
import com.ibm.domino.services.AbstractRestServlet;

public class ModelServlet extends AbstractRestServlet {
	private static final long serialVersionUID = 1L;

	public static ModelServlet instance;

	public ModelServlet() {
		instance = this;
	}

	@Override
	protected void doInit() throws ServletException {
		super.doInit();
	}
}

Once that class existed, I registered it in the plugin.xml as a servlet extension:

<extension id="frostillicus.xsp.model.Servlet" name="fmodelservlet" point="org.eclipse.equinox.http.registry.servlets">
	<servlet alias="/fmodel" class="frostillicus.xsp.model.servlet.ModelServlet">
		<init-param name="applicationConfigLocation" value="/WEB-INF/fmodelapplication"/>
		<init-param name="propertiesLocation" value="/WEB-INF/fmodelservlet.properties"/>
		<init-param name="DisableHttpMethodCheck" value="true"/>
	</servlet>
</extension>

In addition to that servlet class, it also references two text files to provide configuration for Wink, the JAX-RS implementation packaged with the Extension Library. The first is a list of resource classes to use in the servlet:

frostillicus.xsp.model.servlet.resources.ApiRootResource
frostillicus.xsp.model.servlet.resources.ManagersResource
frostillicus.xsp.model.servlet.resources.ManagerResource
frostillicus.xsp.model.servlet.resources.ModelResource

The second is a properties file with configuration options (I don't remember why this option is important):

wink.defaultUrisRelative=false

Implementing a Resource

The classes listed above are what receive a REST request (funneled through Wink/JAX-RS) and provide a response. As an example, here's the ManagerResource class:

package frostillicus.xsp.model.servlet.resources;

import java.io.IOException;
import java.net.URI;
import java.util.Map;
import java.util.HashMap;

import javax.ws.rs.Consumes;
import javax.ws.rs.GET;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import javax.ws.rs.PathParam;
import javax.ws.rs.Produces;
import javax.ws.rs.WebApplicationException;
import javax.ws.rs.core.Context;
import javax.ws.rs.core.MediaType;
import javax.ws.rs.core.Response;
import javax.ws.rs.core.Response.ResponseBuilder;
import javax.ws.rs.core.UriInfo;

import org.openntf.domino.Database;

import com.ibm.commons.util.io.json.JsonException;
import com.ibm.commons.util.io.json.JsonGenerator;
import com.ibm.commons.util.io.json.JsonJavaFactory;
import com.ibm.domino.commons.util.UriHelper;
import com.ibm.domino.das.utils.ErrorHelper;

import frostillicus.xsp.model.ModelManager;
import frostillicus.xsp.model.ModelObject;
import frostillicus.xsp.model.ModelUtils;
import frostillicus.xsp.util.FrameworkUtils;

@SuppressWarnings("unused")
@Path("{managerName}")
public class ManagerResource {

	@GET
	@Produces(MediaType.APPLICATION_JSON)
	public Response getManager(@Context final UriInfo uriInfo, @PathParam("managerName") final String managerName) {
		try {
			Map<String, Object> result = new HashMap<String, Object>();
			Database database = FrameworkUtils.getDatabase();
			if(database == null) {
				result.put("status", "error");
				result.put("message", "Must be run in the context of a database.");
			} else {
				Class<? extends ModelManager<?>> managerClass = ModelUtils.findModelManager(database, managerName);
				if(managerClass == null) {
					result.put("status", "failure");
					result.put("message", "No manager found for name '" + managerName + "'");
				} else {
					result.put("status", "success");
					result.put("managerClass", managerClass.getName());
				}
			}

			return ResourceUtils.createJSONResponse(result, false);
		} catch (Throwable e) {
			return ResourceUtils.createErrorResponse(e);
		}
	}

	@POST
	@Consumes(MediaType.APPLICATION_JSON)
	public Response createModel(final String requestEntity, @Context final UriInfo uriInfo, @PathParam("managerName") final String managerName) {

		Database database = FrameworkUtils.getDatabase();
		Class<? extends ModelManager<?>> managerClass = ModelUtils.findModelManager(database, managerName);
		if(managerClass == null) {
			return ErrorHelper.createErrorResponse("Manager '" + managerName + "' not found.", Response.Status.NOT_FOUND);
		}

		URI location;
		try {
			ModelManager<? extends ModelObject> manager = managerClass.newInstance();
			ModelObject model = manager.create();
			ResourceUtils.updateModelObject(requestEntity, model, false);

			location = UriHelper.appendPathSegment(uriInfo.getAbsolutePath(), model.getId());
		} catch(Throwable t) {
			return ResourceUtils.createErrorResponse(t);
		}

		ResponseBuilder builder = Response.created(location);
		Response response = builder.build();
		return response;
	}
}

There are quite a few concepts at work here, as well as tons of logic wrapped up in the referenced ResourceUtils and ModelUtils classes. The term "manager" in this class has no special meaning for JAX-RS or servlets - it's the term the Framework uses for the objects that provide access to models, like the "Posts" manager that maps requests for "all" to a back-end view named "Posts\All" and returning "Post" objects.

This is where things get hairy and diverge from basic servlet creation and head into Domino/XPages-specific eccentricities.

A Secret Double Life

The job of the model servlet is a bit strange, in that it doesn't want to just read document data from an NSF, but also should read and process Java classes. Framework model classes are defined in the NSF, not in plugins, and so the servlet has no real knowledge of what's inside the NSF, other than that it should look for classes that implement the appropriate interfaces.

When code is executing in an OSGi servlet context like this, it sits in a strange grey area. It's a better spot than agents - which have no knowledge of OSGi plugins or the XSP runtime - but it's not quite an XPages context, either, and there's no FacesContext available. Instead, a class called ContextInfo provides access to the session (running as the currently-authenticated Domino user) and the current database, if applicable. That "if applicable" comes in because an OSGi servlet can be accessed either as "http://foo.com/servletname" or as "http://foo.com/bar.nsf/servletname". I modified my utility class to paper over the difference between these two environments. The manager resource calls ModelUtils.findModelManager with this database context to try to find the requested manager. For example, if the request comes in as "/fmodel/Posts", it will search the database for a class or managed bean named "Posts".

This is where the ability to treat an NSF as a "bag of classes" comes in handy. It may be possible to do this another way, but I'm using the DatabaseClassLoader provided by ODA to perform searches on the Java classes contained in an NSF. For this purpose, the actual names and structure of the classes are irrelevant, only that they implement ModelManager. If such a class is found, then the servlet knows enough about it, thanks to the interface, to fetch collections and individual model objects as necessary.

Bits and Bobs

In addition to the trickery required to pull manager and model classes out of an NSF, there are also a number of other techniques and components, mostly lifted from the ExtLib, used to make this work.

The code makes heavy use of JsonWriter to produce JSON in a lightweight manner, rather than building up and then spitting out large blobs of JSON, which is particularly important with large amounts of data.

The AbstractDominoModel class needed a lot of reworking to exist in the not-quite-XPages environment. In an XSP context, it makes use of an inner DominoDocument object in order to be able to deal with file attachments more easily, but that doesn't work without the full context. Accordingly, it uses a holder class to paper over the difference between DominoDocument and "manually" accessing the ODA Document. Of particular note is the handling of rich text for export into the REST display. It uses the HTML converter class included in DominoUtils, which I assume uses the HTMLConvertItem C function underneath the hood. It then uses the converter object to output an HTML version of the rich-text item as well as URLs for any attachments.

There is a certain amount of number fiddling and off-by-one-error-prone work done to determine the first and last entries in a model collection to show. As with DAS, it supports both the "Range" header and "start" and "count" GET parameters.

I nabbed wholesale IBM's shim implementation of the PATCH method, which isn't included in the version of Wink shipped with the ExtLib (at least not when I wrote this). Ideally, PATCH would mean that the JSON provided to the server would only be used to update fields in-place (leaving any fields in the model not included in the JSON untouched), while PUT would replace the model object entirely (removing any model fields not present in the JSON). In reality, they both act as PATCH.

As with XPages access to model objects, the REST APIs work with the JPA annotations used in model objects for validation. The model objects check their context - in an XPages context, failed validations result in FacesMessages, while otherwise it throws a ConstraintViolationException. When this exception occurs in the servlet, the error-response method picks up on that and generates specialized JSON to provide an explanation of the failed constraints to the user.

So Yeah

If you haven't tried out JAX-RS servlets yet, don't let this list of caveats and complicated code daunt you. This specific case of working with model objects in a very generic way naturally leads to complicated code, and the annotation-based coding system of JAX-RS/Wink reduces the amount of code dramatically. None of my code has to deal with fetching HTTP requests or parsing query parameters into useful objects - the API does that for me. There's no doubt a good deal more I could have it do for me as well. This is a pretty clean way to write servlets and is absolutely the best way to write them when the code makes sense to exist in a plugin.

Quick Tip: Wrapping a lotus.domino.Document in a DominoDocument

Tue Mar 03 15:13:13 EST 2015

Tags: xpages java

In the XPages environment, there are two kinds of documents: the standard Document class you get from normal Domino API operations and DominoDocument (styled NotesXspDocument in SSJS). Many times, these can be treated roughly equivalently: DominoDocument has some (but not all) of the same methods that you're used to from Document, such as getItemValueString and replaceItemValue, and you can always get the inner Document object with getDocument(boolean applyChanges).

What's less common, though, is going the other direction: taking a Document object from some source (say, database.getDocumentByUNID) and wrapping it in a DominoDocument. This can be very useful, though, as the API on the latter is generally nicer, containing improved support for dealing with attachments, rich text and MIME, and even little-used stuff like getting JSON from an item directly. Take a gander at the API: com.ibm.xsp.model.domino.wrapped.DominoDocument.

It's non-obvious, though, how to make use of it from the Java side. For that, there are a couple wrap static methods at the bottom of the list. The methods are overflowing with parameters, but they align with many of the attributes used when specifying an <xp:dominoDocument/> data source and are mostly fine to leave as blanks. So if you have a given Document, you can wrap it like so:

Document notesDoc = database.getDocumentByUNID(someUNID);
DominoDocument domDoc = DominoDocument.wrap(
	database.getFilePath(), // databaseName
	notesDoc,               // Document
	null,                   // computeWithForm
	null,                   // concurrencyMode
	false,                  // allowDeletedDocs
	null,                   // saveLinksAs
	null                    // webQuerySaveAgent
);

Once you do that, you can make use of the improved API. You can also serialize these objects, though I believe you have to call restoreWrappedDocument on it after deserialization to use it again. I do this relatively frequently when I'm in a non-ODA project and want some nicer methods and better data-type support.

Factories in XPages

Wed Nov 12 18:27:14 EST 2014

Tags: java xpages

In my last post, I intimated that I wanted to write a post covering the concept of factories in XPages, or at least the specific kind in question. This is that post.

The term "factory" covers a number of meanings, and objects named this way crop up all over the place (ODA has one, for example). The kind I care about today are those defined in an XspContributor object in an XPages plugin. These factories are generally (exclusively, perhaps) used for the purpose of generating adapters: assistant objects that allow the framework to perform operations on object types that may have no knowledge of the framework at all.

The way this generally takes form is this: when the framework needs to perform a specialized task, it asks the application (that is to say, the Application object that controls the XPages app) for a list of factories it knows about that conform to a given interface:

FacesContext facesContext = FacesContext.getCurrentInstance();
ApplicationEx app = (ApplicationEx)facesContext.getApplication();
List<ComponentMapAdapterFactory> factories = app.findServices(COMPONENT_MAP_SERVICE_NAME);

Once it has this list, it loops through all of them and passes the object it's trying to adapt. If the factory is written to understand that type of object, it will return an adapter object; if not, it will return null:

ComponentMapAdapter adapter = null;
for(ComponentMapAdapterFactory fac : factories) {
	adapter = fac.createAdapter(object_);
	if(adapter != null) {
		break;
	}
}

The framework can then use that adapter to perform the action on the object, without either the framework or the object knowing anything about the other:

for(String propertyName : adapter.getPropertyNames()) {
	// ...
}

XSP internally uses this for a great many things. One use that I've run into (and butted heads with) is to create adapter objects for dealing with file attachments. The DominoDocument class has a bit of information about attachments, particularly in its AttachmentValueHolder inner class, but doesn't on its own handle the full job of dealing with file upload and download controls. During the processes of getting a list of files for a given field from the document and handling the "delete document" method, the XPages framework looks up appropriate factories to handle the data type.

The reason for this indirection is so that this sort of operation can work with arbitrary data: in an ideal world, the XPages framework doesn't care at all that anything it does is related to Domino, and similarly the Domino data model objects don't care at all that they're embedded in the XSP framework. By allowing the model-object author (IBM, in this case) to say "hey, when you want to get a list of file attachments for an object of this type, I can help", it decouples the operation cleanly (it's broken in this case, but the theory applies). When the framework needs to process an object, it loops through the adapter factory classes, asks each one if it can handle the object in question, and takes the adapter from the first one that returns non-null.

At first blush, this setup seems overly contrived. After all, isn't this what interfaces are for? Well, in many cases, yes, interfaces would do the job. However, sometimes it makes sense to add this extra layer of indirection. Say, for example, that you're adapting between two Java classes you don't control: you can't modify the framework to support a class you want, and you can't modify the class to conform to an interface the framework understands. In that case, an adapter factory is a perfect shim.

But in fact, I've even found it useful to adopt this structure when I control all of the code. When I was designing the model/component adapter in the frostillic.us Framework, I made the conscious decision to not tie the two sides (the controller and the model) together tightly. Instead, I wrote a pair of interfaces in the controller package: ComponentMapAdapterFactory and ComponentMapAdapter. This way, when the controller gets the order to create an input field for an object's "firstName" property, it loops through the list of ComponentMapAdapterFactorys to find one that fits. Over in the model package, I have an appropriate factory and adapter to handle my model framework.

I could have combined these two more tightly, but I enjoy the cleanliness this brings. I may not stick with my same model framework forever, and similarly the expectations of the controller class may change; because of this separation, it's clear where tweaks will need to be made. It also gives me the freedom to use the same component-building code with model objects I don't control, such as the adapter I wrote that pulls its configuration from Notes forms.


These types of factories are not something you're likely to run into during normal XPages development, but it may be useful to bear their existence in mind. The XPages framework is a great morass of moving parts, and so being able to chart the inner workings in your mental map one bit at a time can go a long way to mastering the platform.

Property Resolution in XPages EL

Tue Nov 11 11:24:21 EST 2014

Tags: xpages java

Reading Devin Olson's recent series on EL processing put me in the mood to refresh and fill out my knowledge of how that stuff actually happens.

A while back, I made a small foray of my own into explaining how property resolution in XPages EL works, one which I followed up with a mea culpa explaining that I had left out a few additional supported types. As happens frequently, that still didn't cover the full story. Before getting to what I mean by that, I'll step back to an overview of XPages EL in general.

Components of EL Processing

To my knowledge, there are three main conceptual pieces to the EL-resolution process. I'll use the EL #{foo.bar.baz[1]} as a common example.

  • The EL parser itself. This is what reads the EL above and determines that you want the 1-indexed property of the baz property of the bar property of the foo object. I don't know if there's a realistic way to override or extend this stock-EL behavior.
    • This does, though contain an extensible side path: BindingFactory. This lets you create your own processors for value and method bindings based on the EL prefix, in the same vein as #{javascript: ... }.
  • The VariableResolver. This is a relatively-common bit of XPages extensibility and for good reason: they're quite useful. The variable resolver is what is used by EL (and SSJS, and others) to determine what object is referenced by foo in the example.
  • The PropertyResolver. This is the companion to the VariableResolver and is what handles the rest of the dereferencing in the example. EL asks the app's property resolver to find the bar property of foo, then the baz property of that, and then the 1 indexed property of that. This is the main topic of conversation today.

Setting An Application's PropertyResolver

There are two main ways I know of to make use of property resolvers, and the first is analogous to the VariableResolver: you write a single object and specify it in your faces-config file, like so:

<application>
	<property-resolver>config.PropResolver</property-resolver>
</application>

Then the skeletal implementation of such an object looks like this:

package config;

import javax.faces.el.EvaluationException;
import javax.faces.el.PropertyNotFoundException;
import javax.faces.el.PropertyResolver;

public class PropResolver extends PropertyResolver {
	private final PropertyResolver delegate_;

	public PropResolver(PropertyResolver delegate) {
		delegate_ = delegate;
	}

	@Override
	public Class<?> getType(Object obj, Object property) throws EvaluationException, PropertyNotFoundException {
		return delegate_.getType(obj, property);
	}

	@Override
	public Class<?> getType(Object obj, int index) throws EvaluationException, PropertyNotFoundException {
		return delegate_.getType(obj, index);
	}

	@Override
	public Object getValue(Object obj, Object property) throws EvaluationException, PropertyNotFoundException {
		return delegate_.getValue(obj, property);
	}

	@Override
	public Object getValue(Object obj, int index) throws EvaluationException, PropertyNotFoundException {
		return delegate_.getValue(obj, index);
	}

	@Override
	public boolean isReadOnly(Object obj, Object property) throws EvaluationException, PropertyNotFoundException {
		return delegate_.isReadOnly(obj, property);
	}

	@Override
	public boolean isReadOnly(Object obj, int index) throws EvaluationException, PropertyNotFoundException {
		return delegate_.isReadOnly(obj, index);
	}

	@Override
	public void setValue(Object obj, Object property, Object value) throws EvaluationException, PropertyNotFoundException {
		delegate_.setValue(obj, property, value);
	}

	@Override
	public void setValue(Object obj, int index, Object value) throws EvaluationException, PropertyNotFoundException {
		delegate_.setValue(obj, index, value);
	}
}

If you've done much with DataObjects, you'll likely immediately recognize those methods: I imagine that DataObject was created as a simplest-possible implementation of a PropertyResolver-friendly interface.

So how might you use this? Well, most of the time, you probably shouldn't - in my experience, the standard property resolver is sufficient and using DataObject in custom objects is easy enough that it's the best path. Still, you could use this to patch the behavior that is driving Devin to madness or to paint over other persistent annoyances. For example, DominoViewEntry contains hard-coded properties for accessing the entry's Note ID, but not its cluster-friendly Universal ID. To fix this, you could override the non-indexed getValue method like so:

public Object getValue(Object obj, Object property) throws EvaluationException, PropertyNotFoundException {
	if (obj instanceof DominoViewEntry && "documentId".equals(property)) {
		return ((DominoViewEntry) obj).getUniversalID();
	}
	return delegate_.getValue(obj, property);
}

Now, anywhere where you have a DominoViewEntry, you can use #{viewEntry.documentId} to get the UNID. You could do the same for DominoDocument as well, if you were so inclined. You'll just have to plan to never have a column or field named "documentId" (much like you currently have to avoid "openPageURL", "columnIndentLevel", "childCount", "noteID", "selected", "responseLevel", "responseCount", and "id").

Property Resolver Factories

The other way to use PropertyResolvers is to register and use a PropertyResolverFactory. Unlike the faces-config approach, these do not override (all of) the default behavior, but are instead looked up by IBM's PropertyResolver implementation at a point during its attempts at property resolution. Specifically, that point is after support for ResourceBundle, ViewRowData, and DataObject and before delegation to Sun's stock resolver (which handles Maps, Lists, arrays, and generic POJOs).

If you get a type hierarchy, you can see that IBM uses this route for lotus.domino.Document, com.ibm.commons.util.io.json.JsonObject, and com.ibm.jscript.types.FBSObject (SSJS object) support. So the idea of this route is that you'd have your own custom object type which doesn't implement any of the aforementioned interfaces and for which you want to provide EL support beyond the normal getter/setter support. Normally, this is not something worth doing, but I could see it being useful if you have a third-party class you want to work in, such as a non-Domino/JDBC data source.

The method for actually using one of these is... counterintuitive, but is something you may have run into in plugin development. The first step is simple enough: implement the factory:

package config;

import javax.faces.el.PropertyResolver;
import com.ibm.xsp.el.PropertyResolverFactory;

public class PropResolverFactory implements PropertyResolverFactory {
	public PropertyResolver getPropertyResolver(Object obj) {
		return null;
	}
}

That getPropertyResolver method's job is to check to see if the object is one of the types it supports and, if it is, return a PropertyResolver object (the same kind as above) that will allow the primary resolver to get the property.

Actually registering the factory is weirder. It must be done via a plugin (or by code that manually registers it in the application when needed), and the best way to see what I mean is to take a look at an example: the OpenntfDominoXspContributor class used by the OpenNTF Domino API. The contributor is registered in the plugin.xml and returns an array of arrays (because Java doesn't have tuples or map literals) representing a unique name for your factory plus the implementing class.

This concept of factories actually probably warrants its own blog post down the line. For the time being, the upshot is that this approach is appropriate if you're adding your own data type via a plugin (and which wouldn't be better-suited to implement DataObject), so it's a rare use case indeed.

Generating JSON in XPages Applications

Thu Sep 18 17:55:44 EDT 2014

Tags: json java

This topic is fairly well-trodden ground, but there's no harm in trodding it some more: methods of producing JSON in the XPages environment. Specifically, this will be primarily about the IBM Commons JSON classes, found in com.ibm.commons.util.io.json. The reason for that choice is just that they ship with Domino - other tools (like Gson) are great too, and in some ways better.

Before I go further, I'd like to reiterate a point I made before:

Never, ever, ever generate code without proper escaping.

This goes for any executable, markup, or data language like this. It's tempting to generate XML or JSON in Domino views, but formula language lacks proper escape functions and so, unless you are prepared to study the specs for all edge cases (or escape every character), don't do it.

So anyway, back to the JSON generation. To my knowledge, there are three main ways to generate JSON via the Commons libraries: a single call to process an existing Java object, by building a JsonJavaObject directly, and by "streaming" the code bit by bit. I've found the first and last methods to be useful, but there's nothing inherently wrong about the middle one.

Processing an existing object

With this route, you use a class called JsonGenerator to process an existing JSON-compatible object (usually a List or Map) into a String after building the object via some other mechanism. In a simple example, it looks like this:

Map<String, Object> foo = new HashMap<String, Object>();
foo.put("bar", "baz");
foo.put("ness", 1);
return JsonGenerator.toJson(JsonJavaFactory.instance, foo);

Overall, it's fairly straightforward: create your Map the normal way (or get it from some library), and then pass it to the JsonGenerator. Becuase it's Java, it forces you to also pass in the type of generator you want to use, and that's JsonJavaFactory's role. There are several instance objects that seem to vary primarily in how much they rely on the other Commons JSON classes in their implementation, and I have no idea what the performance or other characteristics are like. instance is fine.

Building a JsonJavaObject

An alternate route is to use the JsonJavaObject directly and then toString it at the end. This is very similar in structure to the previous example (because JsonJavaObject inherits from HashMap<String, Object> directly, for some reason):

JsonJavaObject foo = new JsonJavaObject();
foo.put("bar", "baz");
foo.put("ness", 1);
return foo.toString();

The result is effectively the same. So why do I avoid this method? Mostly for flexibility reasons. Conceptually, I don't like assuming that the data is going to end up as a JSON string right from the start unless there's a compelling reason to do so, and so it's strange to use JsonJavaObject right out of the gate. If your data starts coming from another source that returns a Map, you'll need to adjust your code to account for it, whereas that is less the case in the first case.

Still, it's no big problem if you use this. Presumably, it will be slightly faster than the first method, and it's often functionally identical anyway (considering it is a Map<String, Object>).

Streaming

This one is the weird one, but it will be familiar if you've ever written a renderer (stay tuned to NotesIn9 for an example). Rather than constructing the entire object, you push out bits of the JSON code as you go. There are a couple reasons you might do this: when you want to keep memory/processor use low when dealing with large objects, when you want to make your algorithm recursive without, again, using up too much memory, or when you're actually streaming the result to the client. It's the ugliest method, but it's potentially the fastest and most efficient. This is what you want to use if you're writing out a very large collection (say, filtered view data) in an XAgent or other servlet.

I'll leave out an example of using it in a streaming context for now, but if you're curious you can find examples in the DAS code in the Extension Library or in the REST API code for the frostillic.us model objects (which is derived wholesale from DAS).

The key object here is JsonWriter, which is found in the com.ibm.commons.util.io.json.util sub-package. Much like with other Writers in Java, you hook this up to a further destination - in this example, I'll use a StringWriter, which is a basic way to write into a String in memory and return that. In other situations, this would likely be the ServletOutputStream. Here's an example of it in action:

StringWriter out = new StringWriter();
JsonWriter writer = new JsonWriter(out, false);

writer.startObject();

writer.startProperty("bar");
writer.outStringLiteral("baz");
writer.endProperty();

writer.startProperty("ness");
writer.outIntLiteral(1);
writer.endProperty();

Map<String, Object> foo = new HashMap<String, Object>();
foo.put("bar", "baz");
foo.put("ness", 1);
writer.startProperty("foo");
writer.outObject(foo);
writer.endProperty();

writer.endObject();

writer.flush();
return out.toString();

As you can tell, the LOC count ballooned fast. But you can also tell that it makes a kind of sense: you're doing the same thing, but "manually" starting and ending each element (there are also constructs for arrays, booleans, etc.). This is very similar to writing out HTML/XML using equivalent libraries. And it's good to know that there's always a fallback to output a full object in the style of the first example when it's appropriate. For example, you might be outputting data from a large view - many entries, but each entry is fairly simple. In that case, you'd use this writer to handle the array structure, but build a Map for each entry and add that to keep the code simpler and more obvious.


So none of these are right in all cases (and sometimes you'll just do toJson(...) in SSJS), but they're all good to know. Most of the time, the choice will be between the first and the last: the easy-to-use, just-do-what-I-mean one and the cumbersome, really-crank-out-performance one.

A Centralized Bean for Translation

Mon Sep 01 16:54:59 EDT 2014

Tags: java

The normal method for doing translation in XPages is by using the built-in Designer tooling, which creates properties files for each language for each XPage in your app. This is okay, though it requires using Designer to update the properties (since apparently the translation happens as part of the build process). For the refresh of my blog I'm working on, I'm taking a different tack, inspired by the way a client does it and similar to how I do it in the Framework.

Specifically, I have a centralized bean named "translation" that accepts strings and returns a best-match translation for the current session's locale. This "best match" is the routine that the XSP framework uses for getting resource bundles. So, taking my browser as an example, requesting the bundle named "translation" will cause it to search the classpath for these files until it finds one:

  1. translation_en_US.properties
  2. translation_en.properties
  3. translation.properties
  4. translation_fr.properties

(I'm not sure why it uses translation.properties before translation_fr.properties - maybe it assumes that it's the English strings when there's no locale code on the file).

So I take advantage of this by creating a DataObject bean to do my translation:

package config;

import java.io.IOException;
import java.io.Serializable;
import java.util.HashMap;
import java.util.Map;
import java.util.MissingResourceException;
import java.util.ResourceBundle;

import javax.faces.context.FacesContext;

import com.ibm.xsp.application.ApplicationEx;
import com.ibm.xsp.designer.context.XSPContext;
import com.ibm.xsp.model.DataObject;

import frostillicus.xsp.bean.SessionScoped;
import frostillicus.xsp.bean.ManagedBean;
import frostillicus.xsp.util.FrameworkUtils;

@ManagedBean(name="translation")
@SessionScoped
public class Translation implements Serializable, DataObject {
	private static final long serialVersionUID = 1L;

	public static Translation get() {
		Translation existing = (Translation)FrameworkUtils.resolveVariable(Translation.class.getAnnotation(ManagedBean.class).name());
		return existing == null ? new Translation() : existing;
	}

	private transient ResourceBundle bundle_;
	private Map<Object, String> cache_ = new HashMap<Object, String>();

	public Class<String> getType(final Object key) {
		return String.class;
	}

	public String getValue(final Object key) {
		if(!cache_.containsKey(key)) {
			try {
				ResourceBundle bundle = getTranslationBundle();
				cache_.put(key, bundle.getString(String.valueOf(key)));
			} catch(IOException ioe) {
				throw new RuntimeException(ioe);
			} catch(MissingResourceException mre) {
				cache_.put(key, "[Untranslated " + key + "]");
			}
		}
		return cache_.get(key);
	}

	public boolean isReadOnly(final Object key) {
		return true;
	}

	public void setValue(final Object key, final Object value) {
		throw new UnsupportedOperationException();
	}


	private ResourceBundle getTranslationBundle() throws IOException {
		if(bundle_ == null) {
			FacesContext facesContext = FacesContext.getCurrentInstance();
			ApplicationEx app = (ApplicationEx)facesContext.getApplication();
			bundle_ = app.getResourceBundle("translation", XSPContext.getXSPContext(facesContext).getLocale());
		}
		return bundle_;
	}
}

This uses the Framework's annotation-based managed-bean declaration, but it'd work just fine in faces-config. This allows you to use EL to request a translation, such as #{translation.home}, #{translation['home']}, or #{translation[someVar.prop]}, or by using translation.getValue(...) in Java or JavaScript.

I've found this approach to be much easier to work with. There's only one central file to manage (you could split it up into multiple beans), you don't have to re-generate files for new languages, and the keys can be natural and human-friendly, instead of XPaths.

In addition, you could easily change this bean to get its translation information elsewhere without modifying the XPages that use it - it could look up, for example, against Domino documents to allow non-developers to change the translation values without any special rights (though you'd likely want to tweak the cache in that case).

How I Learned to Stop Worrying and Love C Structs

Mon Aug 04 22:38:04 EDT 2014

Tags: c java

Since a large amount of on-site client time has put my framework work on hold for a bit, I figured I'd continue my dalliance in the world of raw item data.

Specifically, what I've been doing is filling out the collection of structs with Java wrappers, particularly in the "cd" subpackage. For someone like me who has only done bits and pieces of C over the years, this is a fruitful experience, particularly since I'm dealing with just the core concept of the Notes API data structures without the messy business of actually worrying about memory or crashing programs.

If you're not familiar with structs, what they are is effectively just the data portion of a class. They're like a blueprint of related pieces of data - ints, doubles, other structs, etc. - used both for creating new elements and for a contract for the type of data received from an API. The latter is what I'm dealing with: since the raw item data in DXL is presented as just a series of bytes, the struct documentation is vital in figuring out what to expect in which spot. Here is a representative example of a struct declaration from the Notes API, one which includes bonus concepts:

typedef struct{
	LSIG	Header;		/* Signature and Length */
	WORD 	FileExtLen;		/* Length of file extenstion */
	DWORD	FileDataSize;	/* Size (in bytes) of the file data */
	DWORD	SegCount;		/* Number of CDFILESEGMENT records expected to follow */
	DWORD	Flags;		/* Flags (currently unused) */
	DWORD	Reserved;		/* Reserved for future use */
	/*	Variable length string follows (not null terminated).
		This string is the file extension for the file. */
} CDFILEHEADER;

So... that's a thing, isn't it? It represents the start of a File Resource (or CSS file, Java class, XPage, etc.). I'll start from the top:

  1. LSIG Header: An "LSIG" is actually another struct, but a smaller one: its job is to let a processor (like my code) identify the following record, as well as providing the total size of the record. When processing the byte stream, my code looks to the first two bytes of each record to determine what to expect for the next block.
  2. WORD FileExtLen: To start with, "WORD" here is just an alias for an "unsigned short" - a 16-bit integer ranging from 0 to 65535 (64K). The term "word" itself refers to the computer-architecture term. This field specifically denotes the number of bytes to expect after the record for the file extension of the file resource being defined.
    • The "unsigned" above refers to the way the numbers are stored in memory, which is to say a series of binary digits. By default, the highest-value bit is reserved for the "sign" - a 0 for positive and a 1 for negative. Normal "signed" numbers can store positive values only half the size of their unsigned counterparts, because going from "0111 1111" (+127) wraps around to "1000 0000" (-127). This is why the "half" values - 32K, 2G - are seen as commonly as their "full" counterparts. As to why negative numbers are represented with all those zeros, you'll either have to read up independently or just trust me for now.
  3. DWORD FileDataSize: A "DWORD" is double the size of a "WORD" (hence the "D", presumably): it's an unsigned 32-bit number, ranging from 0 to 4,294,967,295 (4G). This number represents the total size of the file attachment, and so also presumably represents the maximum size of a file resource in Domino.
  4. DWORD SegCount: This indicates the number of "segment" records expected to follow. Segment records are similar to this header we're looking at, but contain the actual file data, split across multiple chunks and across multiple items. That "multiple items" bit is due to the overall structure of rich-text items, and is why you'll often see rich text item names (like "Body" or "$FileData") repeated many times within a note.
  5. DWORD Flags: As the comment in the API indicates, this field is currently unused. However, it's representative of something that is used very commonly in other structures: a set of bits that isn't useful as a number, but instead has values set to correspond to traits of the entity in question. This is referred to as a bit field and is an efficient way to store a fixed number of flags. So if you had a four-bit flags field for a text item, the bits may indicate whether it's summary data, a names field, a readers field, and/or an authors field - for example "1100" for a field that's summary and names, but not readers or authors. That is, incidentally, basically how those flags are implemented, albeit with a larger bit field.
    • There is a secondary type of "flag" in Notes: the "$Flags" and "$FlagsExt" fields in design elements. Those flags are less efficient - 8 bits per flag instead of 1 - but are generally more extensible and somewhat conceptually easier.
    • These bit fields are what those oddball bitwise operators are for, by the way.
  6. DWORD Reserved: Many (most?) C API data structures contain at least one block of bits like this that is reserved for future use. These are so that IBM can add features without breaking all existing code: API users are expected to leave anything there intact and to create new structures with all zeros. You can see this in action in a couple places, such as the addition of rudimentary theme support to legacy design elements, which used up a byte out of the previously-reserved block.
  7. Variable data: This is the fun part. Many structures contain "variable" data following the "fixed" portion. Whereas the previous parts are a predictable size - a DWORD will always be four bytes - the parts following the block can be anywhere from 0 bytes to whatever is the max value of their referring entity. Fortunately, the "SIG" at the beginning of the record tells the API user the length ahead of time, so it's not required to read it all in when dealing with an API entity. Still, the code to read this can be complex and bug-prone, particularly when there are multiple variable parts packed together. In this case, though, it's relatively simple: we get the value of "FileExtLen" and read that many bytes into an array and convert that from LMBCS to a respectable Unicode string.

Dealing with a stream of structs is... awkward at first, particularly when you're used to Java amenities like being able to just pour out and read back in serialized objects without even thinking twice about it. After a while, though, it gets easier, and you get an appreciation for a lower-level type of programming. And while you still may not be happy about Domino's 32K summary-data limit, you at least get an understanding for why it's there.

So if you're in a position to dive into this sort of thing once in a while, I recommend you do so. Though the value of what I'm writing specifically is mixed - I doubt the world needs, say, a Java wrapper for a CD record reflecting a DECS field association - the benefit to my brain is immense. Dealing with web and Java programming can cause you to become very disconnected from the fundamentals of programming, and something like this can bring you back to solid ground. Give it a shot!

Building an App with the frostillic.us Framework, Part 6

Wed Jul 23 14:46:54 EDT 2014

Tags: java
  1. Building an App with the frostillic.us Framework, Part 1
  2. Building an App with the frostillic.us Framework, Part 2
  3. Building an App with the frostillic.us Framework, Part 3
  4. Building an App with the frostillic.us Framework, Part 4
  5. Building an App with the frostillic.us Framework, Part 5
  6. Building an App with the frostillic.us Framework, Part 6
  7. Building an App with the frostillic.us Framework, Part 7
  1. Define the data model
  2. Create the view and add it to an XPage
  3. Create the editing page
  4. Add validation and translation to the model
  5. Add notification to the model
  6. Add sorting to the view
  7. Basic servlet
  8. REST with Angular.js

Add sorting to the view

This step in the framework example is very similar to the second in that there's nothing particularly Framework-y about it.

Technically, the work in the title of this step is already done: I had already enabled click-to-sort and set sortable="true" in the XSP source. That on its own is enough to, as with a normal view, enable both available sort directions in the view panel:

So... that was easy! While we're at it, we'll go back in and add a categorized column. This, too, has no gotchas:

In my case, Designer gave this category a programmatic name of "$4", so we'll use that in the modified XSP for the view panel:

<xp:viewPanel value="#{Notes.all}" pageName="/note.xsp">
	<xp:this.facets>
		<xp:pager xp:key="headerPager" id="pager1" partialRefresh="true" layout="Previous Group Next" />
	</xp:this.facets>
	
	<xp:viewColumn columnName="$4"/>
	<xp:viewColumn columnName="Title" displayAs="link">
		<xp:viewColumnHeader value="Title" sortable="true"/>
	</xp:viewColumn>
	<xp:viewColumn columnName="Body">
		<xp:viewColumnHeader value="Body"/>
	</xp:viewColumn>
</xp:viewPanel>

This gives us the categorized column we expect, including the ability to collapse the categories like normal:

And, well, that's it too. Because of my extensive use of view features on the back end, Framework collections inherit the same features and (most of) the performance of the existing view data sources. That's a lot of the point: taking these model collections and using them in the ways you're used to using views in XPages apps is meant to be easy and not something you have to think too much about.

Generating Toaster, dGrowl, etc. Notifications From Server Code

Tue Jul 22 12:30:32 EDT 2014

In yesterday's Framework-series post, I offhandedly mentioned that the "flashMessage" routine I use could easily be paired with or replaced by "toaster"-style notifications. This turned out to be very coincidental, as just today a NotesIn9 episode went up featuring Brad Balassaitis talking about using dGrowl for this purpose, along with an accompanying blog post.

Whenever I've used something like this, I've run into situations where I want to generate the message from the server, say as part of a partial refresh event to save an object. One option in that situation is to attach JavaScript to the "onComplete" and "onError" properties of the event handler (the latter of which is something I should do more regularly), but that doesn't give you access to all the information on the server that you may want to use in the message.

What I really want to do is send customized JavaScript code from the server back to the client, just for that event. Fortunately, there's an out-of-the-way method in the XPages stack to allow you to do this: UIViewRootEx#postScript. Though the documentation on that page is copious and detailed, it may not quite give you an indication of what that method does. What it does is to take a string of (client) JavaScript code that is then wrapped up and included in the currently-being-generated response to the browser, and which will be executed at the end of the current event (the partial refresh or, probably, initial page load). You can get to the view root object by resolving the "view" variable and casting it to an appropriate class:

UIViewRootEx2 view = (UIViewRootEx2)ExtLibUtil.resolveVariable(FacesContext.getCurrentInstance(), "view");

So I've used this in a couple situations, and here's an example of a method I have in my utils class for a current project. The project uses a WrapBootstrap theme that came with Gritter, which is a jQuery plugin for a similar effect, but it could be adapted for either Dojo module.

public static void toaster(final String summary, final String detail, final boolean sticky, final String styleClass) {
	StringBuilder result = new StringBuilder();
	result.append("jQuery.gritter.add({\n");
	if(StringUtil.isNotEmpty(summary)) {
		result.append("\ttitle: ");
		JSUtil.addString(result, summary);
		result.append(",\n");
	}
	if(StringUtil.isNotEmpty(detail)) {
		result.append("\ttext: ");
		JSUtil.addString(result, detail);
		result.append(",\n");
	}
	result.append("\tsticky: " + sticky + ",\n");
	
	String effectiveStyleClass = (styleClass == null ? "gritter-info" : styleClass);
	result.append("\tclass_name: ");
	JSUtil.addString(result, effectiveStyleClass);
	result.append("\n");
	
	result.append("})");

	FrameworkUtils.getViewRoot().postScript(result.toString());
}

So, first off: yikes. This is an example of why Java is awful at string handling and generating other languages. However, this ugliness is in service of a vital lesson that goes beyond this task:

Never, ever, ever generate code without proper escaping.

This is why it's so problematic to generate JSON or XML via view columns: formula language lacks escaping functions for those languages, and most code I've seen neglects to include its own. If you don't escape strings you're adding into another language, you're asking for parsing bugs at best and code-injection attacks at worst.

But back to the code at hand. I'm using the com.ibm.xsp.util.JSUtil class that comes with the XPages framework to assist in writing out JavaScript code. The result is that you can write something like this in Java:

toaster("Some alert!", "These are \"alert\" details", false, "gritter-error");

...and end up with code like this sent to the browser:

jQuery.gritter.add({
	title: "Some alert!",
	text: "These are \"alert\" details",
	sticky: false,
	class_name: "gritter-error"
})

In my example from yesterday, if I were using a partial refresh instead of a page redirection, I could use this method to alert the user of a successful save.

Because my model framework publishes FacesMessages for event conditions in a Faces environment, I've also written some companion methods to translate them into a toaster events and to drain the queue, useful during a partial refresh:

public static void toastMessage(final FacesMessage message) {
	String styleClass = null;
	if(FacesMessage.SEVERITY_ERROR.equals(message.getSeverity())) {
		styleClass = "gritter-error";
	} else if(FacesMessage.SEVERITY_FATAL.equals(message.getSeverity())) {
		styleClass = "gritter-error";
	} else if(FacesMessage.SEVERITY_INFO.equals(message.getSeverity())) {
		styleClass = "gritter-info";
	} else if(FacesMessage.SEVERITY_WARN.equals(message.getSeverity())) {
		styleClass = "gritter-warning";
	} else {
		styleClass = "gritter-success";
	}
	toaster(message.getSummary(), message.getDetail(), false, styleClass);
}
@SuppressWarnings("unchecked")
public static void toastMessages() {
	Iterator<FacesMessage> messages = FacesContext.getCurrentInstance().getMessages();
	while(messages.hasNext()) {
		FacesMessage message = messages.next();
		toastMessage(message);
	}
}

This sort of thing can be a useful way to bridge the client and server sides of the app without giving up client UI niceties.

Building an App with the frostillic.us Framework, Part 5

Mon Jul 21 15:46:09 EDT 2014

Tags: java
  1. Building an App with the frostillic.us Framework, Part 1
  2. Building an App with the frostillic.us Framework, Part 2
  3. Building an App with the frostillic.us Framework, Part 3
  4. Building an App with the frostillic.us Framework, Part 4
  5. Building an App with the frostillic.us Framework, Part 5
  6. Building an App with the frostillic.us Framework, Part 6
  7. Building an App with the frostillic.us Framework, Part 7
  1. Define the data model
  2. Create the view and add it to an XPage
  3. Create the editing page
  4. Add validation and translation to the model
  5. Add notification to the model
  6. Add sorting to the view
  7. Basic servlet
  8. REST with Angular.js

Add notification to the model

The next step in building our Framework app is a simple one: adding some basic visual notification when user saves a note. We'll leave the navigation rule to send the user to the home page in place and use flashScope (included in the Framework) and a "messages" custom control included in the scaffolding DB to add a message.

There are a few places where I could add this code, primarily either the model or the controller. Since it makes sense in this context, I'll add it to the model, via this additional code in the model.Note class:

@Override
protected void postSave() {
	super.postSave();

	FrameworkUtils.flashMessage("confirmation", "Note saved successfully.");
}

The flashMessage method in FrameworkUtils is a convenience method for exactly this sort of thing. It adds the message string to a List in the flashScope's property with the provided first parameter plus "Messages". So, in this case, #{flashScope.confirmationMessages} ends up containing ["Note saved successfully."].

These values are read by the xc:messages control included in the standard xc:layout control in the scaffolding DB. It outputs both the normal xp:messages notifications as well as these flash-scoped ones - currently, the classes for the messages are hard-coded in the control and should be tweaked for use with OneUI, Bootstrap, or another UI framework, but eventually I'll shift those out to themes. Since I'm using OneUIv3 for this demo, I set the classes in the control to the appropriate values. The result is a small message that appears on the first load of a subsequent page and disappears thereafter:

Fancy? No, but it doesn't need to be. This same postSave hook could be used to do much more complex things whenever a note is saved in any context, such as sending a notification email, writing a log entry, or Toaster notification (forgive the missing images on that link).

How I Deal With Collections in the Framework

Sun Jul 20 10:30:13 EDT 2014

Tags: java

A post on Christian Güdemann's blog and a followup on Nathan Freeman's made me figure it could be useful to discuss how I deal with the job of working with document collections in the frostillic.us Framework.

First of all: I am not solving the same original problem Christian had. His post discussed strategies for actually opening and processing every document and secondarily building a collection using inadvisable selection formulas for views (namely, @Today). I don't generally do that stuff, so our paths diverge.

The collection code in the Framework is very focused on dancing on top of existing view indexes: using them for collecting documents, for sorting/categorization, and accessing summary data. In many ways, the main Framework collections can be thought of as a re-implementation of the Domino view data source in XPages, except without the same cheating they do and with some side benefits.

Core

The core essence of a Framework collection - a DominoModelList - is that it stores information about how to access the underlying view, which category filter to use, and which model class to use to create objects. This can be seen in its constructor; it grabs important metadata information about the view and that's about it. It's not until it's required that the list does the dirty work of actually fetching a ViewNavigator (or re-using one in the current request). As much as possible, I aim to use ViewNavigators - which is to say, whenever there's not an FT search involved. Navigators are (for Domino) highly efficient, particularly compared to the shockingly-bad performance characteristics of ViewEntryCollection.

The primary consumer of the navigator is the get(int) method (among other things, collections implement List). If you'll kindly ignore some workaround code for a bug I haven't reliably been able to pin on either my code or the OpenNTF API yet, you can see the navigator use in action: unless there's a search in place (in which case it falls back to VEC), the method fetches an active navigator, skips to the appropriate requested entry, and uses getCurrent() to retrieve it. Though the skip/retrieve mix is odd when the average case will be iterating over successive entries, the performance is speedy enough that I haven't felt the need to try to cover both random and sequential access differently.

Sorting

Since 8.5.0, we have the ability to use click-to-sort columns in the back-end API, and I make use of this to expose sortable columns though TabularDataModel's sorting methods. Once a sorted column is chosen, I pass that along to the underlying view when fetching it. If there's an FT seach query specified, I use the surfaced-in-8.5.3 FTSearchSorted method to retain the sorting.

Collapsible Categories

In 9.0.1, IBM added the ability to collapse categories in navigators via the setAutoExpandGuidance method paired with the view's setEnableNoteIDsForCategories. Using TabularDataModel's expand/collapse methods, I maintain a Set of the faux category note IDs generated when setEnableNoteIDsForCategories is enabled and pass them to the navigator when appropriate. This allows my code to deal with arbitrarily-collapsed categories without containing any other special code - considering how uselessly buggy my implementation of this was before 9.0.1, I'm quite happy those methods are there.

The combination of the sorting and categorization capabilities means that my collections are able to support the same xp:viewPanel UI features that standard xp:dominoView data sources are.

Deferred Data Access

The final key concept in the framework is the deferral of actually accessing a document until it's necessary. Each model object can be constructed in one of three ways: as a new document in a database, as a wrapper around an existing document, and as a wrapper around a view entry. In the view entry case, the model object doesn't touch the underlying document. Instead, it makes a note of the database path and the UNID (if a non-category) for if it DOES need to access it later and then stores the entry's column values in a map. While the model objects don't make a user-side distinction between view entries and documents (you can request any item value whether or not it's in the view), it DOES use these cached column values first. So if your code only requests values that are present in the view, the underlying document is never accessed at all. This leads to exceedingly-efficient (comparatively) data access without making the user of the objects worry about manually accessing the document if the value isn't in the view.


The result of all this is that the Framework collections share all the advantages and pitfalls of the underlying views. Some things are easy and fast (categorization, sorting, multi-entry documents, summary data) while some are still impractical or slow (Rich Text, MIMEBeans, arbitrary queries). But you go to production with the database indexer you have, and so far this method has been serving me well.

Dabbling in Reflection

Mon Jul 07 15:00:07 EDT 2014

Tags: java

In a similar vein to my post from the other day, I'd like to discuss one of the other bugaboos that crops up from time to time in the edges of XPages development, usually spoken of in hushed tones: reflection.

If you're not familiar with the term, reflection is an aspect of metaprogramming; specifically, it's the ability to inspect and operate on objects and classes at runtime. Reflection in Java isn't as smooth as it is in other languages, but it makes sense as you work with it.

The most basic form is the getClass() that every non-null object has: you can use it to get an object representing the class of the object in question, which is your main starting point. This code will print out "java.lang.String":

Class<?> stringClass = "".getClass();
System.out.println(stringClass.getName());

From here, you can start to get fancier. For example, this code...

Class<?> stringClass = "".getClass();

for(java.lang.reflect.Method method : stringClass.getMethods()) {
	System.out.println(method);
}

... will print out the signatures for all public instance and static methods, with output like:

public boolean java.lang.String.equals(java.lang.Object)
public int java.lang.String.hashCode()
public java.lang.String java.lang.String.toString()
public char java.lang.String.charAt(int)
public int java.lang.String.compareTo(java.lang.String)
...
public int java.lang.String.compareTo(java.lang.Object)
public final native java.lang.Class java.lang.Object.getClass()
public final native void java.lang.Object.notify()
public final native void java.lang.Object.notifyAll()

There is also a variant to show just the methods from that specific class, including private/protected ones. But the java.lang.reflect.Method objects you get there are useful for more than just printing out signatures: they're proxies for actually executing the method against an appropriate object. To demonstrate, say you have this class:

public class ExampleClass {
	private String name;

	public ExampleClass(final String name) { this.name = name; }

	public void printName() { System.out.println(name); }
}

You can then use reflection to call the method (this also demonstrates an alternate way of getting a class object if you know the class you want):

ExampleClass someObject = new ExampleClass("my name is someObject");
Method printNameMethod = ExampleClass.class.getMethod("printName");
printNameMethod.invoke(someObject);

So, okay, I found a way to call a method that involves way more typing for no gain - so what? Well, in a situation where reflection is actually useful, you won't actually know what methods are available. And this happens constantly in XPages, but you don't have to worry about it: reflection is how EL and SSJS do their work. If you've ever looked through one of the monstrous stack traces emitted by an XPage with an exception, you've probably seen this very call:

java.lang.Exception
    controller.home.getOutput(home.java:18)
    sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
    java.lang.reflect.Method.invoke(Method.java:611)
    com.sun.faces.el.PropertyResolverImpl.getValue(PropertyResolverImpl.java:96)
	...

That's reflection in action. The EL resolver found the object and then used reflection to determine how it should get the "output" property (most likely the instanceof operator) and then, determining that it was talking to a plain bean, constructed the name "getOutput", asked the object's class for that method, and invoked it.

Like most things in Java, reflection gets more complicated and powerful from here; there are facilities for dealing with generic types, creating new instances, reading annotations, making private properties/methods accessible, and so forth. But the basics are simple and worth knowing: it's all about asking an object what it is and what it is capable of. Once you're comfortable with that, the rest of the pieces will fall into place as you encounter them.

XPages Data Caching and a Dive Into Java

Wed Jul 02 16:39:54 EDT 2014

Tags: java

One of the most common idioms I run into when writing an app is the desire to cache some expensive-to-compute data in the view or application scope on first fetch, and then use the cached version from then on. It usually takes a form like this:

public Date getCachedTime() {
	Map<String, Object> applicationScope = ExtLibUtil.getApplicationScope();
	if(!applicationScope.containsKey("cachedTime")) {
		// Some actual expensive operation goes here
		applicationScope.put("cachedTime", new Date());
	}
	return (Date)applicationScope.get("cachedTime");
}

That works well, but it bothered me that I had to write the same boilerplate code every time, and I wondered if I could come up with a better way. The short answer is "no, not really". But the longer answer is "sort of!". And though the solution I came up with is kinda pathological and is generally unsuitable for normal humans and not worth the thin advantages, it does provide a good example of a couple of more-advanced Java concepts that you're likely to run across the more you get into the language. So here's the cache function in question, plus a "shorthand" version and an example:

@SuppressWarnings("unchecked")
public static <T> T cache(final String cacheKey, final String scope, final Callable<T> callable) throws Exception {
	Map<String, Object> cacheScope = (Map<String, Object>) ExtLibUtil.resolveVariable(FacesContext.getCurrentInstance(), scope + "Scope");
	if (!cacheScope.containsKey("cacheMap")) {
		cacheScope.put("cacheMap", new HashMap());
	}
	Map<String, Object> cacheMap = (Map<String, Object>) cacheScope.get("cacheMap");
	if (!cacheMap.containsKey(cacheKey)) {
		cacheMap.put(cacheKey, callable.call());
	}
	return (T) cacheMap.get(cacheKey);
}

public static <T> T cache(final String cacheKey, final Callable<T> callable) throws Exception {
	return cache(cacheKey, "application", callable);
}

public Date getFetchedTime() throws Exception {
	return cache("fetchedTime", new Callable<Date>() {
		public Date call() throws Exception {
			return new Date();
		}
	});
}

This code begs a question: what in the name of all that is good and holy is going on? There are a number of Java concepts at work here, some of which you've likely already seen:

  • Generics. Generics are the class names in angle brackets, like Map<String, Object> and Callable<Date>. They're Java's way of taking a container or producer class and making it usable with any class types without having to cast everything from Object.
  • Overriding methods. This is a small one: you can declare the same method name with different parameter types, which is often useful for "shorthand" versions of methods that allow for default values after a fashion. In this case, the short version of "cache" defaults to using the application scope by name.
  • The Callable interface. This is an interface in the java.util.concurrent package and is meant as a utility for multithreading purposes, but it also serves our needs here. Basically, it's an interface that means "this contains code that can be executed by using the call method with no arguments".
  • Anonymous classes. This is that business with "new Callable...". Java calls those anonymous classes and what they are are classes declared and instantiated inline with the code, without having a "proper" class declaration somewhere else. They're called "anonymous" because they have no name of their own - just the interface or class they implement/extend. In this case, I'm saying "make a new object that implements Callable and is used for this purpose only". I could make a standalone class, but there's no need. If you're familiar with closures from good languages, anonymous classes can be used like a really crappy version of those.
  • Generic return value declarations. The "<T> T" and "Callable<T>" bits build on normal generic use for when declaring a method. The first one, is brackets, basically says "okay Java, rather than returning a known object type, I'm going to let the user use anything, and for this purpose we're going to call it T". So from then on, an occurrence of T in that method body refers to whatever the programmer using the method intends. You can see in the getFetchedTime that I'm using a Callable<Date>; Java picks up on that "Date" name and conceptually substitutes it for all the Ts in the method. I'm essentially saying "I'm going to use the cache method, and I want it to accept and return Dates". Another call to the method could be to cache a List<String> or a SomeUserClass, while the method itself would remain the same.

So is all this worth it for the task at hand? Probably not. There are SOME advantages, in that it's easy to change the default caching scope, and in practice I also used a map that auto-expires its entries over time, but for normal use the first idiom is fine. But hey, it sure provided a whole torrent of Java concepts, so that's worth something.

Coaxing Enums Into XPage Combo Boxes

Wed May 28 19:07:12 EDT 2014

Tags: java xpages

I've been reworking Forms 'n' Views over the last couple days to use it as a development companion for the OpenNTF Design API, and one of the minor hurdles I ran across was presenting a getter/setter pair that expects a Java Enum to an XPages control (a combo box in this case, but it would be the same with other similar controls). I had assumed at first that it would simply not work - that the runtime would entirely balk at the conversion. However, it turned out that XPages does half the job for you: for displaying existing values, it will properly coax between the string form and the Enum constant. However, it breaks when actually going to set the value.

This whetted my appetite enough that I decided to try to see if I could write a small converter to finish the job in a properly generic way, and it turns out I could. To cut to the chase, I wrote this class:

package converter;

import javax.faces.component.UIComponent;
import javax.faces.context.FacesContext;
import javax.faces.convert.Converter;
import javax.faces.el.ValueBinding;

public class EnumBindingConverter implements Converter {

	@SuppressWarnings("unchecked")
	public Object getAsObject(final FacesContext facesContext, final UIComponent component, final String value) {
		ValueBinding binding = component.getValueBinding("value");
		Class<? extends Enum> enumType = binding.getType(facesContext);

		return Enum.valueOf(enumType, value);
	}

	public String getAsString(final FacesContext facesContext, final UIComponent component, final Object value) {
		return String.valueOf(value);
	}

}

The getAsString method is your standard "I don't care about this" near-passthrough, but the getAsObject portion does the work necessary. Because I'm binding to a Java bean with getters/setters expecting the appropriate Enum type (this would presumably also work with a DataObject with a thoroughly-implemented getType method), the ValueBinding class is able to look up and provide back the expected Enum class. I then use the standard Enum.valueOf method to let Java do the work of actually finding the appropriate Enum constant.

Using this, I'm able to write a control like this:

<xp:comboBox value="#{columnNode.sortOrder}">
	<xp:this.converter><xp:converter converterId="enumBindingConverter"/></xp:this.converter>
	<xp:selectItem itemLabel="None" itemValue="${javascript:org.openntf.domino.design.DesignColumn.SortOrder.NONE}" />
	<xp:selectItem itemLabel="Ascending" itemValue="${javascript:org.openntf.domino.design.DesignColumn.SortOrder.ASCENDING}"/>
	<xp:selectItem itemLabel="Descending" itemValue="${javascript:org.openntf.domino.design.DesignColumn.SortOrder.DESCENDING}"/>
</xp:comboBox>

And it works! That allows you to bind to a getter/setter pair that expects an Enum without having to write in a shim to convert it to a String value beyond the generic converter.

You probably noticed the same thing I did, though: that's ugly as sin. Not only that, but it's a lot of redundant code when what you usually want to do is provide a selection of all available constants for the given Enum. To save myself a bit of future hassle and prettify my code a bit (at the expense of UI, but who cares about that? (I kid)). I wrote this DataObject bean to accept an Enum class name and return a list suitable for use in a control:

package bean;

import java.io.Serializable;
import java.util.*;

import javax.faces.context.FacesContext;
import javax.faces.model.SelectItem;

import org.openntf.domino.utils.Strings;

import com.ibm.xsp.model.DataObject;

import frostillicus.bean.*;

@ManagedBean(name="enumItems")
@ApplicationScoped
public class EnumSelectItems implements Serializable, DataObject {
	private static final long serialVersionUID = 1L;

	public Class<?> getType(final Object key) {
		return List.class;
	}

	@SuppressWarnings("unchecked")
	public List<SelectItem> getValue(final Object key) {
		if(key == null) {
			throw new NullPointerException("key cannot be null.");
		}

		String className = String.valueOf(key);

		try {
			Class<? extends Enum> enumClass = (Class<? extends Enum>)FacesContext.getCurrentInstance().getContextClassLoader().loadClass(className);

			Enum[] vals = enumClass.getEnumConstants();
			List<SelectItem> result = new ArrayList<SelectItem>(vals.length);
			for(Enum val : vals) {
				SelectItem item = new SelectItem();
				item.setLabel(Strings.toProperCase(val.name()));
				item.setValue(val);
				result.add(item);
			}

			return result;
		} catch(Throwable t) {
			throw new RuntimeException(t);
		}
	}

	public boolean isReadOnly(final Object key) {
		return true;
	}

	public void setValue(final Object key, final Object value) { }
}

It uses the @ManagedBean implementation that I cobbled together a while back, but you could just as easily use the normal faces-config route. The way you use this is to pass in the class name of the Enum you want in a selectItems component:

<xp:comboBox value="#{columnNode.sortOrder}" onchange="fnv_mark_dirty('#{id:viewPane}')">
	<xp:this.converter><xp:converter converterId="enumBindingConverter"/></xp:this.converter>
	<xp:selectItems value="${enumItems['org.openntf.domino.design.DesignColumn$SortOrder']}"/>
</xp:comboBox>

Though it only saves a few line in this case, it'd pay off more with larger Enums. It has the down side of being pretty naive about its conversions: it just does a basic first-character proper-casing of the Enum constant name, but there's no reason you couldn't add better human-friendly-ifying to it.

I'm not sure if I'll be able to satisfy all of my Java preferences when it comes to multi-value Enums, though. Java's generic type erasure means that runtime code wouldn't have a way to inspect the getters/setters to figure out the appropriate Enum type. Perhaps I can inspect the select items attached to the control to find the appropriate class.

Expanding your use of EL: Mea Culpa

Fri May 02 21:27:54 EDT 2014

A while back, a wrote a couple posts about using Expression Language (EL) to clean up your XPages code, and in the first of them I gave what I labeled an exclusive list of the interfaces that XPages EL provides special support for (as opposed to just looking for getters/setters). It turns out I was sorely mistaken: there are at least two other types that EL specially supports.

The first is the class used for JavaScript objects, such as allowing #{foo.bar} to bind like you'd expect to a value set to the JavaScript literal { bar: 'baz' }. I'm not sure where the specific support comes in - presumably based on one of the classes in the hierarchy and not the IValue interface it implements - but it's not terribly important.

The second is an interface that I've used frequently, but didn't realize had this property: ViewRowData. ViewRowData is essentially a specialized variant of DataObject intended for view entries - so much so, in fact, that I just assumed that view entries supported DataObject simultaneously, as my classes that implement it do.

I don't expect I'll have a lot of situations where I'll want to implement ViewRowData but not DataObject, but it's good to know one additional tidbit about the inner workings of XPages. One day, maybe I'll try to find a truly exhaustive list of what gets special treatment.

How I Added File-Download Support To My Model Framework

Sat Apr 12 13:33:24 EDT 2014

Tags: xpages java

Unlike the last few posts, this one isn't as much a tip or tutorial as it is an attempt to share my suffering by detailing the steps I took recently to add file-download and -upload support to my current model framework.

The goal of my framework overall is to retain the benefits of normal Domino document data sources (arbitrary field access, ease of use in simple cases, and compatibility with standard XPage idoms) while adding on useful new abilities (getters/setters, relationships with other objects, abstraction between the XPage and the actual location of the data). Most of this work is easy - by implementing DataObject, my objects worked well with EL bindings and most of the work was focused on hooking up Domino Document objects in a similar way to IBM's DominoDocument class.

However, some of the more esoteric controls are not so easy, and the file controls were chief among them. As I tinkered, I found that you can't directly point a xp:fileDownload control to a normal data source and get it to work well. You can get it to list a single file, but I wasn't able to get it to recognize more than one through traditional means.

In my tinkering, I discovered that there are two main paths for the control: the traditional one and a more-complicated one, which makes a lot of assumptions about the nature and availability of the referenced data source. Importantly, unlike normal bindings (which take your expression, say #{doc.body}, have the EL processor resolve it, and deal with the final resolved entity), the control actually treats the EL expression as a String, splits it in two on the "." character, and looks up the variable on the left side plus ".DATASOURCE". This "doc.DATASOURCE" idiom matches the behavior of data-source controls on an XPage, which make themselves or an associated object available by that name. If the control finds an object by that name and if it implements DataSource (a common interface used by these controls), it assumes that the source contains a DataContainer object that also implements DocumentDataContainer specifically.

Phew, okay, so far so good. It makes some inconvenient sense that the control would expect its referenced value to conform to the norms of the built-in data sources, so I wrote controls to do just that (previously, I got by using normal EL references from managed beans). However, I still ended up getting ClassCastExceptions at runtime: it turned out that the control also uses a nebulous factory object to attempt to convert the received value into a data type it expects, specifically DominoDocument. Unfortunately, unlike the interfaces I implemented already, subclassing DominoDocument was not something I was willing to do. So I was back to digging.

What I found was that this attempted conversion was happening due to an object implementing com.ibm.xsp.model.DataModelFactory, which has the job of taking an intermediate value used in the control (a com.ibm.xsp.model.FileDownloadValue, which contains a single-entry HashMap containing the object and requested field name) and returning the attachment data as an instance of DataModel. In addition to writing the code to do this, there's another wrinkle: the way the factory is found. The way I found to access this is thoroughly awkward: if I use the deprecated method getFactoryLookup() on the application instance, I can then retrieve the default factory by a key ("com.ibm.xsp.DOMINO_DATAMODEL_FACTORY"), wrap it in a new object that knows about my model objects, and replace it in the lookup.

Once I did all that, I was off to the races! Sort of! There remained another problem: the "delete attachment" icon on the control. I discovered that the way this icon works is that it composes an action that tracks down an adapter that knows how to communicate with the associated model object and sends it a deleteAttachments(...) call. As before, the way it "tracks down" the adapter is to look up another factory ("com.ibm.xsp.DESIGNER_DOMINO_ADAPTER_FACTORY") and ask it to provide an appropriate adapter. So again I wrote my own variant of this to intercept calls dealing with model objects and carefully placed it, Indiana-Jones-style, in place of the default factory.

The net result is that it works! I can point a file-download control at a model object (as long as that object was specifically instantiated in a data source control and is referenced directly in "object.field" format) and have it list all of the attachments in the field, along with a functioning delete action if desired. The file-upload control was, fortunately, much easier: when I realized I could swap out most of the guts of my model object with a back-end DominoDocument (now that the OpenNTF API takes care of data conversions for me), I was able to just proxy the upload calls to it and let it do the heavy lifting.

Now, there's only one nagging problem remaining: inline image uploads in rich text controls. But still, even if I never get that part working, the file manipulation will be enough. And, importantly, I've paved the way for me to be able to do use the same controls with model objects that aren't associated with Domino at all, should I desire to do so down the line.

The Collections Framework as Education

Sun Apr 06 19:15:59 EDT 2014

While I've been formulating the followup to my last post, I've also been thinking about how wonderful the Java Collections framework is. I've been singing its praises for a long time (as have others), but what's stood out to me lately is how well it encapsulates many of the most important foundational concepts of Java.

One of the reasons Java appears so intimidating and foreign to Notes developers is that the way many are introduced to it - as an auxiliary for XPages - involves learning a torrent of unrelated concepts simultaneously: strict variable typing, object equality and comparison, class casting, managed beans, recycling (though this is just as important in SSJS), confusing recycling with memory management, static methods, inheritance, interfaces, exceptions, abstract classes, database access, serialization, concurrency, and so forth. If you focus on the Collections framework specifically, though, you can learn many of the important basics in a clear way. And as a bonus, these are critical concepts shared with other, better OO languages.

Interface vs. Implementation

One of my favorite lecture topics is the value of separating interface and implementation. Unfortunately, most of the time, the values are unclear - if you're only going to make one class, is it valuable to have an interface? - and it seems like pedantry. The Collection classes, however, bring this value into sharp focus. The core interfaces correspond to elemental data structures of Computer Science and tell you everything you need to know to use them, e.g.

  • List: a collection of objects in the order stored by the user, accessed by index.
  • Map: a collection of objects accessed by a key of any type specified by the user.
  • Set: a collection of unique objects in either arbitrary or, for SortedSets), automatically-sorted order.

These interfaces describe everything you need to know about the object and do so so well that there's rarely any need to specify the actual class at all outside of creation. The differences between the actual classes are, quite literally, implementation details: some focus on single-thread performance (like ArrayList), some serve specialized cases (like EnumMap or IdentityHashMap), some implement multiple useful interfaces (like LinkedList), and some are geared towards multi-threaded use (like the java.util.concurrent objects).

Job-Based Polymorphism

Not only does the Collection framework provide an example of a cleanly-designed interface/implementation separation, but it also provides examples of why this pays off. One of the handiest tricks is the way most of the objects (other than Map) participate in mutual-translation schemes. By this I mean that you can take one Collection and pass it to another (either in the constructor or via addAll) and build a second collection of the same objects, but obeying the rules of the new collection.

For example, say you have a List of usernames that you retrieved from a call to view.getColumnValues(...) and now you want to sort and unique-ify them (basically like @Sort(@Unique(...))). Well, there's a collection type for that: the SortedSet, of which TreeSet is the common implementation. Rather than manually building a unique, sorted List (even using helper methods), you can accomplish this with TreeSet's constructor. You can then add in names from other sources and maintain the same unique-and-sorted guarantee without any additional code:

SortedSet<String> names = new TreeSet<String>(view.getColumnValues(1));
names.addAll(doc.getItemValue("SomeNamesField"));
names.addAll(someNameMap.values());

Because those methods only care that the incoming values are in a Collection, the fact that some may be in Lists, in Sets, or in some other entirely-unknown type that implements Collection is completely irrelevant. As it should be.*

Reusability

This benefit is so inherent to the framework that I almost forgot to include it. Largely by the nature of the task being performed but also by virtue of proper coding, the collection objects are so reusable that they feel like part of the language, even though they're just objects like any you write. They're built to be generic and used in ways far outside of the original ken of their creators. Both the implementations and interfaces can be spread far and wide, such as with my database model collection class that also acts as a List.

Takeaway

Not all of these benefits are going to be useful in all of your code, and it will be very infrequent that you have a reason to write your own framework of this type. By learning the contours of this framework, though, you can advance your knowledge of Java and object-oriented programming generally by leaps and bounds. It will help you write cleaner code and to identify similar concepts used elsewhere.

* For example, did you know that viewScope is not a HashMap but actually a class called "javax.faces.component.UIViewRoot$ViewMap"? That's how much the implementation doesn't matter.

A Quick, Dirty, and Inefficient @ManagedBean Implementation in XPages

Tue Dec 17 09:24:31 EST 2013

Tags: xpages java

By now, most of us are pretty familiar with the process for adding managed beans to an XPages app: go to faces-config.xml and add a <managed-bean>...</managed-bean> block for each bean. However, JSF 2 added another method for declaring managed beans: inline annotations in the Java class. This wasn't one of the features backported to the XPages runtime, but it turns out it's not bad to add something along these lines, and I decided to give it a shot while working on the OpenNTF API.

The first thing that's required is a version of the annotations that ship with JSF 2. Fortunately, annotations in Java, though ugly to define, are very simple, and each one involves only a few lines of code and can be added to your NSF directly:

javax.faces.bean.ManagedBean

package javax.faces.bean;

import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;

@Retention(value=RetentionPolicy.RUNTIME)
public @interface ManagedBean {
	String name();
}

javax.faces.bean.ApplicationScoped

package javax.faces.bean;

import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;

@Retention(value=RetentionPolicy.RUNTIME)
public @interface ApplicationScoped { }

javax.faces.bean.SessionScoped

package javax.faces.bean;

import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;

@Retention(value=RetentionPolicy.RUNTIME)
public @interface SessionScoped { }

javax.faces.bean.ViewScoped

package javax.faces.bean;

import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;

@Retention(value=RetentionPolicy.RUNTIME)
public @interface ViewScoped { }

javax.faces.bean.RequestScoped

package javax.faces.bean;

import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;

@Retention(value=RetentionPolicy.RUNTIME)
public @interface RequestScoped { }

javax.faces.bean.NoneScoped

package javax.faces.bean;

import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;

@Retention(value=RetentionPolicy.RUNTIME)
public @interface NoneScoped { }

The more-complicated code comes into play when it's time to actually make use of these annotations. There are a couple ways this could be handled, and the route I took was via a variable resolver. Note that this code requires my current branch of the OpenNTF API, though a mild variant would work with the latest M release:

resolver.AnnotatedBeanResolver

package resolver;

import java.io.Serializable;
import java.util.*;
import javax.faces.context.FacesContext;
import javax.faces.el.EvaluationException;
import javax.faces.el.VariableResolver;
import javax.faces.bean.*;

import org.openntf.domino.*;
import org.openntf.domino.design.*;

import com.ibm.commons.util.StringUtil;

public class AnnotatedBeanResolver extends VariableResolver {

	private final VariableResolver delegate_;

	public AnnotatedBeanResolver(final VariableResolver resolver) {
		delegate_ = resolver;
	}

	@SuppressWarnings("unchecked")
	@Override
	public Object resolveVariable(final FacesContext facesContext, final String name) throws EvaluationException {
		Object existing = delegate_.resolveVariable(facesContext, name);
		if(existing != null) {
			return existing;
		}

		try {
			// If the main resolver couldn't find it, check our annotated managed beans
			Map<String, Object> applicationScope = (Map<String, Object>)delegate_.resolveVariable(facesContext, "applicationScope");
			if(!applicationScope.containsKey("$$annotatedManagedBeanMap")) {
				Map<String, BeanInfo> beanMap = new HashMap<String, BeanInfo>();

				Database database = (Database)delegate_.resolveVariable(facesContext, "database");
				DatabaseDesign design = database.getDesign();
				for(String className : design.getJavaResourceClassNames()) {
					Class<?> loadedClass = Class.forName(className);
					ManagedBean beanAnnotation = loadedClass.getAnnotation(ManagedBean.class);
					if(beanAnnotation != null) {
						BeanInfo info = new BeanInfo();
						info.className = loadedClass.getCanonicalName();
						if(loadedClass.isAnnotationPresent(ApplicationScoped.class)) {
							info.scope = "application";
						} else if(loadedClass.isAnnotationPresent(SessionScoped.class)) {
							info.scope = "session";
						} else if(loadedClass.isAnnotationPresent(ViewScoped.class)) {
							info.scope = "view";
						} else if(loadedClass.isAnnotationPresent(RequestScoped.class)) {
							info.scope = "request";
						} else {
							info.scope = "none";
						}

						if(!StringUtil.isEmpty(beanAnnotation.name())) {
							beanMap.put(beanAnnotation.name(), info);
						} else {
							beanMap.put(loadedClass.getSimpleName(), info);
						}
					}
				}

				applicationScope.put("$$annotatedManagedBeanMap", beanMap);
			}

			// Now that we know we have a built map, look for the requested name
			Map<String, BeanInfo> beanMap = (Map<String, BeanInfo>)applicationScope.get("$$annotatedManagedBeanMap");
			if(beanMap.containsKey(name)) {
				BeanInfo info = beanMap.get(name);
				Class<?> loadedClass = Class.forName(info.className);
				// Check its scope
				if("none".equals(info.scope)) {
					return loadedClass.newInstance();
				} else {
					Map<String, Object> scope = (Map<String, Object>)delegate_.resolveVariable(facesContext, info.scope + "Scope");
					if(!scope.containsKey(name)) {
						scope.put(name, loadedClass.newInstance());
					}
					return scope.get(name);
				}
			}
		} catch(Exception e) {
			e.printStackTrace();
			throw new RuntimeException(e);
		}

		return null;
	}

	private static class BeanInfo implements Serializable {
		private static final long serialVersionUID = 1L;

		String className;
		String scope;
	}
}

faces-config.xml

<faces-config>
	...
	<application>
		<variable-resolver>resolver.AnnotatedBeanResolver</variable-resolver>
	</application>
</faces-config>

This is not efficient! What happens is that, whenever a variable is requested that the main resolver can't find (or is null), the code goes through each Java class defined in the NSF (not in plugins, the filesystem, or in attached Jars) and checks if it contains the @ManagedBean definition. Fortunately, the worst of this is only incurred the first time (according to the duration of applicationScope), but that one-time process is still something that would get linearly slower as the number of Java resources in your app increases.

Now, there are still other features of the actual JSF 2 implementation that this doesn't cover, particularly @ManagedProperty, but it can still be useful on its own. In practice, it lets you ditch the faces-config declaration and write your beans like this:

package bean;

import javax.faces.bean.*;
import java.io.Serializable;

@ManagedBean(name="someAnnotatedBean")
@ApplicationScoped
public class AnnotatedBean implements Serializable {
	private static final long serialVersionUID = 1L;

	// ...
}

I plan to give this a shot in the next thing I write to see how it works out. And regardless of future use, it's a nice example of the kind of thing that gets easier with the OpenNTF API.

Building My App On Custom Renderers

Sun Nov 24 20:41:15 EST 2013

Tags: xpages java
  1. I Am Terribly Excited About Custom Renderers
  2. Building My App On Custom Renderers

A little while ago, I mentioned how I have become smitten with custom renderers - Java classes that take XPage components like application layouts and widgets and render them in custom HTML. Though I've had a lot of other things to work on in between then and today, my ardor for the subject has not dimmed. So today, I mostly completed the process for my task-tracking app.

At a high level, the app is a fairly typical web app built using the popular-for-good-reason Ace - Responsive Admin Template theme from WrapBootstrap. I'm a huge fan of using these pre-built themes - in addition to handling the visual design, they tend to come packaged with a set of specialized widgets and jQuery plugins that are made to work well as a cohesive whole. However, the down side is that you have to write your code specifically targeting the framework, even moreso than stock Bootstrap (and for this reason, I couldn't directly use Bootstrap4XPages). This makes it a perfect candidate for renderer-ification, as most of the components I'm using - the layout, widget boxes, view panels/data tables, form layouts - have direct analogs in either the stock XPage runtime or in the OneUI-based components in the Extension Library.

So what I have done is to take the basic structure of the Bootstrap4XPages project (and more than a smidge of its code) and used it to build custom renderers for the components I need, packaged into an OSGi plugin:

Project Explorer

It's something of a mess of code, and parts of it have been a pain to deal with, but for the most part it's been a process of looking at the original renderer source in the ExtLib and Bootstrap4XPages and then either copying and tweaking, subclassing and overriding, or implementing from scratch. I've run into some bothersome limitations in the stock objects as well, particularly when it comes to areas where I need to apply CSS classes to elements not represented in the components, like the FontAwesome-based icon classes for the nav bar or specific classes for widget title bars and the body area. Since these components also lack convenient "attr" properties, I've resorted to ugly hacks: I've co-opted the "imageAlt" property of the tree nodes to trigger creation of a Bootstrap-style "i" tag with a class for the navigator, and for the widgets I just made the styleClass apply only to the header and added an ugly exception to look for "no-padding" classes and apply them to the body instead. It's not a pretty process sometimes.

But the results are pretty interesting. By using standard components, I have an app that can swap readily between very different themes (with some rough edges for areas where code is still Ace-specific):

Ace

OneUI v2.1

OneUI v3.0.2

All of this does, of course, raise an important question. Namely: why bother? The original way I implemented it - by writing the Bootstrap layout HTML by hand, tweaking individual classes here and there, and forgoing the OneUI-based components - worked just fine, and it took less work. And really, I'm not terribly likely to suddenly decide that I want to switch this app from Ace to OneUI or stock Bootstrap. There are a couple important benefits to this approach that apply both to me personally and to others who may consider doing the same thing:

  • Practice. This seems odd to put first, but it's vitally important. Writing these renderers has improved my knowledge of the platform a great deal. It's given me a reason to delve further into the Extension Library source, to improve my OSGi-plugin-writing knowledge, and to familiarize myself with an easily-overlooked part of the JSF stack. Like many things with the plumbing of XPages, it's more of a PITA than it absolutely needs to be, but, if you do it, you'll come out a much better developer.
  • Focus. Now that I have almost all of the code that's specific to Bootstrap or Ace out of my XPage app, the remaining code has a sharpened focus on the task at hand. Now I use standard widget, pager, and view controls just like in any run-of-the-mill app and hook them up the same way, without having to worry about how it's going to look. Time spent on the problem at hand is time spent well.
  • Maintainability. During my transition from hardcoded HTML in the XSP source to custom renderers, I also jumped from Ace 1.0 to 1.2, which included an upgrade to Bootstrap 3. For this task, that meant I had to learn about a number of the differences between Bootstrap 2 and 3, as well as tweaking lots of little CSS classes and HTML structures across every page of the app. Next time, though, when Ace version 2 running with Bootstrap 4 comes out, I only need to change the theme, entirely outside of the app, and everything else will come along for the ride (mostly).
  • Less Code in the NSF. This is a corollary to the last bullet point, but is important as its own concept. By moving the theme and supporting files (like the CSS and JS assets themselves) out to a plugin, there's less cruft to carry around in the NSF itself and, more importantly, less stuff to keep up-to-date when used in common across multiple apps. This is not unique to this technique (you can move the resource files out without doing renderers, and you can bundle common custom controls that way too), but it's a valuable side effect.
  • Interoperability. This doesn't matter so much to me currently, since I'm the only one working on this app, but it will matter later, and it would matter immediately to any larger company. If each of your apps, regardless of the final appearance, is built using the standard XPage and ExtLib controls, that means you don't need to familiarize every developer working on it with the CSS framework of your choice, whether it be Bootstrap, your company's home-grown HTML, or something else entirely. You can have one person writing the themes and have the rest of the team go about building their apps by the book (or, rather, books).
  • Flexibility. As demonstrated by my screenshots above, this technique brings tremendous flexibility. In this specific case, I knew what theme I wanted to use when I started writing the app, but in the future I won't need to. Whenever I'm struck by inspiration for a new app, I can start writing code immediately (dressing it up in Ace or OneUI) and then decide on and buy another theme later without the hassle of going back into my functional XPages to replace and tweak a bunch of CSS classes.

It's a pity that there's such overhead in the initial work of writing custom renderers, but it will get gradually easier over time, now that Bootstrap4XPages is available to reference. And still, hassle and all, I believe this to be the best way forward for organized XPage development, and represents one of the platform's distinct advantages over others (well, the others that aren't JSF). I strongly encourage everyone to look at least a little into the idea. And I also, as always, encourage you to contact me at will either to pick my brain or commission my company to help renderer-ify or otherwise improve your apps.

I Am Terribly Excited About Custom Renderers

Tue Nov 12 19:56:45 EST 2013

Tags: xpages java
  1. I Am Terribly Excited About Custom Renderers
  2. Building My App On Custom Renderers

It's much to my chagrin that it took me until this past weekend to properly dive into custom renderers. I had seen them before, primarily when working with mypic, but always avoided building my own, primarily because of what a hassle they are to write. However, I'm now convinced that the hassle is worth it, at least for some primary uses.

If you're not familiar with custom renderers, they're sort of an odd beast, but they're the kind of thing where the concept eventually "clicks" in your mind at some point. In XPages, a component (such as a link, a panel, or an application layout) consists of two main aspects: the structural code that defines the properties of the component and the renderer that handles actually outputting the HTML for the browser. What adding your own custom renderers allows is for you to continue to use the same XSP markup to define the control, yet have very different results.

The best example of this (and the source of most of my education) is the recent Bootstrap4XPages project. It takes a standard OneUI applicationLayout-based app and styles it using Bootstrap markup instead of OneUI, even though the XSP markup is the same. In the normal case, this just means that you can reuse some of the basics when building a Bootstrap-targeted app. And that's good enough on its own: it saves you tons of work.

But the more important aspect is that it lets you focus much more on solving the task at hand and much less on writing to the layout framework.

This is a huge advantage for XPages (and any framework with this separation). As low-key as Bootstrap is (and other modern CSS frameworks are), you still have to structure your markup and sprinkle class names around to match its expectations. With a really fleshed-out set of custom renderers, though, you could potentially not care at all about the resultant UI framework when writing your app and instead focus on using stock or ExtLib controls to solve the problem directly, letting the theme+renderers do the work.

In reality, there'll be a bit of leaking (for example, using Bootstrap-friendly FontAwesome classes instead of images doesn't really fit with standard IBM controls), but I aim to minimize that as much as possible in my future projects. My goal is to use applicationLayout and standard controls for as much as I can, and instead writing a set of custom renderers for each primary theme I want to apply.

I'll have more to write about this later, but for now, the TL;DR version is: custom renderers have the potential to dramatically improve the way XPages apps are written, as long as you're willing to do the initial Java legwork.

Progress on the org.openntf.domino Design API

Fri Sep 06 18:57:39 EDT 2013

Tags: java openntf

The primary focus of my recent work with the OpenNTF Domino API has been the implementation of a fleshed-out design API. I've done a lot of work in my Domino-programming career manipulating design elements, usually via DXL (such as with my Forms 'n' Views app), and my aim is to codify all of that into the API itself, eventually growing it to cover almost every design element that Designer does (we'll see about Navigators, Composite Apps, and the like).

One of the big hurdles in recent years when manipulating design elements has been dealing with XPages-related elements - XPages themselves, Java design elements, Jars, faces-config.xml, etc.. Though XPages by their nature have a very nice XML representation, that doesn't carry over to DXL - all of those elements are dumped out in the "raw" data format that DXL uses when it doesn't have a better idea of how to handle what you're trying to export. And more specifically, though the XSP markup and the compiled Java code are present in the result, attempts to actually manipulate that were stymied by a "checksum" block of bits at the start of the data - you couldn't just drop the Java bytecode into the doc and expect it to work.

Fortunately, it turns out that those bits aren't a weird "checksum" at all: the data is just stored as Composite Data, the internal format of rich text in Domino. Specifically, all of those elements share the same internal representation as File Resources and Style Sheets, being composed of a CDFILEHEADER and list of CDFILESEGMENTs. When DXL uses the "raw" format, it exports that data as an array of Base64-encoded bytes directly. After discovering this, I wrote the start of a CD implementation in the API and got it to work: the API is now able to read/write the attached data of any one of those file types. Now, there are a few steps that would have to be implemented between this and "write a new XPage with a few lines of code", but this is a step in that direction (and you could theoretically modify your faces-config.xml, xsp.properties, etc. on the fly now, though I haven't tested much).

More immediately, it's also a huge step in the direction of interesting read-focused uses. One of these uses in particular has caught my fancy: loading and using XSP-related Java classes in other contexts. To go along with my work, I wrote a DatabaseClassLoader that gives you access to all of those class types from anywhere using the API (including Jar design elements). The possibilities for this have made me giddy: re-using data-model objects in external scheduled tasks, sharing classes with Java agents (...with the caveat that you have to use reflection or a scripting language), "loading" other XPage apps from inside another, treating NSFs as replicating, access-controlled Jars, and so on. I have my own notions for some practical uses I may want to explore, and I'm very interested in seeing what kind of things this lets others do as well.

Expanding your use of EL (Part 2)

Wed Jul 24 12:37:12 EDT 2013

Tags: el java
  1. Expanding your use of EL (Part 1)
  2. Expanding your use of EL (Part 2)

Last time, I said that my next post would delve into operations and comparisons in EL, but I figured first it would be good to have a real-use example of what I talked about last time.

In several of my apps lately, I've taken to using properties files to store app configuration, primarily the server and file paths for the databases that store the actual data. The standard way of doing this is to use the platform's built-in bundle-handling abilities. However, that'd leave me stuck with either putting the bundle reference in the XSP markup somewhere or in a theme - the former would muddy things up and the latter is not available at page-load time. So instead, I wrote a quick class to use as an application-scoped bean, making it available everywhere via EL and Java:

package config;

import java.io.IOException;
import java.io.Serializable;
import java.util.Locale;
import java.util.ResourceBundle;
import javax.faces.context.FacesContext;
import com.ibm.xsp.application.ApplicationEx;
import com.ibm.xsp.model.DataObject;

public class AppConfig implements Serializable, DataObject {
	private static final long serialVersionUID = -5125445506949601097L;

	private transient ResourceBundle config;

	public Class<?> getType(final Object keyObject) {
		return String.class;
	}

	public Object getValue(final Object keyObject) {
		if(!(keyObject instanceof String)) { throw new IllegalArgumentException(); }
		return getConfig().getString((String)keyObject);
	}

	public boolean isReadOnly(final Object keyObject) { return true; }

	public void setValue(final Object keyObject, final Object value) { }

	private ResourceBundle getConfig() {
		if (config == null) {
			try {
				ApplicationEx app = (ApplicationEx) FacesContext.getCurrentInstance().getApplication();
				config = app.getResourceBundle("config", Locale.getDefault());
			} catch (IOException ioe) {
				ioe.printStackTrace();
			}
		}
		return config;
	}
}

This is an example of a simple read-only DataObject implementation. There are only four methods to implement: getType, which returns the class of the value for the key; getValue, which returns the actual value; isReadOnly, which determines whether a given key is settable (always false here); and setValue, which would set the value for the key if I wasn't making it read-only.

It's pretty straightforward to take this idea and apply it to any of your Map-like objects to ease their use in EL without having to implement or stub-out the dozen or so methods demanded by the actual Map interface.

Expanding your use of EL (Part 1)

Tue Jul 23 13:07:43 EDT 2013

Tags: el java
  1. Expanding your use of EL (Part 1)
  2. Expanding your use of EL (Part 2)

One of the main aspects to cleaning up the code of an XPage is the heavy use of Expression Language (EL), which is XPages/JSF's shorthand language for binding properties to values and methods. The most common kind you'll run across is the kind you get for "free" when working with document data sources:

<xp:inputText value="#{doc.FirstName}"/>

This is the essence of EL in a nutshell: the actual code behind the scenes can be pretty complex, but EL handles that for you. It goes far beyond this, though, in two important ways: accessing custom and non-Domino data sources and writing complex operations.

Accessing Custom Data Sources

In addition to accessing "built-in" data types like Domino documents and the scope variables, EL is generic enough to deal with several kinds of data access (technically, it ONLY does these, and the standard bindings are examples of it):

  • Lists (and probably arrays). EL can use bracketed-subscript notation to reference individual elements of a Java list/array, e.g. #{names[3]}. I've found this to be the rarest use, but it could be useful if repeating over two parallel lists.
  • Maps. This is how EL accesses the scope variables (requestScope, viewScope, etc.). It translates code like #{sessionScope.cart} to sessionScope.get("cart") and sessionScope.put("cart", ...).
  • DataObjects. This is basically a "Map-lite" interface from IBM that is meant for making generic key/value objects easily-accessible via EL - this is what Domino document objects implement. It translates code like #{doc.FirstName} to doc.getValue("FirstName") and doc.setValue("FirstName", ...).
  • Generic bean-style methods. If (and only if) your object doesn't implement any of the above interfaces, EL falls back to seeking out bean-style getters and setters. For example, if you have a class that implements no special interfaces, EL will attempt to match #{myObject.foo} to myObject.getFoo() (or myObject.isFoo()) and myObject.setFoo(...).

The first important thing to remember is that EL applies these rules to your own and third-party objects just as much as it does to IBM's stock objects. If your code is just #{doc.foo}, you can switch "doc" from a Domino document to a HashMap, a custom DataObject class, a bean with getFoo, or an entirely-different third-party data source that implements one of those and it will continue to work smoothly. If you instead wrote it as #{javascript:doc.getItemValueString('foo')}, your experience would be... a bit rougher.

The second important thing to remember is EL's equivalence between dot and bracket notations. What I mean by this is that these three are all equivalent (assuming "barHolder" is a variable on the page containing the string "bar"):

  • #{foo.bar}
  • #{foo['bar']} or #{foo["bar"]}
  • #{foo[barHolder]}

Note that those are equivalent for every supported object type, including generic beans (okay, but not Lists, because #{foo.1} is illegal). This equivalence allows you to use EL to reference keys that would not be legal in dot notation (e.g. #{someProperties['us.iksg.setting']}) and to use other EL expressions to determine the field name. Say you're keeping some user-specific info in the application scope; you could end up with a chain like this to access it via EL:

<xp:text value="#{applicationScope.userCache[context.user.name].ticketCount}"/>

The more you understand the different ways of accessing your objects via EL, the more you can remove SSJS from your XPage markup and make your apps cleaner, (probably) faster, and more portable.

In my next post, I will cover the second major topic: complex operations and comparisons in EL.

The "final" Keyword in Java

Sun Jun 16 14:45:24 EDT 2013

Yesterday, I suggested that everyone enable finalizing parameters in their code cleanups and, better, to do the same in their Java save actions; later, I set Eclipse loose on the org.openntf.domino API to implement it there. Adding final keywords to method parameters (and elsewhere where possible) is one of the habits I picked up from reading Effective Java a bit ago, but it's kind of a subtle thing and the benefits aren't immediately obvious.

If you're not at all familiar with the final keyword, it has a couple meanings in Java - in the context of variable and parameter declarations, it means that you can't reassign the value of the variable after it's been assigned the first time. So, for example, this is illegal and will generate a compile error:

final int i = 1;
i = 2;

In that example, it seems like a "why the heck would I do that?" sort of thing, but it's not too much of a stretch to picture a block of code involving something like this:

void doSomething(Document doc) {
	System.out.println("doc is " + doc.getUniversalID());
	
	// a bunch of code
	doc = database.getDocumentByID("FFFF0002");
	// a bunch more code
	
	System.out.println("doc is " + doc.getUniversalID());
}

That will WORK and it won't damage the original doc that was passed in, but it'll get confusing, particularly the more lines of code or eyeballs involved. Functioning or not, it's poor code style, and changing the method signature to doSomething(final Document doc) means that the compiler will enforce good style.

In addition to being good style, final has some technical benefits. First off, it might be faster. Secondly, if you ever use anonymous classes, final variables are a necessity for accessing information in the surrounding block. If you get into the habit of finalizing your variables, you'll already be ready by the time you have a need for anonymous classes.

final is a component of a larger and more important topic: immutability. I should probably write a lengthy post on immutability in the future, but for now there's an important thing to remember: final does not make your objects immutable. It only prevents assignment - it doesn't prevent calling methods on an object that change the object's value (in fact, Java lacks any language construct or cultural convention for immutability). So even if your Document variable is declared final, you (or any code you pass it to) can still call doc.replaceItemValue("Form", "Ha ha"); with impunity.

My Latest Programming-Technique Improvements

Tue Apr 23 11:34:58 EDT 2013

Tags: java

Over the past couple weeks, I've been trying out some new things to make my code a bit cleaner and I've been having a good time of it, so I figured I'd write up a quick summary.

Composed Method

First and foremost, I've been trying out the Composed Method of programming. This is not a new idea - indeed, it's a pretty direct application of general refactoring - but the slight perspective change from "factor out reused code" to starting out with clean, small blocks has been a pleasant switch. The gist of the idea is that, rather than writing a giant, LotusScript-agent-style block of code first and then breaking out individual components after the fact, you think in terms of small, discrete operations that you can name. The result is that your primary "control" methods end up very descriptive and easy to follow, consisting primarily of ordered phrases specifically saying what's going on. I encourage you to search around and familiarize yourself with the technique.

import static

Java's import static statement is a method-level analog to the normal import statement: it lets you make static methods of classes available for easy calling without having to specify the surrounding class name. Most of the examples you'll see (like those on the linked docs) focus on the Math class, which makes sense, but it applies extremely well to XPages programming. Pretty much every app should have a JSFUtil class floating around, and most should also use the ExtLibUtil class for its convenience methods like getCurrentSession:

import static com.ibm.xsp.extlib.util.ExtLibUtil.getSessionScope;
import static com.ibm.xsp.extlib.util.ExtLibUtil.getCurrentSession;

// ...

getSessionScope().put("currentUser", getCurrentSession().getEffectiveUserName());

Night and day? Not really, but it's a little more convenient and easier to read.

Now, as the linked docs above indicate, import static comes with a warning to use it cautiously. Part of this is Java's cultural aversion to concision, but it's also a good idea to not go too crazy. When the methods you're importing are clear in intent and unlikely to overlap with local ones (like those in ExtLibUtil), it makes sense, particularly because Designer/Eclipse allows you to hover over the method or hit F3 and see where it's defined.

createDocument

Since creating a document in the current database with a known set of fields is such a common action, I've taken a page from one of our extended methods in org.openntf.domino and have started using a createDocument method that takes arbitrary pairs of keys and value and runs them through replaceItemValue, along with a helper method to allow for storing additional data types (since I'm still using the legacy API for these projects):

protected static Document createDocument(final Object... params) throws Exception {
	Document doc = ((Database)resolveVariable("database")).createDocument();

	for(int i = 0; i < params.length; i += 2) {
		if(params[i] != null) {
			String key = params[i].toString();
			if(!key.isEmpty()) {
				doc.replaceItemValue(key, toDominoFriendly(params[i+1]));
			}
		}
	}

	return doc;
}
protected static Object toDominoFriendly(final Object value) throws Exception {
	if(value == null) {
		return "";
	} else if(value instanceof List) {
		Vector<Object> result = new Vector<Object>(((List<?>)value).size());
		for(Object listObj : (List<?>)value) {
			result.add(toDominoFriendly(listObj));
		}
		return result;
	} else if(value instanceof Date) {
		return ((Session)resolveVariable("session")).createDateTime((Date)value);
	} else if(value instanceof Calendar) {
		return ((Session)resolveVariable("session")).createDateTime((Calendar)value);
	}
	return value;
}

In use, it looks like this:

Document request = createDocument(
		"Form", "Request",
		"Principal", getCurrentSession().getEffectiveUserName(),
		"StartDate", getNewRequestStartDate(),
		"EndDate", getNewRequestEndDate(),
		"Type", getNewRequestType()
);

 

The result of these changes is that I'm writing less code and the intent of every line is much more clear. It also happens to be more fun, which never hurts.

Fun With Old XML Features

Wed Apr 03 01:35:55 EDT 2013

Tags: java xml

One of the side effects of working on the OpenNTF Domino API is that I saw every method in the interfaces, including ones that were either new to me or that I had forgotten about a long time ago. One of these is the "parseXML" method found on Items, RichTextItems, and EmbeddedObjects. This was added back in 5.0.3, I assume for some reason related to the mail template, like everything else added back then. Basically, it takes either the contents of a text item, the text of a rich text item, or the contents of an attached XML document and converts it to an org.w3c.dom.Document object (the usual Java standard for dealing with XML docs).

That's actually kind of cool on its own in some edge cases (along with the accompanying transformXML method), but you can also combine it with ANOTHER little-utilized feature: XPath support in XPages. So say you have an XML document like this attached to a Notes doc (or stored in an Item):

<stuff>
	<thing>
		<foo>bar</foo>
	</thing>
	<thing>
		<foo>baz</foo>
	</thing>
</stuff>

Once you have that, you can write code like this*:

<?xml version="1.0" encoding="UTF-8"?>
<xp:view xmlns:xp="http://www.ibm.com/xsp/core">
	<xp:this.data>
		<xp:dominoDocument var="doc" formName="Doc" action="openDocument">
			<xp:this.postOpenDocument><![CDATA[#{javascript:
				requestScope.put("xmlDoc", doc.getDocument().getFirstItem("Attachment").getEmbeddedObjects()[0].parseXML(false))
			}]]></xp:this.postOpenDocument>
		</xp:dominoDocument>
	</xp:this.data>
	
	<xp:repeat value="#{xpath:xmlDoc:stuff/thing}" var="thing">
		<p><xp:text value="#{xpath:thing:foo}"/></p>
	</xp:repeat>

</xp:view>

I don't really have any immediate use to do this kind of thing, but sometimes it's just fun to explore what the platform lets you do. It's one more thing I'll keep in the back of my mind as a potential technique if the right situation arises.

* Never write code like this.

We Have The Technology: The org.openntf.domino API

Thu Mar 21 21:16:14 EDT 2013

As Nathan and Tim posted earlier today, a number of us have been working recently on a pretty exciting new project: the org.openntf.domino API. This is a drop-in replacement/extension for the existing lotus.domino Java API that improves its stability and feature set in numerous ways. The way I see it, there are a couple main areas of improvement:

Plain Old Bug Fixes

doc.hasItem(null) crashes a Domino server. That's no good! We've fixed that, and we're going through and fixing other (usually less severe) bugs in the existing API.

Modern Java

Java has progressed a long way since Domino 4.x, but the Java API hasn't much. Our API is chock full of Iterators, proper documentation, generic type definitions, preference for interfaces over concrete classes, and other trappings of a clean, modern API. Additionally, Nathan put together some voodoo programming to take care of the recycle thing. You read that right: recycle = handled.

New Features

In addition to cleaning up the existing functionality, we're adding features that will be useful every day. My favorite of those (since I'm implementing it myself) is baked-in MIMEBean-and-more support in get/replaceItemValue:

Set<String> stringSet = new HashSet<String>();
stringSet.add("foo");
stringSet.add("bar");
doc.replaceItemValue("Set", stringSet);

doc.replaceItemValue("Docs", database.getAllDocuments());

NoteCollection notes = database.createNoteCollection(true);
notes.buildCollection();
doc.replaceItemValue("Notes", notes);

doc.replaceItemValue("ExternalizableObject", new MimeType("text", "plain"));

List<String> notAVector = new ArrayList<String>();
notAVector.add("hey");
doc.replaceItemValue("TextList", notAVector);

doc.replaceItemValue("LongArray", new long[] { 1L, 2L, Integer.MAX_VALUE + 1L });

That's all legal now! We have more on the docket for either the initial release or future versions, too: TinkerPop Blueprints support, easy conversion of applicable entities to XML and JSON, fetching remote documents from DXL or JSON files, proper Logging support, and more.

 

As we've been developing the new API, we've already found it painful to go back to the old one - it doesn't take long to get spoiled. And fortunately, the transition process will be smooth: you can start by just replacing one entry point for your code (say, changing "JavaAgent" to "org.openntf.domino.JavaAgent" in an agent), then advance to swapping out your "import lotus.domino.*" line for "import org.openntf.domino.*" line, and then start using the new methods. All the while, your existing code will continue to work as well or better than before.

We hope to have a solid release available around the beginning of next month, and in the mean time I highly encourage you to check out the code. Download it, give it a shot, let us know how it works for you and if you have anything you'd like us to add.

The Bean-Backed Table Design Pattern

Tue Jan 22 17:54:14 EST 2013

First off, I don't like the name of this that I came up with, but it'll have to do.

One of the design problems that comes up all the time in Notes/Domino development is the "arbitrary table" idea. In classic Notes, you could solve this with an embedded view, generated HTML (if you didn't want it to be good), or a fixed-size table with a bunch of hide-whens. With XPages, everything is much more flexible, but there's still the question of the actual implementation.

The route I've been taking lately involves xp:dataTables, Lists of Maps, MIMEBean, and controller classes. In a recent game-related database, I wanted to store a list of components that go into a recipe. There can be an arbitrary number of them (well, technically, 4 in the game, but "arbitrary" to the code) and each has just two fields: an item ID and a count. The simplified data table code looks like this:

<xp:dataTable value="#{pageController.componentInfo}" var="component" indexVar="componentIndex">
	<xp:column styleClass="addRemove">
		<xp:this.facets>
			<xp:div xp:key="footer">
				<xp:button value="+" id="addComponent">
					<xp:eventHandler event="onclick" submit="true" refreshMode="partial" refreshId="componentInfo"
						action="#{pageController.addComponent}"/>
				</xp:button>
			</xp:div>
		</xp:this.facets>
						
		<xp:button value="-" id="removeComponent">
			<xp:eventHandler event="onclick" submit="true" refreshMode="partial" refreshId="componentInfo"
				action="#{pageController.removeComponent}"/>
		</xp:button>
	</xp:column>
					
	<xp:column styleClass="itemName">
		<xp:this.facets><xp:text xp:key="header" value="Item"/></xp:this.facets>
						
		<xp:inputText value="#{component.ItemID}"/>
	</xp:column>
					
	<xp:column styleClass="count">
		<xp:this.facets><xp:text xp:key="header" value="Count"/></xp:this.facets>
		<xp:inputText id="inputText1" value="#{component.Count}" defaultValue="1">
			<xp:this.converter><xp:convertNumber type="number" integerOnly="true"/></xp:this.converter>
		</xp:inputText>
	</xp:column>
</xp:dataTable>

The two data columns are pretty normal - they could easily point to an array/List in a data context or a view/collection of documents. The specific implementation is that it's an ArrayList stored in the view scope - either created new or deserialized from the document as appropriate:

public void postNewDocument() {
	Map<String, Object> viewScope = ExtLibUtil.getViewScope();
	viewScope.put("ComponentInfo", new ArrayList<Map<String, String>>());
}

public void postOpenDocument() throws Exception {
	Map<String, Object> viewScope = ExtLibUtil.getViewScope();
	Document doc = this.getDoc().getDocument();
	viewScope.put("ComponentInfo", JSFUtil.restoreState(doc, "ComponentInfo"));
}

public List<Map<String, Serializable>> getComponentInfo() {
	return (List<Map<String, Serializable>>)ExtLibUtil.getViewScope().get("ComponentInfo");
}

Those two "addComponent" and "removeComponent" actions are very simple as well: they just fetch the list from the view scope and manipulate it:

public void addComponent() {
	this.getComponentInfo().add(new HashMap<String, Serializable>());
}
public void removeComponent() {
	int index = (Integer)ExtLibUtil.resolveVariable(FacesContext.getCurrentInstance(), "componentIndex");
	this.getComponentInfo().remove(index);
}

One key part to notice is that the "componentIndex" variable is indeed the value you'd want. In the context of the clicked button (say, in the second row of the table), the componentIndex variable is set to 1, and the resolver and FacesContext make that work.

On save, I just grab the modified Document object, serialize the object back into it, and save the data source (that's what super.save() from my superclass does):

public String save() throws Exception {
	List<Map<String, Serializable>> componentInfo = this.getComponentInfo();

	Document doc = this.getDoc().getDocument(true);
	JSFUtil.saveState((Serializable)componentInfo, doc, "ComponentInfo");

	return super.save();
}

The end result is that I have a user-expandable/collapsible table that can hold arbitrary fields (since it's a Map - it could just as easily be a list of any other serializable objects):

If I wanted to access the data from non-Java contexts, I could replace the MIMEBean bits with another method, like a text-list-based format or response documents, but the final visual result (and the XSP code) would be unchanged.

More On "Controller" Classes

Mon Jan 07 19:39:09 EST 2013

Tags: xpages mvc java
  1. "Controller" Classes Have Been Helping Me Greatly
  2. More On "Controller" Classes

Since my last post on the matter , I've been using this "controller" class organization method in a couple other projects (including a refresh of the back-end of this blog), and it's proven to be a pretty great way to go about XPages development.

As I mentioned before, the "controller" term comes from the rough equivalent in Rails. Rails is thoroughly MVC based, so a lot of your programming involves creating the UI of a page in HTML with a small sprinkling of Ruby (the "view"), backed by a class that is conceptually tied to it and stores all of the real business logic associated with that specific page (the "controller"). XPages don't have quite this same assumption built in, but the notion of pairing an XPage with a single backing Java class is a solid one. Alternatively, you can think of the "controller" class as being like a stylesheet or client JavaScript file: the HTML page handles the design, and only contains references to functionality defined elsewhere.

What this means in practice is that I've been pushing to eliminate all traces of non-EL bindings in my XSP markup in favor of writing the code in the associated Java class - this includes not only standard page events like beforeRenderResponse , but also anywhere else that Server JavaScript (or Ruby) would normally appear, like value and method bindings. Here's a simple example, from an XPage named Test and its backing class:

Test.xsp
<p><xp:button id="clickMe" value="Click Me">
	<xp:eventHandler event="onclick" submit="true" refreshMode="partial" refreshId="output"
		action="#{pageController.refreshOutput}"/>
</xp:button></p>
	
<p><xp:text id="output" value="#{pageController.output}"/></p>
controller/Test.java
package controller;

import frostillicus.controller.BasicXPageController;
import java.util.Date;

public class Test extends BasicXPageController {
	private static final long serialVersionUID = 1L;

	private String output = "default";
	public String getOutput() { return this.output; }

	public void refreshOutput() {
		this.output = new Date().toString();
	}
}

In a simple case like this, it doesn't buy you much, but imagine a more complicated case, say code to handle post-save document processing and notifications, or complicated code to generate the value for a container control. The separation between the two pays significant dividends when you stop having to worry scrolling through reams of code in the XSP markup when you're working on the page layout, while having a known location for the Java code associated with a page makes it easier to track down functionality. Plus, Server JavaScript is only mildly more expressive than Java, so even the business logic itself stays just about as clean (moving from Ruby to Java, on the other hand, imposes a grotesque LOC penalty).

It sounds strange, but the most important benefit I've derived from this hasn't been from performance or cleanliness (though those are great), but instead the discipline it provides. Now, there's no question where business logic associated with a page should go: in a class that implements XPageController with the same name as the page and put in the "controller" package. If I want pages to share functionality, that's handled either via a common method stored in another class or, as appropriate, via class inheritance. No inline code, no Server JavaScript script libraries. If something is going to be calculated, it's done in Java.

There is one area that's given me a bit of trouble: custom controls. For example, the linksbar on this blog requires computation to generate, but it's not associated with any specific page. For now, I put that in the top-level controller class, but that feels a bit wrong. The same solution also doesn't apply to the code that handles rendering each individual post in the list, which may be on the Home page, the Month page, or individually on the Post page, in which case it also has edit/save/delete actions associated with it. For now, I created a "helper" class that I instantiate (yes, with JavaScript, which is depressing) for each instance. I think the "right" way to do it with the architecture is to create my controls entirely in Java, with renderers and all. That would allow me to handle the properties passed in cleanly, but the code involved with creating those things is ugly as sin. Still, I'll give that a shot later to see if it's worth the tradeoff.

"Controller" Classes Have Been Helping Me Greatly

Wed Dec 26 16:54:16 EST 2012

Tags: xpages mvc java
  1. "Controller" Classes Have Been Helping Me Greatly
  2. More On "Controller" Classes

I mentioned a while ago that I've been using "controller"-type classes paired with specific XPages to make my code cleaner. They're not really controllers since they don't actually handle any server direction or page loading, but they do still hook into page events in a way somewhat similar to Rails controller classes. The basic idea is that each XPage gets an object to "back" it - I tie page events like beforePageLoad and afterRenderResponse to methods on the class that implements a standard interface:

package frostillicus.controller;

import java.io.Serializable;
import javax.faces.event.PhaseEvent;

public interface XPageController extends Serializable {
	public void beforePageLoad() throws Exception;
	public void afterPageLoad() throws Exception;

	public void afterRestoreView(PhaseEvent event) throws Exception;

	public void beforeRenderResponse(PhaseEvent event) throws Exception;
	public void afterRenderResponse(PhaseEvent event) throws Exception;
}

I have a basic stub class to implement that as well as an abstract class for "document-based" pages:

package frostillicus.controller;

import iksg.JSFUtil;

import javax.faces.context.FacesContext;

import com.ibm.xsp.extlib.util.ExtLibUtil;
import com.ibm.xsp.model.domino.wrapped.DominoDocument;

public class BasicDocumentController extends BasicXPageController implements DocumentController {
	private static final long serialVersionUID = 1L;

	public void queryNewDocument() throws Exception { }
	public void postNewDocument() throws Exception { }
	public void queryOpenDocument() throws Exception { }
	public void postOpenDocument() throws Exception { }
	public void querySaveDocument() throws Exception { }
	public void postSaveDocument() throws Exception { }

	public String save() throws Exception {
		DominoDocument doc = this.getDoc();
		boolean isNewNote = doc.isNewNote();
		if(doc.save()) {
			JSFUtil.addMessage("confirmation", doc.getValue("Form") + " " + (isNewNote ? "created" : "updated") + " successfully.");
			return "xsp-success";
		} else {
			JSFUtil.addMessage("error", "Save failed");
			return "xsp-failure";
		}
	}
	public String cancel() throws Exception {
		return "xsp-cancel";
	}
	public String delete() throws Exception {
		DominoDocument doc = this.getDoc();
		String formName = (String)doc.getValue("Form");
		doc.getDocument(true).remove(true);
		JSFUtil.addMessage("confirmation", formName + " deleted.");
		return "xsp-success";
	}

	public String getDocumentId() {
		try {
			return this.getDoc().getDocument().getUniversalID();
		} catch(Exception e) { return ""; }
	}

	public boolean isEditable() { return this.getDoc().isEditable(); }

	protected DominoDocument getDoc() {
		return (DominoDocument)ExtLibUtil.resolveVariable(FacesContext.getCurrentInstance(), "doc");
	}
}

I took it all one step further in the direction "convention over configuration" as well: I created a ViewHandler that looks for a class in the "controller" package with the same name as the current page's Java class (e.g. "/Some_Page.xsp" → "controller.Some_Page") - if it finds one, it instantiates it; otherwise, it uses the basic stub implementation. Once it has the class created, it plunks it into the viewScope and creates some MethodBindings to tie beforeRenderResponse, afterRenderResponse, and afterRestoreView to the object without having to have that code in the XPage (it proved necessary to still include code in the XPage for the before/after page-load and document-related events).

So on its own, the setup I have above doesn't necessarily buy you much. You save a bit of repetitive code for standard CRUD pages when using the document-controller class, but that's about it. The real value for me so far has been a clarification of what goes where. Previously, if I wanted to run some code on page load, or attached to a button, or to set a viewScope value, or so forth, it could go anywhere: in a page event, in a button action, in a dataContext, in a this.value property for a xp:repeat, or any number of other places. Now, if I want to evaluate something, it's going to happen in one place: the controller class. So if I have a bit of information that needs recalculating (say, the total cost of a shopping cart), I make a getTotalCost() method on the controller and set the value in the XPage to pageController.totalCost. Similarly, if I need to set some special values on a document on load or save, I have a clear, standard way to do it:

package controller;

import com.ibm.xsp.model.domino.wrapped.DominoDocument;
import frostillicus.controller.BasicDocumentController;

public class Projects_Contact extends BasicDocumentController {
	private static final long serialVersionUID = 1L;

	@Override
	public String save() throws Exception {
		DominoDocument doc = this.getDoc();
		doc.setValue("FullName", ("CN=" + doc.getValue("FirstName") + " " + doc.getValue("LastName")).trim() + "/O=IKSGClients");

		return super.save();
	}

	@Override
	public void postNewDocument() throws Exception {
		super.postNewDocument();

		DominoDocument doc = this.getDoc();
		doc.setValue("Type", "Person");
		doc.setValue("MailSystem", "5");
	}
}

It sounds like a small thing - after all, who cares if you have some SSJS in the postNewDocument event on an XPage? And, besides, isn't that where it's supposed to go? Well, sure, when you only have a small amount of code, doing it inline works fine. However, as I've been running into for a while, the flexibility of XPages makes them particularly vulnerable to becoming tangled blobs of repeated and messy code. By creating a clean, strict system from the start, I've made it so that the separation is never muddied: the XPage is about how things appear and the controller class is about how those things get to the XPage in the first place.

We'll see how it holds up as I use it more, but so far this "controller"-class method strikes a good balance between code cleanliness without getting too crazy on the backing framework (as opposed so some of the "model" systems I've tried making).

A Couple Updates to My Domino-One-Offs Repository

Tue Sep 04 18:25:00 EDT 2012

Tags: xpages java

I realized earlier that I let a couple of the classes in Domino-One-Offs stagnate relative to the versions I currently use. Since originally posting my convenience XML library and, more usefully, the DynamicViewCustomizer, I've tweaked both in my various projects to add more features I ended up needing.

The DynamicViewCustomizer has gone further down the path of Notes-client fidelity at the expense of a bit more overhead, doing things like setting column widths (except where the column is set to extend to the window width, either specifically or by virtue of being the last column, as appropriate). It's still not perfect, as I ran into a bit of trouble with, I think, HTML in category columns (and you can see commented-out remnants of various other changes), but it's potentially useful.

For my "man, I hate dealing with all these stupid orthogonal Java XML libraries" XML library, I've added a number of functions to support my Forms 'n' Views app (coming Soon™), namely better attribute handling, a simple NodeList wrapper (that actually acts like a List), and a basic getXml() method for when you just want a string version.

Finally, I tossed in the current version of my SortableMapView class, meant to gobble up Views or ViewEntryCollections and make them sortable by all columns. That one uses Lombok, but I think you could just remove the bits about making it List-compatible and still be fine.

An Extended Document Data Source To Support MIMEBean

Wed Aug 22 20:02:00 EDT 2012

Tags: mime xpages java

Ever since I personally came up with the idea of storing serialized Java objects in Notes documents via MIME, I've been trying to use the technique all over the place, since it's basically the best thing since sliced bread.

My latest notion was that the pattern would be all the cooler if the serialized Java objects could be referenced directly in EL, so you could do something like #{doc.LineItems[3].cost}, pulling directly from a more-or-less normal Domino document. I decided to give implementing this a shot, since it would also have the side benefit of introducing me to writing JSF components/complex types.

The path I took sure feels like the long route (lots of manual doubling up of property definitions and proxying methods), but it seems to work: I have a "mimeDominoDocument" data source that acts just like a normal document source except it transparently serializes and deserializes Java objects. It's not exactly a perfect implementation (the setValue() method has a rather naive method of guessing which objects should be serialized and which shouldn't, and I didn't handle the case of setting a value to an object and then using, say, getItemValueString()), but it should be fine for demo and basic purposes.

I set up an example database with pages for the appropriate code here: /tests/mime.nsf.

Starting With Java Classes First

Mon Aug 20 21:06:00 EDT 2012

Tags: xpages java

Every time I make a new XPages app, I start by using the normal xp:dominoView and xp:dominoDocument data sources, on the theory that building with the standard set of components will keep things clean while I build on top of that. However, with each successive app, the time before I get annoyed at how messy the code has to be and start writing Java wrapper and management classes gets shorter and shorter.

For example, in my Forms 'n' Views mini-Designer app, I have the sidebar list of databases, which is just pointing to a view and showing those entries. That works pretty well, since the values I need are in the view itself. However, now I want to show the database icons, but also want to check to make sure the DBs are on the current server, allow the current user to access them, and are accessible via HTTP (I'm just pointing the image to /whatever.nsf/$Icon). Doing that with just the view entries will get messy, requiring a bunch of Server JavaScript to be crammed into my xp:image object. It'd work, but it'll only be a matter of time before I want to add more smarts, such as if I make it so that you can grant enhanced access to certain design elements to a certain user. The best way to handle this will be to make a "DatabaseEntry" class and give it methods to determine the icon and design-element visibility.

I don't think I really saved myself any time by delaying this - it's pretty rare that I DON'T have this sort of calculation eventually, so I'm just adding a bit of time doing things the "dirty" way first before cleaning them up. I'm not really familiar with "vanilla" JSF, but I get the impression that it doesn't really let you put as much business logic into the view portion as XPages do (I think JSF tends to stick to EL normally). Certainly, when you write a Rails app, you start with the models and controllers cleanly before putting the data on the page. The XPage architecture doesn't make that as easy, with its lack of reasonably-modifiable controllers, but adding Java classes and using collection classes in xp:dataContexts or as managed beans work great.

I've toyed with the idea of expanding the classes I use in my forums app to work smoothly with any view/document, picking up the columns by name and storing them in Map representing the entry/document. The problem with that is that the data source and needs tend to be just different enough per app that it's not generic enough (for example, in this one, I'm getting databases for the "add DB" picker via DbDirectory, not a view). In the mean time, I think I'll just start with the access classes first and see what I do most commonly.

This View Indexer Thing Might Be Worth Some Time

Sun Aug 05 16:51:00 EDT 2012

Tags: java views crazy
  1. I Went Crazy and Made a Small View Indexer
  2. This View Indexer Thing Might Be Worth Some Time

I've put a little more time into my small view indexer from the other day, and it seems even more promising now. The editor I made for it might give you some idea of the potential:

Fancy View Builder

Since the view building is all being done in Java, I decided to toss in the list of available JSR-223-compliant scripting languages currently available. The scripting context is given a variable "doc" that's a DocumentWrapper class I wrote that implements Map to make its use a bit easier. For added fun, I baked in knowledge of serialized Java objects stored in the document's items:

Fancy View Objects

Normally, the wrapper just returns the value of getItemValue(...), but when it encounters a MIME entity of type "application/x-java-serialized-object", it deserializes it and returns that instead, allowing for real structured data access.

I also realized that I don't have any reason to stick to only objects available in the stock JDK for index storage. Since using this will require extra Java code anyway, I may as well make my life easier and write my own "view entry" class. Once I get that sorted out, I can work on keeping the indexes updated with a server task, dealing with reader fields, and physical storage.

Hmm, maybe I should make myself an editor for normal views that looks like this one. It'd probably be a lot less hassle to deal with than the legacy one.

I Went Crazy and Made a Small View Indexer

Sat Aug 04 11:45:00 EDT 2012

Tags: java views crazy
  1. I Went Crazy and Made a Small View Indexer
  2. This View Indexer Thing Might Be Worth Some Time

Warning: this post is almost entirely pie-in-the-sky nonsense, untethered from reality. Probably.

So yesterday, for some reason, I got to thinking about Domino's view indexing and whether or not it can be readily done better. The current view indexer does its intended task with aplomb (most of the time), but it has its problems. For one, iterating over view indexes in Java (read: everything you should do in Domino today) is dog slow. More theoretically, since it deals only with summary data for performance reasons, its ability to deal with structured data is limited - you can't deal with rich text or, for example, store any sort of Map in a Domino document and deal with it in a view (realistically).

I had the notion that it may be faster, at least in some cases, to do your own indexing: make a Java collection of your "view" data and serialize the result into a note. I decided to try a little test: take a view that consists only of a sorted column of 100k documents' UNIDs and compare it to a TreeSet of the same. My initial results were promising: iterating over the collection of documents (via getNextDocument() and a forall with getDocumentByUNID(), respectively) was about twice as fast with the Java version and was stored in about 17% the space, presumably thanks to document compression.

To be fair, that was really stacking the deck in favor of Java: I didn't care about reader fields or any entry metadata like position, and fetching every document in a large view is possibly the most expensive way to use it. Presumably, fetching just the first entry would make the view version much faster. Still, it shows that it might be useful in some cases, so I decided to try something a bit more complicated.

I made another view with 15k documents (task requests) and two columns: the date of the request, sorted, and the summary line. I matched that up with a TreeSet containing Lists of Maps and iterated over both to fetch and print the entry data. Again, the Java version ended up way faster, taking about 25% of the time.

Now it has me wondering about whether it'd be worth investigating further. It's fair to assume that Java can be treated as the new "native" language of Domino, so serialization is potentially a legitimate mechanism. Furthermore, a real implementation of an alternative view index wouldn't have to be as hopelessly naive as the one I put together: the basic java.util.* collections are almost definitely not the best storage mechanisms for a database index, and, better still, this is well-trodden territory. Since normal views would still exist in this magic future world, these fancy views could eschew a lot of the UI and build-performance requirements of the normal ones in favor of fancy tricks like native knowledge of serialized data structures. Imagine storing structured data like a table of line items in a MIME entity but then being able to query that in a "view". Reader fields could be handled by storing the applicable names in the "entry" and then comparing that two the user's names list at runtime, presumably like standard views do. A server task could handle monitoring document changes and making incremental changes (like happens now, really).

So it's theoretically possible. Would it be worth it? I'm not sure - maybe. It's fun to think about, in any event.

A Prototype In-App Messaging System for XPages

Mon Jul 09 14:35:00 EDT 2012

Tags: xpages java

Sven Hasselbach's post about ApplicationListeners the other day and the notion of post-MVC methods of app architecture got me thinking about the notion of generic messaging/events inside an XPages app. The first example that comes to mind is a project-tracking database where there may be events like "new task created" or "delivery ready for review" - business logic stuff where the fact that it's a document being modified is an implementation detail. In my actual project-tracking database at work, I implemented this by creating "Notification Stub" documents with information about the event and having a scheduled agent process them. That's a fine Notes-y way to do it (and has its own advantages, like having another cluster member send the emails), but I want a way for the running XPages app to notice and react to these things without the immediate code knowing about every single concern.

I did a quick search and didn't find any such capability in basic JSF or XPages, but I realized it wouldn't be terribly difficult to implement it myself: have an application-scoped bean (maybe one per scope, if it's useful) with a methods to dispatch events and declare listeners. Then, other beans could attach themselves as listeners and the normal code would only be responsible for sending out its messages, to be received by zero or more unknown objects:

public interface XPagesEventDispatcher {
	public void dispatchEvent(XPagesEvent event);
	public void addListener(XPagesEventListener listener);
}

The actual event objects would be similarly simple: just a name and an arbitrary array of objects:

public interface XPagesEvent {
	public String getEventName();
	public Object[] getEventPayload();
}

Finally, listeners are the simplest of all, being responsible only for providing a method to handle an XPagesEvent (EventListener is the standard "tagging" interface for this kind of thing):

public interface XPagesEventListener extends EventListener {
	public void receiveEvent(XPagesEvent event);
}

However, after creating my gloriously clean set of interfaces, I ran into the ugly marsh of reality: application-scoped beans only exist after first use. That works fine for the dispatcher itself, but not for listener beans, the whole point of which is to not be referenced in code elsewhere. After a bit of searching, I came up with a solution that may work. When defining a managed bean, you can set properties, and those properties can be Lists. So, rather than creating two beans, I define just the dispatcher bean, but provide it a list of class names of listener beans to create at startup:

<managed-bean>
	<managed-bean-name>applicationDispatcher</managed-bean-name>
	<managed-bean-class>frostillicus.event.SimpleEventDispatcher</managed-bean-class>
	<managed-bean-scope>application</managed-bean-scope>
	<managed-property>
		<property-name>listenerClasses</property-name>
		<list-entries>
			<value-class>java.lang.String</value-class>
			<value>frostillicus.Messenger</value>
		</list-entries>
	</managed-property>
</managed-bean>

I gave my dispatcher implementation class a convenience methods to take a string and 0 or more objects and dispatch the event, so it could be used in code like:

applicationDispatcher.dispatch("home page loaded")
applicationDispatcher.dispatch("delivery submitted", [delivery.getUniversalID()])

So far, this setup is working pretty well. I plan to toy with it a bit and then try it out in proper use. Provided it works well (and it doesn't turn out that this capability already exists and I just missed it), I'll turn it into a project on GitHub/OpenNTF. As long as it keeps working, I'm excited about the prospects, especially when combined with XPages Threads and Jobs for non-blocking event handling.

Using Lombok to Write Cleaner Java in Designer

Fri Jul 06 18:48:00 EDT 2012

Tags: java

Java, as a language, has a number of admirable qualities, but succinctness is not among them. Even a simple "Person" class with just "firstName" and "lastName" properties can boil over with at least a dozen lines of boilerplate to add constructors, getters, setters, and equivalency testing. Fortunately, Project Lombok can help. Their main page contains a good video of the basics, while the Feature Overview page covers the rest. If you've written a lot of Java classes, I suspect that the video will have you hooked.

And there's good news: Lombok can work in Designer! It's not as simple as it is for vanilla Eclipse and it took me a good amount of trial-and-error to get it functional, but I figured out what you have to do:

  1. Copy lombok.jar into your Notes program directory (e.g. C:\Program Files\IBM\Lotus\Notes). It has to go there so that the initial program launch sees it.
  2. Additionally, copy the same lombok.jar to your jvm\lib\ext directory inside your Notes program directory. It has to go there so that it's on your project build path. Yes, normally you "shouldn't" put your Java libraries there, but I figure it's alright since this takes local, non-update-site fiddling anyway.
  3. Now for the weird part. Normally, the Lombok installer modifies eclipse.ini to launch some support stuff. That won't work for us, however. What we need is the file "jvm.properties" in the framework\rcp\deploy directory. The syntax isn't the same as for eclipse.ini and I'm not 100% sure my method doesn't have any horrible side effects, but adding these two lines (the two under "Lombok stuff", with "vmarg.javaagent" and "vmarg.Xbootclasspath/a") worked for me:
    vmarg.javaagent=-javaagent:lombok.jar
vmarg.Xbootclasspath/a=-Xbootclasspath/a:lombok.jar

Now, I've only had this installed for a couple hours and have only done a couple basic tests, so I can't rule out the notion that this will interfere with some other Designer thing (the "Xbootclasspath/a" line may override some Designer default), so use with care. On the plus side, Lombok's magic works during compilation, not runtime, so you shouldn't need lombok.jar on the destination server. You'd still need it on the Designer client of any other programmers working on the code, though.

Update: I think you can avoid doubling up your copies of lombok.jar (and avoid a problem where launching the normal Notes client doesn't work) by just putting lombok.jar in jvm/lib/ext and using these lines in jvm.properties instead:

vmarg.javaagent=-javaagent:${rcp.home}/../jvm/lib/ext/lombok.jar
vmarg.Xbootclasspath/a=-Xbootclasspath/a:${rcp.home}/../jvm/lib/ext/lombok.jar

Use Interfaces All The Time

Wed Jun 27 09:10:00 EDT 2012

Tags: java

In addition to being useful concepts generally, Java interfaces can be used literally in code as a way of keeping your code as clean and generic as possible. While you can't ever create a new object based on an interface, you CAN use interface names as object types for variables and parameters. For example, this is a legal way to create a Vector of Objects:

List<Object> someList = new Vector<Object>();

"That's great and all," you say, "but why bother doing that?" And indeed, in a simple case like a mostly-procedural Java agent, you won't get much benefit from doing that. But imagine a method like this:

public List<String> retrieveValuesFromDatabase();

Since all that method guarantees is that it's going to return some sort of List, it has a lot of discretion as to the form that list will take. Maybe the programmer starts out with Vector but then decides that ArrayList is better for the purpose - they're free to change it without troubling the user of the method one bit. Maybe they want to get a little crazier and make their own custom class that implements List and provides a live view of a database - again, that can happen with no change to the user's code.

While the primary benefit comes when large libraries use interfaces in their APIs, even moderately-sized structured programs written by one programmer can benefit. Take a (contrived and unsafe) class like this:

public class ExampleClass {
	private List<String> cache;

	public ExampleClass() {
		this.cache = new ArrayList<String>();
	}
	public List<String> getCache() {
		return this.cache;
	}
	public void setCache(List<String> cache) {
		this.cache = cache;
	}
}

The same benefits mentioned above apply here: because you're referring to your instance variable just as a List, you're free to change the actual implementing class by changing only one line. The benefits are only compounded when you add more and more-complicated methods to your class.

Additionally, using interfaces when they're not strictly necessary can help avoid bad habits and traps. Our go-to example, Vector class, pre-dates the Java Collections Framework and was only retrofitted to the List interface when the framework came out in Java 1.2. It shows its age in a couple ways, not the least of which is the inclusion of a couple methods it uses that are not part of List, such as elementAt(...) and addElement(...). These are best replaced entirely with get(...) and add(...), which are common to all Lists... and are easier on the eyes, to boot.

It's easy to run into problems with APIs that don't use interfaces extensively, like, say, the Domino API*. Presumably due to the fact that the "new" lotus.domino classes were implemented in a pre-1.2 JRE, they use Vector extensively. Most of the time, this is fine - anything expecting a List will happily accept a Vector, but the reverse doesn't hold. You don't have to go far to create a problem. Multi-value controls in XPages, when bound to a Java object, prefer the use of ArrayList (but work great when pointed to an existing List of any stripe). As a result, this will result in an exception:

<?xml version="1.0" encoding="UTF-8"?>
<xp:view xmlns:xp="http://www.ibm.com/xsp/core">
	<xp:this.dataContexts>
		<xp:dataContext var="newRegData" value="${javascript: new java.util.HashMap()}"/>
	</xp:this.dataContexts>
	
	<xp:checkBoxGroup value="#{newRegData.multi}">
		<xp:selectItems value="#{javascript:['one', 'two']}"/>
	</xp:checkBoxGroup>
	
	<xp:button id="createDoc" value="Create Doc">
		<xp:eventHandler event="onclick" submit="true"><xp:this.action><![CDATA[#{javascript:
			var doc = database.createDocument()
			doc.replaceItemValue("MultiValue", newRegData["multi"])
			doc.save()
		}]]></xp:this.action></xp:this.eventHandler>
	</xp:button>
</xp:view>

The problem is that, while ArrayList is still a List, replaceItemValue(...) doesn't give two hoots about interfaces, forcing you to do something like this:

var vec = new java.util.Vector()
vec.addAll(newRegData["multi"])
doc.replaceItemValue("MultiValue", vec)

Not pretty, and that's one that could be fixed without changing the outward API.

The upshot of all this is that you can derive a lot of benefit from using interfaces extensively, even when they're not strictly necessary. I even have a habit of writing lines like "List<Object> values = (List<Object>)doc.getItemValue("SomeItem");", because I am a madman. You may not go as far as I do, but it's still usually a good idea to follow the policy of using interfaces whenever the methods you want are available that way.

* Technically, the the lotus.domino.* "classes" are all interfaces and the implementing classes vary based on the type of connection you're using (e.g. local vs. CORBA). As usual, Domino provides both good and bad examples simultaneously.

Putting java.util to Use

Fri Jun 22 11:46:00 EDT 2012

Tags: java

I'm in the process of figuring out a good way to combine data from several sources into a single activity stream, which means that they should be categorized by date and then sorted by time. While that's a piece of cake with a single view, it gets hairy when you have several views, or perhaps several different types of sources entirely. Fortunately, abstract data types are here to help.

You're already using Lists and Maps, right? For this, I decided to use Maps and one of my personal favorites, the Set. If you're not familiar with them, Sets are like Lists, but only contain one of each element and don't (normally) guarantee any specific order - DocumentCollections are a type of Set (albeit not actually implementing the interface).

I created the categories by using a Map with the Date as the key and Sets of entries as the value. That would work well enough using HashMaps and HashSets, but they would require manual sorting in the XPage to display them in the right order. Fortunately, Java includes some more-specific types for this purpose: SortedMap and SortedSet. These are used the same way as normal Maps and Sets, but automatically maintain an order (based on either your own Comparator or the "natural" ordering based on the objects' compareTo(...) methods). Better still, the specific TreeMap and TreeSet implementation classes have methods to get at the keys and values, respectively, in descending order.

Once I had my collection objects picked out, all I had to do was start filling them in. I used stock Date objects for the Map's keys and wrote a compareTo(...) method for the entries I'm keeping in the Set. Then, on the containing activity stream class, I just had to write a "serializer" method to write out the current state of the objects into a List for access.

While I may change around the way I do this (I may end up putting the category headers inline so I just use one SortedSet for performance), it provides a pretty good example of when you can use some of Java's built-in classes to do some of the grunt work for you.

Reverse-Engineering File Formats for Fun

Mon Jun 11 09:00:00 EDT 2012

Tags: java

Well, okay, it wasn't for fun; it was for work. About a year and a half ago, I had occasion to parse the contents of shared object files from a Flash Media Server ("*.fso" files), and I figured it might be interesting to go back over the kind of things I had to do to accomplish that.

The first thing I did was to search around to see if anyone else had solved the same problem. However, while there are plenty of parsers for "Flash shared objects", not the least of which is in Flash itself, it seemed that, unless I was doing it wrong, that format is different from the one used by servers, so I couldn't use any of the ones I found. So I was left with a folder full of binary files and a task to accomplish. Sounds like a job for programming!

Fortunately, I knew the shape of the data inside the files: they were essentially arrays of maps. I could see the keys and some of the values in there, so the files were binary but not compressed, which gave me hope. However, since the data was variable-length, I couldn't just find the record delimiter and use offsets - I had to go through and find the various byte-value delimiters for each aspect and write a proper parser.

I opened up a couple of the files in vi sessions to compare them and, using the handy ga command, checked the byte values of the non-ASCII characters. I found that the file always starts with 0 then 3, and there appeared to be a "general" delimiter of 0, 0, 0 to break up header sections of the file and each record. Then I saw a number that was different between each file - ah-hah, the record count!

After another delimiter, I found a number followed by the name of the file (for some reason). It turned out that the number corresponded to the length of the file name, indicating that the format uses Pascal-style length-prefixed Strings. Good to know! After the name came a second instance of the record count for some reason, leaving the rest of the file to the actual records.

Conceptually, the records are easy: there should be about $record_count instances of some sort of record delimiter with the data packed in there in, essentially, key-value pairs. That's ALMOST how it is, but it got a bit weird: all records were ended with the same "general" delimiter used in the header, but they ALSO had their own two-byte record delimiter. That is, except when they used a three-byte variant for some reason. And there are a couple unknown values in there - they were different in each record and I'm sure they serve some use, but I couldn't figure out what it was. Once I figured out the hairy bits, though, the data was pretty much as I expected: there was a record index (for some reason), then the expected key-value pairs. Most of them were strings, but not all - there was an "urgent" key in some entries that had no corresponding value (though now that I back look at the code, I suspect that it was a boolean value of \1 to indicate "true").

The end result of all this was by far the most C-like Java I've ever written:

FSOParser.java

I felt pretty good after finishing that. High-level object tress are fun and all, but it's good to, from time to time, get your hands dirty with lower-level stuff like reading files as integer arrays.

PS: I don't know why my FSORecord class didn't just extend HashMap. I blame the brain haze from staring at binary files for too long. Conversely, I think I did have a reason to use arrays of int instead of byte - I think it had to do with Java byte being always signed but the data in the file being unsigned.

Turns out Partial Execute is handy

Sun Aug 22 22:53:00 EDT 2010

My main "spare time" project involves working on a lineup editor for raids in WoW. Basically, each player can have one or more characters, each of which can perform one or two roles (out of "Tank", "Healer", or "DPS"), and the UI should allow the person setting it up to pick which character that player will bring and which role they will perform. The setup is a table, with each row looking like:

The checkbox indicates whether or not the player is going on the run, then there's the player name, the currently-selected character (the icon denotes the character's current specialization), then how they originally signed up and their desired role, then three checkboxes for the role, and then their skill/gear rank.

Originally, I had the lineup set up as a Server-Side JavaScript array with ad-hoc objects with fields for each column. This worked out alright at first, but not only stepped all over MVC separation, since the controls had to "know" about the nature of the data they were working with, but made the necessary event-handler code really tangled.

What clinched the deal was the side effects of switching a character or role - not every character can fill every role, so switching a character or role could easily switch the other. Case in point - if I switch my lineup line there to my hunter Elegabal, the role has to change as well:

With a JavaScript array and event handlers, every change meant code attached to the control itself that looked up the current character in the database, the roles they can fill, if they're eligible for the selected role, then either what roles they CAN fill (if a character was picked) or what character is available to fill the selected role. Not impossible, but VERY hairy.

So I switched over to Java objects on the back-end. While the Java code itself is less expressive than JavaScript, the organization is significantly cleaner and more logical. The Lineup is a class that descends from Vector to provide the overall table, and each element is a LineupLine object with getters and setters for each important element. That way, rather than each role radio button having to have code to sanity-check the character, they're just bound to "#{line.role}" and the LineupLine.setRole(String) method handles all the particulars.

Early on, though, I started running into some oddities - I'd pick a different role, the table would refresh, and then my role selection would go back to its previous state. I added in some System.out.print() lines in the Java objects to make sure setRole() was being called and, sure enough, it was - I could see that it was called with the right new value, yet it kept switching back to the old one. Upon further investigation, I found the problem: while the new role was being set, the server was also getting a setCharacter() call with the existing character value. So the role would be switched, then the character would be set, and, since that character couldn't do the new role, it'd naturally switch the character back.

The solution I found was to enable Partial Execution on the input controls. I can only assume that this means that, rather than sending the server the entire contents of the row as the new data to process, it sends only the control's (i.e. the drop-down's or the radio button's) data, which limits the method calls to only the one that actually changed. Presumably, this has the side effect of being slightly faster, since less data is sent to the server to be processed.