Showing posts for tag "maven"

Recent Open-Source Project Updates

Fri Sep 06 14:25:34 EDT 2024

I've released a spate of open-source project updates recently, and I figured it'd be good to round up what's new. Most of them are utilitarian in nature - mostly fixes for things that crop up with Domino 14 and Java > 8 - but the first one is larger.

XPages Jakarta EE

Today, I released version 3.1.0 of the XPages JEE project. This is mostly about fixing up some edge-case and sporadic bugs that cropped up in 3.0, but also includes some performance updates and contributions from new contributors. Additionally, it should work on the newly-launched Domino 14.5 EAP1. The use of Java 21 in that version of Domino won't necessarily affect XPages JEE in a while, since JEE 11 targets Java 17, but there's some neat stuff in there for general use.

p2-layout-resolver

The p2-layout-resolver is a plugin that allows the use of p2 (Eclipse-style) repositories as Maven dependencies in non-Tycho projects. I use this in a lot of cases where I move a project from Tycho to maven-bundle-plugin for simplicity in dependency management.

Version 1.9.0 includes a very-useful contribution that fixes dependencies in cases where a bundle has a Bundle-ClassPath entry that references an embedded JAR that doesn't exist. In the Domino world, this cropped up in Domino 14, so it's useful if you're building anything that targets that version of the runtime or above.

p2-maven-plugin

For various Domino-related needs, I maintain a fork of the p2-maven-plugin, which is useful for its additions of things like generating site.xml files (still important for importing into an NSF update site, after all these years) and the <transform>jakarta</transform> option to run JARs through Eclipse Transformer when bundling them, allowing use of pre-Jakarta JEE artifacts in a smooth way.

The 3.1.x versions focused on fixing problems when running on Java > 8 (namely no longer using IBM Commons XML) and improving handling of some other hiccups.

Simplifying the Maven Build of the NSF File Server Project

Wed Apr 10 17:02:09 EDT 2024

When working on NSF File Server project that I talked about the other day, I took a slightly-different tack as far as building it than I did in the past, and I think it's worth going over some of that in case it's useful for others.

Initial Version

The first version of this project was a non-OSGi WAR file meant to be deployed to an app server like Liberty, not to Domino's OSGi stack, and so it's never involved Tycho. This made it mostly simpler, since its various dependencies are normal Maven dependencies and so I didn't have to worry about any of the annoying hoops.

However, it did have some native Domino dependencies: Notes.jar and the NAPI. These would need to be included as Maven dependencies and brought into the final WAR. The way I handled this was using the generate-domino-update-site project, which lets you first generate a p2 site in the style of the painfully outdated IBM-provided update site and then, if desired, turn that p2 site into more-normal Maven artifacts.

When I eventually switched from targeting a WAR file to having it run on Domino, I used the same dependency structure. The Domino version runs as an HttpService implementation, and so I pointed at the Mavenized version of the com.ibm.xsp.bootstrap and com.ibm.domino.xsp.adapter bundles.

Then, I used the maven-bundle-plugin, which fits the job of taking an otherwise-normal Maven project and making it work in OSGi environments (mostly). The way that plugin works is that you specify a lot of your MANIFEST.MF rules in the pom.xml:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
<plugin>
	<groupId>org.apache.felix</groupId>
	<artifactId>maven-bundle-plugin</artifactId>
	<extensions>true</extensions>
	<configuration>
		<instructions>
			<Bundle-SymbolicName>org.openntf.nsffile.httpservice;singleton:=true</Bundle-SymbolicName>
			<Bundle-RequiredExecutionEnvironment>JavaSE-1.8</Bundle-RequiredExecutionEnvironment>
			<Export-Package/>
			<Require-Bundle>
				com.ibm.domino.xsp.adapter,
				com.ibm.commons,
				com.ibm.commons.xml
			</Require-Bundle>
			<Import-Package>
				javax.servlet,
				javax.servlet.http
			</Import-Package>
			<Embed-Dependency>*;scope=compile</Embed-Dependency>
			<Embed-Transitive>true</Embed-Transitive>
			<Embed-Directory>lib</Embed-Directory>
			
			<_removeheaders>Require-Capability</_removeheaders>
			
			<_snapshot>${osgi.qualifier}</_snapshot>
		</instructions>
	</configuration>
</plugin>

The first couple are one-for-one matches to what you'd have in the MANIFEST.MF, but things get weird once you get to the "Embed-*" ones.

The Embed-Dependency instruction is a potent one: you give it a description of what dependencies you want embedded in your OSGi bundle (in this case, all my non-provided dependencies), and then it does the job of copying them into your final bundle JAR. You can do this other ways - copying them manually, using the Maven Dependency Plugin, or others - but this handles all your transitive stuff nicely for you, thanks to Embed-Transitive. I use Embed-Directory here just for cleanliness - the result is functionally the same without it.

The final bits are just for cleanliness: I remove Require-Capability to avoid some trouble I had with older Domino versions, and then I set what the snapshot value will be, which ends up being the current build time.

With this, I end up with a single OSGi bundle with everything in it. This works well for this sort of project - with something to be used in Designer, I prefer to make a big pool of distinct OSGi bundles to make it so that you can look up the source properly, but something server-only like this doesn't need that.

2.0 Version

In this new version, the switch to JNX meant that I was tantalizingly close to not having to do any weird dependency stuff: JNX is distributed in Maven Central, so I didn't need to have the weird locally-built stuff like I did for Notes.jar and the NAPI.

However, that wasn't everything: there are still the "bootstrap" bundles containing the HttpService superclass and related classes. While I don't need to distribute those anywhere, they're still required to compile the classes - no amount of non-verified text files or the like will get around that.

I came up with another way, though. Java only needs classes that look like those to compile, and then the compiled class of mine will be the same regardless of anything else. This is critical: I don't actually need any implementation code, and that's the part I can't redistribute. So I made two little Maven modules: com.ibm.xsp.bootstrap.shim and com.ibm.domino.xsp.adapter.shim. These modules contain just a handful of classes, and of those classes only the methods actually referenced by my code. Since these modules will then be marked as "provided", they won't be bundled into the final JAR.

This squared the circle nicely, where now I can compile the Java side without any weird pre-requisites. Admittedly, the two NSFs in the module set still require an NSF ODP Tooling environment, which is a whole other ball of wax, but it's a step in the right direction.

Other Uses

This technique can be used in other similar projects when you only need a few classes from the XPages stack. For example, if your goal is to just wrap a third-party library and provide it to XPages in Designer, you could probably do this by making a stub implementation of XspLibrary and related classes, and skip the whole generate-domino-update-site step. The more you use from the stack, the less practical this is - for example, the XPages Jakarta EE project reaches into all sorts of crap, and so I can't really do this there. For this, though, it works nicely.

The Difficulties of Domino Project Dependencies

Thu Dec 31 09:38:22 EST 2020

Tags: java maven

This post is a drum I've been banging for a long time, from nagging the dev team in the IBM days through to formally requesting it in HCL's Ideas Portal. That idea there has been "Likely to implement" for a little while now, which is heartening, and either way I figured it'd be useful to have a proper blog post explaining the trouble and what a useful better version would be.

The Core Trouble

The main thing I'm talking about here is the act of having a third-party or (particularly) open-source project that depends on Domino artifacts - namely, Notes.jar, the NAPI, and the XPages UI components. I have more than a few such projects, so it's something I deal with pretty much daily.

When you're dealing with an XPages app in an NSF, this isn't really an issue: all the parts you need are there and are part of the classpath. You just reference lotus.domino.Database or com.ibm.xsp.extlib.util.ExtLibUtil and don't even give it a second thought. When you have a project outside of an NSF or Designer, though, you start to have to worry about this.

OSGi Projects

For OSGi-based projects, this means that you need to have a Target Platform that points to the XPages artifacts and then either have a variant of that that includes a packages Notes.jar or also include Notes.jar in your classpath another way. In Eclipse, this might be accomplished by adding Notes.jar to your active JVM and referencing a Notes or Domino installation's OSGi directories - this is something the XPages SDK helps with.

The immediate trouble this involves is if you want to build this project outside of Eclipse - most commonly now with Maven. This is where the IBM Domino Update Site for Build Management came in, which is a cleanly-packaged p2 update site of the XPages artifacts and Notes.jar, suitable for use with Maven+Tycho and any other tool (like Eclipse) that gets its dependencies out of a p2 repository. Unfortunately, it hasn't been updated since its initial release, and contains just the original 9.0.1 versions.

To aid with creating updated versions of that, I created the generate-domino-update-site tool a while back. Since no one outside HCL can legally share update sites themselves, the tool is the next-best thing: point it at Notes or Domino and it'll make one for you in a consistent way.

With either of those routes, though, there's still a gotcha: you still need to have each developer set up the update site for themselves, and it's only consistent across projects because the community settled on the notes-platform Maven property as a URI pointing to the update site. This is as opposed to something like Eclipse-the-IDE's repositories, which (as a virtue of being open-source) are publicly available and can be referenced freely.

Overall, it's a drag having to bring-your-own-site, but at least the use of notes-platform as a pseudo-standard smooths it out.

Non-OSGi Projects

Things get stickier with non-OSGi projects, though. With OSGi projects, the dependency mechanism lines up with the way the artifacts are delivered from the vendor: they all have OSGi metadata (or have a ready-made hook for it, like Notes.jar) and so just making a p2 site out of them makes them ready to go. They don't, though, have Maven metadata, and so referencing them that way takes extra processing.

I've gone about this two ways to date:

  • The aforementioned update site project also has a mechanism for "Mavenizing" update sites. You point the tool at an existing p2 site (like one created by the first step), pick a groupId for it, and it'll install the files into your local repository.
  • The P2 Maven Resolver plugin, which cuts out that middle step and uses a p2 repository as a source of Maven dependencies directly. This route is more "clever", but some tools get a little shaky with it.

Either way, the experience is okay but not perfect. There are some oddities to do with the different dependency mechanisms between OSGi and Maven, but overall it gets the job done.

The core trouble with it is that it's even less consistent across developers/projects than the Tycho notes-platform idiom. I've personally gone through a couple iterations of the Mavenized layout, with different inter-dependency schemes and groupIds. That leads to drift and incompatibility among projects. For example, I use the xpages-runtime project for client work to do my lingering XPages development, and there's some friction in keeping the dependency schemes between that and the client project in line, even though I'm the only developer.

What I'd Like

What I'd really like would be an official HCL-provided or -sanctioned repository for p2 and Maven use for these artifacts. I've pitched the idea of OpenNTF hosting this, since I already have the tools and servers on hand, though we'd have to come up with a way to agree about who is legally allowed to access it. All the better would be consistently-updated HCL-hosted repositories, where they could link access to one of the various HCL accounts we tend to have.

The best route would be to publish it on a repository that doesn't require authentication. While I'm making wishes, attaching Javadoc would be a classy touch too.

Anyway, that's the gist of it. It's one of the two main thorns in my side when doing Domino-targeted development (the other being initializing the runtime itself in the process), and it'd save me a whole lot of heartache if it had a proper solution.

Quick Tip: JDK Null Annotations for Eclipse

Thu Dec 10 15:17:18 EST 2020

  1. The Cleansing Flame of Null Analysis
  2. Quick Tip: JDK Null Annotations for Eclipse
  3. The Joyful Utility of Optionals in Java

A few years back, I more-or-less found the religion of null analysis, and I've stuck with it with at least my larger projects.

One of the sticking points all along, though, has been Eclipse's lack of knowledge about what code not annotated with nullness annotations does, with the biggest blind spot being the JDK itself. For example, take this bit of code:

1
BigDecimal foo = BigDecimal.valueOf(10).add(1);

That will never throw a NullPointerException, but, since BigDecimal#valueOf isn't annotated at all, Eclipse doesn't know that for sure, and so it may flag it as a potential problem. To deal with this, Eclipse has the concept of external annotations, where you can associate a specially-formatted file with a set of classes and Eclipse will act as if those classes had nullness annotations already.

Core JDK Annotations

Unfortunately - and as opposed to things like IntelliJ - Eclipse for some reason doesn't ship with this knowledge out of the box. For a while, I just dealt with it, throwing in technically-unnecessary checks around things like Optional#get that are guaranteed to return non-null. The other day, though, I decided to look into it more and found lastNPE.org, which is a community-driven project to provide such external annotations.

Better still, they also provide an Eclipse plugin (404 expected on that link - Eclipse knows what to do with it) to apply rules from your project's Maven configuration to the IDE. This not only covers applying external annotations, but also synchronizing compiler configurations.

Sidebar: The Eclipse Compiler

By default, a Java project is compiled with javac, the stock Java compiler. Eclipse maintains its own compiler, varyingly called ECJ or (as shorthand) JDT. Eclipse's compiler is, unsurprisingly, well-geared towards IDE use, and part of that is that it can flag and process a great deal of semantic and stylistic issues that the stock compiler doesn't care about. This included null annotations.

Maven Configuration

With this information in hand, I went to configure my project's Maven build. The first step was to change it to use Eclipse's compiler, since I had recently switched the project away from being Tycho-based (which uses ECJ by default). This can be done by configuring maven-compiler-plugin:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
<build>
  <plugins>
    <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-compiler-plugin</artifactId>
      <version>3.8.1</version>
      <configuration>
        <failOnWarning>false</failOnWarning>
        <compilerId>jdt</compilerId>
        <compilerArguments>
          <properties>${project.basedir}/../../config/org.eclipse.jdt.core.prefs</properties>
        </compilerArguments>
      </configuration>
      <dependencies>
        <dependency>
          <groupId>org.eclipse.tycho</groupId>
          <artifactId>tycho-compiler-jdt</artifactId>
          <version>1.7.0</version>
        </dependency>
      </dependencies>
    </plugin>
  <plugins>
<build>

I use an inline dependency on Tycho's tycho-compiler-jdt to provide the compiler. I stuck with version 1.7.0 for now because Tycho 2.0+ uses the newer core runtime that requires Java 11, which this project does not yet for platform lag reasons. I also find it useful to set <failOnWarning>false</failOnWarning> here because ECJ throws many more (legitimate) warnings than javac. Long-term, it's cleanest to keep this enabled.

I also configured Eclipse's compiler settings like I wanted for one of the project's modules, then copied the settings file to a common location. That's where the compilerArguments bit comes from.

Then, I went through the available libraries from lastNPE.org, found the ones that match the libraries we use, and added them as dependencies in my root project:

<properties>
  <lastnpe-version>2.2.1</lastnpe-version>
</properties>

<dependencies>
  <dependency>
    <groupId>org.lastnpe.eea</groupId>
    <artifactId>jdk-eea</artifactId>
    <version>${lastnpe-version}</version>
    <scope>provided</scope>
  </dependency>
  <!-- and so on -->
</dependencies>

Once I updated the project configurations, Eclipse churned for a while, and then I got to work cleaning up the giant pile of new errors and warnings it brought up. As usual with null checks, this was a mix of "oh, nice catch" and "okay, sure, technically, but come on". For example, it flags System.out.println as a potential NPE because System.out is assignable - this is true, but realistically my app's code is going to be the least of my concerns when System.out is set to null.

In any event, I was pleased as punch to find this. Now, I have a way to not only properly check nullness with core classes and common libraries, but it's a way that's shared among the whole project team automatically and enforced at compile time.

OpenNTF Fork of p2-maven-plugin

Sat Nov 14 13:56:27 EST 2020

Tags: maven tycho
  1. Converting Tycho Projects to maven-bundle-plugin, Initial Phase
  2. Winter Project #2: Maven P2 Repository Resolver
  3. OpenNTF Fork of p2-maven-plugin
  4. The Intricate Work of OSGi Dependencies on Domino

It's been one of my long-running goals to reduce my use of Tycho for my work. While Tycho does what it says on the tin, the way PDE works in Eclipse means it's an ongoing nightmare to deal with when I want to do simple things like add a new dependency. This isn't really Tycho's fault as such, and the project itself is making major steps to alleviate some issues, but it's the nature of the surrounding tooling. Even beyond that, the shaky support in IntelliJ and total lack of support in Visual Studio Code and similar editors makes it a real thorn in my side.

Still, though, it brings a lot to the table, particularly when dealing with Domino-targeted projects. Because Domino's OSGi layout is... fiddly, it's often safest to use the "Manifest-first" approach for dependencies, and it's definitely important to still be able to do feature projects and p2 repositories for importing into Designer and Domino.

But I've still been trying to whittle away at the constraints over time, and I got fed up enough yesterday to make some major strides.

The Original Project

One of the major tools in my toolbelt for years has been the p2-maven-plugin, which does a lot of heavy lifting when it comes to taking non-Tycho or non-OSGi-focused projects and making them palatable for an OSGi environment. Even when I don't use it as the backbone of a project, I tend to use it to gather third-party dependencies and process them to make them Domino-friendly.

The Fork

It has its limitations, though, that have kept me from using it to replace the final steps of a Tycho build, and those are the ones that I set out to improve. Yesterday, I forked the project and got to work. Most of my work centered around letting it pull more information out of existing p2 repositories. While it already has some knowledge of such repos, it was still geared heavily towards only using them to pick up a bundle here or there. The big annoyance for me there was that I wanted to bring in entire existing p2-housed features into the final update site.

For example, one of my big projects consumes and redistributes a bunch of upstream projects, such as ODA and the XPages Jakarta EE support. While the p2-maven-plugin made it possible to reference those projects as Maven artifacts or individual bundles, I couldn't do what I wanted and just say "bring X and Y features in, including all their bundles".

I also went in and added a few other niceties needed for Domino: generation of the antiquated "site.xml" file for the NSF Update Site, archiving of the final site for distribution, and so forth.

The Implications

With my changes, I was able to delete all of the feature projects in the tree, which lowers the mental complexity a bit. That also means that the only parts "controlled" by Tycho now are the actual bundle projects, and those have a clear path to de-Tycho-ization. Though doing that will make it a little more difficult to know when dependencies are Domino-suitable ahead of time, the conversion should save a ton of hassle overall.

So now, I have a toolchain that should be able to work together to replace Tycho while still working with the Equinox-heavy target:

  • maven-bundle-plugin to generate the OSGi metadata in META-INF/MANIFEST.MF. I could also use bnd-maven-plugin directly for this and bndtools in Eclipse, but I'm not sure that it'd gain me much in practice
  • generate-domino-update-site to create p2 repositories from post-9.0.1 Domino releases' XPages framework, which remains damnably non-Mavenized
  • p2-layout-provider to resolve p2-housed artifacts like those from above and OpenNTF projects and make them available as normal-enough Maven dependencies on the fly
  • The forked p2-maven-plugin to generate features and update sites, as well as to repackage existing bundles to be more Domino-friendly

What's missing now is an ability to run compile-time test suites in a true Equinox environment. I'm hemming and hawing on how important that really is, though. The tests I write only rarely expect the presence of OSGi - the main way it comes into play is for extensions, which are papered over by IBM Commons anyway. I've had a delightful time lately running tests of JAX-RS resources with Liberty's dev mode, and I'm pretty sure I saw some examples somewhere of building up and tearing down a scaffolding to run them during compilation, so maybe I'll switch to that anyway.

In any event, just having a tool to do this stuff is a huge weight off my back, and now the goal of a fully-normal-enough Maven project tree is tantalizingly in sight.

Getting Started with the NSF ODP Tooling

Wed Aug 26 10:57:53 EDT 2020

Tags: maven nsfodp
  1. Getting Started with the NSF ODP Tooling
  2. NSF ODP Tooling: Setting Up Jenkins Builds

I've mentioned the NSF ODP Tooling project quite a bit here, and a lot of that is just a reflection of how much use I've gotten out of it and how much time it's been saving me in my regular work.

Part of it is also, though, that I think that it should see wider use. I realized that the project can seem off-putting, or reserved only for the sort of lost-in-the-weeds sort of work I do. Generally, when I mention it, it's in the context of a massive project with a bunch of OSGi plugins, or describing the intricate work that went in to implementing it.

So I figured this was as good a time as any to describe the simplest-case scenario to get use out of the project: wrapping a normal ODP, without plugins, and then building it into an NSF outside of Designer.

Environment Setup

Domino Installation

To get started, you'll first need either a local Notes/Domino installation or a remote Domino server. Since it involves slightly-less local configuration, we'll go with the remote Domino path for now. Download the latest distribution ZIP [from the project on OpenNTF](https://openntf.org/main.nsf/project.xsp?r=project/NSF ODP Tooling/releases) and install the update site from the "Domino" directory on your server in the same way you would the OpenNTF Domino API or other XPages library, and restart HTTP.

Maven and Java

The second thing you'll need is a Maven installation locally. If you're running on macOS or Linux, the easiest way to install this is with a package manager, such as Homebrew or apt. On any platform, you can also follow the download and installation instructions from the official Maven site. You'll also need Java installed - nowadays, I use AdoptOpenJDK.

You'll also need a Maven "settings.xml" file to point to your server. If you don't have such a file already, create an ".m2" directory (with the leading dot) in your home directory. This is the same process as in my original Maven setup guide, but with different contents. Configure the contents to look like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
<?xml version="1.0"?>
<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0 http://maven.apache.org/xsd/settings-1.0.0.xsd">
    <profiles>
        <profile>
            <id>nsfodp</id>
            <properties>
                <!-- the server name can be anything as long as it matches below -->
                <nsfodp.compiler.server>some-server-name</nsfodp.compiler.server>
                <!-- specify the HTTP/HTTPS URL for your Domino server -->
                <nsfodp.compiler.serverUrl>https://some.server/</nsfodp.compiler.serverUrl>
                
                <!-- set to true if you use a self-signed SSL certificate -->
                <nsfodp.compiler.serverTrustSelfSignedSsl>true</nsfodp.compiler.serverTrustSelfSignedSsl>
            </properties>
        </profile>
    </profiles>
    <activeProfiles>
        <activeProfile>nsfodp</activeProfile>
    </activeProfiles>
    
    <servers>
        <server>
            <id>some-server-name</id>
            <!-- Use a Domino HTTP username and password -->
            <username>builduser</username>
            <password>buildpassword</password>
        </server>
    </servers>
</settings>

NSF Project Setup

The core On-Disk Project you create for your NSF is done using the normal Designer source-control. This process hasn't changed over the years; if you're unfamiliar with creating ODPs and working with source control, resources like the NotesIn9 episode remain very useful (though using Mercurial is an odd choice nowadays).

For this example, I just created a new NSF, but you can start with any simple-to-moderate NSF. For now, avoid anything that uses external XPages libraries or platform-specific things like ODBC in LotusScript. Right-click the NSF and go to "Team Development" ? "Set Up Source Control for this Application":

Set up source control in Designer

In the following wizard, give it a name (your choice) and uncheck "Use default location". Pick a destination for your created project, but make sure to put it within an "odp" subfolder of your main project folder - that'll be important later.

Source control wizard

I also uncheck "Go to Navigator view after project is created" because I use Package Explorer for this. It wouldn't hurt to use the Navigator view, tough - it's basically the same idea.

At this point, you can close out of Designer if you want - it won't be needed for the rest of this.

Maven Project Setup

Create a new text file called "pom.xml" and put it in the project folder, next to the "odp" directory.

pom.xml placement

Set its contents to this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
<?xml version="1.0"?>
<project
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
    xmlns="http://maven.apache.org/POM/4.0.0"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.example</groupId>
    <artifactId>nsfodp-example</artifactId>
	<version>1.0.0-SNAPSHOT</version>
    <packaging>domino-nsf</packaging>

    <pluginRepositories>
        <pluginRepository>
            <id>artifactory.openntf.org</id>
            <name>artifactory.openntf.org</name>
            <url>https://artifactory.openntf.org/openntf</url>
        </pluginRepository>
    </pluginRepositories>

    <build>
        <plugins>
            <plugin>
                <groupId>org.openntf.maven</groupId>
                <artifactId>nsfodp-maven-plugin</artifactId>
                <version>3.1.0</version>
                <extensions>true</extensions>
            </plugin>
        </plugins>
    </build>
</project>

In a terminal window, go to the project directory (the one containing this "pom.xml") and run mvn install. After a bit of churning, you should see some output ending like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
[INFO] --- nsfodp-maven-plugin:3.1.0:compile (default-compile) @ nsfodp-example ---
[INFO] Compiling ODP
[INFO] Installing bundles
[INFO] - Installed no bundles
[INFO] Creating destination NSF
[INFO] Importing DB properties
[INFO] Importing basic design elements
[INFO] Importing file resources
[INFO] Importing LotusScript libraries
[INFO] Uninstalling bundles
[INFO] org.openntf.nsfodp.compiler.equinox.CompilerApplication#end
[INFO] Generated NSF: /Users/jesse/Projects/nsfodp-example/target/nsfodp-example-1.0.0-SNAPSHOT.nsf
[INFO]
[INFO] --- maven-install-plugin:3.0.0-M1:install (default-install) @ nsfodp-example ---
[INFO] Installing /Users/jesse/Projects/nsfodp-example/target/nsfodp-example-1.0.0-SNAPSHOT.nsf to /Users/jesse/.m2/repository/com/example/nsfodp-example/1.0.0-SNAPSHOT/nsfodp-example-1.0.0-SNAPSHOT.nsf
[INFO] Installing /Users/jesse/Projects/nsfodp-example/pom.xml to /Users/jesse/.m2/repository/com/example/nsfodp-example/1.0.0-SNAPSHOT/nsfodp-example-1.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  9.346 s
[INFO] Finished at: 2020-08-26T10:29:10-04:00
[INFO] ------------------------------------------------------------------------

The specifics will change a bit based on your system, but the main things are to see those "Compiling" and "Importing" lines followed by the "BUILD SUCCESS" banner at the end. If you look in your project directory, you'll see some generated support files and, within the "target" directory, the built NSF:

Build results

Conclusion

And that's it! Probably, at least. You can use this with most classic Notes apps and with XPages apps that just use the built-in components and JARs inside the NSF. Things can get more complex from there, and the repository contains an example of an XPages application that uses an OSGi-based library.

I plan to go into some of those details in future posts. In addition, I will demonstrate how to do this compilation in Jenkins, which allows you to have the NSF built automatically whenever you or someone else on your team commits a change to source control.

Executing a Complicated OSGi-NSF-Surefire-NPM Build With Docker

Thu Aug 13 14:42:58 EDT 2020

  1. Weekend Domino-Apps-in-Docker Experimentation
  2. Executing a Complicated OSGi-NSF-Surefire-NPM Build With Docker
  3. Getting to Appreciate the Idioms of Docker

The other month, I got my feet wet with Docker after only conceptually following it for a long time. With that, I focused on getting a basic Jakarta EE app up and running with an active Notes runtime by way of the official Domino-on-Docker image provided by HCL.

Since that time, I'd been mulling over another use for it: having it handle the build process of my client's sprawling app. This started to become a more-pressing desire thanks to a couple factors:

  1. Though I have the build working pretty well on Jenkins, it periodically blocks indefinitely when it tries to launch the NSF ODP Compiler, presumably due to some sort of contention. I can go in and kill the build, but that's only when I notice it.
  2. The project is focusing more on an Angular-based UI, with a distinct set of programmers working on it, and the process of keeping a consistent Domino-side development environment up and running for them is a real hassle.
  3. Setting up a new environment with a Notes runtime is a hassle even for in-the-weeds developers like me.

The Goal

So I set out to use Docker to solve this problem. My idea was to write a script that would compose a Docker image containing all the necessary base tools - Java, Maven, Make for some reason, and so forth - bring in the Domino runtime from HCL's image, and add in a standard Notes ID file, names.nsf, and notes.ini that would be safe to keep in the private repo. Then, I'd execute a script within that environment that would run the Maven build inside the container using my current project tree.

The Dockerfile

Since I'm still not fully adept at Docker, it's been a rocky process, but I've managed to concoct something that works. I have a Dockerfile that looks like this (kindly ignore all cargo-culting for now):

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
FROM maven:3.6.3-adoptopenjdk-8-openj9
USER root

# Install toolchain files for the NPM native components
RUN apt update
RUN apt install -y python make gcc g   openssh-client git

# Configure the Maven environment and permissive root home directory
COPY settings.xml /root/.m2/
COPY build-app.sh /
RUN mkdir -p /root/.m2/repository
RUN chmod -R 777 /root

# Bring in the Domino runtime
COPY --from=domino-docker:V1101_03212020prod /opt/hcl/domino/notes/11000100/linux /opt/hcl/domino/notes/latest/linux
COPY --from=domino-docker:V1101_03212020prod /local/notesdata /local/notesdata

# Some LotusScript libraries use an all-caps name for lsconst.lss
RUN ln -s lsconst.lss /opt/hcl/domino/notes/latest/linux/LSCONST.LSS

# Copy in our stock Notes ID and configuration files
COPY notesdata/* /local/notesdata/

# Prepare a permissive data environment
RUN chmod -R 777 /local/notesdata

The gist here is similar to my previous example, where it starts from the baseline Maven package. One notable difference is that I switched away from the -alpine variant I had inherited from my original Codewind example: I found that I would encounter npm: not found during the frontend build process, and discovered that this had to do with the starting Linux distribution.

The rest of it brings in the core Domino runtime and data directory from the official image, plus my pre-prepared Maven configuration. It also does the fun job of symlinking "lsconst.lss" to "LSCONST.LSS" to account for the fact that some of the LotusScript in the NSFs was written to assume Windows and refers to the include file by that name, which doesn't fly on a case-sensitive filesystem. That was a fun one to track down.

The build-app.sh script is just a shell script that runs several Maven commands specific to this project.

The Executor Script

The other main component is a Bash script, ./build.sh:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
#!/usr/bin/env bash

set -e

mkdir -p ~/.m2/repository
mkdir -p ~/.ssh

# Clean any existing NPM builds
rm -rf ../app-ui/*/node_modules
rm -rf ../app-ui/*/dist

# Set up the Docker workspace
rm -rf scratch
mkdir -p scratch/builder
cp maven/* scratch/builder/
cp -r notesdata-server scratch/builder/notesdata

# Build the image and execute a Maven install
docker build scratch/builder -f build.Dockerfile -t app-build
docker run \
    --mount type=bind,source="$(pwd)/..",target=/build \
    --mount type=bind,source="$HOME/.m2/repository",target=/root/.m2/repository \
    --mount type=bind,source="$HOME/.ssh",target=/root/.ssh \
    --rm \
    --user $(id -u):$(id -g) \
    app-build \
    sh /build-app.sh

This script ensures that some common directories exist for the user, clears out any built Node results (useful for a local dev environment), copies configuration files into an image-building directory, and builds the image using the aforementioned Dockerfile. Then, it executes a command to spawn a temporary container using that image, run the build, and delete the container when done. Some of the operative bits and notes are:

  • I'm using --mount here maybe as opposed to --volume because I don't know that much about Docker. Or maybe it's the right one for my needs? It works, anyway, even if performance on macOS is godawful currently
  • I bring in the current user's Maven repository so that it doesn't have to regenerate the entire world on each build. I'm going to investigate a way to pre-package the dependencies in a cacheable Maven RUN command as my previous example did, but the sheer size of the project and OSGi dependencies tree makes that prohibitive at the moment
  • I bring in the current user's ~/.ssh directory because one of the NPM dependencies references its dependency via a GitHub SSH URL, which is insane and bad but I have to account for it. Looking at it now, I should really mark that one read-only
  • The --rm is the part that discards the container after completing, which is convenient
  • I use --user to specify a non-root user ID to run the build, since otherwise Docker on Linux ends up making the target results root-owned and un-deletable by Jenkins. This is also the cause of all those chmod -R 777 ... calls in the Dockerfile. There are gotchas to keep in mind when doing this

Miscellaneous Other Configuration

To get ODP ? NSF compilation working, I had to make sure that Maven knew about the Domino runtime. Fortunately, since it'll now be consistent, I'm able to make a stock settings.xml file and copy that in:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
<?xml version="1.0"?>
<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
  xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0 http://maven.apache.org/xsd/settings-1.0.0.xsd">
	<profiles>
		<profile>
			<id>notes-program</id>
			<properties>
				<notes-program>/opt/hcl/domino/notes/latest/linux</notes-program>
				<notes-data>/local/notesdata</notes-data>
				<notes-ini>/local/notesdata/notes.ini</notes-ini>
			</properties>
		</profile>
	</profiles>
	<activeProfiles>
		<activeProfile>notes-program</activeProfile>
	</activeProfiles>
</settings>

Those three are the by-convention properties I use in the NSF ODP Tooling and my Tycho-run test suites to pass information along to initialize the Notes process.

Future Improvements

The main thing I want to improve in the future is getting the dependencies loaded into the image ahead of time. Currently, in addition to sharing the local Maven repository, the command brings in not only the full project structure but also the app-dependencies submodule we use to store giant blobs of p2 sites needed by the build. The "Docker way" would be to compose these in as layers of the image, so that I could skip the --mount bit for them but have Docker's cache avoid the need to regenerate a large dependencies image each time.

I'd also like to pair this with app-runner Dockerfiles to launch the webapp variants of the XPages and JAX-RS projects in Liberty-based containers. Once I get that clean enough, I'll be able to hand that off to the frontend developers so that they can build the full app and have a local development environment with the latest changes from the repo, and no longer have to wonder whether one of the server-side developers has updated the Domino server with some change. Especially when that server-side developer is me, and it's Friday afternoon, and I just want to go play Baba Is You in peace.

In the mean time, though, it works, and works in a repeatable way. Once I figure out how to get Jenkins to read the test results of a freestyle project after the build, I hope to replace the Jenkins build process with this script, which should both make the process more reliable and allow me to run multiple simultaneous builds per node without worry about deadlocking contention.

NSF ODP Tooling 3.1.0: Dynamically Including Web Resources

Fri Jul 17 14:10:24 EDT 2020

  1. XPages: The UI Toolkit and the App Framework
  2. The RuntimeEnvironment Idiom
  3. NSF ODP Tooling 3.1.0: Dynamically Including Web Resources

I just released version 3.1.0 of the NSF ODP Tooling project and, while I entirely forgot to make a blog post about 3.0 the other week, I think that one the additions in this one deserves some special mention.

In one of my client projects, we're replacing an old XPages-based UI with an Angular UI backed by our set of JAX-RS resources. This is part of the same sprawling client app I've mentioned a few times so far, but this is a new module within it and doesn't face the same "convert from XPages mid-flight" remit. Since the UI itself is just going to be a bunch of static resource files, that freed up our options for presenting it to the user. In order to keep the benefits of using Domino ACLs, I figured that wrapping it up in an NSF would be the way to go.

The way to do this is to bring your (potentially-transpiled) HTML/JS/CSS files into the WebContent folder in the NSF's Package Explorer representation, either manually or by coaxing Designer to sync it in for you.

My purpose in life is to eliminate Designer from existence, though, so I certainly couldn't be content with that. Instead, I adapted a Maven-based technique for building WAR-packaged JS apps to emit an NSF.

The Project Structure

From that "Targeting Domino for Webapps Incidentally" post, the pertinent part is the use of maven-frontend-plugin to kick off an NPM build of the web app. In that post, I put the JavaScript project files inside a Maven project of their own, but that's optional. In my client's case, the JS team is separate from the Java team, so I didn't want to force them to have to dig through the Maven project tree to get to their files, and the JS apps are in a separate top-level folder in the repository. The simplified structure looks like this:

  • Repository Root
    • ui-projects
      • someuiproject
    • nsfodp-project

My goal is to be able to kick off a Maven build, have it run the NPM build of the JS project in its separate directory, and then pull in the results for the final NSF, all automatically.

The Maven Configuration

By combining frontend-maven-plugin and the NSF ODP Tooling, that's exactly what I get. Here's the <build> section of the ODP project's pom:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
<build>
  <plugins>
    <plugin>
      <groupId>com.github.eirslett</groupId>
      <artifactId>frontend-maven-plugin</artifactId>
      <version>1.10.0</version>
  
      <configuration>
        <nodeVersion>v14.3.0</nodeVersion>
        <npmVersion>6.14.4</npmVersion>
        <installDirectory>target</installDirectory>
      </configuration>
        
      <executions>
        <execution>
          <?m2e ignore?>
          <id>install node and npm</id>
          <goals>
            <goal>install-node-and-npm</goal>
          </goals>
          <phase>generate-resources</phase>
        </execution>
        
        <execution>
          <?m2e ignore?>
          <id>jsapp install</id>
          <goals>
            <goal>npm</goal>
          </goals>
          <phase>generate-resources</phase>
          <configuration>
            <workingDirectory>${project.basedir}/../ui-projects/someuiproject</workingDirectory>
          </configuration>
        </execution>
        <execution>
          <?m2e ignore?>
          <id>jsapp build</id>
          <goals>
            <goal>npm</goal>
          </goals>
          <phase>generate-resources</phase>
          <configuration>
            <workingDirectory>${project.basedir}/../ui-projects/someuiproject</workingDirectory>
            <arguments>run build</arguments>
          </configuration>
        </execution>
      </executions>
    </plugin>
  
    <plugin>
      <groupId>org.openntf.maven</groupId>
      <artifactId>nsfodp-maven-plugin</artifactId>
      <version>3.1.0</version>
      
      <configuration>
        <webContentResources>
          <webContentResource>
            <directory>${project.basedir}/../ui-projects/someuiproject/dist/app</directory>
          </webContentResource>
        </webContentResources>
      </configuration>
    </plugin>
  </plugins>
</build>

Now, the final result will be an NSF with whatever other design elements are needed, ready to be deployed with a design replace/refresh. In my client's case, that ends up also getting bundled up into the distribution ZIP, but in a basic case the NSF would be enough.

Writing the XSP Transpiler Maven Plugin

Thu Jul 09 10:33:44 EDT 2020

Tags: maven xpages

When I was first getting my XPages webapp support project into workable shape, I was faced with the immediate problem of translating XSP source into a usable form. Though the XPages core contains both the code for translating XSP source to Java and the loader that executes the compiled Java classes, they're best thought of as two disjoint components in a larger toolchain. Designer uses the translator to create Java source, which it then compiles into .class files like any other Java source. At runtime, Java uses the CompiledPageDriver implementation of the FacesPageDriver to look for these compiled classes based on translating page names like Foo.xsp to class names like xsp.Foo, loading them with the active classloader, and calling their methods to emit the UIComponent tree.

The fact that XSP is transformed to Java and then bytecode is incidental, though: the FacesPageDriver interface only requires outputting some object that can build page trees. I've tinkered a bit with building on the Bazaar's existing dynamic-interpretation code to go directly from XSP to the tree of UIComponents, but there are a lot of fiddly details. Onerous as it may be, the translation+compilation process covers all of the edge cases that may show up.

The translation process requires a classpath populated with both the XPages core code and any libraries you have, since libraries are defined as dynamic Java classes and not, for example, statically-readable XML configuration files (there are XML files in there, but they're only identified by the Java class). Designer deals with this by making you install XPages libraries into your runtime: the classes have to be present in the Eclipse environment for Designer to be able to identify and load them. That works, but it's onerous and not practical for my uses.

Runtime Compilation

The tack I took initially with the webapp support was to write a FacesPageDriver implementation that translates XSP to Java and then compiles those classes on the fly. This has the distinct advantage of having the entire running app going, so all libraries and control definitions are available. There's overhead on first load for each page, especially for complicated ones, but subsequent loads are as speedy as the precompiled route.

Incidentally, this is basically how JSPs work in normal app servers: the JSP source is included in the .war file, and then it's translated into a Servlet implementation Java class and compiled on the fly.

Maven Compilation

Still, I really wanted to avoid having the app have to translate and compile on the fly. While it works, it's wasteful and adds noticeably to the initial load time of a freshly-deployed instance.

My goal was to do this compilation process during Maven compilation - independent of any particular IDE. The trouble there is that there's still a hard requirement on having the actual app class environment available so that library classes can be resolved. It's not enough to just solve the problem of including XPages artifacts as Maven dependencies, since that wouldn't account for using e.g. ODA in an app.

My original tack for this was to do what I do in the NSF ODP Tooling: create an Equinox environment containing the app and its dependencies, and then execute the transpilation in there. I event went so far as to implement it, though it's essentially an undocumented feature of 3.0 and above. This didn't sit quite right with me, though. For one, it's kind of outside of the Tooling's bailiwick: while it certainly does XSP compilation as part of the overall NSF assembly, it's really a distinct activity. Moreover, though, loading a whole Equinox environment is fiddly and unnecessarily requires a Notes runtime to be configured along with it.

So I took a second pass at it in the xpages-runtime project, and this has been working out well. I realized that I didn't need to have all of the app's classes available to the Maven plugin, nor did I need to spawn a whole second process. I could instead construct something of a jail ClassLoader to house the process. I build a ClassLoader based on the project's dependency tree (which inherently includes the required XSP core classes), copy in a transpiler implementation, and execute the process reflectively. This means that the whole thing can happen in-process and without a special Notes runtime, just like a normal Maven plugin.

Better still, I use the BuildContext apparatus to identify changes, and Eclipse's m2e hooks into this. In this way, I can do essentially incremental compilation that fires off whenever you modify a .xsp file inside Eclipse, essentially giving the same kind of experience that you get in Designer (with less crashing). In both Maven and Eclipse, the actual Java ? bytecode compilation happens with the normal compiler: I just drop class files in the right place, tell the project about the generated source folder, and let it do its thing.

All in all, I'm pretty pleased with how this turned out. It's still primarily useful for the type of development workflow I've set up personally, but it's definitely had a noticeable impact on the modify-deploy-run cycle. For a moribund stack that I'm actively working away from, I've built myself a pretty-respectable toolshed.

Targeting Domino for Webapps Incidentally

Tue Feb 11 17:26:38 EST 2020

Tags: java maven

I recently had occasion to break ground on a new web project that uses a Notes runtime and has a web front end, and I figured it would be a perfect occasion to structure it in a way that is clean, portable, and, while it will run on Domino, doesn't have to use Tycho.

I ended up coming up with a setup that I'm pretty happy with, and so I put up an example on GitHub for anyone else to use as a reference for similar cases.

What Is This, Specifically?

This is an application that consists of a couple main concepts:

  • Maven for project structure and dependencies
  • Core "plain Java" module that contains code that's intended to be portable and doesn't even know it's in a web app
  • JAX-RS-based REST API
  • Client JS web UI written in Stencil and transpiled with Node
  • Standard webapp project for JEE containers such as Liberty
  • Domino project to wrap the app up as an OSGi bundle

What this is specifically not is an XPages project. And, while it can use a Notes runtime and access NSFs, it's also not something that will be stashed inside an NSF, and the "Notes" part is optional and really only included here to show it's possible. The idea is that this is a standard web app first and a Domino thing second.

Project Structure

The project is organized as a Maven module tree like so:

  • domino-webapp: The parent container project just for configuration
    • core
      • webapp-core: This is the main place for UI-independent business logic
    • web
      • webapp-api-jaxrs: This contains the JAX-RS-based REST API, which exposes the core business logic to the web
      • webapp-webui: This contains a Stencil-based JavaScript app. It doesn't need to be Stencil specifically, or even NPM-based at all, but I find Stencil to be a pretty good choice for this
      • webapp-jee: This is the JEE-container web app, containing very little code of its own and just intended to output a WAR
    • domino
      • webapp-domino: This is the Domino equivalent to the previous project, but contains a chunk of adapter code to get things working, plus some Maven configuration to generate an appropriate OSGi bundle
      • webapp-dist-domino: This is a distribution project that pulls in the Domino OSGi bundle and creates a p2 repository, and then a "site.xml" file for the benefit of importing into an NSF Update Site

How the OSGi Part Works

In going deeper into what's going on, I'm going to start at the end: how to go from a normal web app to a Domino-friendly OSGi bundle. If you're not familiar with what I mean by "web app" in general and in a Domino plugin in particular, it's the sort of thing that Sven Hasselbach wrote a series about a few years back: a Java/Jakarta EE Servlet application using the "WebContainer" extension point in the Domino HTTP runtime.

Traditionally, these projects are built as plain-old Eclipse projects, where you drop a bunch of JARs for your framework of choice into a plug-in project and write your code in there, using Eclipse's Plug-in Development Environment. This works well enough as far as it goes, but puts constraints on how you do development, in particular pretty much requiring Tycho if transitioned to a Maven structure, which would then have massive penalties for the rest of your project.

Fortunately, the thing about an OSGi bundle is that it's really just a JAR file with special metadata, and so it doesn't actually have to be created with a toolchain that has full knowledge of OSGi. As long as the required files end up in the right places inside the JAR (which is in turn just a ZIP file), you're good to go.

In this case, I used the maven-bundle-plugin to decorate the "MANIFEST.MF" file with appropriate OSGi metadata and, importantly, to embed all the compile-scoped project dependencies for me. That second part means that Maven will handle the job of steps 7-10 in Sven's example: it'll bring in the dependencies from Maven, copy them into the right place in the final JAR, and set up the Bundle-ClassPath header to point to them.

It's important to note the "compile-scoped" qualifier there. The Maven projects themselves also depend on a couple things that I know will be present on Domino already, namely IBM Commons, Apache Wink, the Web Container adapter, and Notes.jar. Though it'd probably work if I copied those into the JAR, that would be asking for trouble unnecessarily, so I mark them as "provided" in Maven, and then the bundling process knows to skip over them.

The other OSGi-specific element is the "plugin.xml" file, used by Domino's Equinox framework to identify that the bundle provides a web app. In this case, I put that file in "src/main/resources", where it ends up being copied to the root of the JAR. One down side here is that you have to know ahead of time what the syntax for this file is: since Eclipse won't know this is a plug-in project, you won't get the GUI shown in Sven's example.

There are some other Domino-specific considerations, but I'll return to them later. For now, those parts will cover the OSGi "bridge".

Core: Using the Notes API

The core project doesn't have a lot going on, and that's intentional. It does, though, demonstrate how you can use the JSON-B API for JSON serialization and the Notes API for accessing NSFs and other Notes stuff.

The important parts happen in the project dependencies. The first one is simple: I want to use the JSON-B API, but I was to declare that it will be provided one way or another by the environment. The second one includes Notes.jar by way of my P2 Repository Provider since it's still not available as a normal Maven dependency.

This project contains a single class, which just gathers a bit of information about the runtime environment to be shown as a JSON object. The important part here is my use of NotesThread when calling the Notes API. Since this project can run on non-Domino containers, I can't assume that all threads will already be Notes-friendly, so I use that route. You can also call NotesThread.sinitThread() or go other ways, but I like containing the calls into a separate thread outright in simple cases.

JAX-RS

The JAX-RS project is intended to contain JAX-RS configuration and resource classes, and the immediate part to note is once again the dependency set. Here, I targeted specifically JAX-RS 1.1, which is quite old, but is provided by Apache Wink on all Domino installations. I could theoretically bring in RESTEasy for a newer spec version, but 1.1 is capable enough for now and it keeps things simpler.

In the Application implementation class, I enumerate all of the resource classes used in the app. This is equivalent to the text-file-based method common in Wink apps, but it's portable across JAX-RS implementations and has the side benefit of being compiler-checked. However, though it's a step up from the old Wink way, it's a big step down from the modern JAX-RS way: in newer containers, you can just let the container find your resources by looking for classes with annotations automatically. However, that doesn't fly on Domino and, while you can hack in something roughly equivalent, it's simpler for now to just enumerate the classes explicitly and remember to add them to this list.

There are only two resources here: a Hello World resource and one to ferry the ServerInfo object out using the JAX-RS environment's JSON serializer (more on that in a bit).

The Web UI

The web UI project is complicated, but mostly because NPM-based JavaScript development is complicated. This example uses Stencil, which I quite like, but you can use whatever you'd like: React, Angular, just plain ol' HTML, or whatever.

The important parts here are the use of frontend-maven-plugin to create a Node+NPM environment and build the app and the specific configuration to put the output into "src/main/resources/META-INF/resources". Doing this means that, when this project is wrapped up into a Java-less JAR file, the web resources will be in the "META-INF/resources" directory, which is special on Servlet 3 and above. Any files in there in dependency JARs like this will be visible as if they were in the main web content of your web app.

JEE App

The Jakarta EE app is the simplest of the bunch, and the only actual class in there only exists for example purposes.

The work, such as it is, all happens in the Maven configuration. I declare it to be war-packaged, to not complain if there's no "web.xml" file, to bring in the project dependencies, and to specifically include IBM Commons. It also brings in Notes.jar as a compile-time dependency.

The Domino Shims

Back in the Domino module, it's time to talk about the non-OSGi parts. I've mentioned a few things above that require no configuration in a modern web container, but which will require a bit of legwork in Domino. These are generally related to the fact that Domino's servlet container is version 2.4 and it has no idea about newer standards.

  • I bring in a Eclipse Yasson dependency to provide JSON-B support.
    • To bind that to JAX-RS, I wrote a Provider class that knows how to turn any Java object into JSON when a resource says it wants to output JSON.
    • To register that provider (since it can't be picked up automatically), I subclass the Application class to include it specifically.
  • The ResourcesServlet servlet mimics the Servlet 3 behavior of serving resources out of "META-INF/resources". This specific implementation isn't the best, since it doesn't provide any caching, but it gets the job done and means that the web UI JAR will work the same way on both targets.
  • The RootServlet servlet extends the Wink default REST servlet to shim the ClassLoader around, which avoids a lot of trouble with threads used for web app requests that had previously been used for XPages requests (it's annoying, trust me).
  • I have to include an explicit reference to Wink's JAX-RS provider for some reason to do with bundle class loading.
  • Unlike in the normal web app project, I have to include a "web.xml" file, and this one registers the two servlets above.

Domino Update Site

The second part of the Domino target is the distribution project, which uses the p2-maven-plugin to create a P2 repository. That plugin is a splendid tool for your toolbox and has a lot of capabilities for auto-OSGi-ifying otherwise-non-OSGi projects. In this case, I just want to include the Domino project from the previous step, but I also want to generate an Eclipse feature for it so that it can be imported into an NSF Update Site and with some proper metadata.

I also use the p2sitexml-maven-plugin, which takes the newer-style P2 site generated by the previous step and adds a "site.xml" file, which is needed by the NSF Update Site import process if you want to include categories, which I think are nice.

Seeing It In Action

To run the app on Domino, you can do a Maven install on the root, install the update site from the distribution project onto Domino, and then visit "/exampleapp/". You'll be greeted by a vision of beatuty like this:

Example Webapp Screenshot

Placeholder garishness aside, it shows the Stencil app loading, using the custom favicon, and making a call to the System Info service. That, in turn, shows using the Notes runtime to get the server's distinguished name. It's left as an exercise for the reader to then put in the thousands of hours of work to make a world-class application.

Caveats!

Since this is a Domino thing, there are important caveats.

The first is one I mentioned earlier: because we're restricted to Servlet 2.4/2.5ish, a lot of things just won't work. Indeed, not even all of the 2.4 spec works, as Filters aren't implemented for some reason. Additionally, outside of Servlet and JAX-RS 1.1, you're pretty much in "BYOB" territory when it comes to other JEE specs. In this example, I brought in Yasson for JSON-P and JSON-B and that was pretty simple, but others (say, CDI) would require a lot more fiddly work.

There's also an extra-special caveat when it comes to JSP. Domino's web container knows about JSP, but requires what it calls a "JSP compiler bridge": a special extension that allows for interpreting JSPs inside the special environment it creates. However, it doesn't actually ship with such a bridge. Notes does (and MyFaces too) for what I assume are "social" reasons, but Domino doesn't. You could probably nab the JSP stuff from Notes and drop it onto Domino, but you'd be getting into weird territory. I tried dropping Jasper into the app, but it ran into ClassLoader-casting trouble... hence the bridge, I guess.

Usefulness

Phew! Admittedly, it's a long walk to get to the point where you can just run a web app, and there are quicker ways to get there. However, I do think this is worth it. With this setup, I have a set of Maven projects that work swimmingly in Eclipse and any other Java IDE, a NPM project that acts like any other, and a JEE container front-end for rapid development. No Designer, no NSF syncing, no Plug-in Development Environment, no Tycho. And, though I don't have the full breadth of JEE available to me, JAX-RS is the main one you need for a client-JS app anyway. It's not an appropriate setup for every app, but it's really nice when it fits.

Winter Project #2: Maven P2 Repository Resolver

Sat Dec 28 14:12:26 EST 2019

  1. Converting Tycho Projects to maven-bundle-plugin, Initial Phase
  2. Winter Project #2: Maven P2 Repository Resolver
  3. OpenNTF Fork of p2-maven-plugin
  4. The Intricate Work of OSGi Dependencies on Domino

The second project I took on this past week was related to the first, and also relates to my ongoing struggles with Tycho.

While working on the NSF ODP Tooling, I figured that it could be a good candidate to move away from Tycho and to maven-bundle-plugin or bnd directly. Since I've been Mavenizing the Domino OSGi bundles for a good while now, and the tooling doesn't have any OSGi-dependent tests in it, it seemed like it could go smoothly. Unlike historical precedent, though, my enemy in this endeavor wasn't Domino, but rather Eclipse.

Repository Layouts

One of the big things that makes working with OSGi bundles - at least specifically ones in the Eclipse style - difficult with Maven is that they're generally provided using a repository layout called "p2". This is the evolution of the "site.xml" Update Site style and shares a lot of characteristics. In fact, a p2 repository will often have a "site.xml" file alongside its "artifacts.xml" and "contents.xml" (often Jar'd up) to provide backwards compatibility. It's how we package up XPages plugins and how once upon a time IBM provided [the XPages artifacts for Tycho use](https://openntf.org/main.nsf/project.xsp?r=project/IBM Domino Update Site for Build Management). As a live example, Eclipse 2019-12 is distributed via such a repository.

Maven has its own repository layout, variously called "Maven", "Maven2", "m2", or just "default". This serves a similar purpose, but is structured differently - whereas p2 just has the repo and its "features" and "plugins" directory (and, potentially, composite repositories) - Maven's repository system is organized like a conceptual folder tree based on translating a Group ID (like, say, "org.openntf.maven") into a successive series of subdirectories (like "org/openntf/maven"), followed by a directory for an Artifact ID, which in turn contains directories for each version, and finally within there are any of the actual files that make up a given named "artifact". As a live example, Maven Central is browsable in this manner.

Translating Between Them

Though the two layouts share a common core job - hosting Jar files (mainly) - they diverge enough in how the tools expect the metadata to be laid out that they're difficult to mesh. Tools like bnd can often work with whatever, and even Tycho can try to find OSGi bundles via Maven dependencies, but it's not smooth.

Over time, a pseudo-standard of adding p2 repositories to Maven has emerged, but it's only actually used as a marker to pass along to true Eclipse tools. The most common in our sphere is this construct, seen in projects like ODA:

1
2
3
4
5
<repository>
	<id>notes</id>
	<layout>p2</layout>
	<url>${notes-platform}</url>
</repository>

There, you reference the XPages update site somewhere, and then Tycho can use that to resolve dependencies for things like Require-Bundle: com.ibm.xsp.core. However, it's used only for Tycho's specific OSGi needs. You can't bring in the Tycho plugins and then have a non-Tycho module in your tree declare a dependency like that. Tycho has an implementation class for this, but it's intentionally stubbed out.

My Needs

The reason why tools that can work with both are so heavily slanted to the specific task of generating a true OSGi environment is that that's usually what you want. If you're designing, say, an Eclipse plugin or Eclipse-derived product, you want all of your tooling to know about OSGi from top to bottom, and that's where Tycho excels. It makes sure that all of your dependencies are correct and everything is OSGi-friendly.

This is as opposed to something like maven-bundle-plugin, which is most typically used to put an OSGi coat of paint on a project that isn't primarily geared towards OSGi.

I kind of want a middle ground, though. A project like the NSF ODP Tooling has grown into a sprawling hydra, with heads for Maven, Eclipse, Domino, and now Visual Studio Code. While that works, putting Tycho at the front of it ends up feeling needlessly proscriptive, and I'd love to toss it aside. However, I was blocked in my desire by a small thing: though Eclipse publishes their core bundles on Maven nowadays, Wild Web Developer is currently p2-only.

The Project

So I set out to solve this problem for myself and learn something in the process. As indicated by the <layout>p2</layout> option on the repository above, Maven's repository system is intended to be extensible. Unfortunately, it seems like it hasn't been extended particularly often in practice, and most of what I could find about it was that it's possible to do, but only via references to people saying that one could.

Fortunately, it turns out that it's actually not too difficult to implement after all, and I did just that.

What this Maven plugin does is allow you to specify p2 repositories in any old Maven project, using the ID of the repository you add as the Group ID of dependencies and then the Symbolic Name as the artifact ID. Now, with the plugin added to the project, I'm able to reference the p2-only Wild Web Developer artifact I need:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
<repositories>
	<repository>
		<id>org.eclipse.wildwebdeveloper</id>
		<url>https://download.eclipse.org/wildwebdeveloper/releases/0.8.0/</url>
		<layout>p2</layout>
	</repository>
</repositories>

<dependencies>
	<dependency>
		<groupId>org.eclipse.wildwebdeveloper</groupId>
		<artifactId>org.eclipse.wildwebdeveloper.xml</artifactId>
		<version>[0.5.0,)</version>
	</dependency>
</dependencies>

And, just like that, the bundle and its explicit dependencies show up in my Maven Dependencies group in Eclipse:

p2 Maven Dependencies in Eclipse

As a side bonus, this obviates the need for the mavenizeBundles half of the generate-domino-update-site project, since now I can just reference the generated site directly and get the dependencies, including with better behavior for embedded Jars than I had there:

com.ibm.xsp.core Dependency in Eclipse

The Tiny Details

I think that this plugin is about where it needs to be to suit my needs, but there is still an array of [tiny details I've yet to contend with](https://github.com/OpenNTF/p2-layout-provider/issues?q=is:open is:issue). It'll never be quite a perfect match for an arbitrary OSGi bundle (though some also contain useful Maven metadata), and so there will always be rough edges with something like this, but I think that it will solve a lot of headaches I'd otherwise have to deal with down the line.

If you think it'd be useful for your projects, take a look and let me know if you run into any trouble.

Converting Tycho Projects to maven-bundle-plugin, Initial Phase

Thu Aug 22 15:27:10 EDT 2019

Tags: maven osgi tycho
  1. Converting Tycho Projects to maven-bundle-plugin, Initial Phase
  2. Winter Project #2: Maven P2 Repository Resolver
  3. OpenNTF Fork of p2-maven-plugin
  4. The Intricate Work of OSGi Dependencies on Domino

To date, Tycho has been my tool of choice for developing Domino-targeted Maven projects. However, it's not without protest.. Unlike most Maven plugins, Tycho inserts itself at the very start of the build process and takes over dependency management. Purely in Maven, you can use normal Maven dependencies, but only so long as you're pointing to a dependency that already has OSGi metadata (which, fortunately, most do), and only then to satisfy a Require-Bundle or Import-Package that also has to be present. This gets more annoying, though, when dealing with Eclipse, which removes the notion of Maven dependencies entirely when using Tycho and forces you to jump through hoops to do what you want. And, as a final kicker, Tycho's p2 repository support is completely broken in the latest release version of Maven.

So why do I keep using it, anyway?

Well, it brings a couple major benefits that are of particular importance for Domino:

  • It can use p2 repositories for dependencies. This matters because the XPages runtime plugins are not (yet?) available as normal Maven dependencies. Years back, IBM [provided a "Build Management" update site](https://openntf.org/main.nsf/project.xsp?r=project/IBM Domino Update Site for Build Management), which is helpful, but it's still an Eclipse-style p2 repository, not a Maven repository. Tycho can use p2 repositories natively, though, just as Eclipse does.
  • It constructs a true Equinox environment. This matters both when compiling your project and when running automated tests. The environment created by Tycho is the same Equinox OSGi runtime that Domino uses, and so it supports the same styles of bundle resolution and extensions that you get in Domino. Without this happening during the build, you lose some assurance that things at runtime will match your expectations.
  • It spawns tests in a separate process. This is a little esoteric, but it matters because launching a Notes environment on a non-Windows platform more-or-less requires setting up environment variables for the Notes/Domino directory and others, and these variables are not successfully set when using the normal maven-surefire-plugin runtime. This means that reliably running tests requires setting up the environment ahead of time, which is fiddlier and less automated.
  • It can generate new- and old-style Eclipse Update Sites. To be used in Designer and NSF-based Update Sites, an OSGi project has to be packaged up into a p2 repository along with an old-style "site.xml" file. Tycho can generate these (and can be assisted with "site.xml" when using the newer style), and it can also auto-generate source bundles, features, and repositories.

Alternatives and Workarounds

Some of the "hard" requirements for Tycho can be at least worked around.

Years ago, I wrote a Ruby script that would take a p2 site like IBM's or one generated from a newer version and "Mavenize" it by creating artifact information based on each bundle's OSGi manifest. I since converted it to Java and included it in Darwino's Studio plugins, and yesterday added it to the generate-domino-update-site Maven plugin. Using that lets you declare dependencies on any of the bundles or embedded JARs in a normal Maven project:

1
2
3
4
5
6
7
        <dependency>
            <groupId>com.ibm.xsp</groupId>
            <artifactId>com.ibm.notes.java.api.win32.linux</artifactId>
            <version>[10.0.0,)</version>
            <classifier>Notes</classifier>
            <scope>provided</scope>
        </dependency>

This isn't perfect, since it's neither standardized nor generally available (go vote for the aha idea!), but at least it's reproducible and can be something of a de-facto standard if used enough.

Then there's the matter of generating appropriate OSGi metadata. Outside of the Tycho-using world, the main way that generating this is via a tool called bnd and its related tools. bnd is kind of a parallel world and there's even an alternate tooling set for Eclipse instead of the default PDE. There are a couple ways to use bnd in a Maven build, but the one I'm familiar with to date is the maven-bundle-plugin. I've used this with Darwino to incidentally create OSGi metadata for the otherwise non-OSGi core modules, and I suspect that it gets used heavily this way. It's more powerful than that, though, and is a nice wrapper for bnd under the hood, supporting Declarative Services annotations and all the other OSGi goodies. In my case, I used it to generate the MANIFEST.MF with most of the defaults, but then added in some specifics to play nice in my Domino Equinox target.

I suspect that these bnd-based tools can also be a route to solving my automated-testing woes. For the Open Liberty Runtime project, I don't have to worry about that, since it's so dependent on running in actual Domino that the return-on-investment for setting up JUnit tests wouldn't be worth it. However, I recall seeing some Maven testing plugin that let you spawn an OSGi environment of your choice, and I think that something like that may be able to replace Tycho for me there.

Since p2 repositories/update sites are entirely an Eclipse-ism, most OSGi tooling doesn't care about them. That's where p2-maven-plugin comes in. Not only will it allow you to create p2 repositories, but it lets you define features in the configuration, meaning they don't have to be separate modules like in Tycho. And not only that, but it will also auto-OSGi-ify any Maven dependencies you bring in if they don't already have OSGi bundle information. It also lets you override existing bundle data on the fly if needed, such as if the dependencies and imports conflict with something on Domino.

Eclipse Friendliness

Since I still use Eclipse to develop these projects, I want to be able to make use of the [XPages SDK](https://openntf.org/main.nsf/project.xsp?r=project/XPages SDK for Eclipse RCP)'s ability to run Domino's HTTP stack pointed at my active workspace. For that to work, I need to be able to get Eclipse to recognize my projects as functional PDE-compatible bundles even if I'm not using PDE for them. Fortunately, that process isn't difficult: once I set the location for MANIFEST.MF to be in "META-INF" in the project root, maven-bundle-plugin started generating the files there instead of within "target", and Eclipse started working with the projects as OSGi bundles. The only thing left to do then was to gitignore the generated files, since they don't need to be checked into source control anymore.

Future Improvements

The big thing that is still an open problem is dealing with testing. I have some ideas for taking a swing at it, but for now it's the main thing preventing me from doing this for all of my Tycho projects.

Beyond that, I want to look a bit into bnd-maven-plugin. This diverges from maven-bundle-plugin in that it's geared towards using bnd configuration files directly. During the build process, I think the results would be the same, since maven-bundle-plugin can already pass through whatever configuration I want, but it would be a better match for the Eclipse bndtools tooling. Additionally, externalizing the bnd config files would mean they'd be the same if I decided to switch to Gradle, as Open Liberty uses.

Finally, and specific to this Open Liberty project, I may want to consider using bnd to generate Liberty Feature manifests, as it itself does. These features are implemented as OSGi "subsystems" packaged .esa files. Currently, I'm using esa-maven-plugin to generate their specialized manifests, but I've already hit some limitations in the area of cross-feature dependencies. Apparently, bnd takes some wrangling to suit this, but is worth it. I'll consider that one a "stretch goal", though.

For now, I'm pretty pleased with the new setup. The projects still work on Domino, I can run them on there from the workspace, I was able to eliminate the p2 feature projects outright, and now I don't have to worry about packaging up a dependencies site just to have something to point at in Eclipse. Heck, I can even use Visual Studio Code now! It's pretty nice.

First Steps to Code Coverage Analysis in Domino Plugins

Thu Nov 09 08:53:04 EST 2017

Tags: maven domino java

I'm always interested in getting the computer to tell me how to tell it what to do more successfully, and, to further that pursuit, I've started taking an interest in code coverage.

If you're not familiar with the term, "code coverage" refers to reporting on which lines of code were actually executed during runtime, most commonly in association with unit tests. Eclipse (and presumably other IDEs) has support for this, and I've decided to give it a shot.

Since I'm starting this out in the context of Domino plugins, there are more wrinkles than in most tutorials. Namely, the test suites I've written run exclusively through Maven instead of the Eclipse UI due to all the Notes environment setup, so I can't just use the normal UI tools to gather the data. Fortunately, Eclipse's EclEmma will work just fine with the output from a Maven project, as long as you configure it properly. I looked around for a while to find the right combination of tools to use, but it ended up being fairly simple to configure basic output that can be consumed in Eclipse's Coverage view.

There are two main additions. First, add the jacoco-maven-plugin to your root project's project.build.plugins block:

<plugin>
	<groupId>org.jacoco</groupId>
	<artifactId>jacoco-maven-plugin</artifactId>
	<version>0.7.8</version>
	<executions>
		<execution>
			<goals>
				<goal>prepare-agent</goal>
			</goals>
		</execution>
	</executions>
</plugin>

In normal cases, that would suffice. However, since the test configuration I have for Notes overrides the argLine property of the Tycho test runner, there's another step - add the tycho.testArgLine property manually into those blocks, such as in the Windows profile:

<profile>
	<activation>
		<os>
			<family>Windows</family>
		</os>
		<property>
			<name>notes-program</name>
		</property>
	</activation>

	<build>
		<plugins>
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>tycho-surefire-plugin</artifactId>
				<version>${tycho-version}</version>
 
				<configuration>
					<skip>false</skip>
 
					<argLine>${tycho.testArgLine} -Dfile.encoding=UTF-8 -Djava.library.path="${notes-program}"</argLine>
					<environmentVariables>
						<PATH>${notes-program}${path.separator}${env.PATH}</PATH>
					</environmentVariables>
				</configuration>
			</plugin>
		</plugins>
	</build>
</profile>

Once that's configured, running the test suite via Maven will create a new file in the target folder of the test plugin: jacoco.exec. This file can then be consumed in Eclipse by opening the "Coverage" view:

Eclipse's Show View window

In that view, right click and choose "Import Session..." and point to the data file. Click "Next" and check the projects+source folders from your workspace you're interested in analyzing. When you click "Finish", it'll do two things. First, it'll fill the Coverage view with statistics from your run:

Code Coverage stats

(We have a lot of work to do fleshing out our test suites for this one)

Secondly, it'll start highlighting your code to show you what code is executed, which branches are only partially covered, and which lines are skipped entirely. For example (ignore the sickly color scheme - I need to work on that):

Code Coverage example

This shows how several of the if branches are only tested in one direction, while the "Faces" block is skipped entirely. That also shows some of the trouble with testing XPages-run code: the Tycho environment can't reproduce the XPages environment fully, so some branches aren't testable in that way. I haven't looked into the possibility of gathering similar data from JUnit for XPages, so perhaps that's possible.

For now, though, this will have to do. And, like with these other "code improvement" techniques I've integrated lately, there's a lot of potential tedium - juggling when to write a test to cover some code that will obviously always work just to improve the highlighting vs. just focusing on the low-hanging fruit - but I expect that it will be a nice addition to my workflow over time.

New Small Project: p2site-maven-plugin

Thu Oct 26 14:17:16 EDT 2017

Tags: maven

It's no secret that I have a love/hate relationship with developing for OSGi platforms with Maven. The giant divide between "all-in" Tycho projects (which limit your options with normal Maven features) and trying to bolt on OSGi support in an otherwise-normal project creates an array of problems big and small.

Some of those hurdles would be difficult to bridge, such as any automated tests that want to test the proper functioning of OSGi services. However, not all projects need that - in the case of Darwino, for example, deployment to Domino is a secondary consideration in the Maven project, and so a Darwino app doesn't use Tycho for its packaging or testing. By jumping through a few hoops, we've gotten those projects to the point where they can emit a p2-formatted update site for use in OSGi, and that can be imported into a Domino NSF-based update site.

There's a minor caveat, though: because those update sites don't know about p2 formatting, you can't use the "Import Update Site" action, instead having to use "Import Features", which leaves the imported features in the "(Not Categorized)" group. This isn't a huge problem, but it's one that's easily fixed, so I wrote a small tool to do just that.

I've created a small open-source project called p2sitexml-maven-plugin, the purpose of which is to generate the site.xml file expected by Notes from a p2 repository generated by other means, such as the p2-maven-plugin. This can be included in a Maven build like so:

...
	<build>
		<plugins>
			...
			<plugin>
				<groupId>org.darwino</groupId>
				<artifactId>p2sitexml-maven-plugin</artifactId>
				<version>1.0.0</version>
				<executions>
					<execution>
						<goals>
							<goal>generate-site-xml</goal>
						</goals>
						<configuration>
							<category>Some Category</category>
						</configuration>
					</execution>
				</executions>
			</plugin>
		</plugins>
	</build>
...

Right now, the plugin isn't in Maven Central, but is in OpenNTF's Maven server. You can add that to an active profile in your settings.xml file like so:

...
	<pluginRepositories>
		<pluginRepository>
			<id>artifactory.openntf.org</id>
			<name>artifactory.openntf.org</name>
			<url>https://artifactory.openntf.org/openntf</url>
		</pluginRepository>
	</pluginRepositories>
...

It isn't a world-changing thing, but this should at least make the task of targetting Domino with non-Tycho Maven projects a little easier.

Including a Headless DDE Build in a Maven Tree

Tue Mar 14 12:45:22 EDT 2017

Most of my Domino projects nowadays have two components: a suite of OSGi plugins/features and at least one NSF. Historically, I've kept the NSF part separate from the OSGi plugin projects - I'll keep the ODP in the repo, but then usually also keep a recent "build" made by copying the database from my dev server, and then include that built version in the result using the Maven Assembly plugin. This works, but it's not quite ideal: part of the benefit of having a Maven project being automatically built is that I can have a consistent, neutral environment doing the compilation, without reliance on my local Designer. Fortunately, Designer has a "headless" mode to build NSFs in a scripted way, and Christian Güdemann has done the legwork of building that into a Maven plugin.

It should come as no surprise, however, that this is a fiddly process, and I ran into a couple subtle problems when configuring my build.

Setting Up Designer

The first step is to tell Designer that you want to allow this use, which is done by setting DESIGNER_AUTO_ENABLED=true in your notes.ini. The second step is to configure Notes to use an ID file with no password: because Designer is going to be launched and quit automatically several times, you can't just leave it running and have it use an open session. This is a perfect opportunity to spin up a "template" ID file, distinct from your developer ID, if you haven't do so already. Also, uh... make sure that this user has at least Designer rights to the NSF it's constructing. I ran into a bit of logical trouble with that at first.

The last step was something I didn't realize until late: keep your Designer installation clean of the plugins you're going to be auto-installing. Ideally, Designer will be essentially a fresh install, with no plugins added, and then the Maven definition will list and install all dependencies. If it's not clean, you may run into trouble where Designer emits errors about the plugin conflicting with the installed version.

Setting Up The Maven Environment

Before getting to the actual Maven project files, there's some machine-specific information to set, which is best done with properties in your ~/.m2/settings.xml, much like the notes-platform and notes-program properties. In keeping with that convention, I named them as such:

<properties>
	<notes-platform>file:///C:/Users/jesse/Java/XPages</notes-platform>
	<notes-program>C:\Program Files (x86)\IBM\Notes</notes-program>
	<notes-designer>C:\Program Files (x86)\IBM\Notes\designer.exe</notes-designer>
	<notes-data>C:\Program Files (x86)\IBM\Notes\Data</notes-data>
</properties>

Deploying Features And Initial Root Project Config

The first came in setting up the automatic deployment of the feature. The Maven plugin lets you specify features that you want added to and then removed from your Designer installation. In this case, the feature and update site are within the same Maven tree being built, which adds a wrinkle or two.

The first is that, since the specific version number of the feature changes every build due to the qualifier, I had to set up the root project to export the qualifier value that Tycho plans to use. This is done using the tycho-packaging-plugin, which a standard Maven project will have loaded in the root project pom. The main change is to explicitly tell it to run the build-qualifier goal early on, which has the side effect of contributing a couple properties to the rest of the build:

<plugin>
	<groupId>org.eclipse.tycho</groupId>
	<artifactId>tycho-packaging-plugin</artifactId>
	<version>${tycho-version}</version>
	<configuration>
		<strictVersions>false</strictVersions>
	</configuration>

	<!-- Contribute the "buildQualifier" property to the environment -->
	<executions>
		<execution>
			<goals>
				<goal>build-qualifier</goal>
			</goals>
			<phase>validate</phase>
		</execution>
	</executions>
</plugin>

Once that's running, we'll have the ${qualifiedVersion} property to use down the line to house the actual version made during the build.

The second hurdle is figuring out the URL to use to point to the update site. I did this with a property in the root project pom, alongside setting to properties used by the Headless Designer plugin:

<properties>
	<!-- snip -->
	
	<!-- Headless Designer properties -->
	<designer.feature.url>${project.baseUri}../../releng/com.example.some.updatesite/target/site</designer.feature.url>
	<ddehd.designerexec>${notes-designer}</ddehd.designerexec>
	<ddehd.notesdata>${notes-data}</ddehd.notesdata>
</properties>

Much like with OSGi dependency repositories, this path is recomputed per-project. The NSF projects are housed within an nsf folder in my tree, so I include the ../.. to move up to the root project, before descending back down into the update site. Note that this requires that the update site project be built earlier in the build than the NSF.

Finally, bringing these together, I added a block for the common settings for the plugin to the pluginManagement section of the root project pom:

<plugin>
	<groupId>org.openntf.maven</groupId>
	<artifactId>headlessdesigner-maven-plugin</artifactId>
	<version>1.3.0</version>
	<extensions>true</extensions>
	<configuration>
		<features>
			<feature>
				<featureId>com.example.some.feature</featureId>
				<url>${designer.feature.url}</url>
				<version>${qualifiedVersion}</version>
			</feature>
		</features>
	</configuration>
</plugin>

Configuring The NSF Project

With most aspects configured higher up in the project tree, the actual NSF project pom is fairly slim:

<?xml version="1.0"?>
<project
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
	xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
	<modelVersion>4.0.0</modelVersion>
	<parent>
        <groupId>com.example</groupId>
        <artifactId>some-plugin</artifactId>
        <version>1.0.0-SNAPSHOT</version>
        <relativePath>../..</relativePath>
	</parent>
	<artifactId>nsf-somensf</artifactId>
	
	<packaging>domino-nsf</packaging>
	
	<properties>
		<ddehd.odpdirectory>${basedir}\..\..\..\nsf\nsf-somensf</ddehd.odpdirectory>
		<ddehd.targetdbname>somensf.ntf</ddehd.targetdbname>
	</properties>
	
	<build>
		<plugins>
			<plugin>
				<groupId>org.openntf.maven</groupId>
				<artifactId>headlessdesigner-maven-plugin</artifactId>
				<extensions>true</extensions>
			</plugin>
		</plugins>
	</build>
</project>

The properties block sets two more properties automatically read by the Headless Designer Maven plugin. In this case, the path is an artifact of the history of the Git repository: since the ODP was added to the repo outside of the Maven tree, the path backs up and out of the whole thing, and then to another folder with a confusingly-similar name. In this case, it avoids a lot of developer hassle, but a properly-configured project have the ODP in a subfolder within the Maven project (maybe src/main/odp if you want to be all idiomatic about it).

Note that the ddehd.targetdbname property is the NSF name used both for the intermediate build NSF (which is in the Notes data directory) and for the destination file in the project's target directory, so make sure it doesn't conflict with any existing DBs.

Bringing It All Together

Once you have the NSF built, you can include it in an Assembly down the line, leading to a nicely-packaged update site + NSF pair. This section is something of an "IOU" at the moment, though - I have an idea for how I want to do this, but I haven't actually implemented it yet. Once I do, I'll write a followup post.

In the mean time, having a build server build the NSF can be a useful check on manking sure everything is working correctly, and is a perfect stepping-stone towards a complete solution. Ideally, in addition to packaging up the result, a full system would also deploy the NSF and plugins to a Domino server and run some UI/service tests against it. However, that's a whole ball of wax that I haven't touched on myself (and is also likely prohibitive for licensing reasons in most cases anyway). For now, it's a step in the right direction.

Quick Post: Maven-izing the XSP Repo

Sat Sep 17 06:58:10 EDT 2016

Tags: maven xpages

This post follows in my tradition of extremely-narrow-use-case guides, but perhaps this will come in handy in some situations nonetheless.

Specifically, a while back, I wrote a script that "Maven-izes" the XPages artifacts, as provided by IBM's Update Site for Build Management. This may seem a bit counter-intuitive at first, since the entire point of that download is to be able to compile using Maven, but there's a catch to it: the repository is still in Eclipse ("P2") format, which requires that you use Tycho in your project. That's fine enough in most cases - since Domino-targetted projects are generally purely OSGi, it makes sense to have the full OSGi stack that Tycho provides. However, in a case where Domino is only one of many supported platforms, the restrictions that Tycho imposes on your project can be burdensome.

So, for those uses, I write a JRuby script that reads through the P2 site as downloaded and extracted from OpenNTF and generates best-it-can Maven artifacts out of each plugin. It tries to maintain the plugin names, some metadata (vendor, version, etc.), and dependency hierarchy, and the results seem pretty reliable, at least for the purpose of getting a non-Tycho bundle with XSP references to compile. This isn't necessarily a route you'd want to take in all cases (since you don't get the benefits of normal OSGi resolution and services in your compilation), but may make sense sometimes. In any event, if it's helpful, here you go:

https://github.com/jesse-gallagher/Miscellany/blob/master/UpdateSiteConversion/convert.rb

The Cleansing Flame of Null Analysis

Sat May 21 10:18:00 EDT 2016

Tags: java maven
  1. The Cleansing Flame of Null Analysis
  2. Quick Tip: JDK Null Annotations for Eclipse
  3. The Joyful Utility of Optionals in Java

Though most of my work lately has been on sprawling, platform-level stuff or other large existing codebases, part of it has involved a new small app. I decided to take this opportunity to dive more aggressively than previously into automated null analysis and other potential-bugs tools.

What I mean by "null analysis" is letting the IDE or compiler try to help you avoid NullPointerExceptions. Though there are plenty of other programming mistakes you could still make, these are among the most common, and so a little extra work up front to avoid them should pay dividends. Eclipse has some handy options in its Java → Compiler → Errors/Warnings preferences to assist with this:

The first option will pick up on some pretty basic instances, such as:

Object foo = null;
System.out.println(foo.hashCode());

Since this is clearly going to always cause an NPE, Eclipse is able to point this out as an error. The next level gets a little more nebulous: "potential" null pointer access. This crops up when Eclipse can't reliably determine whether a value will be null, either because there is no way to know at compile time (say, database access) or because the compiler's tooling is too limited. Here's a contrived example:

Object foo = Math.random() > 0.5 ? new Object() : null;
System.out.println(foo.hashCode());

This situation is clearly untenable, but there are other situations where you as a programmer can be very confident that the value will not be null (say, if you swap out the > 0.5 for >= 0.0), but the compiler doesn't know that. That's why it often makes sense to leave that as a warning instead of an error.

That's all stuff I've done before, but now I've decided to dive into annotation-based null analysis as well. Unfortunately, in stock Java, this is something of a hot mess (that list even leaves out Eclipse's home-grown version). Since Java didn't grow up with this sort of capability, it's been shoehorned in by various parties over the years. There are other tools to assist you in Java 8, but, unfortunately, I can only target 7 as the highest. For now, I've settled on the "sort-of standard" javax.validation.constraints package. It wasn't really intended for this specific purpose, but it's flexible enough to suit and can be used in Eclipse and FindBugs (though I have my reservations about the choice).

In Eclipse, this type of analysis can be enabled by checking "Enable annotation-based null analysis" below the other options and, unless you're using Eclipse's known annotations, adjusting the "Configure" options next to "Use default annotations for null specifications":

In any event, regardless of the choice of tooling, the "this shouldn't be null" annotations work the same way: you use them to decorate things that you either require not be null when provided to you (method parameters) or you promise to not be null when providing to others (method return values). For example:

public @NotNull Object doSomething(@NotNull Object otherObject) {
	return otherObject.toString();
}

This highlights three things, two good and one bad:

  • Good: The @NotNull in the method parameter means that, as long as the calling code is also checked for null use, the method can be confident that there won't be a NullPointerException when calling a method on otherObject.
  • Good: The @NotNull on the return value means that other code calling this method can be confident that they will not get a null value from it, and so can skip extra null checks.
  • Bad: Eclipse flags otherObject.toString() as a potential problem because it doesn't know for sure that Object#toString doesn't return null, because it has no nullability annotations. As programmers (or as a compiled-code analysis tool), we can be fairly confident that it will be non-null because any object returning null for that is essentially broken on its own.

That last one is a common problem when adopting annotation-based null analysis, at least in Eclipse (I hear it may be better in IntelliJ): its logic doesn't go very deep. If everything is gussied up with these annotations, you're clear - but as soon as you step outside of the project you're working on, you have to add in likely-unnecessary checks. Fortunately, these checks don't realistically hurt (a null check at runtime in a normal app is negligible performance-wise), but they can grate to have to add in.

Glutton for punishment that I am, I decided to go a step further and enable FindBugs processing as an integral step of my build. Though FindBugs can be very picky about the types of things it complains about, it is blessedly more thorough in its analysis than Eclipse, so you generally end up conceding that it is correct when it yells at you. Since the project is Maven-based, I added the check in the project's pom file:

<plugin>
	<groupId>org.codehaus.mojo</groupId>
	<artifactId>findbugs-maven-plugin</artifactId>
	<version>3.0.3</version>
	<configuration>
		<includeTests>true</includeTests>
	</configuration>
	<executions>
		<execution>
			<phase>compile</phase>
			<goals>
				<goal>check</goal>
			</goals>
		</execution>
		<execution>
			<id>findbugs-test-compile</id>
			<phase>test-compile</phase>
			<goals>
				<goal>check</goal>
			</goals>
		</execution>
	</executions>
</plugin>

For most uses, that's all that's required. Now, when the project is compiled, FindBugs will give it a once-over and halt the build if it finds anything it doesn't like. This can be tweaked a great deal - for example, changing the checks to run or the severity of the problem needed to fail the build - but the defaults will likely suit.

Adding these extra checks involves a lot of plusses and minuses. The big minus is that you may end up spending a lot of time "fixing" bugs that don't really exist, time that you could instead spend actually writing your application (and writing new bugs that the tools won't find anyway). There's really nothing to be gained by carefully explaining to Eclipse for the hundredth time that toString always returns non-null.

Still, particularly when tested out in a small, low-surface-area app, this can be a good practice to learn and refine. Eventually, a move to Java 8 will help this more, and it certainly doesn't hurt to add in nullability annotations in the mean time. Overall, I think having the tooling help you avoid a whole suite of common "brain fart" bugs like this is worthwhile.

Maven Native Chronicles: Running Automated Notes-based Tests

Sat Feb 27 17:02:11 EST 2016

Tags: maven
  1. Maven Native Chronicles, Part 1: Figuring Out nar-maven-plugin
  2. Maven Native Chronicles, Part 2: Setting Up a Windows Jenkins Node
  3. Maven Native Chronicles, Part 3: Improving Native Artifact Handling
  4. Maven Native Chronicles: Running Automated Notes-based Tests

This post isn't really in my ongoing Java thread, though it's related in that this is the sort of thing that may come up in fairly-advanced cases. This post will assume a functional knowledge of Maven, Tycho, and JUnit.

For Darwino, I ran into the need to run unit tests on Domino-adapter code during the Maven build process. Since the Domino project tree uses Tycho, this ended up differing slightly from standard Maven testing. Rather than using the src/test/java directory in the same project to house the associated tests, Tycho prefers the very-OSGi-native method of having a separate project, but declaring it a "fragment" plugin attached to the primary one. In OSGi terms, a fragment is a special type of plugin that, when loaded by the runtime, gets glommed on to a specified host plugin and runs in its same classpath. In other cases, this may be used to provide platform-specific additions, add locale resources, or other uses.

So I created a new fragment project, which is structurally much like a normal plugin, but with an extra line in the MANIFEST.MF:

Fragment-Host: com.example.some.parent.plugin

This line tips off the OSGi environment to its nature. In the pom.xml, there are a number of important differences related both to how Tycho handles test fragments and the necessity of loading the Notes native libraries:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<parent>
		<groupId>com.example</groupId>
		<artifactId>some-parent</artifactId>
		<version>1.0.0-SNAPSHOT</version>
	</parent>
	<artifactId>com.example.some.parent.plugin.test</artifactId>
	<packaging>eclipse-test-plugin</packaging>

	<build>
		<plugins>
			<!--
				By default, Tycho doesn't include the other fragment plugins when running the test.
				So here, we manually include the appropriate features. 
			 -->
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>target-platform-configuration</artifactId>
				<version>${tycho-version}</version>
				
				<configuration>
					<dependency-resolution>
						<extraRequirements>
						
							<requirement>
								<type>eclipse-plugin</type>
								<id>com.ibm.notes.java.api.win32.linux</id>
								<versionRange>[9.0.1,9.0.2)</versionRange>
							</requirement>
							
							<requirement>
								<type>eclipse-feature</type>
								<id>com.example.some.native.feature</id>
								<versionRange>0.0.0</versionRange>
							</requirement>
							
						</extraRequirements>
					</dependency-resolution>
				</configuration>
			</plugin>
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>tycho-surefire-plugin</artifactId>
				
				<configuration>
					<testSuite>${project.artifactId}</testSuite>
					<testClass>com.example.some.parent.plugin.test.AllTests</testClass>
				</configuration>
			</plugin>
		</plugins>
	</build>
	
</project>

The preamble is the same as usual for Maven, but the packaging is slightly different. Instead of eclipse-plugin, this should be packaged as eclipse-test-plugin. Tycho's packaging doesn't particularly care about whether or not it's a fragment, but it does care about its test nature.

Things get a little interesting in the target-platform-configuration block. These two entries have similar purposes: to cause Tycho to load up other, native-artifact fragments required to run the tests. The first one showed up in the Java series: it contains Notes.jar, but, because it is itself a fragment (and can't be directly depended upon by the test project), Tycho won't automatically load it unless directed to. The second one serves a similar purpose, but loads a feature instead. This feature contains references to a number of distinct platform-dependent native-artifact fragments, and specifying this dependency causes Tycho to consider each one without having to specifically enumerate them in the POM.

The final block is a little simpler, and it just tells Tycho where to start when it goes to run the fragment as a test suite. The AllTests class is a test suite in the JUnit 4 convention, with @RunWith and @Suite.SuiteClasses annotations.


There's another catch to this, though: Notes has some specific demands on its environment, and in particular must be run with knowledge of a Notes program directory, a data directory, a notes.ini, and an ID file (unless you're doing DIIOP (which you probably shouldn't)). The specifics of what the libraries expect in their runtime environment and how they should be loaded in their API calls vary a little from platform to platform, and I ended up with a pile of "just keep trying stuff until it works" code. The result, though, is that I have automated tests running on Windows, Linux, and OS X. First, there's the large platform-specific section of my root POM, which defines platform-activated profiles that set up environment variables:

<!-- These profiles add support for specific platforms for tests -->
<profiles>
	<profile>
		<activation>
			<os>
				<family>Windows</family>
			</os>
			<property>
				<name>notes-program</name>
			</property>
		</activation>
	
		<build>
			<plugins>
				<plugin>
					<groupId>org.eclipse.tycho</groupId>
					<artifactId>tycho-surefire-plugin</artifactId>
					<version>${tycho-version}</version>
					
					<configuration>
						<skip>false</skip>
						
						<argLine>-Dfile.encoding=UTF-8 -Djava.library.path="${notes-program}"</argLine>
						<environmentVariables>
							<PATH>${notes-program}${path.separator}${env.PATH}</PATH>
						</environmentVariables>
					</configuration>
				</plugin>
			</plugins>
		</build>
	</profile>
	<profile>
		<id>mac</id>
		<activation>
			<os>
				<family>mac</family>
			</os>
			<property>
				<name>notes-program</name>
			</property>
		</activation>
	
		<build>
			<plugins>
				<plugin>
					<groupId>org.eclipse.tycho</groupId>
					<artifactId>tycho-surefire-plugin</artifactId>
					
					<configuration>
						<skip>false</skip>
						
						<argLine>-Dfile.encoding=UTF-8 -Djava.library.path="${notes-program}"</argLine>
						<environmentVariables>
							<PATH>${notes-program}${path.separator}${env.PATH}</PATH>
							<LD_LIBRARY_PATH>${notes-program}${path.separator}${env.LD_LIBRARY_PATH}</LD_LIBRARY_PATH>
							<DYLD_LIBRARY_PATH>${notes-program}${path.separator}${env.DYLD_LIBRARY_PATH}</DYLD_LIBRARY_PATH>
							<Notes_ExecDirectory>${notes-program}</Notes_ExecDirectory>
						</environmentVariables>
					</configuration>
				</plugin>
			</plugins>
		</build>
	</profile>
	<profile>
		<id>linux</id>
		<activation>
			<os>
				<family>unix</family>
				<name>linux</name>
			</os>
			<property>
				<name>notes-program</name>
			</property>
		</activation>
	
		<build>
			<plugins>
				<plugin>
					<groupId>org.eclipse.tycho</groupId>
					<artifactId>tycho-surefire-plugin</artifactId>
					<version>${tycho-version}</version>
					
					<configuration>
						<skip>false</skip>
						
						<argLine>-Dfile.encoding=UTF-8 -Djava.library.path="${notes-program}"</argLine>
						<environmentVariables>
							<!-- The res/C path entry is important for loading formula language properly -->
							<PATH>${notes-program}${path.separator}${notes-program}/res/C${path.separator}${notes-data}${path.separator}${env.PATH}</PATH>
							<LD_LIBRARY_PATH>${notes-program}${path.separator}${env.LD_LIBRARY_PATH}</LD_LIBRARY_PATH>
							
							<!-- Notes-standard environment variable to specify the program directory -->
							<Notes_ExecDirectory>${notes-program}</Notes_ExecDirectory>
							<Directory>${notes-data}</Directory>
							
							<!-- Linux generally requires that the notes.ini path be specified manually, since it's difficult to determine automatically -->
							<!-- This variable is a convention used in the test classes, not Notes-standard -->
							<NotesINI>${notes-ini}</NotesINI>
						</environmentVariables>
					</configuration>
				</plugin>
			</plugins>
		</build>
	</profile>
</profiles>

Each block is kicked off both by a specific OS combination, using Maven's OS names (you can also target specific architectures within them), as well as the presence of a notes-program property. This is a convention I've adopted to go alongside the notes-program property that points to the XSP plugins; this one instead points to the root Notes or Domino install to use for execution.

Windows is the easiest since Notes still feels most at home on there. There, it's just a matter of adding the Notes program root to the Java library path and the environment's PATH. From there, the Notes libraries automatically picked up the data directory and notes.ini, presumably from the registry.

The Mac is mildly more complex: in addition to the two settings from Windows, I also ended up adding the program path to LD_LIBRARY_PATH and DYLD_LIBRARY_PATH. I'm not entirely sure both are needed, but hey, it works this way. In addition, I had to specify Notes_ExecDirectory. After that, the tests found the location of the data dir and Notes Preferences, presumably due to Mac OS conventions.

Linux needed the most hand-holding, which shouldn't be too surprising for those who have installed Domino on Linux - it doesn't seem to respect any platform conventions there. In addition to specifying the notes-program property and using it in the same places as on the Mac, I also added two more properties to my Maven config: notes-data, to point to the data directory, and notes-ini, to point to notes.ini. I used the notes-data property to specify the Directory environment variable that the Notes libraries look for, and then I also specified NotesINI. That's not something that the Notes libs look for, but instead it's a way to shuttle the configuration to the Java code that actually executes the tests.

That leads to the final hurdle: initializing the Notes environment in the JUnit test classes. To do that, I specified a @BeforeClass method that checks for the presence of the Notes_ExecDirectory and NotesINI environment variables. If they're present (i.e. it's Linux), it calls NotesInitExtended with the value of Notes_ExecDirectory as the first argument and = plus the value of NotesINI as the second. Afterwards, whether or not that was called, it calls NotesThread.sinitThread(), and from then on NotesFactory.createSession() will generate proper native sessions.

There's also an @AfterClass method that is the mirror of that: it calls NotesThread.stermThread() and then, on Linux, NotesTerm.


So yeah, there are a lot of hoops to hop through! Hopefully, this post will be helpful for someone attempting to do the same thing I did, and it'll cut down on a lot of searching around and trying to piece together a working environment.

That Java Thing, Part 16: Maven Fallout

Tue Feb 23 14:33:51 EST 2016

Tags: java maven
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment

So, after the last post's large task of converting to Maven, this step is mostly about picking up the pieces and expanding on some of the concepts. We'll start with M2Eclipse, usually rendered as just "m2e".

m2e

m2e is the set of plugins that acts as Eclipse's interface to Maven. It more-or-less replaces the earlier maven-eclipse-plugin, though you will likely still see references to that around. Eclipse doesn't have any inherent knowledge of how Maven works, m2e has the complicated task of reading your projects' pom.xml files and adapting them to Eclipse's internal configuration. So, for example, in our projects it saw the presence of Tycho and determined that they should be imported as OSGi projects. In other cases, m2e may pick up the presence of things like Android plugins to trigger the use of the Android development tools.

Though it tries mightily, m2e is the source of a lot of the consternation that can come with a switch to Maven-based development. Because most Maven plugins don't have any inherent allowances for working in an Eclipse environment, adapters have to be written for each one in order for them to work with m2e - this is what the dialog yesterday installing the Tycho adapters was about. In some cases, these don't exist and you have to tell m2e to ignore the plugin; in other cases, the adapters DO exist, but are flawed in some way. Most of the time, things go alright, but there are enough edge cases that it can be irritating.

For this kind of task, m2e is pretty unobtrusive, but it's important to know it's there.

Updating the .gitignore

One side effect of m2e's behavior is that it's not a good idea to remove Eclipse's project configuration files from the Git repository. This is not required, but it can avoid a number of annoying problems when dealing with multi-person Maven projects. To start with, open the .gitignore file from the root of your local Git repository (you can get to this easily using Eclipse's Git Repositories view, in the "Working Directory" part of the repo). Add some lines at the end to ignore .project and .classpath, so your whole file should now look like:

._*
Thumbs.db
.DS_Store

*.class

# Mobile Tools for Java (J2ME)
.mtj.tmp/

# Package Files #
#*.jar
*.war
*.ear

# virtual machine crash logs, see http://www.java.com/en/download/help/error_hotspot.xml
hs_err_pid*

# Eclipse project files
.project
.classpath

Depending on how your (hypothetical) team wants to work, it may also make sense to ignore the .settings/ directory, which stores some additional Eclipse project information. However, some of that information may be useful to share - for example, on-save code-cleaning behavior that isn't readily expressed in Maven.

Due to the way Git works, just adding the files to the .gitignore won't remove them from the repository: instead, they'll just no longer show up in the list for new changes. In order to also remove them from the repository without deleting them from the filesystem, go to the "Navigator" pane in Eclipse (if it doesn't show up currently, you can add it via Window → Show View → Navigator), find each .project and .classpath file in the four projects (some will only have the former), right-click, and choose Team → Advanced → Untrack:

Now, commit the changes - though the files remain on the filesystem, they should show up as deleted in the commit dialog:

The target Folder

This one is one we've already prepped a bit for. Whereas most Eclipse projects store their binary output (Java class files, Jars, etc.) in the bin folder or elsewhere, the standard Maven behavior is to use target. For most of the projects, this doesn't matter - we had already configured the plugin to use a subfolder here for its classes, and the temporary files for other aspects don't matter. However, it's still important to know about this; when you're looking for the compiled or packaged output of a Maven project, this is the place to look, and we'll run into this when building the update site.

Building the Update Site

There's an important changed involved now with how the update site is built: it does not involve opening the site.xml and clicking "Build All" anymore. Instead, it involves right-clicking the root project ("parent-xsp") and choosing Run As → Maven Install:

There are two logical followup questions when seeing this change: "what?" and "why?". They're both bound together to what the nature of a Maven project is, and, significantly, the way Eclipse interacts with them. Maven is primarily a command-line tool - granted, it's a set of Java classes, but the primary way to interact with it is via the command line. m2e does a lot of work to interpret the projects in the same way as the `mvn` command-line tool, but it's just a secondary interpretation due to the way Maven and Eclipse work.

The way to fully build a Maven-ized project in a way that fully uses the configuration is to run the command-line tool. Fortunately, m2e comes with its own embedded version and doesn't require you to use a terminal, but the abstraction is very leaky - and this is why you use "Run As" instead of any of the normal "Build"-related commands. The "Run As" commands construct a CLI-type environment and execute the embedded Maven, which is what then does the real work.

Since "parent-xsp" is the root of our projects, it's the starting point to execute a Maven build. When you run this, you'll see a lot of chatter in Eclipse's Console view, particularly the first run: Maven will seek out the plugins needed to build the projects and install them to the local Maven repository (stored in ".m2/repository" in your home folder). After that, it will build and package each of your projects. There's a whole phase system going on here (similar in concept to the XPages lifecycle), as well as many configuration options, but the important part here is that the "install" command (called a "goal" in Maven parlance) is the last phase that we will worry about, and it will cover everything we need here.

Upon completion, the Console text should end with something like this (incidentally, "Reactor" is Maven's term for the entire blob of modules being processed):

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] parent-xsp ......................................... SUCCESS [  0.617 s]
[INFO] com.example.xsp.plugin ............................. SUCCESS [  1.647 s]
[INFO] com.example.xsp.feature ............................ SUCCESS [  0.411 s]
[INFO] com.example.xsp.update ............................. SUCCESS [  4.143 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 30.310 s
[INFO] Finished at: 2016-02-23T13:55:20-05:00
[INFO] Final Memory: 80M/191M
[INFO] ------------------------------------------------------------------------

Part of this process is the cration of the update site, which, due to how we configured it, will be represented twice in the "target" folder in "com.example.xsp.plugins": as a tree of files inside the "site" folder and also zipped up into the "site_assembly.zip" file. There's also a file named "site.zip", but that contains just the site.xml, which is not important. It's these files that you should now target with Designer and the NSF Update Site when updating the plugin. In fact, it'd be a good idea to delete the "features" and "plugins" folders from outside the "target" folder now - they won't be used any more.

As for the "Build All" button in site.xml, it's best to pretend it doesn't exist. It will still work, but it will break your Maven build, because it overwrites the "qualifier" in the version numbers. This is, admittedly, a drag: it's convenient having a clear, logical button to build the site, and it's very inconvenient that Eclipse doesn't tell you not to use it any more. However, the Maven process, besides being now required, has a nice advantage: now building the update site will no longer cause Git to want to check in the change. That's something that can get annoying very quickly when working with another developer on a non-Mavenized OSGi project.

Adding Back The Source Plugin

We'll finish the day on an easy one: adding back in the source plugin. Unlike the original setup, which used a separate feature to house the source plugin, we'll now include it in the same feature. You could also continue to have a separate source feature, which would be useful for very large projects where it would actually be a big burden to deploy the source to servers, but, for XPages libraries, it's generally not worth the cognitive hassle.

Since we already configured the Tycho source plugin earlier, this is just a matter of adding a reference to the (implied) source plugin to the feature.xml:

<?xml version="1.0" encoding="UTF-8"?>
<feature
      id="com.example.xsp.feature"
      label="Example XSP Library Feature"
      version="1.0.0.qualifier">

   <description url="http://www.example.com/description">
      [Enter Feature Description here.]
   </description>

   <copyright url="http://www.example.com/copyright">
      [Enter Copyright Description here.]
   </copyright>

   <license url="http://www.example.com/license">
      [Enter License Description here.]
   </license>

   <plugin
         id="com.example.xsp.plugin"
         download-size="0"
         install-size="0"
         version="0.0.0"
         unpack="false"/>

   <plugin
         id="com.example.xsp.plugin.source"
         download-size="0"
         install-size="0"
         version="0.0.0"/>

</feature>

Now, the source will be included in the output and bundled with the main feature when it's installed. Commit this change:


At this point, we're basically back to where we were previously, OSGi-wise, but in a much better position to scale the project further and take advantage of supporting systems. The next couple posts will cover some of those potential systems, as well as remaining large conceptual topics. There's a great deal to know when it comes to Maven, but it's all helpful.

That Java Thing, Part 15: Converting the Projects

Mon Feb 22 10:26:54 EST 2016

Tags: maven java
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment

Prelude: there was a typo in the previous entry. Originally, the file URL read "file://C:/IBM/UpdateSite", but, on Windows, there should be another slash in there: "file:///C:/IBM/UpdateSite". I've corrected the original post now, but you should make sure to fix your own settings.xml file if needed. Otherwise, Maven will complain down the line about the URI having "an authority component".

The time has come to do the dirty work of converting our existing plugin projects to Maven. There will be some filesystem-side reorganizing and not every project will make it (looking at you, source project), but overall it's mostly a job of pasting a bunch of XML into new files.

For the first leg of this, I recommend removing the projects from your Eclipse workspace by selecting them, right-clicking, and choosing Delete:

On the confirmation dialog, do not select "Delete project contents on disk" - we don't actually want to get rid of the files.

Next, find the projects on your filesystem, create a new folder alongside them named "com.example.xsp", and move the projects inside it. In Maven parlance, we're creating a "multi-module project", and this new folder is the top level in our module hierarchy. This can contain arbitrary levels and can be very helpful in project organization, but this will be a pretty simple parent-and-children case. Next, dive into the folder and delete the "com.example.xsp.source.feature" - we'll be able to generate this through Maven now, and so we can trim down our project count slightly.

Now, create a file named pom.xml in the "com.example.xsp" folder alongside the subfolders, and fill its contents with this:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<groupId>com.example</groupId>
	<artifactId>parent-xsp</artifactId>
	<version>1.0.0-SNAPSHOT</version>
	
	<packaging>pom</packaging>

	<modules>
		<module>com.example.xsp.plugin</module>
		<module>com.example.xsp.feature</module>
		<module>com.example.xsp.update</module>
	</modules>

	<properties>
		<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
		<tycho-version>0.24.0</tycho-version>
		<compiler>1.6</compiler>
	</properties>

	<repositories>
		<repository>
			<id>Luna</id>
			<layout>p2</layout>
			<url>http://download.eclipse.org/releases/luna/</url>
		</repository>
		<repository>
			<id>notes</id>
			<layout>p2</layout>
			<url>${notes-platform}</url>
		</repository>
	</repositories>

	<build>
		<plugins>
			<!--
				Maven compiler options
			-->
			<plugin>
				<groupId>org.apache.maven.plugins</groupId>
				<artifactId>maven-compiler-plugin</artifactId>
				<version>3.1</version>
				<configuration>
					<source>${compiler}</source>
					<target>${compiler}</target>
					<compilerArgument>-err:-forbidden,discouraged,deprecation</compilerArgument>
				</configuration>
			</plugin>
			
			<!--
				Tycho plugins
			-->
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>tycho-maven-plugin</artifactId>
				<version>${tycho-version}</version>
				<extensions>true</extensions>
			</plugin>
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>tycho-packaging-plugin</artifactId>
				<version>${tycho-version}</version>
				<configuration>
					<strictVersions>false</strictVersions>
				</configuration>
			</plugin>
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>tycho-compiler-plugin</artifactId>
				<version>${tycho-version}</version>
				<configuration>
					<source>${compiler}</source>
					<target>${compiler}</target>
					<compilerArgument>-err:-forbidden,discouraged,deprecation</compilerArgument>
				</configuration>
			</plugin>
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>tycho-source-plugin</artifactId>
				<version>${tycho-version}</version>
				<executions>
					<execution>
						<id>plugin-source</id>
						<goals>
							<goal>plugin-source</goal>
						</goals>
					</execution>
				</executions>
			</plugin>
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>target-platform-configuration</artifactId>
				<version>${tycho-version}</version>
				<configuration>

					<pomDependencies>consider</pomDependencies>
					<dependency-resolution>
						<extraRequirements>
							<requirement>
								<type>eclipse-plugin</type>
								<id>com.ibm.notes.java.api.win32.linux</id>
								<versionRange>[9.0.1,9.0.2)</versionRange>
							</requirement>
						</extraRequirements>
						<optionalDependencies>ignore</optionalDependencies>
					</dependency-resolution>

					<filters>
						<!-- work around Equinox bug 348045 -->
						<filter>
							<type>p2-installable-unit</type>
							<id>org.eclipse.equinox.servletbridge.extensionbundle</id>
							<removeAll />
						</filter>
					</filters>

					<environments>
						<environment>
							<os>linux</os>
							<ws>gtk</ws>
							<arch>x86</arch>
						</environment>
						<environment>
							<os>linux</os>
							<ws>gtk</ws>
							<arch>x86_64</arch>
						</environment>
						<environment>
							<os>win32</os>
							<ws>win32</ws>
							<arch>x86</arch>
						</environment>
						<environment>
							<os>win32</os>
							<ws>win32</ws>
							<arch>x86_64</arch>
						</environment>
						<environment>
							<os>macosx</os>
							<ws>cocoa</ws>
							<arch>x86_64</arch>
						</environment>
					</environments>
					<resolver>p2</resolver>
				</configuration>
			</plugin>
		</plugins>
	</build>
</project>

So... yeah, there's a lot going on here. This is the biggest of the "XML dumps" we're going to have and contains by far the greatest number of bizarre "you just have to know about it" parts. "POM" stands for "Project Object Model" - it's the language Maven uses to describe the project. Let's tackle the file from near the top (ignoring the XML header):

project and modelVersion

These elements are effectively just boilerplate: the project element is the root of our project descriptor and it contains some definitions to let XML editors parse the file format. In turn, the modelVersion describes to Maven the specific version we're working with, which has been "4.0.0" for as long as I've been doing this.

groupId, artifactId, and version

These elements are obligatory in one form or another in every project, but are less copy-and-paste-able: they define the name and version of your project. These are Maven's equivalents to OSGi's Bundle-SymbolicName and Bundle-Version, though Maven makes an explicit distinction between the overall grouping of the plugin and its specific name. These are essentially arbitrary, but the convention is to use the standard reverse-DNS version of your domain name for the group ID, and then keep this group ID consistent across different projects (...mostly). The artifact ID is less consistent, but it's good to pick a pattern like "projectPrefix-submodule". Here, we actually reverse that a bit to call it "parent-xsp" in order to emphasize that this project's purpose is entirely to be a parent to the submodules and not an interesting artifact to consume itself. We'll break this convention again for the submodules due to our use of Tycho/OSGi.

packaging

A project's packaging describes the sort of output. By default, if this is left un-specified, it's jar - a normal, run-of-the-mill Jar file. There are a few other common ones you may run into - such as war for J2EE web apps or bundle for non-Tycho OSGi bundles - and the one we're using here is pom. This is actually kind of the "none of the above" option: the "pom" is just the file we're editing now, and is included with every project type. Having a packaging type of pom generally means that either the project has no real outputs of its own (as is the case here) or it's an "ad hoc" project that doesn't fit an existing type.

modules

This block is the hallmark of a parent project: it lists the relative folder paths that contain the submodules. In this case, the names line up with the names we'll use for the submodules, but this could potentially vary depending on the folder names and locations. Parent-child module relationships don't have to be physically hierarchical on the filesystem, but it's a good convention when you don't have a specific reason to break it.

properties

This block is the project-level equivalent to the user-level property we defined in .m2/settings.xml. There are two types of properties that can go here, with no obvious distinction between them: known configuration properties used by Maven itself and arbitrary named variables used by the person writing the pom.

The first property - project.build.sourceEncoding - is an example of the former. During execution, Maven will reference this property defined in the project (or one of its parents) when determining the text file encoding to use. This could be set to something else if you're working with non-Unicode files, but it's important to set it here so that file interpretation will not be platform-dependent. These properties can be read like an equivalent of EL for the project XML: it sets a property of sourceEncoding within the build node in project, but more consisely (more or less).

The other two are variables for use later. The names of these have only very loose conventions, but there seem to be a couple common types: ALL_CAPS, camelCase, and hyphen-delimited. You can also specify variables as dot.delimited and they will work the same way, but that makes them more difficult to distinguish from the system-level properties.

repositories

The repositories block is the start of our OSGi-related weirdness. The block itself isn't OSGi-specific - it has its role in other projects that want to make use of dependencies outside of the core public Maven repositories - but the contents is. We're setting two repositories here: one to point to the main Eclipse repository (the Luna version here, but that could just as well be Mars or Kepler) and one to point to the XPages Update Site. This is where the property we set before comes into play, allowing different developers to keep the update site in different locations without changing the project's config.

build

The build section is often the largest part of a POM file - it contains definitions and configuration for various additional Maven plugins used during compilation. We have two tasks to accomplish here: ensure that we use Java 6 for compilation at the root level (to ensure the build doesn't execute as Java 7 or 8 and be incompatible with Domino) and enable a whole slew of Tycho plugins.

The maven-compiler-plugin block specifies the version of the Java compiler plugin to use (3.1, which is actually kind of old, but the differences aren't important) and then provides it with some configuration to set the Java version level and to not choke on forbidden references. Like the Java version, the latter is a nod to Domino: depending on your JVM configuration, you may run into forbidden-reference errors relating to the lotus.domino classes.

The next slew of blocks all relate to loading up various Tycho components. Tycho's job is esentially to construct an entire OSGi environment during the Maven build, and it consists of a number of moving parts, many of which are basically the Tycho version of normal Maven facilities. The tycho-maven-plugin is the core, and its extensions rule is what allows it to worm itself into various phases of the build process. The tycho-packaging-plugin controls the process of bundling the projects as their various types: the plugin, the feature, and the update site (in our case). The tycho-compiler-plugin is the Tycho variant of the Maven one we configured earlier. The tycho-source-plugin is what allowed us to kick the standalone source feature to the curb - it's the equivalent of the Eclipse-specific feature we had been hooking into before.

The tycho-platform-configuration is the scariest of the bunch. This plugin's job is to establish the OSGi Target Platform we're working with, in conjunction with the repository specified above. Not all of this configuration is necessary for our immediate needs, but may come in handy later. The pomDependencies rule is useful when using mixed-type dependencies in more-complicated projects, while the extraRequirements block forces the inclusion of the plugin fragment that contains Notes.jar. The optionalDependencies rule comes in handy from time to time with XPages projects: there are sometimes cases where there's a dependency that Eclipse has and which the server will have, but which will be awkward to get to in Maven, usually relating to dependencies-of-dependencies not related to compilation. The filters block is... I don't know; just keep it in there. The environments block is a way to describe the platforms on which your code can execute - I believe this is primarily used when testing. The resolver is like filters in that it's a "I just copy it around" thing; presumably, it refers to a specific code path for resolving plugin dependencies.

Whew!

Okay, so... that's the first one down! For the most part, you can carry around the whole bottom section pretty much as-is for your XPages Maven projects (as I do), and then gradually become comfortable with the specifics over time. Now, there's some good news and some bad news:

  • The bad news is that there are three more POM files to write.
  • The good news is that they are much, much simpler.

In recent versions of Tycho, they've added the ability to reduce the number of POMs involved, but there are limits to that, particularly to do with Jenkins, so we'll stick to the traditional way for now.

com.example.xsp.plugin

Go into the "com.example.xsp.plugin" folder and create a new pom.xml containing this:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<parent>
		<groupId>com.example</groupId>
		<artifactId>parent-xsp</artifactId>
		<version>1.0.0-SNAPSHOT</version>
	</parent>
	<artifactId>com.example.xsp.plugin</artifactId>
	<packaging>eclipse-plugin</packaging>
</project>

Like I promised: much simpler. Because the parent POM already brought in all the Tycho plugins and configuration, all we need to do here is the basics. One slightly-unusual aspect here is the packaging type. eclipse-plugin isn't a packaging type known to Maven inherently; instead, it's provided by Tycho, but can be used in the same way.

The artifact ID here is a concession to Tycho: Maven artifact IDs don't usually follow the same full-reverse-DNS conversion as OSGi, but Tycho wants the artifact ID to match the OSGi bundle name.

com.example.xsp.feature

Next up is the pom.xml file in the "com.example.xsp.feature" folder:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<parent>
		<groupId>com.example</groupId>
		<artifactId>parent-xsp</artifactId>
		<version>1.0.0-SNAPSHOT</version>
	</parent>
	<artifactId>com.example.xsp.feature</artifactId>
	<packaging>eclipse-feature</packaging>
</project>

This is very similar to the last, with the only real differences being the artifact ID and the packaging type.

com.example.xsp.update

Now, the pom.xml in "com.example.xsp.update":

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<parent>
		<groupId>com.example</groupId>
		<artifactId>parent-xsp</artifactId>
		<version>1.0.0-SNAPSHOT</version>
	</parent>
	<artifactId>com.example.xsp.update</artifactId>
	<packaging>eclipse-update-site</packaging>

	<build>
		<plugins>
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>tycho-packaging-plugin</artifactId>
				<version>${tycho-version}</version>
				<configuration>
					<archiveSite>true</archiveSite>
				</configuration>
			</plugin>
		</plugins>
	</build>
</project>

This one's slightly longer, but not by too much. Beyond the different artifact ID and packaging, we also provide some additional configuration to the tycho-packaging-plugin. This is among the plugins that were established in the root POM, but it's re-defined here in order to enable the archiveSite configuration option. This will give us a nice ZIP file of the Update Site at the end.

There's one other thing to note here: Tycho considers the eclipse-update-site packaging type to be deprecated, and it may be removed in the future. In most examples you'll see outside of Domino, people use eclipse-repository instead. This gets back to the difference between the old-style ("site.xml") Eclipse Update Sites and the new-style ("category.xml", named P2) Update Sites. For now, we use the old-style variant because it works better with Notes and Domino.

In addition to this POM file, we also have two changes to make in the site.xml: remove the source-feature reference (we'll add this back elsewhere later) and clean up the versions:

<?xml version="1.0" encoding="UTF-8"?>
<site>
   <feature url="features/com.example.xsp.feature_1.0.0.qualifier.jar" id="com.example.xsp.feature" version="1.0.0.qualifier">
      <category name="Example"/>
   </feature>
   <category-def name="Example" label="Example"/>
</site>

The version distinction is to change the Eclipse-generated timestamps at the end of the versions to "qualifier". The reason for this is that, for Tycho, the site.xml acts as a pure configuration file, and will no longer be the site index itself. So Tycho wants the qualifier to be generic, and then will fill it in during compilation. Like the source feature, this will be covered more later.

Last Steps

With our POM files defined, the last step for now is to import the projects back into Eclipse. In Eclipse, go to File → Import, expand the "Maven" category, and choose "Existing Maven Projects":

On the next screen, browse to the "com.example.xsp" directory created earlier. If all goes well, this should find the four projects in their hierarchy:

Everything on this can be left as the defaults, though you may want to specify a more-descriptive working set name - that doesn't affect the project behavior.

When you click "Finish", Eclipse with churn for a bit and then, if you're running Mars and haven't done this before, it will present a dialog about "Maven plugin connectors":

The specifics of what is going on here are a large topic of their own, but the short of it is that Eclipse needs specialized plugins to deal with each Maven plugin, and in this case it's looked for (and found) connectors for Tycho. Click "Finish", "Next", and "OK", accept the license terms, and restart Eclipse as it tells you to.

When Eclipse restarts, it should go through some churning while it updates the Maven projects and should finally settle on no remaining errors.

Closing Out

There will be some things to discuss with the fallout from this conversion, but this will do it for today. Commit your changes, stand up and stretch, and grab a cup of relaxing tea:

That Java Thing, Part 14: Maven Environment Setup

Sun Feb 21 17:51:37 EST 2016

Tags: java maven
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment

Before diving into the task of converting our plugin projects to Maven, there's a bit of setup we need to do. In a basic case, Maven doesn't require much setup beyond the project file itself - it's a "convention over configuration" type of thing that tries to make doing things the default way smooth. However, since it's also a "Java" thing, that means that anything out of the ordinary requires a bunch of XML.

Our big "out of the ordinary" aspect is OSGi. Maven and OSGi are often at loggerheads, but the conflict won't be too great in our situation. Still, it does mean there will be a few hoops to jump through, and one of those hoops is dealing with our dependency on the XPages runtime plugins. Since these plugins are not packaged as fully-Mavenized artifacts (yet (hopefully)), we'll need to configure Tycho to read the p2 (Eclipse) style.

In part 3, we downloaded the Build Management Update Site from OpenNTF, and we'll reuse that here. What we need to do is create a global Maven settings file that, for now, will just contain a definition of a variable to point to this update site. It would also be possible to specify this inside the project itself, but it's better form to use a consistent variable name (the most common convention is notes-platform) in the project files and then have your local settings point to it on your machine.

The global Maven settings file is called settings.xml and is stored in a folder named .m2 in your user's home directory (e.g. C:\Users\someuser\.m2 or /Users/someuser/.m2). Creating a folder named with a leading dot can be a pain in Explorer and the Finder, so it may be necessary to drop into a command line or other tool to do it. One way or another, create this file and set its contents similar to this:

<?xml version="1.0"?>
<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0 http://maven.apache.org/xsd/settings-1.0.0.xsd">
	<profiles>
		<profile>
			<id>main</id>
			<properties>
				<notes-platform>file:///C:/IBM/UpdateSite</notes-platform>
			</properties>
		</profile>
	</profiles>
	<activeProfiles>
		<activeProfile>main</activeProfile>
	</activeProfiles>
</settings>

Adjust the file:// URL as necessary to point to the location on your computer. It has to be a file URL and not a normal path, presumably because repositories are usually expected to be remote HTTP sites.

This is the only configuration we need before getting to the project, but it's a good preview of the sort of "try pasting this big block of XML somewhere" advice you're in for when it comes to Maven use. Over time, the structure of the XML and how it relates to Maven's behavior begins to crystallize, but it's definitely cumbersome to start with, and it will get more opaque before it gets less so.

Depending on your proclivities, this may be a good opportunity to install standalone Maven as well. Eclipse has its own embedded version, so this is not required, but it can be handy sometimes to be able to run Maven from the command line. On your average Linux distribution or OS X with Homebrew, Maven should be installable with a package manager. Otherwise, Maven can be downloaded from maven.apache.org - it doesn't have an installer as such, as it's essentially some scripts around Java classes, but they have tips for adding it to your path.

Next, we'll get to the real meat of this process: actually converting the projects to Maven.

That Java Thing, Part 13: Introduction to Maven

Fri Feb 19 18:27:12 EST 2016

Tags: maven
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment

I've been laying warnings that this would be coming and you've seen me grouse about it for over a year, but now the time has come to really dive into Maven for Domino developers.

To lead into it, there are two main topics to cover: what Maven is and why you should bother.

What Maven Is

Maven is a build automation tool, primarily for Java applications but able to work with a number of other languages and environments.

The concept of a "build automation tool" is a strange one when you're coming from a Notes/Domino perspective, and it's the source of a lot of consternation when moving to it. In classic Notes, there conceptually was no build phase for an application: certain things would be compiled on save, but there was rarely any need to think about this. Designer was the way to write applications and it took care of it. With XPages came a bit of Eclipse-ism with the notion of "Build" being a separate, not-necessarily-automatic stage, but there still wasn't much user-facing configuration going on: other than maybe adding a source folder, the IDE just kind of took care of it.

Even for OSGi plugin developers, the need seems a little arcane. Eclipse does have project and build configurations, and it runs through build scripts internally when you export Jars or build an update site. Again, though, this is all largely hidden and the user doesn't normally have to think much about it.

Where this comes in, though, is when you want to start expanding your projects in ways beyond the "single bag of code" stage: automatically including pre-packaged dependencies, making the project cleanly available to others down stream, sharing configuration across projects, and, particularly, automating building, testing, and deployment with an environment like Jenkins. Maven (and the alternatives like Gradle) provide important structure and meta-information to do these things and scale them to ever-larger tasks.

Why You Should Bother

I've found the pro-Maven pitch to be kind of a weird one, since it's sort of like unit/integration testing in that, before you do it, it doesn't seem worth the hassle, but then, when you've switched over, it seems crazy to not do it. Like a cult, I guess, but a somewhat better idea.

I'll start with an important reason: it's good for your career. Unless you want to remain on legacy-maintenance duty forever (which, granted, can be a stable gig), it's important to keep improving your skills, and build automation is a big concept that pays dividends in knowledge in Domino programming and beyond. Once you're familiar with a system like Maven, you start recognizing the same patterns elsewhere: in some of OSGi's capabilities, in older systems like Make, and, crucially, in whatever modern JavaScript toolchains are doing lately. Learning something like this opens up doors.

It also makes testing that much more natural. Building and running automated tests is certainly possible in Eclipse, but Maven's structure strongly encourages it (newly-created projects start with JUnit and a tests directory, to nudge you in that direction), and having test running be a phase in building makes it much more foolproof. The virtue of writing tests creeps up on you: even if you don't go whole-hog TDD, getting into the habit of starting each bug fix with a failing-then-successful test case means you now have a bug you'll never have to see again. Granted, as with a lot of other aspects, the nature of Domino development creates some hurdles, but it's still worth it.

And finally: features, features, features. Once you get comfortable with Maven, it becomes much easier to spin up accompanying source and Javadoc packages for your projects, add in other languages, filter content during builds, create alternate build profiles for different situations, bring in remote resources easily, deploy to targets automatically, manage versioning cleanly, and so forth. These are all things that are possible in the absence of a system like Maven, but Maven brings them all together in a way that is understandable both by your IDE of choice and by faceless servers.

What's Next

The next step will move from high-flown concepts to some brass tacks: preparing a Maven environment for Domino development and getting a look at what Maven's configuration files look like.

Wrangling Tycho and Target Platforms

Sun Aug 30 17:16:13 EDT 2015

Tags: maven tycho

One of the persistent problems when dealing with OSGi projects with Maven is the interaction between Maven, Tycho, and Eclipse. The core trouble comes in with the differing ways that Maven and OSGi handle dependencies.

Dependency Mechanisms

The Maven way of establishing dependencies is to list them in your Maven project's POM file. A standard one will look something like this:

<dependencies>
	<dependency>
		<groupId>com.google.guava</groupId>
		<artifactId>guava</artifactId>
		<version>18.0</version>
	</dependency>
</dependencies>

This tells Maven that your project depends on Guava version 18.0. The "groupId" and "artifactId" bits are essentially arbitrary strings that identify the piece of code, and, following Java standards, convention dictates that they are generally reverse-DNS-style. There are variations on this setup, such as specifying version ranges or sub-artifacts, but that's what you'll usually see. The term "artifact" is a Maven-ism referring to a specific entity, usually a single Jar file, and I've taken to using it casually.

One of the key things Maven brings to the table here is Maven Central: a warehouse of common Maven-ized projects. Without specifying any additional configuration, the dependency declaration above will cause Maven to check with Maven Central to find the Jar, download it, and store it in your local repository (usually ~/.m2/repository). Then, during the build process, Java can reference the local copy of the Jar in the consistently-organized local folder structure. It will also, if needed, download "transitive" dependencies: the dependencies listed by the project you're depending on.

OSGi's dependency system is conceptually similar. Instead of the POM file, it piggybacks on the Jar's MANIFEST.MF file with something like this:

Require-Bundle: com.google.guava;bundle-version="18.0"

This is essentially the same idea as the Maven dependency: you reference an OSGi-enabled Jar (called a "Bundle" in OSGi parlance... which can also be a "Plug-in") by its usually-reverse-DNS name and provide restrictions on versions, plus other potential options.

There is no equivalent here of Maven Central: OSGi artifacts are found in Update Sites for each project and are added to the OSGi environment. When you install a plug-in in Eclipse/Designer or Domino, you are contributing to your installation's pool of OSGi artifacts. There are some conveniences to make this experience easier in some cases, such as the Eclipse Marketplace and the primary Eclipse Update Site, but it's not as coordinated as Maven.

The Overlap

Though often redundant, these two dependency mechanisms are not inherently incompatible. A given Jar file can be represented as both a Maven artifact and an OSGi bundle - and, indeed, a great many of the artifacts in Maven Central come pre-packaged with OSGi metadata, and there are Maven plugins to make generating this invisible to the developer.

Tycho - the Maven plugin that creates an OSGi environment for your Maven development - has the capability to more-or-less bridge this gap. By adding the Tycho plugins to your Maven build, you can point Maven at OSGi Update Sites (called "p2" sites) and Tycho will be able to find the artifacts referenced by your project's MANIFEST.MF Require-Bundle line. Even better, by using <pomDependencies>consider</pomDependencies> in your Tycho config, it will be able to look at the Maven dependencies of your project, check them for OSGi metadata, and then use that to satisfy the MANIFEST.MF requiremenets.

Though convoluted to say, the upshot is that, when you have that pomDependencies option, things work out pretty well... from the command line. The trouble comes in when you want to develop these projects in Eclipse.

Target Platforms

The aggregate set of OSGi bundles known by your OSGi environment (either Tycho or Eclipse in this case) and used for compilation is the "Target Platform". If you've used the XPages SDK or otherwise set up a non-Designer Eclipse installation for XPages plug-in development, you've seen Target Platforms in action: the installation process locates your Notes and Domino installations and adds their OSGi bundles to Eclipse's Target Platform, allowing them to be references by your own OSGi projects.

The trouble is that Eclipse is a bit... inflexible when it comes to specifying a project's Target Platform. Though Eclipse has the capacity to have many Target Platform definitions, only one is active at a time for your entire workspace. Moreover, this Target Platform (plus any projects in your workspace) makes up the entirety of what Eclipse is willing to acknowledge for OSGi development.

This causes serious trouble for Maven dependencies.

If you have a Tycho-enabled project, Eclipse's adapter will not use its Maven dependencies for OSGi requirement resolution. So if your project lists Guava in both OSGi and Maven, even though Maven can see it, and Tycho can see it, and the Guava Jar sitting in your local Maven repository is brimming with OSGi metadata, Eclipse will not acknowledge it and you will have an error that com.google.guava can't be found.

Workarounds

There are a couple potential workarounds for this, none of which are particularly great.

Just Do It Manually

One option is to just have any developers working on the project also track down and manually add all applicable OSGi bundles to their Eclipse installation. It's not ideal, but it could work in a pinch, especially if you only have a single dependency or two.

Include the Project Wholesale

This is the approach the OpenNTF Domino API has taken to date: several of its external dependencies are included wholesale in source form in the project tree. This accomplishes the goal because, with the projects in your workspace, Eclipse will happily acknowledge them as part of the Target Platform, while Tycho will also be able to recognize them. However, it carries with it the significant down side of importing a whole heap of foreign code into your project and then having to ensure that it builds in your environment.

Maven-Generated Target Platform

Another option is to have Maven create a Target Platform file (*.target) dynamically, and then have Eclipse use that as its Target Platform definition. You can do that by including a Maven project like this in your tree:

<?xml version="1.0"?>
<project
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
	xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
	<modelVersion>4.0.0</modelVersion>
	<parent>
		<groupId>com.example</groupId>
		<artifactId>project-parent</artifactId>
		<version>1.0.0-SNAPSHOT</version>
	</parent>
	<artifactId>example-osgi-target</artifactId>
	
	<packaging>eclipse-target-definition</packaging>
	
	<build>
		<plugins>
			<plugin>
				<groupId>lt.velykis.maven</groupId>
				<artifactId>pde-target-maven-plugin</artifactId>
				<version>1.0.0</version>
				<executions>
					<execution>
						<id>pde-target</id>
						<goals>
							<goal>add-pom-dependencies</goal>
						</goals>
						<configuration>
							<baseDefinition>${project.basedir}/osgi-base.target</baseDefinition>
							<outputFile>${project.basedir}/${project.artifactId}.target</outputFile>
						</configuration>
					</execution>
				</executions>
			</plugin>
		</plugins>
	</build>
</project>

By creating a shell Target file in Eclipse named osgi-base.target, this project will locate its known dependencies (namely, any dependencies listed in it or in parent projects) and glom the paths of any of those OSGi plugins found in your local Maven repository onto it. In Eclipse, you can then open the generated Target file and set it as your active.

This... basically works, but it's ugly. Moreover, it limits your Target Platform customization options. If you want to include other Update Sites in your platform (say, the XPages targets generated by the SDK), you would have to modify the base Target file manually, making it fragile for multi-developer use.

Maven-Generated p2 Site

This is the option I'm tinkering with now, and it's similar to the Target-file approach. However, instead of creating an exclusive Target Platform, you can have Maven create a p2 Update Site and then add that directory to your Target Platform manually. That manual step is still unfortunate, but it's not too bad, and it should adapt automatically as more dependencies are added. A Maven plugin named p2-maven-plugin can do a tremendous amount of heavy lifting here: it can track down Maven dependencies, add OSGi metadata if they don't have them already, do the same for their dependencies, and then put them all into a nicely-organized Update Site:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>

	<groupId>com.example</groupId>
	<artifactId>example-osgi-site</artifactId>
	<version>1.0.0-SNAPSHOT</version>
	<packaging>pom</packaging>

	<pluginRepositories>
		<pluginRepository>
			<id>reficio</id>
			<url>http://repo.reficio.org/maven/</url>
		</pluginRepository>
	</pluginRepositories>

	<build>
		<plugins>
			<plugin>
				<groupId>org.reficio</groupId>
				<artifactId>p2-maven-plugin</artifactId>
				<version>1.2.0-SNAPSHOT</version>
				<executions>
					<execution>
						<id>default-cli</id>
						<phase>validate</phase>
						<goals>
							<goal>site</goal>
						</goals>
						<configuration>
							<artifacts>
								<artifact><id>com.google.guava:guava:18.0</id></artifact>
							</artifacts>
						</configuration>
					</execution>
				</executions>
			</plugin>
		</plugins>
	</build>
</project>

Once this project is executed, you can then add the generated folder to Eclipse's active Target Platform and be set. Though I haven't put this into practice yet, it may be the best out of a bad bunch of options.

Don't Use Eclipse

Well, I guess this final option may be the best if you're not an Eclipse fan - other IDEs may handle this whole thing much more smoothly. So, if you use IntelliJ and it doesn't have this problem, that's good.


These problems cause a lot more heartburn than you'd think they should, considering that this is basic project setup and not even part of the task of actually developing your project, but such is life. As long as you have a dependency on non-Mavenized OSGi artifacts (such as the XPages runtime) or want to use Tycho's full abilities (such as OSGi-environment unit tests or building full Eclipse-based applications) while also developing in Eclipse, you're stuck with this sort of workaround.

MWLUG 2015 - Maven: An Exhortation and Apology

Sun Aug 16 11:55:17 EDT 2015

Tags: mwlug maven

At MWLUG this coming week, I'll be giving a presentation on Maven. Specifically, I plan to cover:

  • What Maven is
  • Why Domino developers should know about it
  • Why it's so painful and awkward for Domino developers
  • Why it's still worth using in spite of all the suffering
  • How this will help when working on projects outside of traditional Domino

The session is slated for 3:30 PM on Thursday. I expect it to be cathartic for me and useful for the attendees, so I hope you can make it.

Maven Native Chronicles, Part 3: Improving Native Artifact Handling

Sun Jul 26 21:38:37 EDT 2015

Tags: maven
  1. Maven Native Chronicles, Part 1: Figuring Out nar-maven-plugin
  2. Maven Native Chronicles, Part 2: Setting Up a Windows Jenkins Node
  3. Maven Native Chronicles, Part 3: Improving Native Artifact Handling
  4. Maven Native Chronicles: Running Automated Notes-based Tests

This post isn't so much a part of the current series as it is a followup to a post from the other week, but I can conceptually retcon that one in as a prologue. This will also be a good quick tip for dealing with Maven projects.

In my previous post, I described how I copied the built native shared library from the C++ project into the OSGi fragments for distribution, and I left it with the really hacky approach of copying the file using a project-relative path that reached up into the other project. It technically functioned, but it relied on the specific project structure, which wouldn't survive any reorganization or breaking up of the module tree.

To improve it, I reworked it to be a bit more Maven-y, which involves two steps: attaching the built artifacts to the output of the native project and then using the dependency plugin to copy the native artifacts in as needed. For the first step, I used the build-helper-maven-plugin, though there may be other ways to do it. This is relatively straightfoward, though:

<plugin>
	<groupId>org.codehaus.mojo</groupId>
	<artifactId>build-helper-maven-plugin</artifactId>
	<version>1.3</version>
	<executions>
		<execution>
			<id>attach-artifacts</id>
			<phase>package</phase>
			<goals>
				<goal>attach-artifact</goal>
			</goals>
			<configuration>
				<artifacts>
					<artifact>
						<file>${project.basedir}/x64/Debug/nativelib-win32-x64.dll</file>
						<type>dll</type>
						<classifier>win32-x64</classifier>
					</artifact>
					<artifact>
						<file>${project.basedir}/Win32/Debug/nativelib-win32-x86.dll</file>
						<type>dll</type>
						<classifier>win32-x86</classifier>
					</artifact>
				</artifacts>
			</configuration>
		</execution>
	</executions>
</plugin>

This causes the native libraries - so far, the two Windows ones - to be included in the Maven repository during installation, and to then be accessible from other projects. The files are named using the module base name plus the classifier appended and the type as the file extension, like native-project-name-win32-x64.dll.

To copy that artifact into the OSGi bundle project, I then use maven-dependency-plugin to copy it in. Here I reference it via the module name and the classifier/type pair used above (with some shorthands because they're in the same multi-module project):

<plugin>
	<groupId>org.apache.maven.plugins</groupId>
	<artifactId>maven-dependency-plugin</artifactId>
	<version>2.10</version>
	
	<executions>
		<execution>
			<id>copy-native-lib</id>
			<phase>prepare-package</phase>
			<goals>
				<goal>copy</goal>
			</goals>
			<configuration>
				<artifactItems>
					<artifactItem>
						<groupId>${project.groupId}</groupId>
						<artifactId>native-project-name</artifactId>
						<version>${project.version}</version>
						<type>dll</type>
						<classifier>win32-x64</classifier>
					</artifactItem>
				</artifactItems>
				<outputDirectory>lib</outputDirectory>
				<stripVersion>true</stripVersion>
			</configuration>
		</execution>
	</executions>
</plugin>

The net result here is the same as previously, but should be more maintainable.

Maven Native Chronicles, Part 2: Setting Up a Windows Jenkins Node

Sun Jul 26 11:16:50 EDT 2015

Tags: maven
  1. Maven Native Chronicles, Part 1: Figuring Out nar-maven-plugin
  2. Maven Native Chronicles, Part 2: Setting Up a Windows Jenkins Node
  3. Maven Native Chronicles, Part 3: Improving Native Artifact Handling
  4. Maven Native Chronicles: Running Automated Notes-based Tests

Before I get to the meat of this post, I want to point out that Ulrich Krause wrote a post on a similar topic today and you should read it.

The build process I've been working with involves a Jenkins server running on OS X (in order to build iOS binaries), and so it will be useful to have a Windows instance set up as well to run native builds and, importantly, tests. Jenkins comes with support for distributed builds and makes it relatively straightforward.

To start with, I installed VirtualBox and went through the usual Windows setup process - it shouldn't matter too much which major version of Windows you use, as long as it's 64-bit, in order to be able to generate and test both types of binaries. Once that was running, I installed the latest 64-bit JDK followed by Visual Studio Community, which is a pretty smooth process (for all their faults, Microsoft knows how to treat developers). To provide access to the VM from the Mac host, I added a second network adapter to the VM and set it to host-only networking:

During this process, I found Jump Desktop to be a very useful tool. Since the Mac host runs SSH, I was able to set up an RDP connection to the Windows VM using an SSH tunnel, which Jump does transparently for you. This made for a much better experiencing than VNCing into the Mac and controlling Windows in the VirtualBox window in there.

Next, I decided that the route I wanted to take to control the Windows slave was SSH, since SSH is the bee's knees. I installed Cygwin, which creates a fairly Unix-like environment on top of Windows, and included OpenSSH in the process. After going through the afore-linked setup process, I had SSH access to the Windows machine (including, thanks to SSH proxying, remote access via the primary build server). On the Jenkins side on the Mac, I installed the "Cygpath plugin" (which is in the built-in plugin manager) to avoid any of the issues mentioned on the wiki page. The configuration in Jenkins is relatively straightforward (I will probably end up changing the base directory to be a clean Jenkins home, since I hadn't initially been sure if I needed Jenkins installed on the slave):

With that, I was able to set the build to run on servers with the "windows" label, kick it off, and start going through its complaints until I had it working.

First off, I had some more Java setup to do, specifically creating a system environment variable named JAVA_HOME and setting it to the root of the JDK ("C:\Program Files\Java\jdk1.8.0_51" in this case). Then, I set up Maven, which is something of an awkward process on Windows, but not TOO bad. I downloaded the latest binaries, unzipped them to "C:\Program Files\maven", added an environment variable of M2_HOME to point to that:

I also added %M2_HOME%\bin;C:\Program Files (x86)\MSBuild\12.0\Bin to the end of the PATH variable, to cover both the Maven tools and the msbuild executable for later.

I ran into a bit of weirdness when it came to setting up configuration for SSH and Maven, specifically because it seems that Cygwin has two home folders for the logged-in user: the Unix-style /home/jesse and the normal Windows C:\Users\jesse (which is available in Cygwin as /cygdrive/c/Users/jesse). Since this Jenkins build checks out the code from GitHub via SSH, I needed to copy over the id_rsa file for the Jenkins user: this went into /home/jesse/.ssh/id_rsa. In order to configure Maven, though, the settings file went to C:\Users\jesse\.m2\settings.xml.

Eventually, it slogged its way through the build to completion, including a successful run of the integration tests. I still need to figure out the best way to get the resultant artifacts back out (or maybe it will be best to just deploy from both to the same Artifactory server), but this seems to do the main task for me.

Maven Native Chronicles, Part 1: Figuring Out nar-maven-plugin

Fri Jul 24 15:48:59 EDT 2015

  1. Maven Native Chronicles, Part 1: Figuring Out nar-maven-plugin
  2. Maven Native Chronicles, Part 2: Setting Up a Windows Jenkins Node
  3. Maven Native Chronicles, Part 3: Improving Native Artifact Handling
  4. Maven Native Chronicles: Running Automated Notes-based Tests

As I mentioned the other day, my work lately involves a native shared library that is then included in an OSGi plugin. To get it working during a Maven compile, I just farmed out the actual build process to Visual Studio's command-line project builder. That works as far as it goes, but it's not particularly Maven-y and, more importantly, it's Windows-only.

In looking around, it seems like the most popular method of doing native compilation in Maven, especially with JNI components, is maven-nar-plugin - nar means "Native ARchive", and it's meant to be a consistent way to package native artifacts (executables and libraries) across platforms. It does an admirable job wrangling the normally-loose nature of a C/C++ program to work with Maven-ish standards and attempts to paper over the differences between platforms and toolchains. I'm not entirely convinced that this will be the way I go long-term (in particular, its attitude towards multi-platform/arch builds seems to be "eh, sort of?"), but it's a good place to get started with non-Windows compilation.

The first step was to move the files around to mostly match a Maven-style layout. Starting out, the .cpp and .h files were in the src folder directly, while dependency headers were in a dependencies folder next to it. I left the Notes includes in there for now, but it seems that nar-maven-plugin will cover the JNI stuff for me, so I could simplify that somewhat. The new project structure looks like:

  • (project root)
    • src
      • main
        • c++
        • include
    • dependencies
      • inc
        • notes

Next was to set up the project configuration. For now, I want to still use Visual Studio's CLI app to build the Windows version, and I'm going to have to specifically define supported platforms, so I define the project as a nar, but then disable actual execution of the plugin by default:

<project>
	...
	<packaging>nar</packaging>
	
	<build>
		<plugins>
			<plugin>
				<groupId>com.github.maven-nar</groupId>
				<artifactId>nar-maven-plugin</artifactId>
				<version>3.2.3</version>
				<extensions>true</extensions>
				
				<configuration>
					<skip>true</skip>
				</configuration>
			</plugin>
		</plugins>
	</build>
</project>

Then, much as I did for the Windows-specific builds, I added a profile to try to build on my Mac. Note that these build settings produce a library that fails all unit tests, so they're surely not correct, but hey, it compiles and links, so that's a start. To ensure that it only builds when it has an appropriate context, it is triggered by a combination of OS family and the presence of the notes-program Maven property, which should point to the Notes executable directory.

<project>
	...
    
	<profiles>
		...
		<profile>
			<id>mac</id>
		
			<activation>
				<os>
					<family>mac</family>
				</os>
				<property>
					<name>notes-program</name>
				</property>
			</activation>
	
			<build>
				<plugins>
					<plugin>
						<groupId>com.github.maven-nar</groupId>
						<artifactId>nar-maven-plugin</artifactId>
						<extensions>true</extensions>
			
						<configuration>
							<skip>false</skip>
				
							<cpp>
								<debug>true</debug>
								<includePaths>
									<includePath>${project.basedir}/src/main/include</includePath>
									<includePath>${project.basedir}/dependencies/inc/notes</includePath>
								</includePaths>
					
								<options>
									<option>-DMAC -DMAC_OSX -DMAC_CARBON -D__CF_USE_FRAMEWORK_INCLUDES__ -DLARGE64_FILES -DHANDLE_IS_32BITS -DTARGET_API_MAC_CARBON -DTARGET_API_MAC_OS8=0 -DPRODUCTION_VERSION -DOVERRIDEDEBUG</option>
								</options>
							</cpp>
							<linker>
								<options>
									<option>-L${notes-program}</option>
								</options>
								<libSet>notes</libSet>
							</linker>
				
							<libraries>
								<library>
									<type>shared</type>
								</library>
							</libraries>
						</configuration>
					</plugin>
				</plugins>
			</build>
		</profile>
	</profiles>
</project>

Unstable though the result may be, the nar plugin does its job: it produces an archive containing the dylib, suitable for distribution as a Maven artifact and extraction into the downstream project, which I'll go into later.

So this is a good step towards my final goal. As I mentioned, I may end up getting rid of nar-maven-plugin specifically, but this is a good way to shape the code into something more portable (I also got rid of a few Windows-isms in the C++ while I was at it). My ultimate goal is to get a single build run that produces artifacts for all of the important platforms (Windows 32/64 and Linux 32/64 for production, Mac 32/64(?) for JUnit tests during development). I may be able to accomplish that using the nar plugin with a distributed Jenkins build, or I may be able to do it with Makefiles with GCC cross-compilers on OS X build host. If that works, it's the sort of thing that makes all this Maven stuff worthwhile.

Quick-and-Dirty Inclusion of a Visual C++ Project in a Maven Build

Sat Jul 11 19:26:34 EDT 2015

Tags: maven jni

One of my projects lately makes use of a JNI library distributed via an OSGi plugin. The OSGi side of the project uses the typical Maven+Tycho combination for its building, but the native library was developed using Visual C++. This is workable enough, but ideally I'd like to have the whole thing part of one smooth build: compile the native library, then subsequently copy its resultant shared 32- and 64-bit libraries into the OSGi plugins.

From what I've gathered, the "proper" way to do this sort of setup is to use the nar-maven-plugin, which is intended to wrap around the normal compilers for each platform and handle packaging and access to the libraries and related components. I tinkered with this a bit but ran into a lot of trouble trying to get it to work properly, no doubt due to my extremely-limited knowledge of C++ toolchains combined with the natural weirdness of Windows's development environment.

For now, I decided to do it the "ugly" way that nonetheless gets the job done: just run the Visual C++ toolchain from Maven. Fortunately, Microsoft includes a tool called msbuild for this purpose: if you run it in the directory of a Visual C++ project, it will act like the full IDE. I added its executables to my PATH (C:\Program Files (x86)\MSBuild\12.0\bin) and then used a Maven plugin called exec-maven-plugin to launch it (the Ant plugin would also work, but this is more explicit). Since this will only run on Windows, I wrapped it in a triggered profile and added two executions to cover both 32-bit and 64-bit versions:

<project>
	...
	<packaging>pom</packaging>
	...
	
	<profiles>
		<profile>
			<id>windows-x64</id>
		
			<activation>
				<os>
					<family>windows</family>
					<arch>amd64</arch>
				</os>
			</activation>
			
			<build>
				<plugins>
					<plugin>
						<groupId>org.codehaus.mojo</groupId>
						<artifactId>exec-maven-plugin</artifactId>
						<version>1.4.0</version>
						<executions>
							<execution>
								<id>build-x86</id>
								<phase>generate-sources</phase>
								<goals>
									<goal>exec</goal>
								</goals>
								<configuration>
									<environmentVariables>
										<Platform>Win32</Platform>
									</environmentVariables>
									<executable>msbuild</executable>
								</configuration>
							</execution>
							<execution>
								<id>build-x64</id>
								<phase>generate-sources</phase>
								<goals>
									<goal>exec</goal>
								</goals>
								<configuration>
									<environmentVariables>
										<Platform>X64</Platform>
									</environmentVariables>
									<executable>msbuild</executable>
								</configuration>
							</execution>
						</executions>
					</plugin>
				</plugins>
			</build>
		</profile>
	</profiles>
</project>

The project itself remains configured in Visual Studio. While the source files are certainly modifiable in Eclipse, it won't have the full C/C++ toolchain environment until I figure out a proper way to do that. But this does indeed do the trick: it creates the two DLLs in the same way as when I had been building them in the IDE.

The next step is to automatically include these in the appropriate OSGi fragment projects. For this, at least for now, I'm using the maven-resources-plugin. This configuration depends on the structure of the Maven projects, which is sort of fragile, but it's not too bad when they're in the same overall project. This is the config for the x64 plugin, and there is a separate x86 project with an almost-identical configuration:

<project>
	...
	<build>
		<plugins>
			...
			<plugin>
				<groupId>org.apache.maven.plugins</groupId>
				<artifactId>maven-resources-plugin</artifactId>
				<version>2.7</version>
				
				<executions>
					<execution>
						<id>copy-native-lib</id>
						<phase>generate-resources</phase>
						<goals>
							<goal>copy-resources</goal>
						</goals>
						<configuration>
							<resources>
								<resource>
									<directory>${project.basedir}/../../native-project-name/x64/Debug/</directory>
									<includes>
										<include>nativelib-win32-x64.dll</include>
									</includes>
								</resource>
							</resources>
							<outputDirectory>${project.basedir}/lib</outputDirectory>
						</configuration>
					</execution>
				</executions>
			</plugin>
		</plugins>
	</build>
</project>

The result is that, at least when I build on Windows, everything is properly compiled and put in its right place. When running in my normal Mac dev environment, it uses the built libraries that have previously been copied into the plugin, so it still works well enough.

This is still a far cry from an optimal configuration. The requirement of using Visual Studio is cumbersome, which means that any multi-platform build will mean a redundant config (whether it be in the pom or in a separate Makefile), and this current setup isn't properly "Mavenized": the output doesn't go into the "target" folder and the DLLs aren't tagged for inclusion in the installed Maven repo. It suits the purpose, though, of being an intermediate step in a larger build.

My long-term desire is to get this fully cross-platform and automated on a build server. That will involve a lot of learning about the nar-maven-plugin (or Makefiles) as well as either setting up a cross-compilation infrastructure or a series of Jenkins slaves. In theory, an OS X system can have everything it would need to build for the other platforms itself, but I've gathered that the safest way to do it is with the "multiple Jenkins nodes" route. When I develop an improved build system for this, I'll write followup posts.

Building on ODA's Maven-ization

Tue Mar 31 20:30:49 EDT 2015

Tags: maven oda

Over the weekend, I took a bit of time to apply some of my hard-won recent Maven knowledge to a project I wish I had more time to work with lately: the ODA. The development branches have been Maven-ized for half a year or so, but primarily just to the point of getting the compile to work. Now that I know more about it, I was able to go in and make great strides towards several important goals.

As a preliminary note: don't take my current implementations as gospel. There are parts that will no doubt change; for example, there are some intermittent timing issues currently with the final assembly. But the changes I did make have borne some early fruit.

Source Bundles

Over the releases, it's proven surprisingly fiddly to get parameter names, inline Javadoc, and attached source to work in Designer, leaving some builds no better off than the legacy API in those regards. The apparently-consistent fix for this is the use of "source" plugins: OSGi plugins that go alongside the normal one that just contain the source of each class. Those aren't too bad to generate manually from Eclipse, but the point of Maven is getting away from that sort of manual stuff.

Fortunately, Tycho (the OSGi toolkit for Maven) includes a plugin that allows you to generate these source bundles alongside the normal ones, by including this in the list of plugins executed during the build:

<plugin>
	<groupId>org.eclipse.tycho</groupId>
	<artifactId>tycho-source-plugin</artifactId>
	<version>${tycho-version}</version>
	<executions>
		<execution>
			<id>plugin-source</id>
			<goals>
				<goal>plugin-source</goal>
			</goals>
		</execution>
	</executions>
</plugin>

Once you have that (which I added to the top-level project, so it cascades down), you can then add the plugins to the OSGi feature with the same name as the base plugin plus ".source". Eclipse will give a warning that the plugins don't exist (since they exist only during a Maven build), but you can ignore that.

Javadoc

Javadoc generation is an area where I suspect I'll make the most changes down the line, but I managed to wrangle it into a spot that mostly works for now.

Not every project in the tree needs Javadoc (for example, we don't need to include docs for third-party modules necessarily), but it's still useful to specify configuration. So I took the already-existing basic config in the parent pom and moved it to pluginManagement for the children:

<pluginManagement>
	<plugins>
		<plugin>
			<!-- javadoc configuration -->
			<groupId>org.apache.maven.plugins</groupId>
			<artifactId>maven-javadoc-plugin</artifactId>
			<version>2.9</version>
			<configuration>
				<failOnError>false</failOnError>
				<excludePackageNames>com.sun.*:com.ibm.commons.*:com.ibm.sbt.core.*:com.ibm.sbt.plugin.*:com.ibm.sbt.jslibrray.*:com.ibm.sbt.proxy.*:com.ibm.sbt.security.*:*.util.*:com.ibm.sbt.portlet.*:com.ibm.sbt.playground.*:demo.*:acme.*</excludePackageNames>
			</configuration>
		</plugin>
	</plugins>
</pluginManagement>

Then, I added specific plugin references in the applicable child projects:

<plugin>
	<groupId>org.apache.maven.plugins</groupId>
	<artifactId>maven-javadoc-plugin</artifactId>
	<executions>
		<execution>
			<id>generate-javadoc</id>
			<phase>package</phase>
			<goals>
				<goal>jar</goal>
			</goals>
		</execution>
	</executions>
</plugin>

With those, the build can generate Javadoc appropriate for consumption in the final assembly down the line.

Assembly

The final coordinating piece is referred to as the "assembly". The job of the Maven Assembly Plugin is to take your project components and output - built Jars, source files, documentation, etc. - and assembly them into an appropriate final format, usually a ZIP file.

The route I took is to add a distribution project to the tree whose sole job it is to wait until the other components are done and then assemble the results. The pom for this project primarily consists of telling Maven to run the assembly plugin to create an appropriately-named ZIP file using what's called an "assembly descriptor": an XML file that actually provides the instructions. There are a couple stock descriptors, but for something like this it's useful to write your own. It's quite a file (and also liable to change as I figure out the best practices), but is broken down into a couple logical segments.

First off, we have a rule telling it to include all files from the "src/main/resources" folder in the current (assembly) projet:

<fileSets>
	<fileSet>
		<directory>src/main/resources</directory>
		<includes>
			<include>**/*</include>
		</includes>
		<outputDirectory>/</outputDirectory>
	</fileSet>
</fileSets>

This folder contains a README description of the result as well as the miscellaneous presentations and demo files the ODA has collected over time.

Next, in addition to the source bundles mentioned earlier, I want to include ZIP files of the important project sources in the distribution, for easy access (technically wasteful, but not by too much):

<moduleSet>
	<useAllReactorProjects>true</useAllReactorProjects>
	<includes>
		<include>org.openntf.domino:org.openntf.domino</include>
		<include>org.openntf.domino:org.openntf.domino.xsp</include>
		<include>org.openntf.domino:org.openntf.formula</include>
		<include>org.openntf.domino:org.openntf.junit4xpages</include>
	</includes>
	
	<binaries>
		<attachmentClassifier>src</attachmentClassifier>
		<outputDirectory>/source/</outputDirectory>
		<unpack>false</unpack>
		<outputFileNameMapping>${module.artifactId}.${module.extension}</outputFileNameMapping>
	</binaries>
</moduleSet>

I use the "binaries" tag here instead of "sources" because I want to include the ZIP forms (hence unpack=false) - this is one part that may change, but it works for now.

Next, I gather the Javadocs generated earlier, but these I do want to unpack:

<moduleSet>
	<useAllReactorProjects>true</useAllReactorProjects>
	<includes>
		<include>org.openntf.domino:org.openntf.domino</include>
		<include>org.openntf.domino:org.openntf.domino.xsp</include>
		<include>org.openntf.domino:org.openntf.formula</include>
	</includes>
	
	<binaries>
		<attachmentClassifier>javadoc</attachmentClassifier>
		<outputDirectory>/apidocs/${module.artifactId}</outputDirectory>
		<unpack>true</unpack>
	</binaries>
</moduleSet>

This results in an "apidocs" folder containing the Javadoc HTML for each of those three projects in subfolders.

Finally, I want to include the built and ZIP'd Update Site for use in Designer and Domino:

<moduleSet>
	<useAllReactorProjects>true</useAllReactorProjects>
	<includes>
		<include>org.openntf.domino:org.openntf.domino.updatesite</include>
	</includes>
	
	<binaries>
		<attachmentClassifier>assembly</attachmentClassifier>
		<outputDirectory>/</outputDirectory>
		<unpack>false</unpack>
		<includeDependencies>false</includeDependencies>
		<outputFileNameMapping>UpdateSite.zip</outputFileNameMapping>
	</binaries>
	
	<sources>
		<outputDirectory>/</outputDirectory>
		<includeModuleDirectory>false</includeModuleDirectory>
		<includes>
			<include>LICENSE</include>
			<include>NOTICE</include>
		</includes>
	</sources>
</moduleSet>

While grabbing the Update Site, I also copy the all-important LICENSE and NOTICE files from this current project - these may be best moved to the resources folder above.

The result of all this is a nicely-packed ZIP containing everything a user should need to get started with the API:

Next Steps

So, as I mentioned, this work isn't complete, in large part because I'm still learning the ropes. I suspect that the way I'm gathering the sources in the assembly and generating and gathering the Javadoc are not quite right - and this shows in the way that slightly-different host configurations (like on a Bamboo build server or when doing a multi-threaded build) fail during packaging.

Additionally, it's somewhat wasteful to include the source plugins even for server distributions; I won't really lose sleep over it, but it'd still be ideal to continue the recent policy of providing ExtLib-style distinct Update Sites. I'm not sure if this will require creating multiple feature and update-site projects or if it can be accomplished with build profiles.

Finally, I would love to be able to get rid of the source-form third-party dependencies like Guava and Javolution. One of the main benefits of Maven is that you can very-easily consume dependencies by listing them in the config, but Tycho and Eclipse throw a wrench into that: when you configure a project to use Tycho, then Eclipse stops referencing the Maven dependencies. Moreover, even though I believe all of the dependencies we use contain OSGi metadata, which would satisfy a Tycho command-line build, both Eclipse and the requirement that we build an old-style (non-p2) Update Site prevent us from doing that simply. It's possible that the best route will be to have Maven download and copy in the Jar files of the dependencies, but even that has its own suite of issues.

But, in any event, it's satisfying seeing this come together - and nice for me personally to build on the work Nathan, Paul, and Roland-and-co. have been doing lately. Maven is a monster and still suffers from severe "how the F does this stuff work?" problems, but it does feel good to put it to work.

Auto-OSGi-ifying Maven Projects

Sat Mar 28 16:15:59 EDT 2015

Tags: maven

In my last post, I discussed some of the problems that you run into when dealing with Maven projects that should also be OSGi plugins, particularly when you're mixing OSGi and non-OSGi projects in the same Maven build (in short: don't do that). Since then, things have smoothed out, particularly when I split the OSGi portion out into another Maven build, allowing it to consume the "core" artifacts cleanly, without the timing issue described previously.

But I ran into another layer of the task: consuming the Maven artifacts as plain Jars is all well and good, but the ideal would be to also have them available as a suite of OSGi plugins, so they can be managed and debugged more easily in an OSGi environment like Eclipse or Domino. Fortunately, this process, while still fairly opaque, is smoother than the earlier task.

A note on terminology: the term "plugin" can refer to both the OSGi component as well as the tools added into a Maven build. The term "bundle" aptly describes the OSGi plugins as well, but I'm used to "plugin", so that's what I use here. It's probably the case that an OSGi plugin is a specialized type of bundle, but whatever.

Preparing the Plugins

The route I'm taking, at least currently, is to tell the root Maven project that all of its Jar-producing children should also have a META-INF/MANIFEST.MF file packaged along to allow for OSGi use, and moreover to automatically generate that manifest using the maven-bundle-plugin. The applicable code in the parent pom.xml looks like this:

<build>
    <pluginManagement>
        <plugin>
            <groupId>org.apache.felix</groupId>
            <artifactId>maven-bundle-plugin</artifactId>
            <version>2.1.0</version>
            <configuration>
                <manifestLocation>META-INF</manifestLocation>
                <instructions>
                    <Bundle-RequiredExecutionEnvironment>JavaSE-1.6</Bundle-RequiredExecutionEnvironment>
                    <Import-Package></Import-Package>
                </instructions>
            </configuration>

            <executions>
                <execution>
                    <id>bundle-manifest</id>
                    <phase>process-classes</phase>
                    <goals>
                        <goal>manifest</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>
        <plugin>
            <artifactId>maven-jar-plugin</artifactId>
            <version>2.3.1</version>
            <configuration>
                <archive>
                    <manifestFile>META-INF/MANIFEST.MF</manifestFile>
                </archive>
            </configuration>
        </plugin>
    </pluginManagement>
</build>

In order to actually generate the manifest files, I included a block like this in each child project that produces a Jar:

<build>
    <plugins>
        <plugin>
            <groupId>org.apache.felix</groupId>
            <artifactId>maven-bundle-plugin</artifactId>

            <configuration>
                <instructions>
                    <Bundle-SymbolicName>com.somecompany.someplugin</Bundle-SymbolicName>
                </instructions>
            </configuration>
        </plugin>
    </plugins>
</build>

The Bundle-SymbolicName bit is there to translate the project's Maven artifact ID (which would be like "foo-someplugin") into a nicer OSGi version. There are other ways to do this, including just letting it use the default, but it made sense to write them manually here.

Once you do that and then run a Maven package, each Jar project in the tree should get an auto-generated MANIFEST.MF file that exports all of the project's Java classes and specifies a Java 6 runtime and no imported packages. There are many tweaks you can make here - any of the normal MANIFEST entries can be specified in the <instructions/> block, so you could add imported packages, required bundles, or other metadata at will.

If you install these projects into your local repository, then downstream OSGi projects using Tycho can find the dependencies when you include them in the pom.xml by Maven artifact ID and in the downstream MANIFEST.MF by OSGi bundle name. There's one remaining hitch (at least): though Maven will be fine with that resolution, Eclipse doesn't pick up on them. To do that, it seems that the best route is to create a p2 repository housing the plugins, which would also be useful for other needs.

Creating an Update Site

Fortunately, there is actually an excellent example of this on GitHub. By following those directions, you can create a project where you list the plugins you want to include as dependencies in the pom.xml, and it will properly package them into a p2 site containing all the plugins with their OSGi-friendly names and nice site metadata.

As a Domino-specific aside, a "p2 Update Site" is somewhat distinct from the Update Sites we've gotten used to dealing with - namely, it's a newer format that is presumably unsupported by Notes and Domino's outdated infrastructure. You can tell the difference because the "old" ones contain a site.xml file while the p2 format contains content.jar and artifacts.jar (those may be .xml instead). It's just another one of those things for us to deal with.

In any event, the instructions on GitHub do what they say on the tin, but I wanted a bit more automation: I wanted to automatically include all of the plugins built in the project without specifying them each as a dependency. To do this, I replaced Step 2 in the example (the use of maven-dependency-plugin) with the maven-assembly-plugin, which is a generic tool for culling together the results of a build in some useful format. The replaced plugin block looks like this:

<plugin>
	<groupId>org.apache.maven.plugins</groupId>
	<artifactId>maven-assembly-plugin</artifactId>
	<version>2.5.3</version>
	<configuration>
		<descriptors>
			<descriptor>src/assembly/plugins.xml</descriptor>
		</descriptors>
		<outputDirectory>${project.basedir}/target/source</outputDirectory>
		<finalName>plugins</finalName>
		<appendAssemblyId>false</appendAssemblyId>
	</configuration>
	<executions>
		<execution>
			<id>make-assembly</id>
			<!-- Bump this up to earlier than package so that the plugins below see the results -->
			<phase>process-resources</phase>
			<goals>
				<goal>single</goal>
			</goals>
		</execution>
	</executions>
</plugin>

This block tells the assembly plugin to look for an assembly descriptor file (which is yet another specialized XML file format, naturally) named "plugins.xml" and execute its instructions during the phase where it's processing resources, coming in just before the later plugins.

In turn, the assembly descriptor looks like this:

<assembly
	xmlns="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.2"
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.2 http://maven.apache.org/xsd/assembly-1.1.2.xsd">
	<id>plugins</id>
	<formats>
		<format>dir</format>
	</formats>
	<includeBaseDirectory>false</includeBaseDirectory>
	<moduleSets>
		<moduleSet>
			<useAllReactorProjects>true</useAllReactorProjects>
			<includes>
				<include>*:*:jar:*</include>
			</includes>
			<binaries>
				<outputDirectory>/</outputDirectory>
				<unpack>false</unpack>
				<includeDependencies>true</includeDependencies>
			</binaries>
		</moduleSet>
	</moduleSets>
</assembly>

What this says is to include all of the modules (Maven artifacts) being processed in the current build that are packaged as Jars and copy them into the designated directory, where they will be picked up by the Tycho plugins down the line.

The result of this Rube Goldberg machine is that all of the applicable plugins in the current build (and their dependencies) are automatically gathered for inclusion in the update site, without having to maintain a specific list.

Missing Pieces

This process accomplishes a great deal automatically, alleviating the need to maintain MANIFEST.MF files or a repository configuration, but it doesn't cover quite everything that might be needed. For one, there's no feature project; the update site is just a bunch of plugins without features to go along with them. Honestly, I don't know if those are even required for most uses - Eclipse seems capable of consuming the site as-is. Secondly, though, the result isn't suitable for use in an old-style environment, so this isn't something you would go plugging into Designer. For that, you'd want a secondary project that wraps the plugins into a feature in an old-style update site, which would have to be done in a second Maven build. Regardless, this seems to get you most of the way, and saves a ton of hassle.

Tycho and Tribulations

Sat Mar 14 15:02:48 EDT 2015

Tags: maven

For the last few weeks, a large part of my work has involved dealing with Maven, and much of that also involves Eclipse. It's been quite the learning experience - very frustrating much of the time, but valuable overall. In particular, any time OSGi comes into play, things get very complicated and arcane, in non-obvious ways. Fair warning: this blog post will likely start out as an even-keeled description of the task at hand and descending into just ranting about Maven.

The Actors

To start out, it's important to know that this sort of development involves three warring factions, each overlapping and having distinct views of the world. In theory, there are plugins that sort out all the differences, but this doesn't play out very smoothly in reality. Our players are:

  • Maven. This is the source of our trouble, but overall worth it. Maven is a build system for Java (and other) projects that brings with it great powers to do with dependency management, project organization, packaging, distribution, and any number of other things. I've been increasingly dealing with it, initially as an observer while the ODA team descended into madness to convert that project, and then Maven-izing my own framework. Its view of the world is of "artifacts" - conceptual units like junit, poi, or other dependencies, plus your own project components - organized in a tree of modules and available via repositories. Its build process is a multi-stage lifecycle with hooks for plugins at each step of the way.
  • Eclipse. Eclipse-the-IDE has its own view of how a Java project should be organized and built, and it doesn't involve Maven. There is a plugin for Eclipse and Maven, m2eclipse, that is meant to patch over these differences, but it can only go so far - while it helps Eclipse know a bit about Maven dependencies and its plugins, it's very dodgy and often involves trying to sync the Eclipse build configuration to be an imperfect representation of the Maven config.
  • OSGi. OSGi is a packaging, dependency, and runtime model, and is spoken natively by Eclipse (and Domino). However, it butts heads with Maven: they both cover the "packaging and dependencies" ground, and this creates a mess. Again, there are plugins to help bridge the gap, but these bring another layer of complexity and brittleness to the process.

Maven, Eclipse, and OSGi go together like oil, water, and a substance that dissolves in water but reacts explosively with oil.

OSGi's Plugins

The interaction with OSGi deserves a bit of further explanation. Unlike the Maven/Eclipse bridge, where there's basically one tool to work with, imperfect as it is, dealing with Maven+OSGi has two distinct plugins, which may or may not be required for your needs:

  • Tycho. This is the big one, intended to give Maven a thorough understanding of OSGi's view of the world, parsing the MANIFEST.MF files and hunting down dependencies using both an OSGi environment and Maven's normal scheme (if you tell it to correctly). If you're writing a full-on Eclipse/OSGi plugin/feature set (like ODA or my Framework), Tycho will be involved.
  • The Maven Bundle Plugin. This confusingly-named plugin is specifically referring to "bundle" in the OSGi sense, which is the elemental form of an OSGi plugin (the terminology begins to really overlap here). Its role is to take a non-OSGi project - say, a "normal" Maven project you or a third party wrote - and generate a MANIFEST.MF for you, allowing you to create an OSGi-friendly project usable as a dependency elsewhere.

These two projects, though often both required, are not related, and are crucially incompatible in one major way: Tycho's dependency resolution runs before the Bundle Plugin can do its job. So you can't, for example, have a Bundle project that generates an OSGi-friendly plugin and then depend on it in a "real" OSGi context inside the same Maven build. As far as I can tell, the "fix" is to separate these out into separate Maven projects. So, if you want to consume Maven projects and convert them into OSGi plugins without also manually managing plugin stuff and dependency copying, you have to make it a two-step process. The reason for this is that computers suck.

Eclipse and Maven

Throughout this sort of development, there's a constant gremlin on your back: the distinct worlds of Eclipse and Maven. Many changes to the pom.xml (Maven's project descriptor file) will prompt Eclipse to tell you that its project config is out of date and that you must click a menu item to sync it, which it refuses to do itself for some reason. Additionally, you will frequently run into a case where you'll paste in a block of Maven XML from somewhere and it will be legal for Maven, but Eclipse will complain about not having lifecycle support for it. If you're lucky, you can click the "quick fix" to download an adapter automatically, or failing that tell it to ignore that part. Other times, it'll give you some cryptic error about packaging or the like and offer no solution. The "fix" at that point is often to stop trying to do what you want to do.

Because of these and other conditions, it's fairly easy to get into a situation where the project will compile in Eclipse but not in Maven or vice-versa. Sometimes, this isn't too bad to fix, such as when you just need to add a dependency to a given project in Maven. Other times, things will get more arcane, requiring seeking out more blocks of Maven XML (this is a common task) to either let Maven or Eclipse know about the other, or to at least tell Eclipse to not bother trying to process part of the Maven project. This process is most similar to an adventure game, trying different combinations of plugins and pasted XML until it works or you quit and try a different career path.

Documentation

Capping these problems off is the peculiar nature of documentation for all this. From my experience, it comes in a couple forms:

  • Official documentation that is either a very basic getting-started tutorial or assumes you have a complete understanding of Maven's conceits and idioms to read what they're talking about.
  • Individual plugin pages with varing levels of thoroughness, and usually no mention of interaction with other components.
  • Blog posts and Stack Overflow questions from 3-5 years ago, half of which amount to "X doesn't work", and most of the rest of which contain blocks of XML to try pasting into your pom without much explanation.

After working with Maven long enough, you start developing a vague, disjointed understanding of how it works - how the "plugins" inside "build" differ from those inside "pluginManagement", for example - but it's slow going. It seems to be the sort of thing where you just have to pay your dues.

Conclusion For Now

Things are very gradually coming together, and the benefits of Maven are paying off as I start avoiding the pitfalls and implementing things like Jenkins. Once I properly sort out the projects I'm working on, I'll post more about what I learn to be the right ways to accomplish these goals, but for now my assessment remains "Maven is a huge PITA, but overall probably worth it".

Figuring Out Maven: Group/Artifact Names and Repositories

Mon Dec 08 16:34:11 EST 2014

Tags: maven

As I fiddle with Maven, I figure it may be useful to share my growing understanding of it - or at least preliminary assumptions. Any of these posts should not be taken as a true guide to learning Maven, since I'm just muddling through myself, but I suspect that my path will be similar to a lot of other Domino developers'.

The first thing I feel I grokked about Maven is its concept of repositories, mostly because it's the easiest concept I've run across. Repositories in Maven seem to match up nicely to their analogues in other environments, such as Eclipse Update Sites or Debian/Ubuntu apt repositories. There's the default "Maven Central" repository, which is similar to the main apt repositories: it contains a very large collection of software projects, available by group+artifact name. This is what you see on the pages for popular software projects: they mention the group/artifact pair and that's enough to use it.

For projects that aren't in Central, it's similar to adding a repo to Debian or an Update Site to Eclipse. You add some repository information to your project or the your user environment's settings.xml and then refer to the plugin similar to how you would with Central ones; Hibernate OGM is one such plugin.

In addition to remote repositories, there is also your local repository, stored in ~/.m2/repository. This contains any Maven projects where you built and ran install locally, and are then available to other Maven projects. This is how I handled my dependencies on the ExtLib and ODA: I ran Maven installs for each to add them to my local repository.

You can also download and store repositories of pre-built plugins locally, and the IBM Domino Update Site for Build Management is an example of this. The way to use this is to extract the ZIP file and then point to the updateSite directory in the same way that you would a remote repository, albeit with a file:// URL (in this case, ideally stored in a Maven environment variable).

The final aspect of this is the way bits of software are designated within a repository: by "group ID" and "artifact ID". The group ID seems like it should be globally unique, and tends to follow the reverse-DNS convention of Java package names. So a group ID might be something like "com.google.guava" or "com.igm.xsp.extlib". These don't have a specific analogue with OSGi development, but are effectively similar to the naming scheme for update site projects (even though Maven groups may contain OSGi update sites). Within a repository, individual projects, called "artifacts", are identified in a way that just needs to be unique in the repository, and it looks like conventions differ here. Sometimes, the artifacts have simple base names, like "guava" or "el", while other times they have OSGi-style full reverse-DNS names. I gather that the convention falls along OSGi lines: for generic projects, short names rule the day, while for OSGi-plugin projects, the name matches the plugin ID.

So... that's the easiest part! I'm slowly getting more of a grasp of other aspects of Maven, but at least repositories seem to make sense so far.

How I Maven-ized My Framework

Mon Dec 08 10:31:14 EST 2014

Tags: maven miasma

This past weekend, I decided to take a crack at Maven-izing the frostillic.us Framework (I should really update the README on there). If you're not familiar with it, Maven is a build system for Java projects, basically an alternative to the standard Eclipse way of doing things that we've all gotten pretty used to. Though I'm not in a position to be a strong advocate of it, I know that it has advantages in dependency-resolution and popularity (most Java projects seem to include a "you can get this from Maven Central" bit in their installation instructions), helps with the sort of continuous-integration stuff I think we're looking to do at OpenNTF, and has something of a "wave of the future" vibe to it, at least for our community: IBM's open-source releases have all been Maven-ized.

A month or so ago, Nathan went through something of a trial by fire Maven-izing the OpenNTF Domino API (present in the dev branches). Converting an existing project can be something of a bear, scaling exponentially with the complexity of the original project. Fortunately, thanks to his (and others', like Roland's) work, the ODA is nicely converted and was useful as a template for me.

In my case, the Framework is a much-simpler project: a single plugin, a feature, and an update site. It was almost a textbook example of how to Maven-ize an OSGi plugin, except for three dependencies: on the ODA, on the Extension Library, and, as with both of those, on the underlying Domino/XPages plugins. Fortunately, my laziness on the matter paid off, since not only is the ODA Maven-ized, but IBM has put their Maven-ized ExtLib right on GitHub and, better still, released a packaged Maven repository of the required XSP framework components. So everything was in place to make my journey smooth. It was, however, not smooth, and I have a set of hastily-scrawled notes that I will translate into a recounting of the hurdles I faced.

Preparing for the Journey

First off, if you're going to Maven-ize a project, you'll need a few things. If it's an XPages project, you'll likely need the above-linked IBM Domino Update Site. This should go, basically, "somewhere on your drive". IBM seems to have adopted the convention internally of putting it in C:\updateSite. However, since I use a good computer, I have no C drive and this didn't apply to me - instead, I adopted a strategy seen in projects like this one, where the path is defined in a variable. This is a good introduction to a core concept with Maven: it's basically a parallel universe to Eclipse. This nature takes many forms, ranging from its off-and-on interaction with the workspace to its naming scheme for everything; Eclipse's built-in Maven tools are a particularly-thin wrapper around the command-line environment. But for now the thing to know is that this environment variable is not an Eclipse variable; it comes from Maven's settings.xml, which is housed at ~/.m2/settings.xml. It doesn't exist by default, so I made a new one:

<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
      xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
      xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0
                          http://maven.apache.org/xsd/settings-1.0.0.xsd">

    <profiles>
        <profile>
            <id>main</id>
            <properties>
                <notes-platform>file:///Users/jesse/Documents/Java/IBM/UpdateSite</notes-platform>
            </properties>
        </profile>
    </profiles>
    <activeProfiles>
        <activeProfile>main</activeProfile>
    </activeProfiles>
</settings>

I'm not sure that that's the best way to do it, but it works. The gist of it is that you can fill in the properties block with arbitrarily-named environment variables.

Secondly, you'll need a decent tutorial. I found this one and its followups to do well. Not everything fit (I didn't treat the update site the same way), but it was a good starting point. In particular, you'll need Tycho, which is explained there. Tycho is a plugin to Maven that gives it some knowledge of Eclipse/OSGi plugin development.

Third, you'll need some examples. Now that my Framework is converted, you can use that, and the projects linked above are even better (albeit more complex). There were plenty of times where my troubleshooting just involved looking at my stuff and figuring out where it was different from the others.

Finally, if your experience ends up anything like mine, you'll want something like this.

Prepping Dependencies

Since my project depended on the ExtLib and ODA, I had to get those in the local repository first. As I found, it's not enough to merely have the projects built in your workspace, as it is when doing non-Maven OSGi development - they have to be "installed" in your local repository (~/.m2/repository). Though the Extension Library is larger, it's slightly easier to do. I cloned the ExtLib repository (technically, I cloned my fork of it) and imported the projects into the Eclipse workspace using Import → Maven → Existing Maven Projects. By pointing that to the repository root, I got a nice Maven tree of the projects and imported them all into a new working set. Maven, like many things, likes to use a tree structure for its projects; this allows it to know about module dependencies and provides inheritance of configuration (there's a LOT of configuration, so this helps). Unfortunately, Eclipse doesn't represent this hierarchy in the Project Explorer; though you can see the other projects inside the container projects, they also appear on their own, so you get this weird sort of doubled-up effect and you just have to know what the top-level project you want is. In this case, it's named well: com.ibm.xsp.extlib.parent.

So once you've found that in the sea of other projects (incidentally, this is why I like to click on the little triangle on top of the Project Explorer view and set Top Level Elements to Working Sets), there's one change to make, unless you happened to put the Update Site from earlier at C:\updateSite. If you didn't, open up the pom.xml file (that's the main Maven config file for each project) and change the url on line 28 to <url>${notes-platform}</url>. After that, you can right-click the project and go to Run As → Maven Install. If it prompts you with some stuff, do what the tutorial above does ("install verify" or something). This is an aspect of the thin wrapper: though you're really building, the Maven tasks take the form of Run Configurations. You just have to get used to it.

Once you do that, maybe it'll work! You'll get a console window that pops up and runs through a slew of fetching and building tasks. If all goes well, there'll be a cheery "BUILD SUCCESS" near the bottom. If not, it'll be time for troubleshooting. The first step for any Maven troubleshooting is to right-click the project and go to Maven → Update Project, check all applicable projects, and let it do its thing. You'll be doing that a lot - it's your main go-to "this is weird" troubleshooting step, like Project → Clean for a misbehaving XPage app. If the build still fails, it's likely a problem with your Update Site location. Um, good luck.

Next up comes the ODA, if you're using that. As before, it's best to clone the repository from GitHub (using one of the dev branches, like Nathan's or mine) and import the Maven projects. There's good news and bad news compared to the ExtLib. The good news is that it already uses ${notes-platform} for the repository location, so you're set there. The bad news is that trying to install from the main domino parent project doesn't work - it fails on the update site for some reason. So instead, I had to install each part in turn. In particular, you'll need "externals" (covers a lot of dependencies), "org.openntf.junit4xpages", "org.openntf.formula", and "org.openntf.domino".

Converting the Projects

Okay! So, now we can actually start! For the plugin project, the first page of the tutorial works word-for-word. One thing to note is that the "eclipse-plugin" option isn't actually in the Packaging drop-down; you just have to type it in. Again: thin wrapper. It may not work immediately after following the directions, but the divergences are generally due to the non-standard Domino-related dependencies. In particular, I ran into trouble with forbidden-access rules in Notes.jar - Maven, being a separate world, ignores your Eclipse preferences on the matter. To get around that, I added the parts in the plugin block of this pom.xml - among other things, they tell the compiler to ignore such problems. I still ran into trouble with lotus.domino.local.NotesBase specifically after the other classes started working, and I "solved" that by deleting the code (it was related to recycle checking, which I no longer need).

It may also be useful to change build.properties so that the output.. = bin/ line reads output.. = target/classes. I don't know if this is actually used, but it was a troubleshooting step I took elsewhere and it makes conceptual sense: Maven puts its output classes in target/classes, not bin.

During this process, I quickly realized the value of having a parent project. I had a hitch in mine in that I wanted to call the parent frostillicus.framework, which meant renaming the plugin to frostillicus.framework.plugin and dealing with the associated updating of Eclipse and git, but that was an unforced error. The normal layout of parent projects seems to be that they're parents both conceptually by pom.xml and also physically by folder structure. I haven't done the latter yet, and the process works just as well if you don't. Still, I should move it eventually. So, following the third part of the tutorial, I created a near-empty project (no Java code) just to house the pom.xml with common settings and told it to adopt the plugin as a child. Converting the feature project was the easiest step and went exactly as described in the tutorial.

Where I diverged from both the tutorial and ODA is in the Update Site. The tutorial suggests renaming site.xml to category.xml and using the Maven type eclipse-repository, but none of the examples I used did that. Instead, I followed those projects and left site.xml as-is (other than making sure that the versions in the XML source use ".qualifier" instead of any timestamp from building) and used the Maven type eclipse-update-site in the pom.xml.

I then spent about two hours pulling my hair out over bizarre problems I had wherein the update site would build but not actually include the compiled classes in the plugin jar if I clicked on "Build" in the site.xml editor and would fail with bizarre error messages if I did Run As → Maven Install. I'll spare you the tribulations and cut to the chase: my problem was that I had the modules in the parent project's pom.xml out of order, with the update site coming before the feature project. When I fixed that, I was able to start building the site the "Maven way". Which is to say: not using the site.xml's Build button (which still had the same problem for me), but using Run As → Maven Install. This ends up putting the built update site inside the target/site directory rather than directly in plugins and features folders. This is a case of "okay, sure" again.

Conclusion

So, after a tremendous amount of suffering and bafflement, I have a converted project! So what does it buy me? Not much, currently, but it feels good, and I had to learn this stuff eventually one way or another. Over the process, some aspects of Maven started to crystallize in my mind - the repositories, the dependencies, the module trees - and that helps me understand why other Maven-ized projects look the way they do. Other aspects are still beyond my ken (like most of the terminology), but it's a step in the process. This should also mean I'm closer to ready for future build processes and am more in line with the larger Java world.

If you have a similar project, I'd say it's not required that you make the switch, but if you're planning on working on larger projects that use Maven, it'd be a good idea. Maven takes a lot of getting used to, since everything feels like it's a from-scratch rethinking of the way to structure Java projects with no regard to the structure or terminology of "normal" Eclipse/OSGi development, and something like this conversion is as good a start down the path as any.