Showing posts for tag "maven"

Converting Tycho Projects to maven-bundle-plugin, Initial Phase

Aug 22, 2019 3:27 PM

Tags: maven osgi tycho
  1. Developing an Open/WebSphere Liberty UserRegistry with Tycho
  2. Developing Open Liberty Features, Part 2
  3. Converting Tycho Projects to maven-bundle-plugin, Initial Phase

To date, Tycho has been my tool of choice for developing Domino-targeted Maven projects. However, it's not without protest.. Unlike most Maven plugins, Tycho inserts itself at the very start of the build process and takes over dependency management. Purely in Maven, you can use normal Maven dependencies, but only so long as you're pointing to a dependency that already has OSGi metadata (which, fortunately, most do), and only then to satisfy a Require-Bundle or Import-Package that also has to be present. This gets more annoying, though, when dealing with Eclipse, which removes the notion of Maven dependencies entirely when using Tycho and forces you to jump through hoops to do what you want. And, as a final kicker, Tycho's p2 repository support is completely broken in the latest release version of Maven.

So why do I keep using it, anyway?

Well, it brings a couple major benefits that are of particular importance for Domino:

  • It can use p2 repositories for dependencies. This matters because the XPages runtime plugins are not (yet?) available as normal Maven dependencies. Years back, IBM [provided a "Build Management" update site](https://openntf.org/main.nsf/project.xsp?r=project/IBM Domino Update Site for Build Management), which is helpful, but it's still an Eclipse-style p2 repository, not a Maven repository. Tycho can use p2 repositories natively, though, just as Eclipse does.
  • It constructs a true Equinox environment. This matters both when compiling your project and when running automated tests. The environment created by Tycho is the same Equinox OSGi runtime that Domino uses, and so it supports the same styles of bundle resolution and extensions that you get in Domino. Without this happening during the build, you lose some assurance that things at runtime will match your expectations.
  • It spawns tests in a separate process. This is a little esoteric, but it matters because launching a Notes environment on a non-Windows platform more-or-less requires setting up environment variables for the Notes/Domino directory and others, and these variables are not successfully set when using the normal maven-surefire-plugin runtime. This means that reliably running tests requires setting up the environment ahead of time, which is fiddlier and less automated.
  • It can generate new- and old-style Eclipse Update Sites. To be used in Designer and NSF-based Update Sites, an OSGi project has to be packaged up into a p2 repository along with an old-style "site.xml" file. Tycho can generate these (and can be assisted with "site.xml" when using the newer style), and it can also auto-generate source bundles, features, and repositories.

Alternatives and Workarounds

Some of the "hard" requirements for Tycho can be at least worked around.

Years ago, I wrote a Ruby script that would take a p2 site like IBM's or one generated from a newer version and "Mavenize" it by creating artifact information based on each bundle's OSGi manifest. I since converted it to Java and included it in Darwino's Studio plugins, and yesterday added it to the generate-domino-update-site Maven plugin. Using that lets you declare dependencies on any of the bundles or embedded JARs in a normal Maven project:

1
2
3
4
5
6
7
        <dependency>
            <groupId>com.ibm.xsp</groupId>
            <artifactId>com.ibm.notes.java.api.win32.linux</artifactId>
            <version>[10.0.0,)</version>
            <classifier>Notes</classifier>
            <scope>provided</scope>
        </dependency>

This isn't perfect, since it's neither standardized nor generally available (go vote for the aha idea!), but at least it's reproducible and can be something of a de-facto standard if used enough.

Then there's the matter of generating appropriate OSGi metadata. Outside of the Tycho-using world, the main way that generating this is via a tool called bnd and its related tools. bnd is kind of a parallel world and there's even an alternate tooling set for Eclipse instead of the default PDE. There are a couple ways to use bnd in a Maven build, but the one I'm familiar with to date is the maven-bundle-plugin. I've used this with Darwino to incidentally create OSGi metadata for the otherwise non-OSGi core modules, and I suspect that it gets used heavily this way. It's more powerful than that, though, and is a nice wrapper for bnd under the hood, supporting Declarative Services annotations and all the other OSGi goodies. In my case, I used it to generate the MANIFEST.MF with most of the defaults, but then added in some specifics to play nice in my Domino Equinox target.

I suspect that these bnd-based tools can also be a route to solving my automated-testing woes. For the Open Liberty Runtime project, I don't have to worry about that, since it's so dependent on running in actual Domino that the return-on-investment for setting up JUnit tests wouldn't be worth it. However, I recall seeing some Maven testing plugin that let you spawn an OSGi environment of your choice, and I think that something like that may be able to replace Tycho for me there.

Since p2 repositories/update sites are entirely an Eclipse-ism, most OSGi tooling doesn't care about them. That's where p2-maven-plugin comes in. Not only will it allow you to create p2 repositories, but it lets you define features in the configuration, meaning they don't have to be separate modules like in Tycho. And not only that, but it will also auto-OSGi-ify any Maven dependencies you bring in if they don't already have OSGi bundle information. It also lets you override existing bundle data on the fly if needed, such as if the dependencies and imports conflict with something on Domino.

Eclipse Friendliness

Since I still use Eclipse to develop these projects, I want to be able to make use of the [XPages SDK](https://openntf.org/main.nsf/project.xsp?r=project/XPages SDK for Eclipse RCP)'s ability to run Domino's HTTP stack pointed at my active workspace. For that to work, I need to be able to get Eclipse to recognize my projects as functional PDE-compatible bundles even if I'm not using PDE for them. Fortunately, that process isn't difficult: once I set the location for MANIFEST.MF to be in "META-INF" in the project root, maven-bundle-plugin started generating the files there instead of within "target", and Eclipse started working with the projects as OSGi bundles. The only thing left to do then was to gitignore the generated files, since they don't need to be checked into source control anymore.

Future Improvements

The big thing that is still an open problem is dealing with testing. I have some ideas for taking a swing at it, but for now it's the main thing preventing me from doing this for all of my Tycho projects.

Beyond that, I want to look a bit into bnd-maven-plugin. This diverges from maven-bundle-plugin in that it's geared towards using bnd configuration files directly. During the build process, I think the results would be the same, since maven-bundle-plugin can already pass through whatever configuration I want, but it would be a better match for the Eclipse bndtools tooling. Additionally, externalizing the bnd config files would mean they'd be the same if I decided to switch to Gradle, as Open Liberty uses.

Finally, and specific to this Open Liberty project, I may want to consider using bnd to generate Liberty Feature manifests, as it itself does. These features are implemented as OSGi "subsystems" packaged .esa files. Currently, I'm using esa-maven-plugin to generate their specialized manifests, but I've already hit some limitations in the area of cross-feature dependencies. Apparently, bnd takes some wrangling to suit this, but is worth it. I'll consider that one a "stretch goal", though.

For now, I'm pretty pleased with the new setup. The projects still work on Domino, I can run them on there from the workspace, I was able to eliminate the p2 feature projects outright, and now I don't have to worry about packaging up a dependencies site just to have something to point at in Eclipse. Heck, I can even use Visual Studio Code now! It's pretty nice.

First Steps to Code Coverage Analysis in Domino Plugins

Nov 9, 2017 8:53 AM

Tags: maven domino java

I'm always interested in getting the computer to tell me how to tell it what to do more successfully, and, to further that pursuit, I've started taking an interest in code coverage.

If you're not familiar with the term, "code coverage" refers to reporting on which lines of code were actually executed during runtime, most commonly in association with unit tests. Eclipse (and presumably other IDEs) has support for this, and I've decided to give it a shot.

Since I'm starting this out in the context of Domino plugins, there are more wrinkles than in most tutorials. Namely, the test suites I've written run exclusively through Maven instead of the Eclipse UI due to all the Notes environment setup, so I can't just use the normal UI tools to gather the data. Fortunately, Eclipse's EclEmma will work just fine with the output from a Maven project, as long as you configure it properly. I looked around for a while to find the right combination of tools to use, but it ended up being fairly simple to configure basic output that can be consumed in Eclipse's Coverage view.

There are two main additions. First, add the jacoco-maven-plugin to your root project's project.build.plugins block:

<plugin>
	<groupId>org.jacoco</groupId>
	<artifactId>jacoco-maven-plugin</artifactId>
	<version>0.7.8</version>
	<executions>
		<execution>
			<goals>
				<goal>prepare-agent</goal>
			</goals>
		</execution>
	</executions>
</plugin>

In normal cases, that would suffice. However, since the test configuration I have for Notes overrides the argLine property of the Tycho test runner, there's another step - add the tycho.testArgLine property manually into those blocks, such as in the Windows profile:

<profile>
	<activation>
		<os>
			<family>Windows</family>
		</os>
		<property>
			<name>notes-program</name>
		</property>
	</activation>

	<build>
		<plugins>
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>tycho-surefire-plugin</artifactId>
				<version>${tycho-version}</version>
 
				<configuration>
					<skip>false</skip>
 
					<argLine>${tycho.testArgLine} -Dfile.encoding=UTF-8 -Djava.library.path="${notes-program}"</argLine>
					<environmentVariables>
						<PATH>${notes-program}${path.separator}${env.PATH}</PATH>
					</environmentVariables>
				</configuration>
			</plugin>
		</plugins>
	</build>
</profile>

Once that's configured, running the test suite via Maven will create a new file in the target folder of the test plugin: jacoco.exec. This file can then be consumed in Eclipse by opening the "Coverage" view:

Eclipse's Show View window

In that view, right click and choose "Import Session..." and point to the data file. Click "Next" and check the projects+source folders from your workspace you're interested in analyzing. When you click "Finish", it'll do two things. First, it'll fill the Coverage view with statistics from your run:

Code Coverage stats

(We have a lot of work to do fleshing out our test suites for this one)

Secondly, it'll start highlighting your code to show you what code is executed, which branches are only partially covered, and which lines are skipped entirely. For example (ignore the sickly color scheme - I need to work on that):

Code Coverage example

This shows how several of the if branches are only tested in one direction, while the "Faces" block is skipped entirely. That also shows some of the trouble with testing XPages-run code: the Tycho environment can't reproduce the XPages environment fully, so some branches aren't testable in that way. I haven't looked into the possibility of gathering similar data from JUnit for XPages, so perhaps that's possible.

For now, though, this will have to do. And, like with these other "code improvement" techniques I've integrated lately, there's a lot of potential tedium - juggling when to write a test to cover some code that will obviously always work just to improve the highlighting vs. just focusing on the low-hanging fruit - but I expect that it will be a nice addition to my workflow over time.

New Small Project: p2site-maven-plugin

Oct 26, 2017 2:17 PM

Tags: maven

It's no secret that I have a love/hate relationship with developing for OSGi platforms with Maven. The giant divide between "all-in" Tycho projects (which limit your options with normal Maven features) and trying to bolt on OSGi support in an otherwise-normal project creates an array of problems big and small.

Some of those hurdles would be difficult to bridge, such as any automated tests that want to test the proper functioning of OSGi services. However, not all projects need that - in the case of Darwino, for example, deployment to Domino is a secondary consideration in the Maven project, and so a Darwino app doesn't use Tycho for its packaging or testing. By jumping through a few hoops, we've gotten those projects to the point where they can emit a p2-formatted update site for use in OSGi, and that can be imported into a Domino NSF-based update site.

There's a minor caveat, though: because those update sites don't know about p2 formatting, you can't use the "Import Update Site" action, instead having to use "Import Features", which leaves the imported features in the "(Not Categorized)" group. This isn't a huge problem, but it's one that's easily fixed, so I wrote a small tool to do just that.

I've created a small open-source project called p2sitexml-maven-plugin, the purpose of which is to generate the site.xml file expected by Notes from a p2 repository generated by other means, such as the p2-maven-plugin. This can be included in a Maven build like so:

...
	<build>
		<plugins>
			...
			<plugin>
				<groupId>org.darwino</groupId>
				<artifactId>p2sitexml-maven-plugin</artifactId>
				<version>1.0.0</version>
				<executions>
					<execution>
						<goals>
							<goal>generate-site-xml</goal>
						</goals>
						<configuration>
							<category>Some Category</category>
						</configuration>
					</execution>
				</executions>
			</plugin>
		</plugins>
	</build>
...

Right now, the plugin isn't in Maven Central, but is in OpenNTF's Maven server. You can add that to an active profile in your settings.xml file like so:

...
	<pluginRepositories>
		<pluginRepository>
			<id>artifactory.openntf.org</id>
			<name>artifactory.openntf.org</name>
			<url>https://artifactory.openntf.org/openntf</url>
		</pluginRepository>
	</pluginRepositories>
...

It isn't a world-changing thing, but this should at least make the task of targetting Domino with non-Tycho Maven projects a little easier.

Including a Headless DDE Build in a Maven Tree

Mar 14, 2017 12:45 PM

Most of my Domino projects nowadays have two components: a suite of OSGi plugins/features and at least one NSF. Historically, I've kept the NSF part separate from the OSGi plugin projects - I'll keep the ODP in the repo, but then usually also keep a recent "build" made by copying the database from my dev server, and then include that built version in the result using the Maven Assembly plugin. This works, but it's not quite ideal: part of the benefit of having a Maven project being automatically built is that I can have a consistent, neutral environment doing the compilation, without reliance on my local Designer. Fortunately, Designer has a "headless" mode to build NSFs in a scripted way, and Christian Güdemann has done the legwork of building that into a Maven plugin.

It should come as no surprise, however, that this is a fiddly process, and I ran into a couple subtle problems when configuring my build.

Setting Up Designer

The first step is to tell Designer that you want to allow this use, which is done by setting DESIGNER_AUTO_ENABLED=true in your notes.ini. The second step is to configure Notes to use an ID file with no password: because Designer is going to be launched and quit automatically several times, you can't just leave it running and have it use an open session. This is a perfect opportunity to spin up a "template" ID file, distinct from your developer ID, if you haven't do so already. Also, uh... make sure that this user has at least Designer rights to the NSF it's constructing. I ran into a bit of logical trouble with that at first.

The last step was something I didn't realize until late: keep your Designer installation clean of the plugins you're going to be auto-installing. Ideally, Designer will be essentially a fresh install, with no plugins added, and then the Maven definition will list and install all dependencies. If it's not clean, you may run into trouble where Designer emits errors about the plugin conflicting with the installed version.

Setting Up The Maven Environment

Before getting to the actual Maven project files, there's some machine-specific information to set, which is best done with properties in your ~/.m2/settings.xml, much like the notes-platform and notes-program properties. In keeping with that convention, I named them as such:

<properties>
	<notes-platform>file:///C:/Users/jesse/Java/XPages</notes-platform>
	<notes-program>C:\Program Files (x86)\IBM\Notes</notes-program>
	<notes-designer>C:\Program Files (x86)\IBM\Notes\designer.exe</notes-designer>
	<notes-data>C:\Program Files (x86)\IBM\Notes\Data</notes-data>
</properties>

Deploying Features And Initial Root Project Config

The first came in setting up the automatic deployment of the feature. The Maven plugin lets you specify features that you want added to and then removed from your Designer installation. In this case, the feature and update site are within the same Maven tree being built, which adds a wrinkle or two.

The first is that, since the specific version number of the feature changes every build due to the qualifier, I had to set up the root project to export the qualifier value that Tycho plans to use. This is done using the tycho-packaging-plugin, which a standard Maven project will have loaded in the root project pom. The main change is to explicitly tell it to run the build-qualifier goal early on, which has the side effect of contributing a couple properties to the rest of the build:

<plugin>
	<groupId>org.eclipse.tycho</groupId>
	<artifactId>tycho-packaging-plugin</artifactId>
	<version>${tycho-version}</version>
	<configuration>
		<strictVersions>false</strictVersions>
	</configuration>

	<!-- Contribute the "buildQualifier" property to the environment -->
	<executions>
		<execution>
			<goals>
				<goal>build-qualifier</goal>
			</goals>
			<phase>validate</phase>
		</execution>
	</executions>
</plugin>

Once that's running, we'll have the ${qualifiedVersion} property to use down the line to house the actual version made during the build.

The second hurdle is figuring out the URL to use to point to the update site. I did this with a property in the root project pom, alongside setting to properties used by the Headless Designer plugin:

<properties>
	<!-- snip -->
	
	<!-- Headless Designer properties -->
	<designer.feature.url>${project.baseUri}../../releng/com.example.some.updatesite/target/site</designer.feature.url>
	<ddehd.designerexec>${notes-designer}</ddehd.designerexec>
	<ddehd.notesdata>${notes-data}</ddehd.notesdata>
</properties>

Much like with OSGi dependency repositories, this path is recomputed per-project. The NSF projects are housed within an nsf folder in my tree, so I include the ../.. to move up to the root project, before descending back down into the update site. Note that this requires that the update site project be built earlier in the build than the NSF.

Finally, bringing these together, I added a block for the common settings for the plugin to the pluginManagement section of the root project pom:

<plugin>
	<groupId>org.openntf.maven</groupId>
	<artifactId>headlessdesigner-maven-plugin</artifactId>
	<version>1.3.0</version>
	<extensions>true</extensions>
	<configuration>
		<features>
			<feature>
				<featureId>com.example.some.feature</featureId>
				<url>${designer.feature.url}</url>
				<version>${qualifiedVersion}</version>
			</feature>
		</features>
	</configuration>
</plugin>

Configuring The NSF Project

With most aspects configured higher up in the project tree, the actual NSF project pom is fairly slim:

<?xml version="1.0"?>
<project
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
	xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
	<modelVersion>4.0.0</modelVersion>
	<parent>
        <groupId>com.example</groupId>
        <artifactId>some-plugin</artifactId>
        <version>1.0.0-SNAPSHOT</version>
        <relativePath>../..</relativePath>
	</parent>
	<artifactId>nsf-somensf</artifactId>
	
	<packaging>domino-nsf</packaging>
	
	<properties>
		<ddehd.odpdirectory>${basedir}\..\..\..\nsf\nsf-somensf</ddehd.odpdirectory>
		<ddehd.targetdbname>somensf.ntf</ddehd.targetdbname>
	</properties>
	
	<build>
		<plugins>
			<plugin>
				<groupId>org.openntf.maven</groupId>
				<artifactId>headlessdesigner-maven-plugin</artifactId>
				<extensions>true</extensions>
			</plugin>
		</plugins>
	</build>
</project>

The properties block sets two more properties automatically read by the Headless Designer Maven plugin. In this case, the path is an artifact of the history of the Git repository: since the ODP was added to the repo outside of the Maven tree, the path backs up and out of the whole thing, and then to another folder with a confusingly-similar name. In this case, it avoids a lot of developer hassle, but a properly-configured project have the ODP in a subfolder within the Maven project (maybe src/main/odp if you want to be all idiomatic about it).

Note that the ddehd.targetdbname property is the NSF name used both for the intermediate build NSF (which is in the Notes data directory) and for the destination file in the project's target directory, so make sure it doesn't conflict with any existing DBs.

Bringing It All Together

Once you have the NSF built, you can include it in an Assembly down the line, leading to a nicely-packaged update site + NSF pair. This section is something of an "IOU" at the moment, though - I have an idea for how I want to do this, but I haven't actually implemented it yet. Once I do, I'll write a followup post.

In the mean time, having a build server build the NSF can be a useful check on manking sure everything is working correctly, and is a perfect stepping-stone towards a complete solution. Ideally, in addition to packaging up the result, a full system would also deploy the NSF and plugins to a Domino server and run some UI/service tests against it. However, that's a whole ball of wax that I haven't touched on myself (and is also likely prohibitive for licensing reasons in most cases anyway). For now, it's a step in the right direction.

Quick Post: Maven-izing the XSP Repo

Sep 17, 2016 6:58 AM

Tags: maven xpages

This post follows in my tradition of extremely-narrow-use-case guides, but perhaps this will come in handy in some situations nonetheless.

Specifically, a while back, I wrote a script that "Maven-izes" the XPages artifacts, as provided by IBM's Update Site for Build Management. This may seem a bit counter-intuitive at first, since the entire point of that download is to be able to compile using Maven, but there's a catch to it: the repository is still in Eclipse ("P2") format, which requires that you use Tycho in your project. That's fine enough in most cases - since Domino-targetted projects are generally purely OSGi, it makes sense to have the full OSGi stack that Tycho provides. However, in a case where Domino is only one of many supported platforms, the restrictions that Tycho imposes on your project can be burdensome.

So, for those uses, I write a JRuby script that reads through the P2 site as downloaded and extracted from OpenNTF and generates best-it-can Maven artifacts out of each plugin. It tries to maintain the plugin names, some metadata (vendor, version, etc.), and dependency hierarchy, and the results seem pretty reliable, at least for the purpose of getting a non-Tycho bundle with XSP references to compile. This isn't necessarily a route you'd want to take in all cases (since you don't get the benefits of normal OSGi resolution and services in your compilation), but may make sense sometimes. In any event, if it's helpful, here you go:

https://github.com/jesse-gallagher/Miscellany/blob/master/UpdateSiteConversion/convert.rb

The Cleansing Flame of Null Analysis

May 21, 2016 10:18 AM

Tags: java maven

Though most of my work lately has been on sprawling, platform-level stuff or other large existing codebases, part of it has involved a new small app. I decided to take this opportunity to dive more aggressively than previously into automated null analysis and other potential-bugs tools.

What I mean by "null analysis" is letting the IDE or compiler try to help you avoid NullPointerExceptions. Though there are plenty of other programming mistakes you could still make, these are among the most common, and so a little extra work up front to avoid them should pay dividends. Eclipse has some handy options in its Java → Compiler → Errors/Warnings preferences to assist with this:

The first option will pick up on some pretty basic instances, such as:

Object foo = null;
System.out.println(foo.hashCode());

Since this is clearly going to always cause an NPE, Eclipse is able to point this out as an error. The next level gets a little more nebulous: "potential" null pointer access. This crops up when Eclipse can't reliably determine whether a value will be null, either because there is no way to know at compile time (say, database access) or because the compiler's tooling is too limited. Here's a contrived example:

Object foo = Math.random() > 0.5 ? new Object() : null;
System.out.println(foo.hashCode());

This situation is clearly untenable, but there are other situations where you as a programmer can be very confident that the value will not be null (say, if you swap out the > 0.5 for >= 0.0), but the compiler doesn't know that. That's why it often makes sense to leave that as a warning instead of an error.

That's all stuff I've done before, but now I've decided to dive into annotation-based null analysis as well. Unfortunately, in stock Java, this is something of a hot mess (that list even leaves out Eclipse's home-grown version). Since Java didn't grow up with this sort of capability, it's been shoehorned in by various parties over the years. There are other tools to assist you in Java 8, but, unfortunately, I can only target 7 as the highest. For now, I've settled on the "sort-of standard" javax.validation.constraints package. It wasn't really intended for this specific purpose, but it's flexible enough to suit and can be used in Eclipse and FindBugs (though I have my reservations about the choice).

In Eclipse, this type of analysis can be enabled by checking "Enable annotation-based null analysis" below the other options and, unless you're using Eclipse's known annotations, adjusting the "Configure" options next to "Use default annotations for null specifications":

In any event, regardless of the choice of tooling, the "this shouldn't be null" annotations work the same way: you use them to decorate things that you either require not be null when provided to you (method parameters) or you promise to not be null when providing to others (method return values). For example:

public @NotNull Object doSomething(@NotNull Object otherObject) {
	return otherObject.toString();
}

This highlights three things, two good and one bad:

  • Good: The @NotNull in the method parameter means that, as long as the calling code is also checked for null use, the method can be confident that there won't be a NullPointerException when calling a method on otherObject.
  • Good: The @NotNull on the return value means that other code calling this method can be confident that they will not get a null value from it, and so can skip extra null checks.
  • Bad: Eclipse flags otherObject.toString() as a potential problem because it doesn't know for sure that Object#toString doesn't return null, because it has no nullability annotations. As programmers (or as a compiled-code analysis tool), we can be fairly confident that it will be non-null because any object returning null for that is essentially broken on its own.

That last one is a common problem when adopting annotation-based null analysis, at least in Eclipse (I hear it may be better in IntelliJ): its logic doesn't go very deep. If everything is gussied up with these annotations, you're clear - but as soon as you step outside of the project you're working on, you have to add in likely-unnecessary checks. Fortunately, these checks don't realistically hurt (a null check at runtime in a normal app is negligible performance-wise), but they can grate to have to add in.

Glutton for punishment that I am, I decided to go a step further and enable FindBugs processing as an integral step of my build. Though FindBugs can be very picky about the types of things it complains about, it is blessedly more thorough in its analysis than Eclipse, so you generally end up conceding that it is correct when it yells at you. Since the project is Maven-based, I added the check in the project's pom file:

<plugin>
	<groupId>org.codehaus.mojo</groupId>
	<artifactId>findbugs-maven-plugin</artifactId>
	<version>3.0.3</version>
	<configuration>
		<includeTests>true</includeTests>
	</configuration>
	<executions>
		<execution>
			<phase>compile</phase>
			<goals>
				<goal>check</goal>
			</goals>
		</execution>
		<execution>
			<id>findbugs-test-compile</id>
			<phase>test-compile</phase>
			<goals>
				<goal>check</goal>
			</goals>
		</execution>
	</executions>
</plugin>

For most uses, that's all that's required. Now, when the project is compiled, FindBugs will give it a once-over and halt the build if it finds anything it doesn't like. This can be tweaked a great deal - for example, changing the checks to run or the severity of the problem needed to fail the build - but the defaults will likely suit.

Adding these extra checks involves a lot of plusses and minuses. The big minus is that you may end up spending a lot of time "fixing" bugs that don't really exist, time that you could instead spend actually writing your application (and writing new bugs that the tools won't find anyway). There's really nothing to be gained by carefully explaining to Eclipse for the hundredth time that toString always returns non-null.

Still, particularly when tested out in a small, low-surface-area app, this can be a good practice to learn and refine. Eventually, a move to Java 8 will help this more, and it certainly doesn't hurt to add in nullability annotations in the mean time. Overall, I think having the tooling help you avoid a whole suite of common "brain fart" bugs like this is worthwhile.

Maven Native Chronicles: Running Automated Notes-based Tests

Feb 27, 2016 5:02 PM

Tags: maven
  1. Maven Native Chronicles, Part 1: Figuring Out nar-maven-plugin
  2. Maven Native Chronicles, Part 2: Setting Up a Windows Jenkins Node
  3. Maven Native Chronicles, Part 3: Improving Native Artifact Handling
  4. Maven Native Chronicles: Running Automated Notes-based Tests

This post isn't really in my ongoing Java thread, though it's related in that this is the sort of thing that may come up in fairly-advanced cases. This post will assume a functional knowledge of Maven, Tycho, and JUnit.

For Darwino, I ran into the need to run unit tests on Domino-adapter code during the Maven build process. Since the Domino project tree uses Tycho, this ended up differing slightly from standard Maven testing. Rather than using the src/test/java directory in the same project to house the associated tests, Tycho prefers the very-OSGi-native method of having a separate project, but declaring it a "fragment" plugin attached to the primary one. In OSGi terms, a fragment is a special type of plugin that, when loaded by the runtime, gets glommed on to a specified host plugin and runs in its same classpath. In other cases, this may be used to provide platform-specific additions, add locale resources, or other uses.

So I created a new fragment project, which is structurally much like a normal plugin, but with an extra line in the MANIFEST.MF:

Fragment-Host: com.example.some.parent.plugin

This line tips off the OSGi environment to its nature. In the pom.xml, there are a number of important differences related both to how Tycho handles test fragments and the necessity of loading the Notes native libraries:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<parent>
		<groupId>com.example</groupId>
		<artifactId>some-parent</artifactId>
		<version>1.0.0-SNAPSHOT</version>
	</parent>
	<artifactId>com.example.some.parent.plugin.test</artifactId>
	<packaging>eclipse-test-plugin</packaging>

	<build>
		<plugins>
			<!--
				By default, Tycho doesn't include the other fragment plugins when running the test.
				So here, we manually include the appropriate features. 
			 -->
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>target-platform-configuration</artifactId>
				<version>${tycho-version}</version>
				
				<configuration>
					<dependency-resolution>
						<extraRequirements>
						
							<requirement>
								<type>eclipse-plugin</type>
								<id>com.ibm.notes.java.api.win32.linux</id>
								<versionRange>[9.0.1,9.0.2)</versionRange>
							</requirement>
							
							<requirement>
								<type>eclipse-feature</type>
								<id>com.example.some.native.feature</id>
								<versionRange>0.0.0</versionRange>
							</requirement>
							
						</extraRequirements>
					</dependency-resolution>
				</configuration>
			</plugin>
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>tycho-surefire-plugin</artifactId>
				
				<configuration>
					<testSuite>${project.artifactId}</testSuite>
					<testClass>com.example.some.parent.plugin.test.AllTests</testClass>
				</configuration>
			</plugin>
		</plugins>
	</build>
	
</project>

The preamble is the same as usual for Maven, but the packaging is slightly different. Instead of eclipse-plugin, this should be packaged as eclipse-test-plugin. Tycho's packaging doesn't particularly care about whether or not it's a fragment, but it does care about its test nature.

Things get a little interesting in the target-platform-configuration block. These two entries have similar purposes: to cause Tycho to load up other, native-artifact fragments required to run the tests. The first one showed up in the Java series: it contains Notes.jar, but, because it is itself a fragment (and can't be directly depended upon by the test project), Tycho won't automatically load it unless directed to. The second one serves a similar purpose, but loads a feature instead. This feature contains references to a number of distinct platform-dependent native-artifact fragments, and specifying this dependency causes Tycho to consider each one without having to specifically enumerate them in the POM.

The final block is a little simpler, and it just tells Tycho where to start when it goes to run the fragment as a test suite. The AllTests class is a test suite in the JUnit 4 convention, with @RunWith and @Suite.SuiteClasses annotations.


There's another catch to this, though: Notes has some specific demands on its environment, and in particular must be run with knowledge of a Notes program directory, a data directory, a notes.ini, and an ID file (unless you're doing DIIOP (which you probably shouldn't)). The specifics of what the libraries expect in their runtime environment and how they should be loaded in their API calls vary a little from platform to platform, and I ended up with a pile of "just keep trying stuff until it works" code. The result, though, is that I have automated tests running on Windows, Linux, and OS X. First, there's the large platform-specific section of my root POM, which defines platform-activated profiles that set up environment variables:

<!-- These profiles add support for specific platforms for tests -->
<profiles>
	<profile>
		<activation>
			<os>
				<family>Windows</family>
			</os>
			<property>
				<name>notes-program</name>
			</property>
		</activation>
	
		<build>
			<plugins>
				<plugin>
					<groupId>org.eclipse.tycho</groupId>
					<artifactId>tycho-surefire-plugin</artifactId>
					<version>${tycho-version}</version>
					
					<configuration>
						<skip>false</skip>
						
						<argLine>-Dfile.encoding=UTF-8 -Djava.library.path="${notes-program}"</argLine>
						<environmentVariables>
							<PATH>${notes-program}${path.separator}${env.PATH}</PATH>
						</environmentVariables>
					</configuration>
				</plugin>
			</plugins>
		</build>
	</profile>
	<profile>
		<id>mac</id>
		<activation>
			<os>
				<family>mac</family>
			</os>
			<property>
				<name>notes-program</name>
			</property>
		</activation>
	
		<build>
			<plugins>
				<plugin>
					<groupId>org.eclipse.tycho</groupId>
					<artifactId>tycho-surefire-plugin</artifactId>
					
					<configuration>
						<skip>false</skip>
						
						<argLine>-Dfile.encoding=UTF-8 -Djava.library.path="${notes-program}"</argLine>
						<environmentVariables>
							<PATH>${notes-program}${path.separator}${env.PATH}</PATH>
							<LD_LIBRARY_PATH>${notes-program}${path.separator}${env.LD_LIBRARY_PATH}</LD_LIBRARY_PATH>
							<DYLD_LIBRARY_PATH>${notes-program}${path.separator}${env.DYLD_LIBRARY_PATH}</DYLD_LIBRARY_PATH>
							<Notes_ExecDirectory>${notes-program}</Notes_ExecDirectory>
						</environmentVariables>
					</configuration>
				</plugin>
			</plugins>
		</build>
	</profile>
	<profile>
		<id>linux</id>
		<activation>
			<os>
				<family>unix</family>
				<name>linux</name>
			</os>
			<property>
				<name>notes-program</name>
			</property>
		</activation>
	
		<build>
			<plugins>
				<plugin>
					<groupId>org.eclipse.tycho</groupId>
					<artifactId>tycho-surefire-plugin</artifactId>
					<version>${tycho-version}</version>
					
					<configuration>
						<skip>false</skip>
						
						<argLine>-Dfile.encoding=UTF-8 -Djava.library.path="${notes-program}"</argLine>
						<environmentVariables>
							<!-- The res/C path entry is important for loading formula language properly -->
							<PATH>${notes-program}${path.separator}${notes-program}/res/C${path.separator}${notes-data}${path.separator}${env.PATH}</PATH>
							<LD_LIBRARY_PATH>${notes-program}${path.separator}${env.LD_LIBRARY_PATH}</LD_LIBRARY_PATH>
							
							<!-- Notes-standard environment variable to specify the program directory -->
							<Notes_ExecDirectory>${notes-program}</Notes_ExecDirectory>
							<Directory>${notes-data}</Directory>
							
							<!-- Linux generally requires that the notes.ini path be specified manually, since it's difficult to determine automatically -->
							<!-- This variable is a convention used in the test classes, not Notes-standard -->
							<NotesINI>${notes-ini}</NotesINI>
						</environmentVariables>
					</configuration>
				</plugin>
			</plugins>
		</build>
	</profile>
</profiles>

Each block is kicked off both by a specific OS combination, using Maven's OS names (you can also target specific architectures within them), as well as the presence of a notes-program property. This is a convention I've adopted to go alongside the notes-program property that points to the XSP plugins; this one instead points to the root Notes or Domino install to use for execution.

Windows is the easiest since Notes still feels most at home on there. There, it's just a matter of adding the Notes program root to the Java library path and the environment's PATH. From there, the Notes libraries automatically picked up the data directory and notes.ini, presumably from the registry.

The Mac is mildly more complex: in addition to the two settings from Windows, I also ended up adding the program path to LD_LIBRARY_PATH and DYLD_LIBRARY_PATH. I'm not entirely sure both are needed, but hey, it works this way. In addition, I had to specify Notes_ExecDirectory. After that, the tests found the location of the data dir and Notes Preferences, presumably due to Mac OS conventions.

Linux needed the most hand-holding, which shouldn't be too surprising for those who have installed Domino on Linux - it doesn't seem to respect any platform conventions there. In addition to specifying the notes-program property and using it in the same places as on the Mac, I also added two more properties to my Maven config: notes-data, to point to the data directory, and notes-ini, to point to notes.ini. I used the notes-data property to specify the Directory environment variable that the Notes libraries look for, and then I also specified NotesINI. That's not something that the Notes libs look for, but instead it's a way to shuttle the configuration to the Java code that actually executes the tests.

That leads to the final hurdle: initializing the Notes environment in the JUnit test classes. To do that, I specified a @BeforeClass method that checks for the presence of the Notes_ExecDirectory and NotesINI environment variables. If they're present (i.e. it's Linux), it calls NotesInitExtended with the value of Notes_ExecDirectory as the first argument and = plus the value of NotesINI as the second. Afterwards, whether or not that was called, it calls NotesThread.sinitThread(), and from then on NotesFactory.createSession() will generate proper native sessions.

There's also an @AfterClass method that is the mirror of that: it calls NotesThread.stermThread() and then, on Linux, NotesTerm.


So yeah, there are a lot of hoops to hop through! Hopefully, this post will be helpful for someone attempting to do the same thing I did, and it'll cut down on a lot of searching around and trying to piece together a working environment.

That Java Thing, Part 16: Maven Fallout

Feb 23, 2016 2:33 PM

Tags: java maven
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment
  19. Java Hiccups
  20. Bitwise Operators
  21. Java Grab Bag 2

So, after the last post's large task of converting to Maven, this step is mostly about picking up the pieces and expanding on some of the concepts. We'll start with M2Eclipse, usually rendered as just "m2e".

m2e

m2e is the set of plugins that acts as Eclipse's interface to Maven. It more-or-less replaces the earlier maven-eclipse-plugin, though you will likely still see references to that around. Eclipse doesn't have any inherent knowledge of how Maven works, m2e has the complicated task of reading your projects' pom.xml files and adapting them to Eclipse's internal configuration. So, for example, in our projects it saw the presence of Tycho and determined that they should be imported as OSGi projects. In other cases, m2e may pick up the presence of things like Android plugins to trigger the use of the Android development tools.

Though it tries mightily, m2e is the source of a lot of the consternation that can come with a switch to Maven-based development. Because most Maven plugins don't have any inherent allowances for working in an Eclipse environment, adapters have to be written for each one in order for them to work with m2e - this is what the dialog yesterday installing the Tycho adapters was about. In some cases, these don't exist and you have to tell m2e to ignore the plugin; in other cases, the adapters DO exist, but are flawed in some way. Most of the time, things go alright, but there are enough edge cases that it can be irritating.

For this kind of task, m2e is pretty unobtrusive, but it's important to know it's there.

Updating the .gitignore

One side effect of m2e's behavior is that it's not a good idea to remove Eclipse's project configuration files from the Git repository. This is not required, but it can avoid a number of annoying problems when dealing with multi-person Maven projects. To start with, open the .gitignore file from the root of your local Git repository (you can get to this easily using Eclipse's Git Repositories view, in the "Working Directory" part of the repo). Add some lines at the end to ignore .project and .classpath, so your whole file should now look like:

._*
Thumbs.db
.DS_Store

*.class

# Mobile Tools for Java (J2ME)
.mtj.tmp/

# Package Files #
#*.jar
*.war
*.ear

# virtual machine crash logs, see http://www.java.com/en/download/help/error_hotspot.xml
hs_err_pid*

# Eclipse project files
.project
.classpath

Depending on how your (hypothetical) team wants to work, it may also make sense to ignore the .settings/ directory, which stores some additional Eclipse project information. However, some of that information may be useful to share - for example, on-save code-cleaning behavior that isn't readily expressed in Maven.

Due to the way Git works, just adding the files to the .gitignore won't remove them from the repository: instead, they'll just no longer show up in the list for new changes. In order to also remove them from the repository without deleting them from the filesystem, go to the "Navigator" pane in Eclipse (if it doesn't show up currently, you can add it via Window → Show View → Navigator), find each .project and .classpath file in the four projects (some will only have the former), right-click, and choose Team → Advanced → Untrack:

Now, commit the changes - though the files remain on the filesystem, they should show up as deleted in the commit dialog:

The target Folder

This one is one we've already prepped a bit for. Whereas most Eclipse projects store their binary output (Java class files, Jars, etc.) in the bin folder or elsewhere, the standard Maven behavior is to use target. For most of the projects, this doesn't matter - we had already configured the plugin to use a subfolder here for its classes, and the temporary files for other aspects don't matter. However, it's still important to know about this; when you're looking for the compiled or packaged output of a Maven project, this is the place to look, and we'll run into this when building the update site.

Building the Update Site

There's an important changed involved now with how the update site is built: it does not involve opening the site.xml and clicking "Build All" anymore. Instead, it involves right-clicking the root project ("parent-xsp") and choosing Run As → Maven Install:

There are two logical followup questions when seeing this change: "what?" and "why?". They're both bound together to what the nature of a Maven project is, and, significantly, the way Eclipse interacts with them. Maven is primarily a command-line tool - granted, it's a set of Java classes, but the primary way to interact with it is via the command line. m2e does a lot of work to interpret the projects in the same way as the `mvn` command-line tool, but it's just a secondary interpretation due to the way Maven and Eclipse work.

The way to fully build a Maven-ized project in a way that fully uses the configuration is to run the command-line tool. Fortunately, m2e comes with its own embedded version and doesn't require you to use a terminal, but the abstraction is very leaky - and this is why you use "Run As" instead of any of the normal "Build"-related commands. The "Run As" commands construct a CLI-type environment and execute the embedded Maven, which is what then does the real work.

Since "parent-xsp" is the root of our projects, it's the starting point to execute a Maven build. When you run this, you'll see a lot of chatter in Eclipse's Console view, particularly the first run: Maven will seek out the plugins needed to build the projects and install them to the local Maven repository (stored in ".m2/repository" in your home folder). After that, it will build and package each of your projects. There's a whole phase system going on here (similar in concept to the XPages lifecycle), as well as many configuration options, but the important part here is that the "install" command (called a "goal" in Maven parlance) is the last phase that we will worry about, and it will cover everything we need here.

Upon completion, the Console text should end with something like this (incidentally, "Reactor" is Maven's term for the entire blob of modules being processed):

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] parent-xsp ......................................... SUCCESS [  0.617 s]
[INFO] com.example.xsp.plugin ............................. SUCCESS [  1.647 s]
[INFO] com.example.xsp.feature ............................ SUCCESS [  0.411 s]
[INFO] com.example.xsp.update ............................. SUCCESS [  4.143 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 30.310 s
[INFO] Finished at: 2016-02-23T13:55:20-05:00
[INFO] Final Memory: 80M/191M
[INFO] ------------------------------------------------------------------------

Part of this process is the cration of the update site, which, due to how we configured it, will be represented twice in the "target" folder in "com.example.xsp.plugins": as a tree of files inside the "site" folder and also zipped up into the "site_assembly.zip" file. There's also a file named "site.zip", but that contains just the site.xml, which is not important. It's these files that you should now target with Designer and the NSF Update Site when updating the plugin. In fact, it'd be a good idea to delete the "features" and "plugins" folders from outside the "target" folder now - they won't be used any more.

As for the "Build All" button in site.xml, it's best to pretend it doesn't exist. It will still work, but it will break your Maven build, because it overwrites the "qualifier" in the version numbers. This is, admittedly, a drag: it's convenient having a clear, logical button to build the site, and it's very inconvenient that Eclipse doesn't tell you not to use it any more. However, the Maven process, besides being now required, has a nice advantage: now building the update site will no longer cause Git to want to check in the change. That's something that can get annoying very quickly when working with another developer on a non-Mavenized OSGi project.

Adding Back The Source Plugin

We'll finish the day on an easy one: adding back in the source plugin. Unlike the original setup, which used a separate feature to house the source plugin, we'll now include it in the same feature. You could also continue to have a separate source feature, which would be useful for very large projects where it would actually be a big burden to deploy the source to servers, but, for XPages libraries, it's generally not worth the cognitive hassle.

Since we already configured the Tycho source plugin earlier, this is just a matter of adding a reference to the (implied) source plugin to the feature.xml:

<?xml version="1.0" encoding="UTF-8"?>
<feature
      id="com.example.xsp.feature"
      label="Example XSP Library Feature"
      version="1.0.0.qualifier">

   <description url="http://www.example.com/description">
      [Enter Feature Description here.]
   </description>

   <copyright url="http://www.example.com/copyright">
      [Enter Copyright Description here.]
   </copyright>

   <license url="http://www.example.com/license">
      [Enter License Description here.]
   </license>

   <plugin
         id="com.example.xsp.plugin"
         download-size="0"
         install-size="0"
         version="0.0.0"
         unpack="false"/>

   <plugin
         id="com.example.xsp.plugin.source"
         download-size="0"
         install-size="0"
         version="0.0.0"/>

</feature>

Now, the source will be included in the output and bundled with the main feature when it's installed. Commit this change:


At this point, we're basically back to where we were previously, OSGi-wise, but in a much better position to scale the project further and take advantage of supporting systems. The next couple posts will cover some of those potential systems, as well as remaining large conceptual topics. There's a great deal to know when it comes to Maven, but it's all helpful.

That Java Thing, Part 15: Converting the Projects

Feb 22, 2016 10:26 AM

Tags: maven java
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment
  19. Java Hiccups
  20. Bitwise Operators
  21. Java Grab Bag 2

Prelude: there was a typo in the previous entry. Originally, the file URL read "file://C:/IBM/UpdateSite", but, on Windows, there should be another slash in there: "file:///C:/IBM/UpdateSite". I've corrected the original post now, but you should make sure to fix your own settings.xml file if needed. Otherwise, Maven will complain down the line about the URI having "an authority component".

The time has come to do the dirty work of converting our existing plugin projects to Maven. There will be some filesystem-side reorganizing and not every project will make it (looking at you, source project), but overall it's mostly a job of pasting a bunch of XML into new files.

For the first leg of this, I recommend removing the projects from your Eclipse workspace by selecting them, right-clicking, and choosing Delete:

On the confirmation dialog, do not select "Delete project contents on disk" - we don't actually want to get rid of the files.

Next, find the projects on your filesystem, create a new folder alongside them named "com.example.xsp", and move the projects inside it. In Maven parlance, we're creating a "multi-module project", and this new folder is the top level in our module hierarchy. This can contain arbitrary levels and can be very helpful in project organization, but this will be a pretty simple parent-and-children case. Next, dive into the folder and delete the "com.example.xsp.source.feature" - we'll be able to generate this through Maven now, and so we can trim down our project count slightly.

Now, create a file named pom.xml in the "com.example.xsp" folder alongside the subfolders, and fill its contents with this:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<groupId>com.example</groupId>
	<artifactId>parent-xsp</artifactId>
	<version>1.0.0-SNAPSHOT</version>
	
	<packaging>pom</packaging>

	<modules>
		<module>com.example.xsp.plugin</module>
		<module>com.example.xsp.feature</module>
		<module>com.example.xsp.update</module>
	</modules>

	<properties>
		<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
		<tycho-version>0.24.0</tycho-version>
		<compiler>1.6</compiler>
	</properties>

	<repositories>
		<repository>
			<id>Luna</id>
			<layout>p2</layout>
			<url>http://download.eclipse.org/releases/luna/</url>
		</repository>
		<repository>
			<id>notes</id>
			<layout>p2</layout>
			<url>${notes-platform}</url>
		</repository>
	</repositories>

	<build>
		<plugins>
			<!--
				Maven compiler options
			-->
			<plugin>
				<groupId>org.apache.maven.plugins</groupId>
				<artifactId>maven-compiler-plugin</artifactId>
				<version>3.1</version>
				<configuration>
					<source>${compiler}</source>
					<target>${compiler}</target>
					<compilerArgument>-err:-forbidden,discouraged,deprecation</compilerArgument>
				</configuration>
			</plugin>
			
			<!--
				Tycho plugins
			-->
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>tycho-maven-plugin</artifactId>
				<version>${tycho-version}</version>
				<extensions>true</extensions>
			</plugin>
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>tycho-packaging-plugin</artifactId>
				<version>${tycho-version}</version>
				<configuration>
					<strictVersions>false</strictVersions>
				</configuration>
			</plugin>
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>tycho-compiler-plugin</artifactId>
				<version>${tycho-version}</version>
				<configuration>
					<source>${compiler}</source>
					<target>${compiler}</target>
					<compilerArgument>-err:-forbidden,discouraged,deprecation</compilerArgument>
				</configuration>
			</plugin>
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>tycho-source-plugin</artifactId>
				<version>${tycho-version}</version>
				<executions>
					<execution>
						<id>plugin-source</id>
						<goals>
							<goal>plugin-source</goal>
						</goals>
					</execution>
				</executions>
			</plugin>
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>target-platform-configuration</artifactId>
				<version>${tycho-version}</version>
				<configuration>

					<pomDependencies>consider</pomDependencies>
					<dependency-resolution>
						<extraRequirements>
							<requirement>
								<type>eclipse-plugin</type>
								<id>com.ibm.notes.java.api.win32.linux</id>
								<versionRange>[9.0.1,9.0.2)</versionRange>
							</requirement>
						</extraRequirements>
						<optionalDependencies>ignore</optionalDependencies>
					</dependency-resolution>

					<filters>
						<!-- work around Equinox bug 348045 -->
						<filter>
							<type>p2-installable-unit</type>
							<id>org.eclipse.equinox.servletbridge.extensionbundle</id>
							<removeAll />
						</filter>
					</filters>

					<environments>
						<environment>
							<os>linux</os>
							<ws>gtk</ws>
							<arch>x86</arch>
						</environment>
						<environment>
							<os>linux</os>
							<ws>gtk</ws>
							<arch>x86_64</arch>
						</environment>
						<environment>
							<os>win32</os>
							<ws>win32</ws>
							<arch>x86</arch>
						</environment>
						<environment>
							<os>win32</os>
							<ws>win32</ws>
							<arch>x86_64</arch>
						</environment>
						<environment>
							<os>macosx</os>
							<ws>cocoa</ws>
							<arch>x86_64</arch>
						</environment>
					</environments>
					<resolver>p2</resolver>
				</configuration>
			</plugin>
		</plugins>
	</build>
</project>

So... yeah, there's a lot going on here. This is the biggest of the "XML dumps" we're going to have and contains by far the greatest number of bizarre "you just have to know about it" parts. "POM" stands for "Project Object Model" - it's the language Maven uses to describe the project. Let's tackle the file from near the top (ignoring the XML header):

project and modelVersion

These elements are effectively just boilerplate: the project element is the root of our project descriptor and it contains some definitions to let XML editors parse the file format. In turn, the modelVersion describes to Maven the specific version we're working with, which has been "4.0.0" for as long as I've been doing this.

groupId, artifactId, and version

These elements are obligatory in one form or another in every project, but are less copy-and-paste-able: they define the name and version of your project. These are Maven's equivalents to OSGi's Bundle-SymbolicName and Bundle-Version, though Maven makes an explicit distinction between the overall grouping of the plugin and its specific name. These are essentially arbitrary, but the convention is to use the standard reverse-DNS version of your domain name for the group ID, and then keep this group ID consistent across different projects (...mostly). The artifact ID is less consistent, but it's good to pick a pattern like "projectPrefix-submodule". Here, we actually reverse that a bit to call it "parent-xsp" in order to emphasize that this project's purpose is entirely to be a parent to the submodules and not an interesting artifact to consume itself. We'll break this convention again for the submodules due to our use of Tycho/OSGi.

packaging

A project's packaging describes the sort of output. By default, if this is left un-specified, it's jar - a normal, run-of-the-mill Jar file. There are a few other common ones you may run into - such as war for J2EE web apps or bundle for non-Tycho OSGi bundles - and the one we're using here is pom. This is actually kind of the "none of the above" option: the "pom" is just the file we're editing now, and is included with every project type. Having a packaging type of pom generally means that either the project has no real outputs of its own (as is the case here) or it's an "ad hoc" project that doesn't fit an existing type.

modules

This block is the hallmark of a parent project: it lists the relative folder paths that contain the submodules. In this case, the names line up with the names we'll use for the submodules, but this could potentially vary depending on the folder names and locations. Parent-child module relationships don't have to be physically hierarchical on the filesystem, but it's a good convention when you don't have a specific reason to break it.

properties

This block is the project-level equivalent to the user-level property we defined in .m2/settings.xml. There are two types of properties that can go here, with no obvious distinction between them: known configuration properties used by Maven itself and arbitrary named variables used by the person writing the pom.

The first property - project.build.sourceEncoding - is an example of the former. During execution, Maven will reference this property defined in the project (or one of its parents) when determining the text file encoding to use. This could be set to something else if you're working with non-Unicode files, but it's important to set it here so that file interpretation will not be platform-dependent. These properties can be read like an equivalent of EL for the project XML: it sets a property of sourceEncoding within the build node in project, but more consisely (more or less).

The other two are variables for use later. The names of these have only very loose conventions, but there seem to be a couple common types: ALL_CAPS, camelCase, and hyphen-delimited. You can also specify variables as dot.delimited and they will work the same way, but that makes them more difficult to distinguish from the system-level properties.

repositories

The repositories block is the start of our OSGi-related weirdness. The block itself isn't OSGi-specific - it has its role in other projects that want to make use of dependencies outside of the core public Maven repositories - but the contents is. We're setting two repositories here: one to point to the main Eclipse repository (the Luna version here, but that could just as well be Mars or Kepler) and one to point to the XPages Update Site. This is where the property we set before comes into play, allowing different developers to keep the update site in different locations without changing the project's config.

build

The build section is often the largest part of a POM file - it contains definitions and configuration for various additional Maven plugins used during compilation. We have two tasks to accomplish here: ensure that we use Java 6 for compilation at the root level (to ensure the build doesn't execute as Java 7 or 8 and be incompatible with Domino) and enable a whole slew of Tycho plugins.

The maven-compiler-plugin block specifies the version of the Java compiler plugin to use (3.1, which is actually kind of old, but the differences aren't important) and then provides it with some configuration to set the Java version level and to not choke on forbidden references. Like the Java version, the latter is a nod to Domino: depending on your JVM configuration, you may run into forbidden-reference errors relating to the lotus.domino classes.

The next slew of blocks all relate to loading up various Tycho components. Tycho's job is esentially to construct an entire OSGi environment during the Maven build, and it consists of a number of moving parts, many of which are basically the Tycho version of normal Maven facilities. The tycho-maven-plugin is the core, and its extensions rule is what allows it to worm itself into various phases of the build process. The tycho-packaging-plugin controls the process of bundling the projects as their various types: the plugin, the feature, and the update site (in our case). The tycho-compiler-plugin is the Tycho variant of the Maven one we configured earlier. The tycho-source-plugin is what allowed us to kick the standalone source feature to the curb - it's the equivalent of the Eclipse-specific feature we had been hooking into before.

The tycho-platform-configuration is the scariest of the bunch. This plugin's job is to establish the OSGi Target Platform we're working with, in conjunction with the repository specified above. Not all of this configuration is necessary for our immediate needs, but may come in handy later. The pomDependencies rule is useful when using mixed-type dependencies in more-complicated projects, while the extraRequirements block forces the inclusion of the plugin fragment that contains Notes.jar. The optionalDependencies rule comes in handy from time to time with XPages projects: there are sometimes cases where there's a dependency that Eclipse has and which the server will have, but which will be awkward to get to in Maven, usually relating to dependencies-of-dependencies not related to compilation. The filters block is... I don't know; just keep it in there. The environments block is a way to describe the platforms on which your code can execute - I believe this is primarily used when testing. The resolver is like filters in that it's a "I just copy it around" thing; presumably, it refers to a specific code path for resolving plugin dependencies.

Whew!

Okay, so... that's the first one down! For the most part, you can carry around the whole bottom section pretty much as-is for your XPages Maven projects (as I do), and then gradually become comfortable with the specifics over time. Now, there's some good news and some bad news:

  • The bad news is that there are three more POM files to write.
  • The good news is that they are much, much simpler.

In recent versions of Tycho, they've added the ability to reduce the number of POMs involved, but there are limits to that, particularly to do with Jenkins, so we'll stick to the traditional way for now.

com.example.xsp.plugin

Go into the "com.example.xsp.plugin" folder and create a new pom.xml containing this:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<parent>
		<groupId>com.example</groupId>
		<artifactId>parent-xsp</artifactId>
		<version>1.0.0-SNAPSHOT</version>
	</parent>
	<artifactId>com.example.xsp.plugin</artifactId>
	<packaging>eclipse-plugin</packaging>
</project>

Like I promised: much simpler. Because the parent POM already brought in all the Tycho plugins and configuration, all we need to do here is the basics. One slightly-unusual aspect here is the packaging type. eclipse-plugin isn't a packaging type known to Maven inherently; instead, it's provided by Tycho, but can be used in the same way.

The artifact ID here is a concession to Tycho: Maven artifact IDs don't usually follow the same full-reverse-DNS conversion as OSGi, but Tycho wants the artifact ID to match the OSGi bundle name.

com.example.xsp.feature

Next up is the pom.xml file in the "com.example.xsp.feature" folder:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<parent>
		<groupId>com.example</groupId>
		<artifactId>parent-xsp</artifactId>
		<version>1.0.0-SNAPSHOT</version>
	</parent>
	<artifactId>com.example.xsp.feature</artifactId>
	<packaging>eclipse-feature</packaging>
</project>

This is very similar to the last, with the only real differences being the artifact ID and the packaging type.

com.example.xsp.update

Now, the pom.xml in "com.example.xsp.update":

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<parent>
		<groupId>com.example</groupId>
		<artifactId>parent-xsp</artifactId>
		<version>1.0.0-SNAPSHOT</version>
	</parent>
	<artifactId>com.example.xsp.update</artifactId>
	<packaging>eclipse-update-site</packaging>

	<build>
		<plugins>
			<plugin>
				<groupId>org.eclipse.tycho</groupId>
				<artifactId>tycho-packaging-plugin</artifactId>
				<version>${tycho-version}</version>
				<configuration>
					<archiveSite>true</archiveSite>
				</configuration>
			</plugin>
		</plugins>
	</build>
</project>

This one's slightly longer, but not by too much. Beyond the different artifact ID and packaging, we also provide some additional configuration to the tycho-packaging-plugin. This is among the plugins that were established in the root POM, but it's re-defined here in order to enable the archiveSite configuration option. This will give us a nice ZIP file of the Update Site at the end.

There's one other thing to note here: Tycho considers the eclipse-update-site packaging type to be deprecated, and it may be removed in the future. In most examples you'll see outside of Domino, people use eclipse-repository instead. This gets back to the difference between the old-style ("site.xml") Eclipse Update Sites and the new-style ("category.xml", named P2) Update Sites. For now, we use the old-style variant because it works better with Notes and Domino.

In addition to this POM file, we also have two changes to make in the site.xml: remove the source-feature reference (we'll add this back elsewhere later) and clean up the versions:

<?xml version="1.0" encoding="UTF-8"?>
<site>
   <feature url="features/com.example.xsp.feature_1.0.0.qualifier.jar" id="com.example.xsp.feature" version="1.0.0.qualifier">
      <category name="Example"/>
   </feature>
   <category-def name="Example" label="Example"/>
</site>

The version distinction is to change the Eclipse-generated timestamps at the end of the versions to "qualifier". The reason for this is that, for Tycho, the site.xml acts as a pure configuration file, and will no longer be the site index itself. So Tycho wants the qualifier to be generic, and then will fill it in during compilation. Like the source feature, this will be covered more later.

Last Steps

With our POM files defined, the last step for now is to import the projects back into Eclipse. In Eclipse, go to File → Import, expand the "Maven" category, and choose "Existing Maven Projects":

On the next screen, browse to the "com.example.xsp" directory created earlier. If all goes well, this should find the four projects in their hierarchy:

Everything on this can be left as the defaults, though you may want to specify a more-descriptive working set name - that doesn't affect the project behavior.

When you click "Finish", Eclipse with churn for a bit and then, if you're running Mars and haven't done this before, it will present a dialog about "Maven plugin connectors":

The specifics of what is going on here are a large topic of their own, but the short of it is that Eclipse needs specialized plugins to deal with each Maven plugin, and in this case it's looked for (and found) connectors for Tycho. Click "Finish", "Next", and "OK", accept the license terms, and restart Eclipse as it tells you to.

When Eclipse restarts, it should go through some churning while it updates the Maven projects and should finally settle on no remaining errors.

Closing Out

There will be some things to discuss with the fallout from this conversion, but this will do it for today. Commit your changes, stand up and stretch, and grab a cup of relaxing tea:

That Java Thing, Part 14: Maven Environment Setup

Feb 21, 2016 5:51 PM

Tags: java maven
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment
  19. Java Hiccups
  20. Bitwise Operators
  21. Java Grab Bag 2

Before diving into the task of converting our plugin projects to Maven, there's a bit of setup we need to do. In a basic case, Maven doesn't require much setup beyond the project file itself - it's a "convention over configuration" type of thing that tries to make doing things the default way smooth. However, since it's also a "Java" thing, that means that anything out of the ordinary requires a bunch of XML.

Our big "out of the ordinary" aspect is OSGi. Maven and OSGi are often at loggerheads, but the conflict won't be too great in our situation. Still, it does mean there will be a few hoops to jump through, and one of those hoops is dealing with our dependency on the XPages runtime plugins. Since these plugins are not packaged as fully-Mavenized artifacts (yet (hopefully)), we'll need to configure Tycho to read the p2 (Eclipse) style.

In part 3, we downloaded the Build Management Update Site from OpenNTF, and we'll reuse that here. What we need to do is create a global Maven settings file that, for now, will just contain a definition of a variable to point to this update site. It would also be possible to specify this inside the project itself, but it's better form to use a consistent variable name (the most common convention is notes-platform) in the project files and then have your local settings point to it on your machine.

The global Maven settings file is called settings.xml and is stored in a folder named .m2 in your user's home directory (e.g. C:\Users\someuser\.m2 or /Users/someuser/.m2). Creating a folder named with a leading dot can be a pain in Explorer and the Finder, so it may be necessary to drop into a command line or other tool to do it. One way or another, create this file and set its contents similar to this:

<?xml version="1.0"?>
<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0 http://maven.apache.org/xsd/settings-1.0.0.xsd">
	<profiles>
		<profile>
			<id>main</id>
			<properties>
				<notes-platform>file:///C:/IBM/UpdateSite</notes-platform>
			</properties>
		</profile>
	</profiles>
	<activeProfiles>
		<activeProfile>main</activeProfile>
	</activeProfiles>
</settings>

Adjust the file:// URL as necessary to point to the location on your computer. It has to be a file URL and not a normal path, presumably because repositories are usually expected to be remote HTTP sites.

This is the only configuration we need before getting to the project, but it's a good preview of the sort of "try pasting this big block of XML somewhere" advice you're in for when it comes to Maven use. Over time, the structure of the XML and how it relates to Maven's behavior begins to crystallize, but it's definitely cumbersome to start with, and it will get more opaque before it gets less so.

Depending on your proclivities, this may be a good opportunity to install standalone Maven as well. Eclipse has its own embedded version, so this is not required, but it can be handy sometimes to be able to run Maven from the command line. On your average Linux distribution or OS X with Homebrew, Maven should be installable with a package manager. Otherwise, Maven can be downloaded from maven.apache.org - it doesn't have an installer as such, as it's essentially some scripts around Java classes, but they have tips for adding it to your path.

Next, we'll get to the real meat of this process: actually converting the projects to Maven.

That Java Thing, Part 13: Introduction to Maven

Feb 19, 2016 6:27 PM

Tags: maven
  1. That Java Thing, Part 1: The Java Problem in the Community
  2. That Java Thing, Part 2: Intro to OSGi
  3. That Java Thing, Part 3: Eclipse Prep
  4. That Java Thing, Part 4: Creating the Plugin
  5. That Java Thing, Part 5: Expanding the Plugin
  6. That Java Thing, Part 6: Creating the Feature and Update Site
  7. That Java Thing, Part 7: Adding a Managed Bean to the Plugin
  8. That Java Thing, Part 8: Source Bundles
  9. That Java Thing, Part 9: Expanding the Plugin - Jars
  10. That Java Thing, Part 10: Expanding the Plugin - Serving Resources
  11. That Java Thing, Interlude: Effective Java
  12. That Java Thing, Part 11: Diagnostics
  13. That Java Thing, Part 12: Expanding the Plugin - JAX-RS
  14. That Java Thing, Part 13: Introduction to Maven
  15. That Java Thing, Part 14: Maven Environment Setup
  16. That Java Thing, Part 15: Converting the Projects
  17. That Java Thing, Part 16: Maven Fallout
  18. That Java Thing, Part 17: My Current XPages Plug-in Dev Environment
  19. Java Hiccups
  20. Bitwise Operators
  21. Java Grab Bag 2

I've been laying warnings that this would be coming and you've seen me grouse about it for over a year, but now the time has come to really dive into Maven for Domino developers.

To lead into it, there are two main topics to cover: what Maven is and why you should bother.

What Maven Is

Maven is a build automation tool, primarily for Java applications but able to work with a number of other languages and environments.

The concept of a "build automation tool" is a strange one when you're coming from a Notes/Domino perspective, and it's the source of a lot of consternation when moving to it. In classic Notes, there conceptually was no build phase for an application: certain things would be compiled on save, but there was rarely any need to think about this. Designer was the way to write applications and it took care of it. With XPages came a bit of Eclipse-ism with the notion of "Build" being a separate, not-necessarily-automatic stage, but there still wasn't much user-facing configuration going on: other than maybe adding a source folder, the IDE just kind of took care of it.

Even for OSGi plugin developers, the need seems a little arcane. Eclipse does have project and build configurations, and it runs through build scripts internally when you export Jars or build an update site. Again, though, this is all largely hidden and the user doesn't normally have to think much about it.

Where this comes in, though, is when you want to start expanding your projects in ways beyond the "single bag of code" stage: automatically including pre-packaged dependencies, making the project cleanly available to others down stream, sharing configuration across projects, and, particularly, automating building, testing, and deployment with an environment like Jenkins. Maven (and the alternatives like Gradle) provide important structure and meta-information to do these things and scale them to ever-larger tasks.

Why You Should Bother

I've found the pro-Maven pitch to be kind of a weird one, since it's sort of like unit/integration testing in that, before you do it, it doesn't seem worth the hassle, but then, when you've switched over, it seems crazy to not do it. Like a cult, I guess, but a somewhat better idea.

I'll start with an important reason: it's good for your career. Unless you want to remain on legacy-maintenance duty forever (which, granted, can be a stable gig), it's important to keep improving your skills, and build automation is a big concept that pays dividends in knowledge in Domino programming and beyond. Once you're familiar with a system like Maven, you start recognizing the same patterns elsewhere: in some of OSGi's capabilities, in older systems like Make, and, crucially, in whatever modern JavaScript toolchains are doing lately. Learning something like this opens up doors.

It also makes testing that much more natural. Building and running automated tests is certainly possible in Eclipse, but Maven's structure strongly encourages it (newly-created projects start with JUnit and a tests directory, to nudge you in that direction), and having test running be a phase in building makes it much more foolproof. The virtue of writing tests creeps up on you: even if you don't go whole-hog TDD, getting into the habit of starting each bug fix with a failing-then-successful test case means you now have a bug you'll never have to see again. Granted, as with a lot of other aspects, the nature of Domino development creates some hurdles, but it's still worth it.

And finally: features, features, features. Once you get comfortable with Maven, it becomes much easier to spin up accompanying source and Javadoc packages for your projects, add in other languages, filter content during builds, create alternate build profiles for different situations, bring in remote resources easily, deploy to targets automatically, manage versioning cleanly, and so forth. These are all things that are possible in the absence of a system like Maven, but Maven brings them all together in a way that is understandable both by your IDE of choice and by faceless servers.

What's Next

The next step will move from high-flown concepts to some brass tacks: preparing a Maven environment for Domino development and getting a look at what Maven's configuration files look like.

Wrangling Tycho and Target Platforms

Aug 30, 2015 5:16 PM

Tags: maven tycho

One of the persistent problems when dealing with OSGi projects with Maven is the interaction between Maven, Tycho, and Eclipse. The core trouble comes in with the differing ways that Maven and OSGi handle dependencies.

Dependency Mechanisms

The Maven way of establishing dependencies is to list them in your Maven project's POM file. A standard one will look something like this:

<dependencies>
	<dependency>
		<groupId>com.google.guava</groupId>
		<artifactId>guava</artifactId>
		<version>18.0</version>
	</dependency>
</dependencies>

This tells Maven that your project depends on Guava version 18.0. The "groupId" and "artifactId" bits are essentially arbitrary strings that identify the piece of code, and, following Java standards, convention dictates that they are generally reverse-DNS-style. There are variations on this setup, such as specifying version ranges or sub-artifacts, but that's what you'll usually see. The term "artifact" is a Maven-ism referring to a specific entity, usually a single Jar file, and I've taken to using it casually.

One of the key things Maven brings to the table here is Maven Central: a warehouse of common Maven-ized projects. Without specifying any additional configuration, the dependency declaration above will cause Maven to check with Maven Central to find the Jar, download it, and store it in your local repository (usually ~/.m2/repository). Then, during the build process, Java can reference the local copy of the Jar in the consistently-organized local folder structure. It will also, if needed, download "transitive" dependencies: the dependencies listed by the project you're depending on.

OSGi's dependency system is conceptually similar. Instead of the POM file, it piggybacks on the Jar's MANIFEST.MF file with something like this:

Require-Bundle: com.google.guava;bundle-version="18.0"

This is essentially the same idea as the Maven dependency: you reference an OSGi-enabled Jar (called a "Bundle" in OSGi parlance... which can also be a "Plug-in") by its usually-reverse-DNS name and provide restrictions on versions, plus other potential options.

There is no equivalent here of Maven Central: OSGi artifacts are found in Update Sites for each project and are added to the OSGi environment. When you install a plug-in in Eclipse/Designer or Domino, you are contributing to your installation's pool of OSGi artifacts. There are some conveniences to make this experience easier in some cases, such as the Eclipse Marketplace and the primary Eclipse Update Site, but it's not as coordinated as Maven.

The Overlap

Though often redundant, these two dependency mechanisms are not inherently incompatible. A given Jar file can be represented as both a Maven artifact and an OSGi bundle - and, indeed, a great many of the artifacts in Maven Central come pre-packaged with OSGi metadata, and there are Maven plugins to make generating this invisible to the developer.

Tycho - the Maven plugin that creates an OSGi environment for your Maven development - has the capability to more-or-less bridge this gap. By adding the Tycho plugins to your Maven build, you can point Maven at OSGi Update Sites (called "p2" sites) and Tycho will be able to find the artifacts referenced by your project's MANIFEST.MF Require-Bundle line. Even better, by using <pomDependencies>consider</pomDependencies> in your Tycho config, it will be able to look at the Maven dependencies of your project, check them for OSGi metadata, and then use that to satisfy the MANIFEST.MF requiremenets.

Though convoluted to say, the upshot is that, when you have that pomDependencies option, things work out pretty well... from the command line. The trouble comes in when you want to develop these projects in Eclipse.

Target Platforms

The aggregate set of OSGi bundles known by your OSGi environment (either Tycho or Eclipse in this case) and used for compilation is the "Target Platform". If you've used the XPages SDK or otherwise set up a non-Designer Eclipse installation for XPages plug-in development, you've seen Target Platforms in action: the installation process locates your Notes and Domino installations and adds their OSGi bundles to Eclipse's Target Platform, allowing them to be references by your own OSGi projects.

The trouble is that Eclipse is a bit... inflexible when it comes to specifying a project's Target Platform. Though Eclipse has the capacity to have many Target Platform definitions, only one is active at a time for your entire workspace. Moreover, this Target Platform (plus any projects in your workspace) makes up the entirety of what Eclipse is willing to acknowledge for OSGi development.

This causes serious trouble for Maven dependencies.

If you have a Tycho-enabled project, Eclipse's adapter will not use its Maven dependencies for OSGi requirement resolution. So if your project lists Guava in both OSGi and Maven, even though Maven can see it, and Tycho can see it, and the Guava Jar sitting in your local Maven repository is brimming with OSGi metadata, Eclipse will not acknowledge it and you will have an error that com.google.guava can't be found.

Workarounds

There are a couple potential workarounds for this, none of which are particularly great.

Just Do It Manually

One option is to just have any developers working on the project also track down and manually add all applicable OSGi bundles to their Eclipse installation. It's not ideal, but it could work in a pinch, especially if you only have a single dependency or two.

Include the Project Wholesale

This is the approach the OpenNTF Domino API has taken to date: several of its external dependencies are included wholesale in source form in the project tree. This accomplishes the goal because, with the projects in your workspace, Eclipse will happily acknowledge them as part of the Target Platform, while Tycho will also be able to recognize them. However, it carries with it the significant down side of importing a whole heap of foreign code into your project and then having to ensure that it builds in your environment.

Maven-Generated Target Platform

Another option is to have Maven create a Target Platform file (*.target) dynamically, and then have Eclipse use that as its Target Platform definition. You can do that by including a Maven project like this in your tree:

<?xml version="1.0"?>
<project
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
	xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
	<modelVersion>4.0.0</modelVersion>
	<parent>
		<groupId>com.example</groupId>
		<artifactId>project-parent</artifactId>
		<version>1.0.0-SNAPSHOT</version>
	</parent>
	<artifactId>example-osgi-target</artifactId>
	
	<packaging>eclipse-target-definition</packaging>
	
	<build>
		<plugins>
			<plugin>
				<groupId>lt.velykis.maven</groupId>
				<artifactId>pde-target-maven-plugin</artifactId>
				<version>1.0.0</version>
				<executions>
					<execution>
						<id>pde-target</id>
						<goals>
							<goal>add-pom-dependencies</goal>
						</goals>
						<configuration>
							<baseDefinition>${project.basedir}/osgi-base.target</baseDefinition>
							<outputFile>${project.basedir}/${project.artifactId}.target</outputFile>
						</configuration>
					</execution>
				</executions>
			</plugin>
		</plugins>
	</build>
</project>

By creating a shell Target file in Eclipse named osgi-base.target, this project will locate its known dependencies (namely, any dependencies listed in it or in parent projects) and glom the paths of any of those OSGi plugins found in your local Maven repository onto it. In Eclipse, you can then open the generated Target file and set it as your active.

This... basically works, but it's ugly. Moreover, it limits your Target Platform customization options. If you want to include other Update Sites in your platform (say, the XPages targets generated by the SDK), you would have to modify the base Target file manually, making it fragile for multi-developer use.

Maven-Generated p2 Site

This is the option I'm tinkering with now, and it's similar to the Target-file approach. However, instead of creating an exclusive Target Platform, you can have Maven create a p2 Update Site and then add that directory to your Target Platform manually. That manual step is still unfortunate, but it's not too bad, and it should adapt automatically as more dependencies are added. A Maven plugin named p2-maven-plugin can do a tremendous amount of heavy lifting here: it can track down Maven dependencies, add OSGi metadata if they don't have them already, do the same for their dependencies, and then put them all into a nicely-organized Update Site:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>

	<groupId>com.example</groupId>
	<artifactId>example-osgi-site</artifactId>
	<version>1.0.0-SNAPSHOT</version>
	<packaging>pom</packaging>

	<pluginRepositories>
		<pluginRepository>
			<id>reficio</id>
			<url>http://repo.reficio.org/maven/</url>
		</pluginRepository>
	</pluginRepositories>

	<build>
		<plugins>
			<plugin>
				<groupId>org.reficio</groupId>
				<artifactId>p2-maven-plugin</artifactId>
				<version>1.2.0-SNAPSHOT</version>
				<executions>
					<execution>
						<id>default-cli</id>
						<phase>validate</phase>
						<goals>
							<goal>site</goal>
						</goals>
						<configuration>
							<artifacts>
								<artifact><id>com.google.guava:guava:18.0</id></artifact>
							</artifacts>
						</configuration>
					</execution>
				</executions>
			</plugin>
		</plugins>
	</build>
</project>

Once this project is executed, you can then add the generated folder to Eclipse's active Target Platform and be set. Though I haven't put this into practice yet, it may be the best out of a bad bunch of options.

Don't Use Eclipse

Well, I guess this final option may be the best if you're not an Eclipse fan - other IDEs may handle this whole thing much more smoothly. So, if you use IntelliJ and it doesn't have this problem, that's good.


These problems cause a lot more heartburn than you'd think they should, considering that this is basic project setup and not even part of the task of actually developing your project, but such is life. As long as you have a dependency on non-Mavenized OSGi artifacts (such as the XPages runtime) or want to use Tycho's full abilities (such as OSGi-environment unit tests or building full Eclipse-based applications) while also developing in Eclipse, you're stuck with this sort of workaround.

MWLUG 2015 - Maven: An Exhortation and Apology

Aug 16, 2015 11:55 AM

Tags: mwlug maven

At MWLUG this coming week, I'll be giving a presentation on Maven. Specifically, I plan to cover:

  • What Maven is
  • Why Domino developers should know about it
  • Why it's so painful and awkward for Domino developers
  • Why it's still worth using in spite of all the suffering
  • How this will help when working on projects outside of traditional Domino

The session is slated for 3:30 PM on Thursday. I expect it to be cathartic for me and useful for the attendees, so I hope you can make it.

Maven Native Chronicles, Part 3: Improving Native Artifact Handling

Jul 26, 2015 9:38 PM

Tags: maven
  1. Maven Native Chronicles, Part 1: Figuring Out nar-maven-plugin
  2. Maven Native Chronicles, Part 2: Setting Up a Windows Jenkins Node
  3. Maven Native Chronicles, Part 3: Improving Native Artifact Handling
  4. Maven Native Chronicles: Running Automated Notes-based Tests

This post isn't so much a part of the current series as it is a followup to a post from the other week, but I can conceptually retcon that one in as a prologue. This will also be a good quick tip for dealing with Maven projects.

In my previous post, I described how I copied the built native shared library from the C++ project into the OSGi fragments for distribution, and I left it with the really hacky approach of copying the file using a project-relative path that reached up into the other project. It technically functioned, but it relied on the specific project structure, which wouldn't survive any reorganization or breaking up of the module tree.

To improve it, I reworked it to be a bit more Maven-y, which involves two steps: attaching the built artifacts to the output of the native project and then using the dependency plugin to copy the native artifacts in as needed. For the first step, I used the build-helper-maven-plugin, though there may be other ways to do it. This is relatively straightfoward, though:

<plugin>
	<groupId>org.codehaus.mojo</groupId>
	<artifactId>build-helper-maven-plugin</artifactId>
	<version>1.3</version>
	<executions>
		<execution>
			<id>attach-artifacts</id>
			<phase>package</phase>
			<goals>
				<goal>attach-artifact</goal>
			</goals>
			<configuration>
				<artifacts>
					<artifact>
						<file>${project.basedir}/x64/Debug/nativelib-win32-x64.dll</file>
						<type>dll</type>
						<classifier>win32-x64</classifier>
					</artifact>
					<artifact>
						<file>${project.basedir}/Win32/Debug/nativelib-win32-x86.dll</file>
						<type>dll</type>
						<classifier>win32-x86</classifier>
					</artifact>
				</artifacts>
			</configuration>
		</execution>
	</executions>
</plugin>

This causes the native libraries - so far, the two Windows ones - to be included in the Maven repository during installation, and to then be accessible from other projects. The files are named using the module base name plus the classifier appended and the type as the file extension, like native-project-name-win32-x64.dll.

To copy that artifact into the OSGi bundle project, I then use maven-dependency-plugin to copy it in. Here I reference it via the module name and the classifier/type pair used above (with some shorthands because they're in the same multi-module project):

<plugin>
	<groupId>org.apache.maven.plugins</groupId>
	<artifactId>maven-dependency-plugin</artifactId>
	<version>2.10</version>
	
	<executions>
		<execution>
			<id>copy-native-lib</id>
			<phase>prepare-package</phase>
			<goals>
				<goal>copy</goal>
			</goals>
			<configuration>
				<artifactItems>
					<artifactItem>
						<groupId>${project.groupId}</groupId>
						<artifactId>native-project-name</artifactId>
						<version>${project.version}</version>
						<type>dll</type>
						<classifier>win32-x64</classifier>
					</artifactItem>
				</artifactItems>
				<outputDirectory>lib</outputDirectory>
				<stripVersion>true</stripVersion>
			</configuration>
		</execution>
	</executions>
</plugin>

The net result here is the same as previously, but should be more maintainable.

Maven Native Chronicles, Part 2: Setting Up a Windows Jenkins Node

Jul 26, 2015 11:16 AM

Tags: maven
  1. Maven Native Chronicles, Part 1: Figuring Out nar-maven-plugin
  2. Maven Native Chronicles, Part 2: Setting Up a Windows Jenkins Node
  3. Maven Native Chronicles, Part 3: Improving Native Artifact Handling
  4. Maven Native Chronicles: Running Automated Notes-based Tests

Before I get to the meat of this post, I want to point out that Ulrich Krause wrote a post on a similar topic today and you should read it.

The build process I've been working with involves a Jenkins server running on OS X (in order to build iOS binaries), and so it will be useful to have a Windows instance set up as well to run native builds and, importantly, tests. Jenkins comes with support for distributed builds and makes it relatively straightforward.

To start with, I installed VirtualBox and went through the usual Windows setup process - it shouldn't matter too much which major version of Windows you use, as long as it's 64-bit, in order to be able to generate and test both types of binaries. Once that was running, I installed the latest 64-bit JDK followed by Visual Studio Community, which is a pretty smooth process (for all their faults, Microsoft knows how to treat developers). To provide access to the VM from the Mac host, I added a second network adapter to the VM and set it to host-only networking:

During this process, I found Jump Desktop to be a very useful tool. Since the Mac host runs SSH, I was able to set up an RDP connection to the Windows VM using an SSH tunnel, which Jump does transparently for you. This made for a much better experiencing than VNCing into the Mac and controlling Windows in the VirtualBox window in there.

Next, I decided that the route I wanted to take to control the Windows slave was SSH, since SSH is the bee's knees. I installed Cygwin, which creates a fairly Unix-like environment on top of Windows, and included OpenSSH in the process. After going through the afore-linked setup process, I had SSH access to the Windows machine (including, thanks to SSH proxying, remote access via the primary build server). On the Jenkins side on the Mac, I installed the "Cygpath plugin" (which is in the built-in plugin manager) to avoid any of the issues mentioned on the wiki page. The configuration in Jenkins is relatively straightforward (I will probably end up changing the base directory to be a clean Jenkins home, since I hadn't initially been sure if I needed Jenkins installed on the slave):

With that, I was able to set the build to run on servers with the "windows" label, kick it off, and start going through its complaints until I had it working.

First off, I had some more Java setup to do, specifically creating a system environment variable named JAVA_HOME and setting it to the root of the JDK ("C:\Program Files\Java\jdk1.8.0_51" in this case). Then, I set up Maven, which is something of an awkward process on Windows, but not TOO bad. I downloaded the latest binaries, unzipped them to "C:\Program Files\maven", added an environment variable of M2_HOME to point to that:

I also added %M2_HOME%\bin;C:\Program Files (x86)\MSBuild\12.0\Bin to the end of the PATH variable, to cover both the Maven tools and the msbuild executable for later.

I ran into a bit of weirdness when it came to setting up configuration for SSH and Maven, specifically because it seems that Cygwin has two home folders for the logged-in user: the Unix-style /home/jesse and the normal Windows C:\Users\jesse (which is available in Cygwin as /cygdrive/c/Users/jesse). Since this Jenkins build checks out the code from GitHub via SSH, I needed to copy over the id_rsa file for the Jenkins user: this went into /home/jesse/.ssh/id_rsa. In order to configure Maven, though, the settings file went to C:\Users\jesse\.m2\settings.xml.

Eventually, it slogged its way through the build to completion, including a successful run of the integration tests. I still need to figure out the best way to get the resultant artifacts back out (or maybe it will be best to just deploy from both to the same Artifactory server), but this seems to do the main task for me.

Maven Native Chronicles, Part 1: Figuring Out nar-maven-plugin

Jul 24, 2015 3:48 PM

  1. Maven Native Chronicles, Part 1: Figuring Out nar-maven-plugin
  2. Maven Native Chronicles, Part 2: Setting Up a Windows Jenkins Node
  3. Maven Native Chronicles, Part 3: Improving Native Artifact Handling
  4. Maven Native Chronicles: Running Automated Notes-based Tests

As I mentioned the other day, my work lately involves a native shared library that is then included in an OSGi plugin. To get it working during a Maven compile, I just farmed out the actual build process to Visual Studio's command-line project builder. That works as far as it goes, but it's not particularly Maven-y and, more importantly, it's Windows-only.

In looking around, it seems like the most popular method of doing native compilation in Maven, especially with JNI components, is maven-nar-plugin - nar means "Native ARchive", and it's meant to be a consistent way to package native artifacts (executables and libraries) across platforms. It does an admirable job wrangling the normally-loose nature of a C/C++ program to work with Maven-ish standards and attempts to paper over the differences between platforms and toolchains. I'm not entirely convinced that this will be the way I go long-term (in particular, its attitude towards multi-platform/arch builds seems to be "eh, sort of?"), but it's a good place to get started with non-Windows compilation.

The first step was to move the files around to mostly match a Maven-style layout. Starting out, the .cpp and .h files were in the src folder directly, while dependency headers were in a dependencies folder next to it. I left the Notes includes in there for now, but it seems that nar-maven-plugin will cover the JNI stuff for me, so I could simplify that somewhat. The new project structure looks like:

  • (project root)
    • src
      • main
        • c++
        • include
    • dependencies
      • inc
        • notes

Next was to set up the project configuration. For now, I want to still use Visual Studio's CLI app to build the Windows version, and I'm going to have to specifically define supported platforms, so I define the project as a nar, but then disable actual execution of the plugin by default:

<project>
	...
	<packaging>nar</packaging>
	
	<build>
		<plugins>
			<plugin>
				<groupId>com.github.maven-nar</groupId>
				<artifactId>nar-maven-plugin</artifactId>
				<version>3.2.3</version>
				<extensions>true</extensions>
				
				<configuration>
					<skip>true</skip>
				</configuration>
			</plugin>
		</plugins>
	</build>
</project>

Then, much as I did for the Windows-specific builds, I added a profile to try to build on my Mac. Note that these build settings produce a library that fails all unit tests, so they're surely not correct, but hey, it compiles and links, so that's a start. To ensure that it only builds when it has an appropriate context, it is triggered by a combination of OS family and the presence of the notes-program Maven property, which should point to the Notes executable directory.

<project>
	...
    
	<profiles>
		...
		<profile>
			<id>mac</id>
		
			<activation>
				<os>
					<family>mac</family>
				</os>
				<property>
					<name>notes-program</name>
				</property>
			</activation>
	
			<build>
				<plugins>
					<plugin>
						<groupId>com.github.maven-nar</groupId>
						<artifactId>nar-maven-plugin</artifactId>
						<extensions>true</extensions>
			
						<configuration>
							<skip>false</skip>
				
							<cpp>
								<debug>true</debug>
								<includePaths>
									<includePath>${project.basedir}/src/main/include</includePath>
									<includePath>${project.basedir}/dependencies/inc/notes</includePath>
								</includePaths>
					
								<options>
									<option>-DMAC -DMAC_OSX -DMAC_CARBON -D__CF_USE_FRAMEWORK_INCLUDES__ -DLARGE64_FILES -DHANDLE_IS_32BITS -DTARGET_API_MAC_CARBON -DTARGET_API_MAC_OS8=0 -DPRODUCTION_VERSION -DOVERRIDEDEBUG</option>
								</options>
							</cpp>
							<linker>
								<options>
									<option>-L${notes-program}</option>
								</options>
								<libSet>notes</libSet>
							</linker>
				
							<libraries>
								<library>
									<type>shared</type>
								</library>
							</libraries>
						</configuration>
					</plugin>
				</plugins>
			</build>
		</profile>
	</profiles>
</project>

Unstable though the result may be, the nar plugin does its job: it produces an archive containing the dylib, suitable for distribution as a Maven artifact and extraction into the downstream project, which I'll go into later.

So this is a good step towards my final goal. As I mentioned, I may end up getting rid of nar-maven-plugin specifically, but this is a good way to shape the code into something more portable (I also got rid of a few Windows-isms in the C++ while I was at it). My ultimate goal is to get a single build run that produces artifacts for all of the important platforms (Windows 32/64 and Linux 32/64 for production, Mac 32/64(?) for JUnit tests during development). I may be able to accomplish that using the nar plugin with a distributed Jenkins build, or I may be able to do it with Makefiles with GCC cross-compilers on OS X build host. If that works, it's the sort of thing that makes all this Maven stuff worthwhile.

Quick-and-Dirty Inclusion of a Visual C++ Project in a Maven Build

Jul 11, 2015 7:26 PM

Tags: maven jni

One of my projects lately makes use of a JNI library distributed via an OSGi plugin. The OSGi side of the project uses the typical Maven+Tycho combination for its building, but the native library was developed using Visual C++. This is workable enough, but ideally I'd like to have the whole thing part of one smooth build: compile the native library, then subsequently copy its resultant shared 32- and 64-bit libraries into the OSGi plugins.

From what I've gathered, the "proper" way to do this sort of setup is to use the nar-maven-plugin, which is intended to wrap around the normal compilers for each platform and handle packaging and access to the libraries and related components. I tinkered with this a bit but ran into a lot of trouble trying to get it to work properly, no doubt due to my extremely-limited knowledge of C++ toolchains combined with the natural weirdness of Windows's development environment.

For now, I decided to do it the "ugly" way that nonetheless gets the job done: just run the Visual C++ toolchain from Maven. Fortunately, Microsoft includes a tool called msbuild for this purpose: if you run it in the directory of a Visual C++ project, it will act like the full IDE. I added its executables to my PATH (C:\Program Files (x86)\MSBuild\12.0\bin) and then used a Maven plugin called exec-maven-plugin to launch it (the Ant plugin would also work, but this is more explicit). Since this will only run on Windows, I wrapped it in a triggered profile and added two executions to cover both 32-bit and 64-bit versions:

<project>
	...
	<packaging>pom</packaging>
	...
	
	<profiles>
		<profile>
			<id>windows-x64</id>
		
			<activation>
				<os>
					<family>windows</family>
					<arch>amd64</arch>
				</os>
			</activation>
			
			<build>
				<plugins>
					<plugin>
						<groupId>org.codehaus.mojo</groupId>
						<artifactId>exec-maven-plugin</artifactId>
						<version>1.4.0</version>
						<executions>
							<execution>
								<id>build-x86</id>
								<phase>generate-sources</phase>
								<goals>
									<goal>exec</goal>
								</goals>
								<configuration>
									<environmentVariables>
										<Platform>Win32</Platform>
									</environmentVariables>
									<executable>msbuild</executable>
								</configuration>
							</execution>
							<execution>
								<id>build-x64</id>
								<phase>generate-sources</phase>
								<goals>
									<goal>exec</goal>
								</goals>
								<configuration>
									<environmentVariables>
										<Platform>X64</Platform>
									</environmentVariables>
									<executable>msbuild</executable>
								</configuration>
							</execution>
						</executions>
					</plugin>
				</plugins>
			</build>
		</profile>
	</profiles>
</project>

The project itself remains configured in Visual Studio. While the source files are certainly modifiable in Eclipse, it won't have the full C/C++ toolchain environment until I figure out a proper way to do that. But this does indeed do the trick: it creates the two DLLs in the same way as when I had been building them in the IDE.

The next step is to automatically include these in the appropriate OSGi fragment projects. For this, at least for now, I'm using the maven-resources-plugin. This configuration depends on the structure of the Maven projects, which is sort of fragile, but it's not too bad when they're in the same overall project. This is the config for the x64 plugin, and there is a separate x86 project with an almost-identical configuration:

<project>
	...
	<build>
		<plugins>
			...
			<plugin>
				<groupId>org.apache.maven.plugins</groupId>
				<artifactId>maven-resources-plugin</artifactId>
				<version>2.7</version>
				
				<executions>
					<execution>
						<id>copy-native-lib</id>
						<phase>generate-resources</phase>
						<goals>
							<goal>copy-resources</goal>
						</goals>
						<configuration>
							<resources>
								<resource>
									<directory>${project.basedir}/../../native-project-name/x64/Debug/</directory>
									<includes>
										<include>nativelib-win32-x64.dll</include>
									</includes>
								</resource>
							</resources>
							<outputDirectory>${project.basedir}/lib</outputDirectory>
						</configuration>
					</execution>
				</executions>
			</plugin>
		</plugins>
	</build>
</project>

The result is that, at least when I build on Windows, everything is properly compiled and put in its right place. When running in my normal Mac dev environment, it uses the built libraries that have previously been copied into the plugin, so it still works well enough.

This is still a far cry from an optimal configuration. The requirement of using Visual Studio is cumbersome, which means that any multi-platform build will mean a redundant config (whether it be in the pom or in a separate Makefile), and this current setup isn't properly "Mavenized": the output doesn't go into the "target" folder and the DLLs aren't tagged for inclusion in the installed Maven repo. It suits the purpose, though, of being an intermediate step in a larger build.

My long-term desire is to get this fully cross-platform and automated on a build server. That will involve a lot of learning about the nar-maven-plugin (or Makefiles) as well as either setting up a cross-compilation infrastructure or a series of Jenkins slaves. In theory, an OS X system can have everything it would need to build for the other platforms itself, but I've gathered that the safest way to do it is with the "multiple Jenkins nodes" route. When I develop an improved build system for this, I'll write followup posts.

Building on ODA's Maven-ization

Mar 31, 2015 8:30 PM

Tags: maven oda

Over the weekend, I took a bit of time to apply some of my hard-won recent Maven knowledge to a project I wish I had more time to work with lately: the ODA. The development branches have been Maven-ized for half a year or so, but primarily just to the point of getting the compile to work. Now that I know more about it, I was able to go in and make great strides towards several important goals.

As a preliminary note: don't take my current implementations as gospel. There are parts that will no doubt change; for example, there are some intermittent timing issues currently with the final assembly. But the changes I did make have borne some early fruit.

Source Bundles

Over the releases, it's proven surprisingly fiddly to get parameter names, inline Javadoc, and attached source to work in Designer, leaving some builds no better off than the legacy API in those regards. The apparently-consistent fix for this is the use of "source" plugins: OSGi plugins that go alongside the normal one that just contain the source of each class. Those aren't too bad to generate manually from Eclipse, but the point of Maven is getting away from that sort of manual stuff.

Fortunately, Tycho (the OSGi toolkit for Maven) includes a plugin that allows you to generate these source bundles alongside the normal ones, by including this in the list of plugins executed during the build:

<plugin>
	<groupId>org.eclipse.tycho</groupId>
	<artifactId>tycho-source-plugin</artifactId>
	<version>${tycho-version}</version>
	<executions>
		<execution>
			<id>plugin-source</id>
			<goals>
				<goal>plugin-source</goal>
			</goals>
		</execution>
	</executions>
</plugin>

Once you have that (which I added to the top-level project, so it cascades down), you can then add the plugins to the OSGi feature with the same name as the base plugin plus ".source". Eclipse will give a warning that the plugins don't exist (since they exist only during a Maven build), but you can ignore that.

Javadoc

Javadoc generation is an area where I suspect I'll make the most changes down the line, but I managed to wrangle it into a spot that mostly works for now.

Not every project in the tree needs Javadoc (for example, we don't need to include docs for third-party modules necessarily), but it's still useful to specify configuration. So I took the already-existing basic config in the parent pom and moved it to pluginManagement for the children:

<pluginManagement>
	<plugins>
		<plugin>
			<!-- javadoc configuration -->
			<groupId>org.apache.maven.plugins</groupId>
			<artifactId>maven-javadoc-plugin</artifactId>
			<version>2.9</version>
			<configuration>
				<failOnError>false</failOnError>
				<excludePackageNames>com.sun.*:com.ibm.commons.*:com.ibm.sbt.core.*:com.ibm.sbt.plugin.*:com.ibm.sbt.jslibrray.*:com.ibm.sbt.proxy.*:com.ibm.sbt.security.*:*.util.*:com.ibm.sbt.portlet.*:com.ibm.sbt.playground.*:demo.*:acme.*</excludePackageNames>
			</configuration>
		</plugin>
	</plugins>
</pluginManagement>

Then, I added specific plugin references in the applicable child projects:

<plugin>
	<groupId>org.apache.maven.plugins</groupId>
	<artifactId>maven-javadoc-plugin</artifactId>
	<executions>
		<execution>
			<id>generate-javadoc</id>
			<phase>package</phase>
			<goals>
				<goal>jar</goal>
			</goals>
		</execution>
	</executions>
</plugin>

With those, the build can generate Javadoc appropriate for consumption in the final assembly down the line.

Assembly

The final coordinating piece is referred to as the "assembly". The job of the Maven Assembly Plugin is to take your project components and output - built Jars, source files, documentation, etc. - and assembly them into an appropriate final format, usually a ZIP file.

The route I took is to add a distribution project to the tree whose sole job it is to wait until the other components are done and then assemble the results. The pom for this project primarily consists of telling Maven to run the assembly plugin to create an appropriately-named ZIP file using what's called an "assembly descriptor": an XML file that actually provides the instructions. There are a couple stock descriptors, but for something like this it's useful to write your own. It's quite a file (and also liable to change as I figure out the best practices), but is broken down into a couple logical segments.

First off, we have a rule telling it to include all files from the "src/main/resources" folder in the current (assembly) projet:

<fileSets>
	<fileSet>
		<directory>src/main/resources</directory>
		<includes>
			<include>**/*</include>
		</includes>
		<outputDirectory>/</outputDirectory>
	</fileSet>
</fileSets>

This folder contains a README description of the result as well as the miscellaneous presentations and demo files the ODA has collected over time.

Next, in addition to the source bundles mentioned earlier, I want to include ZIP files of the important project sources in the distribution, for easy access (technically wasteful, but not by too much):

<moduleSet>
	<useAllReactorProjects>true</useAllReactorProjects>
	<includes>
		<include>org.openntf.domino:org.openntf.domino</include>
		<include>org.openntf.domino:org.openntf.domino.xsp</include>
		<include>org.openntf.domino:org.openntf.formula</include>
		<include>org.openntf.domino:org.openntf.junit4xpages</include>
	</includes>
	
	<binaries>
		<attachmentClassifier>src</attachmentClassifier>
		<outputDirectory>/source/</outputDirectory>
		<unpack>false</unpack>
		<outputFileNameMapping>${module.artifactId}.${module.extension}</outputFileNameMapping>
	</binaries>
</moduleSet>

I use the "binaries" tag here instead of "sources" because I want to include the ZIP forms (hence unpack=false) - this is one part that may change, but it works for now.

Next, I gather the Javadocs generated earlier, but these I do want to unpack:

<moduleSet>
	<useAllReactorProjects>true</useAllReactorProjects>
	<includes>
		<include>org.openntf.domino:org.openntf.domino</include>
		<include>org.openntf.domino:org.openntf.domino.xsp</include>
		<include>org.openntf.domino:org.openntf.formula</include>
	</includes>
	
	<binaries>
		<attachmentClassifier>javadoc</attachmentClassifier>
		<outputDirectory>/apidocs/${module.artifactId}</outputDirectory>
		<unpack>true</unpack>
	</binaries>
</moduleSet>

This results in an "apidocs" folder containing the Javadoc HTML for each of those three projects in subfolders.

Finally, I want to include the built and ZIP'd Update Site for use in Designer and Domino:

<moduleSet>
	<useAllReactorProjects>true</useAllReactorProjects>
	<includes>
		<include>org.openntf.domino:org.openntf.domino.updatesite</include>
	</includes>
	
	<binaries>
		<attachmentClassifier>assembly</attachmentClassifier>
		<outputDirectory>/</outputDirectory>
		<unpack>false</unpack>
		<includeDependencies>false</includeDependencies>
		<outputFileNameMapping>UpdateSite.zip</outputFileNameMapping>
	</binaries>
	
	<sources>
		<outputDirectory>/</outputDirectory>
		<includeModuleDirectory>false</includeModuleDirectory>
		<includes>
			<include>LICENSE</include>
			<include>NOTICE</include>
		</includes>
	</sources>
</moduleSet>

While grabbing the Update Site, I also copy the all-important LICENSE and NOTICE files from this current project - these may be best moved to the resources folder above.

The result of all this is a nicely-packed ZIP containing everything a user should need to get started with the API:

Next Steps

So, as I mentioned, this work isn't complete, in large part because I'm still learning the ropes. I suspect that the way I'm gathering the sources in the assembly and generating and gathering the Javadoc are not quite right - and this shows in the way that slightly-different host configurations (like on a Bamboo build server or when doing a multi-threaded build) fail during packaging.

Additionally, it's somewhat wasteful to include the source plugins even for server distributions; I won't really lose sleep over it, but it'd still be ideal to continue the recent policy of providing ExtLib-style distinct Update Sites. I'm not sure if this will require creating multiple feature and update-site projects or if it can be accomplished with build profiles.

Finally, I would love to be able to get rid of the source-form third-party dependencies like Guava and Javolution. One of the main benefits of Maven is that you can very-easily consume dependencies by listing them in the config, but Tycho and Eclipse throw a wrench into that: when you configure a project to use Tycho, then Eclipse stops referencing the Maven dependencies. Moreover, even though I believe all of the dependencies we use contain OSGi metadata, which would satisfy a Tycho command-line build, both Eclipse and the requirement that we build an old-style (non-p2) Update Site prevent us from doing that simply. It's possible that the best route will be to have Maven download and copy in the Jar files of the dependencies, but even that has its own suite of issues.

But, in any event, it's satisfying seeing this come together - and nice for me personally to build on the work Nathan, Paul, and Roland-and-co. have been doing lately. Maven is a monster and still suffers from severe "how the F does this stuff work?" problems, but it does feel good to put it to work.

Auto-OSGi-ifying Maven Projects

Mar 28, 2015 4:15 PM

Tags: maven

In my last post, I discussed some of the problems that you run into when dealing with Maven projects that should also be OSGi plugins, particularly when you're mixing OSGi and non-OSGi projects in the same Maven build (in short: don't do that). Since then, things have smoothed out, particularly when I split the OSGi portion out into another Maven build, allowing it to consume the "core" artifacts cleanly, without the timing issue described previously.

But I ran into another layer of the task: consuming the Maven artifacts as plain Jars is all well and good, but the ideal would be to also have them available as a suite of OSGi plugins, so they can be managed and debugged more easily in an OSGi environment like Eclipse or Domino. Fortunately, this process, while still fairly opaque, is smoother than the earlier task.

A note on terminology: the term "plugin" can refer to both the OSGi component as well as the tools added into a Maven build. The term "bundle" aptly describes the OSGi plugins as well, but I'm used to "plugin", so that's what I use here. It's probably the case that an OSGi plugin is a specialized type of bundle, but whatever.

Preparing the Plugins

The route I'm taking, at least currently, is to tell the root Maven project that all of its Jar-producing children should also have a META-INF/MANIFEST.MF file packaged along to allow for OSGi use, and moreover to automatically generate that manifest using the maven-bundle-plugin. The applicable code in the parent pom.xml looks like this:

<build>
    <pluginManagement>
        <plugin>
            <groupId>org.apache.felix</groupId>
            <artifactId>maven-bundle-plugin</artifactId>
            <version>2.1.0</version>
            <configuration>
                <manifestLocation>META-INF</manifestLocation>
                <instructions>
                    <Bundle-RequiredExecutionEnvironment>JavaSE-1.6</Bundle-RequiredExecutionEnvironment>
                    <Import-Package></Import-Package>
                </instructions>
            </configuration>

            <executions>
                <execution>
                    <id>bundle-manifest</id>
                    <phase>process-classes</phase>
                    <goals>
                        <goal>manifest</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>
        <plugin>
            <artifactId>maven-jar-plugin</artifactId>
            <version>2.3.1</version>
            <configuration>
                <archive>
                    <manifestFile>META-INF/MANIFEST.MF</manifestFile>
                </archive>
            </configuration>
        </plugin>
    </pluginManagement>
</build>

In order to actually generate the manifest files, I included a block like this in each child project that produces a Jar:

<build>
    <plugins>
        <plugin>
            <groupId>org.apache.felix</groupId>
            <artifactId>maven-bundle-plugin</artifactId>

            <configuration>
                <instructions>
                    <Bundle-SymbolicName>com.somecompany.someplugin</Bundle-SymbolicName>
                </instructions>
            </configuration>
        </plugin>
    </plugins>
</build>

The Bundle-SymbolicName bit is there to translate the project's Maven artifact ID (which would be like "foo-someplugin") into a nicer OSGi version. There are other ways to do this, including just letting it use the default, but it made sense to write them manually here.

Once you do that and then run a Maven package, each Jar project in the tree should get an auto-generated MANIFEST.MF file that exports all of the project's Java classes and specifies a Java 6 runtime and no imported packages. There are many tweaks you can make here - any of the normal MANIFEST entries can be specified in the <instructions/> block, so you could add imported packages, required bundles, or other metadata at will.

If you install these projects into your local repository, then downstream OSGi projects using Tycho can find the dependencies when you include them in the pom.xml by Maven artifact ID and in the downstream MANIFEST.MF by OSGi bundle name. There's one remaining hitch (at least): though Maven will be fine with that resolution, Eclipse doesn't pick up on them. To do that, it seems that the best route is to create a p2 repository housing the plugins, which would also be useful for other needs.

Creating an Update Site

Fortunately, there is actually an excellent example of this on GitHub. By following those directions, you can create a project where you list the plugins you want to include as dependencies in the pom.xml, and it will properly package them into a p2 site containing all the plugins with their OSGi-friendly names and nice site metadata.

As a Domino-specific aside, a "p2 Update Site" is somewhat distinct from the Update Sites we've gotten used to dealing with - namely, it's a newer format that is presumably unsupported by Notes and Domino's outdated infrastructure. You can tell the difference because the "old" ones contain a site.xml file while the p2 format contains content.jar and artifacts.jar (those may be .xml instead). It's just another one of those things for us to deal with.

In any event, the instructions on GitHub do what they say on the tin, but I wanted a bit more automation: I wanted to automatically include all of the plugins built in the project without specifying them each as a dependency. To do this, I replaced Step 2 in the example (the use of maven-dependency-plugin) with the maven-assembly-plugin, which is a generic tool for culling together the results of a build in some useful format. The replaced plugin block looks like this:

<plugin>
	<groupId>org.apache.maven.plugins</groupId>
	<artifactId>maven-assembly-plugin</artifactId>
	<version>2.5.3</version>
	<configuration>
		<descriptors>
			<descriptor>src/assembly/plugins.xml</descriptor>
		</descriptors>
		<outputDirectory>${project.basedir}/target/source</outputDirectory>
		<finalName>plugins</finalName>
		<appendAssemblyId>false</appendAssemblyId>
	</configuration>
	<executions>
		<execution>
			<id>make-assembly</id>
			<!-- Bump this up to earlier than package so that the plugins below see the results -->
			<phase>process-resources</phase>
			<goals>
				<goal>single</goal>
			</goals>
		</execution>
	</executions>
</plugin>

This block tells the assembly plugin to look for an assembly descriptor file (which is yet another specialized XML file format, naturally) named "plugins.xml" and execute its instructions during the phase where it's processing resources, coming in just before the later plugins.

In turn, the assembly descriptor looks like this:

<assembly
	xmlns="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.2"
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.2 http://maven.apache.org/xsd/assembly-1.1.2.xsd">
	<id>plugins</id>
	<formats>
		<format>dir</format>
	</formats>
	<includeBaseDirectory>false</includeBaseDirectory>
	<moduleSets>
		<moduleSet>
			<useAllReactorProjects>true</useAllReactorProjects>
			<includes>
				<include>*:*:jar:*</include>
			</includes>
			<binaries>
				<outputDirectory>/</outputDirectory>
				<unpack>false</unpack>
				<includeDependencies>true</includeDependencies>
			</binaries>
		</moduleSet>
	</moduleSets>
</assembly>

What this says is to include all of the modules (Maven artifacts) being processed in the current build that are packaged as Jars and copy them into the designated directory, where they will be picked up by the Tycho plugins down the line.

The result of this Rube Goldberg machine is that all of the applicable plugins in the current build (and their dependencies) are automatically gathered for inclusion in the update site, without having to maintain a specific list.

Missing Pieces

This process accomplishes a great deal automatically, alleviating the need to maintain MANIFEST.MF files or a repository configuration, but it doesn't cover quite everything that might be needed. For one, there's no feature project; the update site is just a bunch of plugins without features to go along with them. Honestly, I don't know if those are even required for most uses - Eclipse seems capable of consuming the site as-is. Secondly, though, the result isn't suitable for use in an old-style environment, so this isn't something you would go plugging into Designer. For that, you'd want a secondary project that wraps the plugins into a feature in an old-style update site, which would have to be done in a second Maven build. Regardless, this seems to get you most of the way, and saves a ton of hassle.

Tycho and Tribulations

Mar 14, 2015 3:02 PM

Tags: maven

For the last few weeks, a large part of my work has involved dealing with Maven, and much of that also involves Eclipse. It's been quite the learning experience - very frustrating much of the time, but valuable overall. In particular, any time OSGi comes into play, things get very complicated and arcane, in non-obvious ways. Fair warning: this blog post will likely start out as an even-keeled description of the task at hand and descending into just ranting about Maven.

The Actors

To start out, it's important to know that this sort of development involves three warring factions, each overlapping and having distinct views of the world. In theory, there are plugins that sort out all the differences, but this doesn't play out very smoothly in reality. Our players are:

  • Maven. This is the source of our trouble, but overall worth it. Maven is a build system for Java (and other) projects that brings with it great powers to do with dependency management, project organization, packaging, distribution, and any number of other things. I've been increasingly dealing with it, initially as an observer while the ODA team descended into madness to convert that project, and then Maven-izing my own framework. Its view of the world is of "artifacts" - conceptual units like junit, poi, or other dependencies, plus your own project components - organized in a tree of modules and available via repositories. Its build process is a multi-stage lifecycle with hooks for plugins at each step of the way.
  • Eclipse. Eclipse-the-IDE has its own view of how a Java project should be organized and built, and it doesn't involve Maven. There is a plugin for Eclipse and Maven, m2eclipse, that is meant to patch over these differences, but it can only go so far - while it helps Eclipse know a bit about Maven dependencies and its plugins, it's very dodgy and often involves trying to sync the Eclipse build configuration to be an imperfect representation of the Maven config.
  • OSGi. OSGi is a packaging, dependency, and runtime model, and is spoken natively by Eclipse (and Domino). However, it butts heads with Maven: they both cover the "packaging and dependencies" ground, and this creates a mess. Again, there are plugins to help bridge the gap, but these bring another layer of complexity and brittleness to the process.

Maven, Eclipse, and OSGi go together like oil, water, and a substance that dissolves in water but reacts explosively with oil.

OSGi's Plugins

The interaction with OSGi deserves a bit of further explanation. Unlike the Maven/Eclipse bridge, where there's basically one tool to work with, imperfect as it is, dealing with Maven+OSGi has two distinct plugins, which may or may not be required for your needs:

  • Tycho. This is the big one, intended to give Maven a thorough understanding of OSGi's view of the world, parsing the MANIFEST.MF files and hunting down dependencies using both an OSGi environment and Maven's normal scheme (if you tell it to correctly). If you're writing a full-on Eclipse/OSGi plugin/feature set (like ODA or my Framework), Tycho will be involved.
  • The Maven Bundle Plugin. This confusingly-named plugin is specifically referring to "bundle" in the OSGi sense, which is the elemental form of an OSGi plugin (the terminology begins to really overlap here). Its role is to take a non-OSGi project - say, a "normal" Maven project you or a third party wrote - and generate a MANIFEST.MF for you, allowing you to create an OSGi-friendly project usable as a dependency elsewhere.

These two projects, though often both required, are not related, and are crucially incompatible in one major way: Tycho's dependency resolution runs before the Bundle Plugin can do its job. So you can't, for example, have a Bundle project that generates an OSGi-friendly plugin and then depend on it in a "real" OSGi context inside the same Maven build. As far as I can tell, the "fix" is to separate these out into separate Maven projects. So, if you want to consume Maven projects and convert them into OSGi plugins without also manually managing plugin stuff and dependency copying, you have to make it a two-step process. The reason for this is that computers suck.

Eclipse and Maven

Throughout this sort of development, there's a constant gremlin on your back: the distinct worlds of Eclipse and Maven. Many changes to the pom.xml (Maven's project descriptor file) will prompt Eclipse to tell you that its project config is out of date and that you must click a menu item to sync it, which it refuses to do itself for some reason. Additionally, you will frequently run into a case where you'll paste in a block of Maven XML from somewhere and it will be legal for Maven, but Eclipse will complain about not having lifecycle support for it. If you're lucky, you can click the "quick fix" to download an adapter automatically, or failing that tell it to ignore that part. Other times, it'll give you some cryptic error about packaging or the like and offer no solution. The "fix" at that point is often to stop trying to do what you want to do.

Because of these and other conditions, it's fairly easy to get into a situation where the project will compile in Eclipse but not in Maven or vice-versa. Sometimes, this isn't too bad to fix, such as when you just need to add a dependency to a given project in Maven. Other times, things will get more arcane, requiring seeking out more blocks of Maven XML (this is a common task) to either let Maven or Eclipse know about the other, or to at least tell Eclipse to not bother trying to process part of the Maven project. This process is most similar to an adventure game, trying different combinations of plugins and pasted XML until it works or you quit and try a different career path.

Documentation

Capping these problems off is the peculiar nature of documentation for all this. From my experience, it comes in a couple forms:

  • Official documentation that is either a very basic getting-started tutorial or assumes you have a complete understanding of Maven's conceits and idioms to read what they're talking about.
  • Individual plugin pages with varing levels of thoroughness, and usually no mention of interaction with other components.
  • Blog posts and Stack Overflow questions from 3-5 years ago, half of which amount to "X doesn't work", and most of the rest of which contain blocks of XML to try pasting into your pom without much explanation.

After working with Maven long enough, you start developing a vague, disjointed understanding of how it works - how the "plugins" inside "build" differ from those inside "pluginManagement", for example - but it's slow going. It seems to be the sort of thing where you just have to pay your dues.

Conclusion For Now

Things are very gradually coming together, and the benefits of Maven are paying off as I start avoiding the pitfalls and implementing things like Jenkins. Once I properly sort out the projects I'm working on, I'll post more about what I learn to be the right ways to accomplish these goals, but for now my assessment remains "Maven is a huge PITA, but overall probably worth it".

Figuring Out Maven: Group/Artifact Names and Repositories

Dec 8, 2014 4:34 PM

Tags: maven

As I fiddle with Maven, I figure it may be useful to share my growing understanding of it - or at least preliminary assumptions. Any of these posts should not be taken as a true guide to learning Maven, since I'm just muddling through myself, but I suspect that my path will be similar to a lot of other Domino developers'.

The first thing I feel I grokked about Maven is its concept of repositories, mostly because it's the easiest concept I've run across. Repositories in Maven seem to match up nicely to their analogues in other environments, such as Eclipse Update Sites or Debian/Ubuntu apt repositories. There's the default "Maven Central" repository, which is similar to the main apt repositories: it contains a very large collection of software projects, available by group+artifact name. This is what you see on the pages for popular software projects: they mention the group/artifact pair and that's enough to use it.

For projects that aren't in Central, it's similar to adding a repo to Debian or an Update Site to Eclipse. You add some repository information to your project or the your user environment's settings.xml and then refer to the plugin similar to how you would with Central ones; Hibernate OGM is one such plugin.

In addition to remote repositories, there is also your local repository, stored in ~/.m2/repository. This contains any Maven projects where you built and ran install locally, and are then available to other Maven projects. This is how I handled my dependencies on the ExtLib and ODA: I ran Maven installs for each to add them to my local repository.

You can also download and store repositories of pre-built plugins locally, and the IBM Domino Update Site for Build Management is an example of this. The way to use this is to extract the ZIP file and then point to the updateSite directory in the same way that you would a remote repository, albeit with a file:// URL (in this case, ideally stored in a Maven environment variable).

The final aspect of this is the way bits of software are designated within a repository: by "group ID" and "artifact ID". The group ID seems like it should be globally unique, and tends to follow the reverse-DNS convention of Java package names. So a group ID might be something like "com.google.guava" or "com.igm.xsp.extlib". These don't have a specific analogue with OSGi development, but are effectively similar to the naming scheme for update site projects (even though Maven groups may contain OSGi update sites). Within a repository, individual projects, called "artifacts", are identified in a way that just needs to be unique in the repository, and it looks like conventions differ here. Sometimes, the artifacts have simple base names, like "guava" or "el", while other times they have OSGi-style full reverse-DNS names. I gather that the convention falls along OSGi lines: for generic projects, short names rule the day, while for OSGi-plugin projects, the name matches the plugin ID.

So... that's the easiest part! I'm slowly getting more of a grasp of other aspects of Maven, but at least repositories seem to make sense so far.

How I Maven-ized My Framework

Dec 8, 2014 10:31 AM

Tags: maven miasma

This past weekend, I decided to take a crack at Maven-izing the frostillic.us Framework (I should really update the README on there). If you're not familiar with it, Maven is a build system for Java projects, basically an alternative to the standard Eclipse way of doing things that we've all gotten pretty used to. Though I'm not in a position to be a strong advocate of it, I know that it has advantages in dependency-resolution and popularity (most Java projects seem to include a "you can get this from Maven Central" bit in their installation instructions), helps with the sort of continuous-integration stuff I think we're looking to do at OpenNTF, and has something of a "wave of the future" vibe to it, at least for our community: IBM's open-source releases have all been Maven-ized.

A month or so ago, Nathan went through something of a trial by fire Maven-izing the OpenNTF Domino API (present in the dev branches). Converting an existing project can be something of a bear, scaling exponentially with the complexity of the original project. Fortunately, thanks to his (and others', like Roland's) work, the ODA is nicely converted and was useful as a template for me.

In my case, the Framework is a much-simpler project: a single plugin, a feature, and an update site. It was almost a textbook example of how to Maven-ize an OSGi plugin, except for three dependencies: on the ODA, on the Extension Library, and, as with both of those, on the underlying Domino/XPages plugins. Fortunately, my laziness on the matter paid off, since not only is the ODA Maven-ized, but IBM has put their Maven-ized ExtLib right on GitHub and, better still, released a packaged Maven repository of the required XSP framework components. So everything was in place to make my journey smooth. It was, however, not smooth, and I have a set of hastily-scrawled notes that I will translate into a recounting of the hurdles I faced.

Preparing for the Journey

First off, if you're going to Maven-ize a project, you'll need a few things. If it's an XPages project, you'll likely need the above-linked IBM Domino Update Site. This should go, basically, "somewhere on your drive". IBM seems to have adopted the convention internally of putting it in C:\updateSite. However, since I use a good computer, I have no C drive and this didn't apply to me - instead, I adopted a strategy seen in projects like this one, where the path is defined in a variable. This is a good introduction to a core concept with Maven: it's basically a parallel universe to Eclipse. This nature takes many forms, ranging from its off-and-on interaction with the workspace to its naming scheme for everything; Eclipse's built-in Maven tools are a particularly-thin wrapper around the command-line environment. But for now the thing to know is that this environment variable is not an Eclipse variable; it comes from Maven's settings.xml, which is housed at ~/.m2/settings.xml. It doesn't exist by default, so I made a new one:

<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
      xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
      xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0
                          http://maven.apache.org/xsd/settings-1.0.0.xsd">

    <profiles>
        <profile>
            <id>main</id>
            <properties>
                <notes-platform>file:///Users/jesse/Documents/Java/IBM/UpdateSite</notes-platform>
            </properties>
        </profile>
    </profiles>
    <activeProfiles>
        <activeProfile>main</activeProfile>
    </activeProfiles>
</settings>

I'm not sure that that's the best way to do it, but it works. The gist of it is that you can fill in the properties block with arbitrarily-named environment variables.

Secondly, you'll need a decent tutorial. I found this one and its followups to do well. Not everything fit (I didn't treat the update site the same way), but it was a good starting point. In particular, you'll need Tycho, which is explained there. Tycho is a plugin to Maven that gives it some knowledge of Eclipse/OSGi plugin development.

Third, you'll need some examples. Now that my Framework is converted, you can use that, and the projects linked above are even better (albeit more complex). There were plenty of times where my troubleshooting just involved looking at my stuff and figuring out where it was different from the others.

Finally, if your experience ends up anything like mine, you'll want something like this.

Prepping Dependencies

Since my project depended on the ExtLib and ODA, I had to get those in the local repository first. As I found, it's not enough to merely have the projects built in your workspace, as it is when doing non-Maven OSGi development - they have to be "installed" in your local repository (~/.m2/repository). Though the Extension Library is larger, it's slightly easier to do. I cloned the ExtLib repository (technically, I cloned my fork of it) and imported the projects into the Eclipse workspace using Import → Maven → Existing Maven Projects. By pointing that to the repository root, I got a nice Maven tree of the projects and imported them all into a new working set. Maven, like many things, likes to use a tree structure for its projects; this allows it to know about module dependencies and provides inheritance of configuration (there's a LOT of configuration, so this helps). Unfortunately, Eclipse doesn't represent this hierarchy in the Project Explorer; though you can see the other projects inside the container projects, they also appear on their own, so you get this weird sort of doubled-up effect and you just have to know what the top-level project you want is. In this case, it's named well: com.ibm.xsp.extlib.parent.

So once you've found that in the sea of other projects (incidentally, this is why I like to click on the little triangle on top of the Project Explorer view and set Top Level Elements to Working Sets), there's one change to make, unless you happened to put the Update Site from earlier at C:\updateSite. If you didn't, open up the pom.xml file (that's the main Maven config file for each project) and change the url on line 28 to <url>${notes-platform}</url>. After that, you can right-click the project and go to Run As → Maven Install. If it prompts you with some stuff, do what the tutorial above does ("install verify" or something). This is an aspect of the thin wrapper: though you're really building, the Maven tasks take the form of Run Configurations. You just have to get used to it.

Once you do that, maybe it'll work! You'll get a console window that pops up and runs through a slew of fetching and building tasks. If all goes well, there'll be a cheery "BUILD SUCCESS" near the bottom. If not, it'll be time for troubleshooting. The first step for any Maven troubleshooting is to right-click the project and go to Maven → Update Project, check all applicable projects, and let it do its thing. You'll be doing that a lot - it's your main go-to "this is weird" troubleshooting step, like Project → Clean for a misbehaving XPage app. If the build still fails, it's likely a problem with your Update Site location. Um, good luck.

Next up comes the ODA, if you're using that. As before, it's best to clone the repository from GitHub (using one of the dev branches, like Nathan's or mine) and import the Maven projects. There's good news and bad news compared to the ExtLib. The good news is that it already uses ${notes-platform} for the repository location, so you're set there. The bad news is that trying to install from the main domino parent project doesn't work - it fails on the update site for some reason. So instead, I had to install each part in turn. In particular, you'll need "externals" (covers a lot of dependencies), "org.openntf.junit4xpages", "org.openntf.formula", and "org.openntf.domino".

Converting the Projects

Okay! So, now we can actually start! For the plugin project, the first page of the tutorial works word-for-word. One thing to note is that the "eclipse-plugin" option isn't actually in the Packaging drop-down; you just have to type it in. Again: thin wrapper. It may not work immediately after following the directions, but the divergences are generally due to the non-standard Domino-related dependencies. In particular, I ran into trouble with forbidden-access rules in Notes.jar - Maven, being a separate world, ignores your Eclipse preferences on the matter. To get around that, I added the parts in the plugin block of this pom.xml - among other things, they tell the compiler to ignore such problems. I still ran into trouble with lotus.domino.local.NotesBase specifically after the other classes started working, and I "solved" that by deleting the code (it was related to recycle checking, which I no longer need).

It may also be useful to change build.properties so that the output.. = bin/ line reads output.. = target/classes. I don't know if this is actually used, but it was a troubleshooting step I took elsewhere and it makes conceptual sense: Maven puts its output classes in target/classes, not bin.

During this process, I quickly realized the value of having a parent project. I had a hitch in mine in that I wanted to call the parent frostillicus.framework, which meant renaming the plugin to frostillicus.framework.plugin and dealing with the associated updating of Eclipse and git, but that was an unforced error. The normal layout of parent projects seems to be that they're parents both conceptually by pom.xml and also physically by folder structure. I haven't done the latter yet, and the process works just as well if you don't. Still, I should move it eventually. So, following the third part of the tutorial, I created a near-empty project (no Java code) just to house the pom.xml with common settings and told it to adopt the plugin as a child. Converting the feature project was the easiest step and went exactly as described in the tutorial.

Where I diverged from both the tutorial and ODA is in the Update Site. The tutorial suggests renaming site.xml to category.xml and using the Maven type eclipse-repository, but none of the examples I used did that. Instead, I followed those projects and left site.xml as-is (other than making sure that the versions in the XML source use ".qualifier" instead of any timestamp from building) and used the Maven type eclipse-update-site in the pom.xml.

I then spent about two hours pulling my hair out over bizarre problems I had wherein the update site would build but not actually include the compiled classes in the plugin jar if I clicked on "Build" in the site.xml editor and would fail with bizarre error messages if I did Run As → Maven Install. I'll spare you the tribulations and cut to the chase: my problem was that I had the modules in the parent project's pom.xml out of order, with the update site coming before the feature project. When I fixed that, I was able to start building the site the "Maven way". Which is to say: not using the site.xml's Build button (which still had the same problem for me), but using Run As → Maven Install. This ends up putting the built update site inside the target/site directory rather than directly in plugins and features folders. This is a case of "okay, sure" again.

Conclusion

So, after a tremendous amount of suffering and bafflement, I have a converted project! So what does it buy me? Not much, currently, but it feels good, and I had to learn this stuff eventually one way or another. Over the process, some aspects of Maven started to crystallize in my mind - the repositories, the dependencies, the module trees - and that helps me understand why other Maven-ized projects look the way they do. Other aspects are still beyond my ken (like most of the terminology), but it's a step in the process. This should also mean I'm closer to ready for future build processes and am more in line with the larger Java world.

If you have a similar project, I'd say it's not required that you make the switch, but if you're planning on working on larger projects that use Maven, it'd be a good idea. Maven takes a lot of getting used to, since everything feels like it's a from-scratch rethinking of the way to structure Java projects with no regard to the structure or terminology of "normal" Eclipse/OSGi development, and something like this conversion is as good a start down the path as any.