I Hadn't Thought I'd End The Year Getting Into WebSphere

Wed Dec 31 15:47:01 EST 2014

Tags: oda websphere huh

...yet here we are. It's been a very interesting week in ODA circles. It started with Daniele Vistalli being curious if it was possible to run the OpenNTF Domino API on WebSphere, specifically the surprisingly-friendly Liberty profile. And not just curious: it turned out he had the chops to make it happen. Before too long, with the help of the ODA team, he wrote an extension to the WebSphere server that spins up native (lotus.domino.local) Notes RPC connections using Notes.jar and thus provides a functioning entrypoint to using ODA.

The conversation and possibilities spun out from there. It's too early to talk about too many specifics, but it's very exciting. After the initial bulk of the work, I was able to pitch in a bit yesterday and help figure out how to make this happen (click to view the full size):

CrossWorlds on OS X

That's a JSF app being built and deployed to a WebSphere Liberty server managed in Eclipse on the local machine, iterating over the contents of a DbDirectory pointed at a remote server. And it all happens to be running natively on the Mac - no VM involved.

So, like I said: it's been an interesting week. There are more possibilities than just this case, but even this alone holds tremendous promise: connecting to NSF data with the true, full API (brought into the modern day by ODA) but with the full array of J2EE support available.

Happy New Year!

Musing About Web App Structure

Tue Dec 16 18:25:35 EST 2014

Tags: angular rest

I've been giving a lot of thought lately to how I feel about different structures for web apps. Specifically, in this case, the "structure" I'm thinking about is the "client-server" balance in the app itself and the associated method of data access.

The impetus has been my minor fiddling with Angular over the past half a year or so. If you're not familiar with it, Angular is a client-side JavaScript framework focused on building full-fledged apps that run in the browser. The way it (and frameworks like it) distinguish themselves from more server-heavy methods of development is that the primary logic of running the UI happens in client-side code - resources and data are fetched from the server (including HTML snippets), but little or no presentation logic takes place there. You should go read Marky Roden's posts on the topic and then attend his session at ConnectED.

For a while, I thought of this as an interesting idea, but mostly just another method of writing apps, like PHP vs. Ruby vs. XPages. However, the idea kept churning away in the back of my brain, offering itself up as the right answer as I started writing the REST APIs in my Framework (I swear I'll get to that final post in the series eventually) and thought more about mobile development. The thing that's really latched onto my psyche isn't that Angular or JavaScript are particularly compelling, but the idea of writing a decoupled app is. Writing an app in Angular isn't really like writing an app in XPages or PHP - instead, it's (at a very high level) like writing a native mobile or desktop application. You split your app into two main tiers: the model/data/connection tier on the server and the app tier on the client.

That is something that is starting to seem really right to me. Even if it did nothing else, having an architecture like that forces you to think in terms of "services" instead of just data access - once you've written a model layer that provides REST services with access and validation rules enforced, then you have a single interface that can be used by browser-based Angular apps, mobile and desktop apps, and remote applications and services you didn't write. There's also no reason you couldn't write a server-side app that consumes those services, continuing to build XPages apps that don't use the normal data sources. This is sort of what my framework has evolved towards, just with a first-among-equals twist where the XPage app gets direct Java access to the same objects REST serves up.

Once you have this sort of setup, the answers to what your data-access and client UI frameworks should be both become "whatever". Want to write your REST API in Node and consume it in XPages? Sure, go ahead. Decide later that you'd rather have the data served up by Rails? Can do - the XPages side wouldn't need a change if the API is the same. Similarly, if you want to swap out XPages for Angular served up as static files or, god forbid, a PHP app, the way is smoothed.

Even though it starts as a small thing - switching from accessing data "directly" on the server to always thinking about that REST-API abstraction - this really seems compelling to me. Not so compelling that I've actually written any non-test apps this way yet, but I've opened the door for myself with my framework.

Figuring Out Maven: Group/Artifact Names and Repositories

Mon Dec 08 16:34:11 EST 2014

Tags: maven

As I fiddle with Maven, I figure it may be useful to share my growing understanding of it - or at least preliminary assumptions. Any of these posts should not be taken as a true guide to learning Maven, since I'm just muddling through myself, but I suspect that my path will be similar to a lot of other Domino developers'.

The first thing I feel I grokked about Maven is its concept of repositories, mostly because it's the easiest concept I've run across. Repositories in Maven seem to match up nicely to their analogues in other environments, such as Eclipse Update Sites or Debian/Ubuntu apt repositories. There's the default "Maven Central" repository, which is similar to the main apt repositories: it contains a very large collection of software projects, available by group+artifact name. This is what you see on the pages for popular software projects: they mention the group/artifact pair and that's enough to use it.

For projects that aren't in Central, it's similar to adding a repo to Debian or an Update Site to Eclipse. You add some repository information to your project or the your user environment's settings.xml and then refer to the plugin similar to how you would with Central ones; Hibernate OGM is one such plugin.

In addition to remote repositories, there is also your local repository, stored in ~/.m2/repository. This contains any Maven projects where you built and ran install locally, and are then available to other Maven projects. This is how I handled my dependencies on the ExtLib and ODA: I ran Maven installs for each to add them to my local repository.

You can also download and store repositories of pre-built plugins locally, and the IBM Domino Update Site for Build Management is an example of this. The way to use this is to extract the ZIP file and then point to the updateSite directory in the same way that you would a remote repository, albeit with a file:// URL (in this case, ideally stored in a Maven environment variable).

The final aspect of this is the way bits of software are designated within a repository: by "group ID" and "artifact ID". The group ID seems like it should be globally unique, and tends to follow the reverse-DNS convention of Java package names. So a group ID might be something like "com.google.guava" or "com.igm.xsp.extlib". These don't have a specific analogue with OSGi development, but are effectively similar to the naming scheme for update site projects (even though Maven groups may contain OSGi update sites). Within a repository, individual projects, called "artifacts", are identified in a way that just needs to be unique in the repository, and it looks like conventions differ here. Sometimes, the artifacts have simple base names, like "guava" or "el", while other times they have OSGi-style full reverse-DNS names. I gather that the convention falls along OSGi lines: for generic projects, short names rule the day, while for OSGi-plugin projects, the name matches the plugin ID.

So... that's the easiest part! I'm slowly getting more of a grasp of other aspects of Maven, but at least repositories seem to make sense so far.

How I Maven-ized My Framework

Mon Dec 08 10:31:14 EST 2014

Tags: maven miasma

This past weekend, I decided to take a crack at Maven-izing the frostillic.us Framework (I should really update the README on there). If you're not familiar with it, Maven is a build system for Java projects, basically an alternative to the standard Eclipse way of doing things that we've all gotten pretty used to. Though I'm not in a position to be a strong advocate of it, I know that it has advantages in dependency-resolution and popularity (most Java projects seem to include a "you can get this from Maven Central" bit in their installation instructions), helps with the sort of continuous-integration stuff I think we're looking to do at OpenNTF, and has something of a "wave of the future" vibe to it, at least for our community: IBM's open-source releases have all been Maven-ized.

A month or so ago, Nathan went through something of a trial by fire Maven-izing the OpenNTF Domino API (present in the dev branches). Converting an existing project can be something of a bear, scaling exponentially with the complexity of the original project. Fortunately, thanks to his (and others', like Roland's) work, the ODA is nicely converted and was useful as a template for me.

In my case, the Framework is a much-simpler project: a single plugin, a feature, and an update site. It was almost a textbook example of how to Maven-ize an OSGi plugin, except for three dependencies: on the ODA, on the Extension Library, and, as with both of those, on the underlying Domino/XPages plugins. Fortunately, my laziness on the matter paid off, since not only is the ODA Maven-ized, but IBM has put their Maven-ized ExtLib right on GitHub and, better still, released a packaged Maven repository of the required XSP framework components. So everything was in place to make my journey smooth. It was, however, not smooth, and I have a set of hastily-scrawled notes that I will translate into a recounting of the hurdles I faced.

Preparing for the Journey

First off, if you're going to Maven-ize a project, you'll need a few things. If it's an XPages project, you'll likely need the above-linked IBM Domino Update Site. This should go, basically, "somewhere on your drive". IBM seems to have adopted the convention internally of putting it in C:\updateSite. However, since I use a good computer, I have no C drive and this didn't apply to me - instead, I adopted a strategy seen in projects like this one, where the path is defined in a variable. This is a good introduction to a core concept with Maven: it's basically a parallel universe to Eclipse. This nature takes many forms, ranging from its off-and-on interaction with the workspace to its naming scheme for everything; Eclipse's built-in Maven tools are a particularly-thin wrapper around the command-line environment. But for now the thing to know is that this environment variable is not an Eclipse variable; it comes from Maven's settings.xml, which is housed at ~/.m2/settings.xml. It doesn't exist by default, so I made a new one:

<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
      xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
      xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0
                          http://maven.apache.org/xsd/settings-1.0.0.xsd">

    <profiles>
        <profile>
            <id>main</id>
            <properties>
                <notes-platform>file:///Users/jesse/Documents/Java/IBM/UpdateSite</notes-platform>
            </properties>
        </profile>
    </profiles>
    <activeProfiles>
        <activeProfile>main</activeProfile>
    </activeProfiles>
</settings>

I'm not sure that that's the best way to do it, but it works. The gist of it is that you can fill in the properties block with arbitrarily-named environment variables.

Secondly, you'll need a decent tutorial. I found this one and its followups to do well. Not everything fit (I didn't treat the update site the same way), but it was a good starting point. In particular, you'll need Tycho, which is explained there. Tycho is a plugin to Maven that gives it some knowledge of Eclipse/OSGi plugin development.

Third, you'll need some examples. Now that my Framework is converted, you can use that, and the projects linked above are even better (albeit more complex). There were plenty of times where my troubleshooting just involved looking at my stuff and figuring out where it was different from the others.

Finally, if your experience ends up anything like mine, you'll want something like this.

Prepping Dependencies

Since my project depended on the ExtLib and ODA, I had to get those in the local repository first. As I found, it's not enough to merely have the projects built in your workspace, as it is when doing non-Maven OSGi development - they have to be "installed" in your local repository (~/.m2/repository). Though the Extension Library is larger, it's slightly easier to do. I cloned the ExtLib repository (technically, I cloned my fork of it) and imported the projects into the Eclipse workspace using Import → Maven → Existing Maven Projects. By pointing that to the repository root, I got a nice Maven tree of the projects and imported them all into a new working set. Maven, like many things, likes to use a tree structure for its projects; this allows it to know about module dependencies and provides inheritance of configuration (there's a LOT of configuration, so this helps). Unfortunately, Eclipse doesn't represent this hierarchy in the Project Explorer; though you can see the other projects inside the container projects, they also appear on their own, so you get this weird sort of doubled-up effect and you just have to know what the top-level project you want is. In this case, it's named well: com.ibm.xsp.extlib.parent.

So once you've found that in the sea of other projects (incidentally, this is why I like to click on the little triangle on top of the Project Explorer view and set Top Level Elements to Working Sets), there's one change to make, unless you happened to put the Update Site from earlier at C:\updateSite. If you didn't, open up the pom.xml file (that's the main Maven config file for each project) and change the url on line 28 to <url>${notes-platform}</url>. After that, you can right-click the project and go to Run As → Maven Install. If it prompts you with some stuff, do what the tutorial above does ("install verify" or something). This is an aspect of the thin wrapper: though you're really building, the Maven tasks take the form of Run Configurations. You just have to get used to it.

Once you do that, maybe it'll work! You'll get a console window that pops up and runs through a slew of fetching and building tasks. If all goes well, there'll be a cheery "BUILD SUCCESS" near the bottom. If not, it'll be time for troubleshooting. The first step for any Maven troubleshooting is to right-click the project and go to Maven → Update Project, check all applicable projects, and let it do its thing. You'll be doing that a lot - it's your main go-to "this is weird" troubleshooting step, like Project → Clean for a misbehaving XPage app. If the build still fails, it's likely a problem with your Update Site location. Um, good luck.

Next up comes the ODA, if you're using that. As before, it's best to clone the repository from GitHub (using one of the dev branches, like Nathan's or mine) and import the Maven projects. There's good news and bad news compared to the ExtLib. The good news is that it already uses ${notes-platform} for the repository location, so you're set there. The bad news is that trying to install from the main domino parent project doesn't work - it fails on the update site for some reason. So instead, I had to install each part in turn. In particular, you'll need "externals" (covers a lot of dependencies), "org.openntf.junit4xpages", "org.openntf.formula", and "org.openntf.domino".

Converting the Projects

Okay! So, now we can actually start! For the plugin project, the first page of the tutorial works word-for-word. One thing to note is that the "eclipse-plugin" option isn't actually in the Packaging drop-down; you just have to type it in. Again: thin wrapper. It may not work immediately after following the directions, but the divergences are generally due to the non-standard Domino-related dependencies. In particular, I ran into trouble with forbidden-access rules in Notes.jar - Maven, being a separate world, ignores your Eclipse preferences on the matter. To get around that, I added the parts in the plugin block of this pom.xml - among other things, they tell the compiler to ignore such problems. I still ran into trouble with lotus.domino.local.NotesBase specifically after the other classes started working, and I "solved" that by deleting the code (it was related to recycle checking, which I no longer need).

It may also be useful to change build.properties so that the output.. = bin/ line reads output.. = target/classes. I don't know if this is actually used, but it was a troubleshooting step I took elsewhere and it makes conceptual sense: Maven puts its output classes in target/classes, not bin.

During this process, I quickly realized the value of having a parent project. I had a hitch in mine in that I wanted to call the parent frostillicus.framework, which meant renaming the plugin to frostillicus.framework.plugin and dealing with the associated updating of Eclipse and git, but that was an unforced error. The normal layout of parent projects seems to be that they're parents both conceptually by pom.xml and also physically by folder structure. I haven't done the latter yet, and the process works just as well if you don't. Still, I should move it eventually. So, following the third part of the tutorial, I created a near-empty project (no Java code) just to house the pom.xml with common settings and told it to adopt the plugin as a child. Converting the feature project was the easiest step and went exactly as described in the tutorial.

Where I diverged from both the tutorial and ODA is in the Update Site. The tutorial suggests renaming site.xml to category.xml and using the Maven type eclipse-repository, but none of the examples I used did that. Instead, I followed those projects and left site.xml as-is (other than making sure that the versions in the XML source use ".qualifier" instead of any timestamp from building) and used the Maven type eclipse-update-site in the pom.xml.

I then spent about two hours pulling my hair out over bizarre problems I had wherein the update site would build but not actually include the compiled classes in the plugin jar if I clicked on "Build" in the site.xml editor and would fail with bizarre error messages if I did Run As → Maven Install. I'll spare you the tribulations and cut to the chase: my problem was that I had the modules in the parent project's pom.xml out of order, with the update site coming before the feature project. When I fixed that, I was able to start building the site the "Maven way". Which is to say: not using the site.xml's Build button (which still had the same problem for me), but using Run As → Maven Install. This ends up putting the built update site inside the target/site directory rather than directly in plugins and features folders. This is a case of "okay, sure" again.

Conclusion

So, after a tremendous amount of suffering and bafflement, I have a converted project! So what does it buy me? Not much, currently, but it feels good, and I had to learn this stuff eventually one way or another. Over the process, some aspects of Maven started to crystallize in my mind - the repositories, the dependencies, the module trees - and that helps me understand why other Maven-ized projects look the way they do. Other aspects are still beyond my ken (like most of the terminology), but it's a step in the process. This should also mean I'm closer to ready for future build processes and am more in line with the larger Java world.

If you have a similar project, I'd say it's not required that you make the switch, but if you're planning on working on larger projects that use Maven, it'd be a good idea. Maven takes a lot of getting used to, since everything feels like it's a from-scratch rethinking of the way to structure Java projects with no regard to the structure or terminology of "normal" Eclipse/OSGi development, and something like this conversion is as good a start down the path as any.