I Hadn't Thought I'd End The Year Getting Into WebSphere

Wed Dec 31 15:47:01 EST 2014

Tags: oda websphere huh

...yet here we are. It's been a very interesting week in ODA circles. It started with Daniele Vistalli being curious if it was possible to run the OpenNTF Domino API on WebSphere, specifically the surprisingly-friendly Liberty profile. And not just curious: it turned out he had the chops to make it happen. Before too long, with the help of the ODA team, he wrote an extension to the WebSphere server that spins up native (lotus.domino.local) Notes RPC connections using Notes.jar and thus provides a functioning entrypoint to using ODA.

The conversation and possibilities spun out from there. It's too early to talk about too many specifics, but it's very exciting. After the initial bulk of the work, I was able to pitch in a bit yesterday and help figure out how to make this happen (click to view the full size):

CrossWorlds on OS X

That's a JSF app being built and deployed to a WebSphere Liberty server managed in Eclipse on the local machine, iterating over the contents of a DbDirectory pointed at a remote server. And it all happens to be running natively on the Mac - no VM involved.

So, like I said: it's been an interesting week. There are more possibilities than just this case, but even this alone holds tremendous promise: connecting to NSF data with the true, full API (brought into the modern day by ODA) but with the full array of J2EE support available.

Happy New Year!

Musing About Web App Structure

Tue Dec 16 18:25:35 EST 2014

Tags: angular rest

I've been giving a lot of thought lately to how I feel about different structures for web apps. Specifically, in this case, the "structure" I'm thinking about is the "client-server" balance in the app itself and the associated method of data access.

The impetus has been my minor fiddling with Angular over the past half a year or so. If you're not familiar with it, Angular is a client-side JavaScript framework focused on building full-fledged apps that run in the browser. The way it (and frameworks like it) distinguish themselves from more server-heavy methods of development is that the primary logic of running the UI happens in client-side code - resources and data are fetched from the server (including HTML snippets), but little or no presentation logic takes place there. You should go read Marky Roden's posts on the topic and then attend his session at ConnectED.

For a while, I thought of this as an interesting idea, but mostly just another method of writing apps, like PHP vs. Ruby vs. XPages. However, the idea kept churning away in the back of my brain, offering itself up as the right answer as I started writing the REST APIs in my Framework (I swear I'll get to that final post in the series eventually) and thought more about mobile development. The thing that's really latched onto my psyche isn't that Angular or JavaScript are particularly compelling, but the idea of writing a decoupled app is. Writing an app in Angular isn't really like writing an app in XPages or PHP - instead, it's (at a very high level) like writing a native mobile or desktop application. You split your app into two main tiers: the model/data/connection tier on the server and the app tier on the client.

That is something that is starting to seem really right to me. Even if it did nothing else, having an architecture like that forces you to think in terms of "services" instead of just data access - once you've written a model layer that provides REST services with access and validation rules enforced, then you have a single interface that can be used by browser-based Angular apps, mobile and desktop apps, and remote applications and services you didn't write. There's also no reason you couldn't write a server-side app that consumes those services, continuing to build XPages apps that don't use the normal data sources. This is sort of what my framework has evolved towards, just with a first-among-equals twist where the XPage app gets direct Java access to the same objects REST serves up.

Once you have this sort of setup, the answers to what your data-access and client UI frameworks should be both become "whatever". Want to write your REST API in Node and consume it in XPages? Sure, go ahead. Decide later that you'd rather have the data served up by Rails? Can do - the XPages side wouldn't need a change if the API is the same. Similarly, if you want to swap out XPages for Angular served up as static files or, god forbid, a PHP app, the way is smoothed.

Even though it starts as a small thing - switching from accessing data "directly" on the server to always thinking about that REST-API abstraction - this really seems compelling to me. Not so compelling that I've actually written any non-test apps this way yet, but I've opened the door for myself with my framework.

Figuring Out Maven: Group/Artifact Names and Repositories

Mon Dec 08 16:34:11 EST 2014

Tags: maven

As I fiddle with Maven, I figure it may be useful to share my growing understanding of it - or at least preliminary assumptions. Any of these posts should not be taken as a true guide to learning Maven, since I'm just muddling through myself, but I suspect that my path will be similar to a lot of other Domino developers'.

The first thing I feel I grokked about Maven is its concept of repositories, mostly because it's the easiest concept I've run across. Repositories in Maven seem to match up nicely to their analogues in other environments, such as Eclipse Update Sites or Debian/Ubuntu apt repositories. There's the default "Maven Central" repository, which is similar to the main apt repositories: it contains a very large collection of software projects, available by group+artifact name. This is what you see on the pages for popular software projects: they mention the group/artifact pair and that's enough to use it.

For projects that aren't in Central, it's similar to adding a repo to Debian or an Update Site to Eclipse. You add some repository information to your project or the your user environment's settings.xml and then refer to the plugin similar to how you would with Central ones; Hibernate OGM is one such plugin.

In addition to remote repositories, there is also your local repository, stored in ~/.m2/repository. This contains any Maven projects where you built and ran install locally, and are then available to other Maven projects. This is how I handled my dependencies on the ExtLib and ODA: I ran Maven installs for each to add them to my local repository.

You can also download and store repositories of pre-built plugins locally, and the IBM Domino Update Site for Build Management is an example of this. The way to use this is to extract the ZIP file and then point to the updateSite directory in the same way that you would a remote repository, albeit with a file:// URL (in this case, ideally stored in a Maven environment variable).

The final aspect of this is the way bits of software are designated within a repository: by "group ID" and "artifact ID". The group ID seems like it should be globally unique, and tends to follow the reverse-DNS convention of Java package names. So a group ID might be something like "com.google.guava" or "com.igm.xsp.extlib". These don't have a specific analogue with OSGi development, but are effectively similar to the naming scheme for update site projects (even though Maven groups may contain OSGi update sites). Within a repository, individual projects, called "artifacts", are identified in a way that just needs to be unique in the repository, and it looks like conventions differ here. Sometimes, the artifacts have simple base names, like "guava" or "el", while other times they have OSGi-style full reverse-DNS names. I gather that the convention falls along OSGi lines: for generic projects, short names rule the day, while for OSGi-plugin projects, the name matches the plugin ID.

So... that's the easiest part! I'm slowly getting more of a grasp of other aspects of Maven, but at least repositories seem to make sense so far.

How I Maven-ized My Framework

Mon Dec 08 10:31:14 EST 2014

Tags: maven miasma

This past weekend, I decided to take a crack at Maven-izing the frostillic.us Framework (I should really update the README on there). If you're not familiar with it, Maven is a build system for Java projects, basically an alternative to the standard Eclipse way of doing things that we've all gotten pretty used to. Though I'm not in a position to be a strong advocate of it, I know that it has advantages in dependency-resolution and popularity (most Java projects seem to include a "you can get this from Maven Central" bit in their installation instructions), helps with the sort of continuous-integration stuff I think we're looking to do at OpenNTF, and has something of a "wave of the future" vibe to it, at least for our community: IBM's open-source releases have all been Maven-ized.

A month or so ago, Nathan went through something of a trial by fire Maven-izing the OpenNTF Domino API (present in the dev branches). Converting an existing project can be something of a bear, scaling exponentially with the complexity of the original project. Fortunately, thanks to his (and others', like Roland's) work, the ODA is nicely converted and was useful as a template for me.

In my case, the Framework is a much-simpler project: a single plugin, a feature, and an update site. It was almost a textbook example of how to Maven-ize an OSGi plugin, except for three dependencies: on the ODA, on the Extension Library, and, as with both of those, on the underlying Domino/XPages plugins. Fortunately, my laziness on the matter paid off, since not only is the ODA Maven-ized, but IBM has put their Maven-ized ExtLib right on GitHub and, better still, released a packaged Maven repository of the required XSP framework components. So everything was in place to make my journey smooth. It was, however, not smooth, and I have a set of hastily-scrawled notes that I will translate into a recounting of the hurdles I faced.

Preparing for the Journey

First off, if you're going to Maven-ize a project, you'll need a few things. If it's an XPages project, you'll likely need the above-linked IBM Domino Update Site. This should go, basically, "somewhere on your drive". IBM seems to have adopted the convention internally of putting it in C:\updateSite. However, since I use a good computer, I have no C drive and this didn't apply to me - instead, I adopted a strategy seen in projects like this one, where the path is defined in a variable. This is a good introduction to a core concept with Maven: it's basically a parallel universe to Eclipse. This nature takes many forms, ranging from its off-and-on interaction with the workspace to its naming scheme for everything; Eclipse's built-in Maven tools are a particularly-thin wrapper around the command-line environment. But for now the thing to know is that this environment variable is not an Eclipse variable; it comes from Maven's settings.xml, which is housed at ~/.m2/settings.xml. It doesn't exist by default, so I made a new one:

<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
      xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
      xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0
                          http://maven.apache.org/xsd/settings-1.0.0.xsd">

    <profiles>
        <profile>
            <id>main</id>
            <properties>
                <notes-platform>file:///Users/jesse/Documents/Java/IBM/UpdateSite</notes-platform>
            </properties>
        </profile>
    </profiles>
    <activeProfiles>
        <activeProfile>main</activeProfile>
    </activeProfiles>
</settings>

I'm not sure that that's the best way to do it, but it works. The gist of it is that you can fill in the properties block with arbitrarily-named environment variables.

Secondly, you'll need a decent tutorial. I found this one and its followups to do well. Not everything fit (I didn't treat the update site the same way), but it was a good starting point. In particular, you'll need Tycho, which is explained there. Tycho is a plugin to Maven that gives it some knowledge of Eclipse/OSGi plugin development.

Third, you'll need some examples. Now that my Framework is converted, you can use that, and the projects linked above are even better (albeit more complex). There were plenty of times where my troubleshooting just involved looking at my stuff and figuring out where it was different from the others.

Finally, if your experience ends up anything like mine, you'll want something like this.

Prepping Dependencies

Since my project depended on the ExtLib and ODA, I had to get those in the local repository first. As I found, it's not enough to merely have the projects built in your workspace, as it is when doing non-Maven OSGi development - they have to be "installed" in your local repository (~/.m2/repository). Though the Extension Library is larger, it's slightly easier to do. I cloned the ExtLib repository (technically, I cloned my fork of it) and imported the projects into the Eclipse workspace using Import → Maven → Existing Maven Projects. By pointing that to the repository root, I got a nice Maven tree of the projects and imported them all into a new working set. Maven, like many things, likes to use a tree structure for its projects; this allows it to know about module dependencies and provides inheritance of configuration (there's a LOT of configuration, so this helps). Unfortunately, Eclipse doesn't represent this hierarchy in the Project Explorer; though you can see the other projects inside the container projects, they also appear on their own, so you get this weird sort of doubled-up effect and you just have to know what the top-level project you want is. In this case, it's named well: com.ibm.xsp.extlib.parent.

So once you've found that in the sea of other projects (incidentally, this is why I like to click on the little triangle on top of the Project Explorer view and set Top Level Elements to Working Sets), there's one change to make, unless you happened to put the Update Site from earlier at C:\updateSite. If you didn't, open up the pom.xml file (that's the main Maven config file for each project) and change the url on line 28 to <url>${notes-platform}</url>. After that, you can right-click the project and go to Run As → Maven Install. If it prompts you with some stuff, do what the tutorial above does ("install verify" or something). This is an aspect of the thin wrapper: though you're really building, the Maven tasks take the form of Run Configurations. You just have to get used to it.

Once you do that, maybe it'll work! You'll get a console window that pops up and runs through a slew of fetching and building tasks. If all goes well, there'll be a cheery "BUILD SUCCESS" near the bottom. If not, it'll be time for troubleshooting. The first step for any Maven troubleshooting is to right-click the project and go to Maven → Update Project, check all applicable projects, and let it do its thing. You'll be doing that a lot - it's your main go-to "this is weird" troubleshooting step, like Project → Clean for a misbehaving XPage app. If the build still fails, it's likely a problem with your Update Site location. Um, good luck.

Next up comes the ODA, if you're using that. As before, it's best to clone the repository from GitHub (using one of the dev branches, like Nathan's or mine) and import the Maven projects. There's good news and bad news compared to the ExtLib. The good news is that it already uses ${notes-platform} for the repository location, so you're set there. The bad news is that trying to install from the main domino parent project doesn't work - it fails on the update site for some reason. So instead, I had to install each part in turn. In particular, you'll need "externals" (covers a lot of dependencies), "org.openntf.junit4xpages", "org.openntf.formula", and "org.openntf.domino".

Converting the Projects

Okay! So, now we can actually start! For the plugin project, the first page of the tutorial works word-for-word. One thing to note is that the "eclipse-plugin" option isn't actually in the Packaging drop-down; you just have to type it in. Again: thin wrapper. It may not work immediately after following the directions, but the divergences are generally due to the non-standard Domino-related dependencies. In particular, I ran into trouble with forbidden-access rules in Notes.jar - Maven, being a separate world, ignores your Eclipse preferences on the matter. To get around that, I added the parts in the plugin block of this pom.xml - among other things, they tell the compiler to ignore such problems. I still ran into trouble with lotus.domino.local.NotesBase specifically after the other classes started working, and I "solved" that by deleting the code (it was related to recycle checking, which I no longer need).

It may also be useful to change build.properties so that the output.. = bin/ line reads output.. = target/classes. I don't know if this is actually used, but it was a troubleshooting step I took elsewhere and it makes conceptual sense: Maven puts its output classes in target/classes, not bin.

During this process, I quickly realized the value of having a parent project. I had a hitch in mine in that I wanted to call the parent frostillicus.framework, which meant renaming the plugin to frostillicus.framework.plugin and dealing with the associated updating of Eclipse and git, but that was an unforced error. The normal layout of parent projects seems to be that they're parents both conceptually by pom.xml and also physically by folder structure. I haven't done the latter yet, and the process works just as well if you don't. Still, I should move it eventually. So, following the third part of the tutorial, I created a near-empty project (no Java code) just to house the pom.xml with common settings and told it to adopt the plugin as a child. Converting the feature project was the easiest step and went exactly as described in the tutorial.

Where I diverged from both the tutorial and ODA is in the Update Site. The tutorial suggests renaming site.xml to category.xml and using the Maven type eclipse-repository, but none of the examples I used did that. Instead, I followed those projects and left site.xml as-is (other than making sure that the versions in the XML source use ".qualifier" instead of any timestamp from building) and used the Maven type eclipse-update-site in the pom.xml.

I then spent about two hours pulling my hair out over bizarre problems I had wherein the update site would build but not actually include the compiled classes in the plugin jar if I clicked on "Build" in the site.xml editor and would fail with bizarre error messages if I did Run As → Maven Install. I'll spare you the tribulations and cut to the chase: my problem was that I had the modules in the parent project's pom.xml out of order, with the update site coming before the feature project. When I fixed that, I was able to start building the site the "Maven way". Which is to say: not using the site.xml's Build button (which still had the same problem for me), but using Run As → Maven Install. This ends up putting the built update site inside the target/site directory rather than directly in plugins and features folders. This is a case of "okay, sure" again.

Conclusion

So, after a tremendous amount of suffering and bafflement, I have a converted project! So what does it buy me? Not much, currently, but it feels good, and I had to learn this stuff eventually one way or another. Over the process, some aspects of Maven started to crystallize in my mind - the repositories, the dependencies, the module trees - and that helps me understand why other Maven-ized projects look the way they do. Other aspects are still beyond my ken (like most of the terminology), but it's a step in the process. This should also mean I'm closer to ready for future build processes and am more in line with the larger Java world.

If you have a similar project, I'd say it's not required that you make the switch, but if you're planning on working on larger projects that use Maven, it'd be a good idea. Maven takes a lot of getting used to, since everything feels like it's a from-scratch rethinking of the way to structure Java projects with no regard to the structure or terminology of "normal" Eclipse/OSGi development, and something like this conversion is as good a start down the path as any.

Using the ODA Design API for File-Resource Manipulation

Wed Nov 19 18:25:15 EST 2014

Tags: oda design-api

As is characteristic of his blog, Sven Hasselbach recently posted two interesting posts on using the NAPI classes in the XPages runtime to manipulate files in the WebContent folder. If you haven't read the posts, I suggest you do so now, because it's knowledge that is very good to have. The NAPI classes are chock full of cheating sorcery.

But the point of this post here is a bit of me-too-ism. Which is to say: the OpenNTF Domino API has you covered on this front. The API's Design package (org.openntf.domino.design.*) contains classes that, among other things, let you retrieve, create, and modify both "normal" and these extra file resources.

The starting point is the getDesign() method on an ODA database object. Using this DatabaseDesign object, you can get access to files. For example:

Database database = FrameworkUtils.getDatabase();

DatabaseDesign design = database.getDesign();

for(FileResource res : design.getFileResources()) {
	// do stuff
}

FileResource newResource = design.createFileResource();
newResource.setName("some file.txt");
newResource.setFileData("some content".getBytes());
newResource.save();

FileResource existing = design.getAnyFileResource("existing-file.txt");
String existingContent = new String(existing.getFileData());
existing.setFileData((existingContent + " new stuff").getBytes());
existing.save();

Created file resources show up in the File Resources section currently (you can call setHideFromDesignList(true), which removes them from there, but puts them at the root of the VFS... that works, but it'd be better for them to show up in WebContent (putting "WebContent/" in the name LOOKS like it works, but doesn't)). The getAnyFileResource method will search for any "file" type element with the given name - normal file resources, WebContent files, XPages, and Java source/class files. There are better classes for manipulating the latter two, but they show up because that's how they're implemented.

One note: the Design API uses DXL, not the NAPI classes, which means it gains freedom from an XSP dependency at the cost of memory efficiency. To do its thing, it must export the file as DXL, which will be larger in memory than the file size, and then convert that BASE64 representaiton to bytes, meaning the memory cost is more than double the size of the file. For most files, that's fine, but be wary of dealing with, say, 200MB video files. That sort of thing totally works (I tested), but only if you have the memory to spare and your JVM configured to use it. Maybe one day the API will have direct access to the notes.

Okay, a second note: it's important to make sure that the max Internet access level is Designer or above for this to work, at least with the normal session object.

If you haven't taken a look at the Design API before, it may be worth at least glancing over the DatabaseDesign interface to get a feel for what it currently allows. Maybe attention on it will coax me into finding free time to finish all the other aspects I plan to get to.

Factories in XPages

Wed Nov 12 18:27:14 EST 2014

Tags: java xpages

In my last post, I intimated that I wanted to write a post covering the concept of factories in XPages, or at least the specific kind in question. This is that post.

The term "factory" covers a number of meanings, and objects named this way crop up all over the place (ODA has one, for example). The kind I care about today are those defined in an XspContributor object in an XPages plugin. These factories are generally (exclusively, perhaps) used for the purpose of generating adapters: assistant objects that allow the framework to perform operations on object types that may have no knowledge of the framework at all.

The way this generally takes form is this: when the framework needs to perform a specialized task, it asks the application (that is to say, the Application object that controls the XPages app) for a list of factories it knows about that conform to a given interface:

FacesContext facesContext = FacesContext.getCurrentInstance();
ApplicationEx app = (ApplicationEx)facesContext.getApplication();
List<ComponentMapAdapterFactory> factories = app.findServices(COMPONENT_MAP_SERVICE_NAME);

Once it has this list, it loops through all of them and passes the object it's trying to adapt. If the factory is written to understand that type of object, it will return an adapter object; if not, it will return null:

ComponentMapAdapter adapter = null;
for(ComponentMapAdapterFactory fac : factories) {
	adapter = fac.createAdapter(object_);
	if(adapter != null) {
		break;
	}
}

The framework can then use that adapter to perform the action on the object, without either the framework or the object knowing anything about the other:

for(String propertyName : adapter.getPropertyNames()) {
	// ...
}

XSP internally uses this for a great many things. One use that I've run into (and butted heads with) is to create adapter objects for dealing with file attachments. The DominoDocument class has a bit of information about attachments, particularly in its AttachmentValueHolder inner class, but doesn't on its own handle the full job of dealing with file upload and download controls. During the processes of getting a list of files for a given field from the document and handling the "delete document" method, the XPages framework looks up appropriate factories to handle the data type.

The reason for this indirection is so that this sort of operation can work with arbitrary data: in an ideal world, the XPages framework doesn't care at all that anything it does is related to Domino, and similarly the Domino data model objects don't care at all that they're embedded in the XSP framework. By allowing the model-object author (IBM, in this case) to say "hey, when you want to get a list of file attachments for an object of this type, I can help", it decouples the operation cleanly (it's broken in this case, but the theory applies). When the framework needs to process an object, it loops through the adapter factory classes, asks each one if it can handle the object in question, and takes the adapter from the first one that returns non-null.

At first blush, this setup seems overly contrived. After all, isn't this what interfaces are for? Well, in many cases, yes, interfaces would do the job. However, sometimes it makes sense to add this extra layer of indirection. Say, for example, that you're adapting between two Java classes you don't control: you can't modify the framework to support a class you want, and you can't modify the class to conform to an interface the framework understands. In that case, an adapter factory is a perfect shim.

But in fact, I've even found it useful to adopt this structure when I control all of the code. When I was designing the model/component adapter in the frostillic.us Framework, I made the conscious decision to not tie the two sides (the controller and the model) together tightly. Instead, I wrote a pair of interfaces in the controller package: ComponentMapAdapterFactory and ComponentMapAdapter. This way, when the controller gets the order to create an input field for an object's "firstName" property, it loops through the list of ComponentMapAdapterFactorys to find one that fits. Over in the model package, I have an appropriate factory and adapter to handle my model framework.

I could have combined these two more tightly, but I enjoy the cleanliness this brings. I may not stick with my same model framework forever, and similarly the expectations of the controller class may change; because of this separation, it's clear where tweaks will need to be made. It also gives me the freedom to use the same component-building code with model objects I don't control, such as the adapter I wrote that pulls its configuration from Notes forms.


These types of factories are not something you're likely to run into during normal XPages development, but it may be useful to bear their existence in mind. The XPages framework is a great morass of moving parts, and so being able to chart the inner workings in your mental map one bit at a time can go a long way to mastering the platform.

Property Resolution in XPages EL

Tue Nov 11 11:24:21 EST 2014

Tags: xpages java

Reading Devin Olson's recent series on EL processing put me in the mood to refresh and fill out my knowledge of how that stuff actually happens.

A while back, I made a small foray of my own into explaining how property resolution in XPages EL works, one which I followed up with a mea culpa explaining that I had left out a few additional supported types. As happens frequently, that still didn't cover the full story. Before getting to what I mean by that, I'll step back to an overview of XPages EL in general.

Components of EL Processing

To my knowledge, there are three main conceptual pieces to the EL-resolution process. I'll use the EL #{foo.bar.baz[1]} as a common example.

  • The EL parser itself. This is what reads the EL above and determines that you want the 1-indexed property of the baz property of the bar property of the foo object. I don't know if there's a realistic way to override or extend this stock-EL behavior.
    • This does, though contain an extensible side path: BindingFactory. This lets you create your own processors for value and method bindings based on the EL prefix, in the same vein as #{javascript: ... }.
  • The VariableResolver. This is a relatively-common bit of XPages extensibility and for good reason: they're quite useful. The variable resolver is what is used by EL (and SSJS, and others) to determine what object is referenced by foo in the example.
  • The PropertyResolver. This is the companion to the VariableResolver and is what handles the rest of the dereferencing in the example. EL asks the app's property resolver to find the bar property of foo, then the baz property of that, and then the 1 indexed property of that. This is the main topic of conversation today.

Setting An Application's PropertyResolver

There are two main ways I know of to make use of property resolvers, and the first is analogous to the VariableResolver: you write a single object and specify it in your faces-config file, like so:

<application>
	<property-resolver>config.PropResolver</property-resolver>
</application>

Then the skeletal implementation of such an object looks like this:

package config;

import javax.faces.el.EvaluationException;
import javax.faces.el.PropertyNotFoundException;
import javax.faces.el.PropertyResolver;

public class PropResolver extends PropertyResolver {
	private final PropertyResolver delegate_;

	public PropResolver(PropertyResolver delegate) {
		delegate_ = delegate;
	}

	@Override
	public Class<?> getType(Object obj, Object property) throws EvaluationException, PropertyNotFoundException {
		return delegate_.getType(obj, property);
	}

	@Override
	public Class<?> getType(Object obj, int index) throws EvaluationException, PropertyNotFoundException {
		return delegate_.getType(obj, index);
	}

	@Override
	public Object getValue(Object obj, Object property) throws EvaluationException, PropertyNotFoundException {
		return delegate_.getValue(obj, property);
	}

	@Override
	public Object getValue(Object obj, int index) throws EvaluationException, PropertyNotFoundException {
		return delegate_.getValue(obj, index);
	}

	@Override
	public boolean isReadOnly(Object obj, Object property) throws EvaluationException, PropertyNotFoundException {
		return delegate_.isReadOnly(obj, property);
	}

	@Override
	public boolean isReadOnly(Object obj, int index) throws EvaluationException, PropertyNotFoundException {
		return delegate_.isReadOnly(obj, index);
	}

	@Override
	public void setValue(Object obj, Object property, Object value) throws EvaluationException, PropertyNotFoundException {
		delegate_.setValue(obj, property, value);
	}

	@Override
	public void setValue(Object obj, int index, Object value) throws EvaluationException, PropertyNotFoundException {
		delegate_.setValue(obj, index, value);
	}
}

If you've done much with DataObjects, you'll likely immediately recognize those methods: I imagine that DataObject was created as a simplest-possible implementation of a PropertyResolver-friendly interface.

So how might you use this? Well, most of the time, you probably shouldn't - in my experience, the standard property resolver is sufficient and using DataObject in custom objects is easy enough that it's the best path. Still, you could use this to patch the behavior that is driving Devin to madness or to paint over other persistent annoyances. For example, DominoViewEntry contains hard-coded properties for accessing the entry's Note ID, but not its cluster-friendly Universal ID. To fix this, you could override the non-indexed getValue method like so:

public Object getValue(Object obj, Object property) throws EvaluationException, PropertyNotFoundException {
	if (obj instanceof DominoViewEntry && "documentId".equals(property)) {
		return ((DominoViewEntry) obj).getUniversalID();
	}
	return delegate_.getValue(obj, property);
}

Now, anywhere where you have a DominoViewEntry, you can use #{viewEntry.documentId} to get the UNID. You could do the same for DominoDocument as well, if you were so inclined. You'll just have to plan to never have a column or field named "documentId" (much like you currently have to avoid "openPageURL", "columnIndentLevel", "childCount", "noteID", "selected", "responseLevel", "responseCount", and "id").

Property Resolver Factories

The other way to use PropertyResolvers is to register and use a PropertyResolverFactory. Unlike the faces-config approach, these do not override (all of) the default behavior, but are instead looked up by IBM's PropertyResolver implementation at a point during its attempts at property resolution. Specifically, that point is after support for ResourceBundle, ViewRowData, and DataObject and before delegation to Sun's stock resolver (which handles Maps, Lists, arrays, and generic POJOs).

If you get a type hierarchy, you can see that IBM uses this route for lotus.domino.Document, com.ibm.commons.util.io.json.JsonObject, and com.ibm.jscript.types.FBSObject (SSJS object) support. So the idea of this route is that you'd have your own custom object type which doesn't implement any of the aforementioned interfaces and for which you want to provide EL support beyond the normal getter/setter support. Normally, this is not something worth doing, but I could see it being useful if you have a third-party class you want to work in, such as a non-Domino/JDBC data source.

The method for actually using one of these is... counterintuitive, but is something you may have run into in plugin development. The first step is simple enough: implement the factory:

package config;

import javax.faces.el.PropertyResolver;
import com.ibm.xsp.el.PropertyResolverFactory;

public class PropResolverFactory implements PropertyResolverFactory {
	public PropertyResolver getPropertyResolver(Object obj) {
		return null;
	}
}

That getPropertyResolver method's job is to check to see if the object is one of the types it supports and, if it is, return a PropertyResolver object (the same kind as above) that will allow the primary resolver to get the property.

Actually registering the factory is weirder. It must be done via a plugin (or by code that manually registers it in the application when needed), and the best way to see what I mean is to take a look at an example: the OpenntfDominoXspContributor class used by the OpenNTF Domino API. The contributor is registered in the plugin.xml and returns an array of arrays (because Java doesn't have tuples or map literals) representing a unique name for your factory plus the implementing class.

This concept of factories actually probably warrants its own blog post down the line. For the time being, the upshot is that this approach is appropriate if you're adding your own data type via a plugin (and which wouldn't be better-suited to implement DataObject), so it's a rare use case indeed.

Using "Verboten" Property Names in Custom Controls

Sun Nov 02 09:46:28 EST 2014

Tags: xpages

In an attempt to save you from yourself, Designer prevents you from naming your custom control properties after SSJS keywords such as "do" and "for". This is presumably because a construct like compositeData.for would throw both a syntax error in SSJS and the developer into a tizzy. However, sometimes you want to use one of those names - they're not illegal in EL, after all, and even SSJS could still use compositeData['for'] or compositeData.get("for") to access the value.

Fortunately, this is possible: if you go to the Package Explorer view in Designer and open up the "CustomControls" folder of your NSF, you'll see each custom control as a pair of files: an ".xsp" file representing the control markup and an ".xsp-config" file representing the metadata specified in the properties pane, including the custom properties. Assuming you attempted to type "for" for the property name and were stuck with "fo", you'll see a block like this:

<property>
	<property-name>fo</property-name>
	<property-class>string</property-class>
</property>

Change that "fo" to "for" and save and all is well. You'll be able to use the property just like you'd expect with a normal property, with the caveat above about how to access it if you use SSJS. I wouldn't make a habit of using certain keywords, such as "class", but "for" is perfectly fine and allows your controls to match stock controls such as xp:pager.

This came up for me in one of the controls I like to keep around when dealing with custom renderers: a rendererInfo control to display some relevant information. Since I keep forgetting where I last used such a control, I figure I should post it here partially for my own future reference.

<?xml version="1.0" encoding="UTF-8"?>
<xp:view xmlns:xp="http://www.ibm.com/xsp/core">
	<table>
		<tr>
			<th>Client ID</th>
			<td><xp:text><xp:this.value><![CDATA[#{javascript:
				var comp = getComponent(compositeData['for']);
				return comp == null ? 'null' : comp.getClientId(facesContext);
			}]]></xp:this.value></xp:text></td>
		</tr>
		<tr>
			<th>Theme Family</th>
			<td><xp:text><xp:this.value><![CDATA[#{javascript:
				var comp = getComponent(compositeData['for']);
				return comp == null ? 'null' : comp.getStyleKitFamily();
			}]]></xp:this.value></xp:text></td>
		</tr>
		<tr>
			<th>Component Family</th>
			<td><xp:text><xp:this.value><![CDATA[#{javascript:
				var comp = getComponent(compositeData['for']);
				return comp == null ? 'null' : comp.getFamily();
			}]]></xp:this.value></xp:text></td>
		</tr>
		<tr>
			<th>Renderer Type</th>
			<td><xp:text><xp:this.value><![CDATA[#{javascript:
				var comp = getComponent(compositeData['for']);
				return comp == null ? 'null' : comp.getRendererType();
			}]]></xp:this.value></xp:text></td>
		</tr>
		<tr>
			<th>Renderer Class</th>
			<td><xp:text><xp:this.value><![CDATA[#{javascript:
				var comp = getComponent(compositeData['for']);
				var renderer = comp == null ? null : comp.getRenderer(facesContext);
				return renderer != null ? renderer.getWrapped().getClass().getName() : 'N/A'
			}]]></xp:this.value></xp:text></td>
		</tr>
	</table>
</xp:view>

CocoaLove Reflection

Sun Oct 26 20:13:10 EDT 2014

Tags: cocoa

This weekend, I attended CocoaLove, a new Mac/iOS-development-related conference held in Philadelphia. Though my Cocoa resume consists of doing various tutorials every few years for the last decade or so, the location, concept, and speaker lineup were impossible to resist.

The upshot: this was a great conference. As the tagline – "A conference about people, not tech." – indicates, the sessions weren't technical or even generally about programming as such. Instead, it was a bit more in the ATLUG Day of Champions vein. They covered a range of useful "surrounding" topics, from self-image, to lessons from other industries, to diversity (in a far more interesting sense than that semi-buzzword makes it sound). The secondary push of the conference was social-in-the-sense-of-socializing - the keynote encouraged everyone to introduce themselves and the tables were stocked with levels-of-introversion pins, something that could be a silly conceit but worked well.

In fact, the socializing push worked remarkably well, thanks in large part to the nature of the talks. Since it was a single-track conference and the topics weren't technical reference material, laptops were almost entirely sheathed the whole time and even phone-checking was shockingly limited. Since the event was in a single room, there was no walking around needed between sessions - the breaks were spent talking about the just-presented topic or getting to know the people sitting with you.

This was also personally a very interesting experience for me. When it comes to Cocoa development, I am but an egg. It was weird being back in the position of not being known by anyone and only knowing a few people by their works and reputation – it was like my first MWLUG a couple years ago. I had a bit of "I got to meet Marco Arment and Brent Simmons!" fanboy-ism, but mostly it was great meeting a whole slew of people in a community I've only ever observed from the outside. It also made me realize that I need to get over the hump of the train ride and watch for more events in the city.

For reference, as you'd probably expect, nobody had any idea what "IBM Domino" is other than one long-former IBMer. The reactions I got when I explained that I do Java development all day ranged from "ah, I've used that for some Android development" to the sort of sympathetic reaction you'd get if you told someone you were just evicted from your house.

On a final note, the conference badges were amazing. They were all hand-drawn renditions of attendees' Twitter-or-otherwise avatars and it was an unexpected cool touch. The Fracture (one of the sponsors) prints they threw in were a nice bonus.

A Welcome SSL Stay of Execution

Tue Oct 21 17:52:58 EDT 2014

Tags: ssl

As you likely know from the torrent of posts on Planet Lotus on the topic, IBM announced a hopefully-imminent pair of updates to cover the two main SSL issues that have come to the fore recently: lack of SHA-2 support and the POODLE vulnerability in SSLv3. This is welcome indeed!

Personally, I'm going to stick with the nginx approach for HTTP, even in simple setups, because I've found the extra features you can get (and the promising new ones I haven't tried) to be a dramatic improvement in my server's capabilities. But in the mean time, I'm pleased that the pressure to investigate proxies for other protocols is lessened for the time being. It's not a full SSL revamp (the technote only mentions TLS 1.0 for Domino), but it's something to calm the nerves.

Nonetheless, it's been a good experience to branch out into better ways of running the server. I expect I'll eventually look into mail and LDAP proxying, both to get the highest level of SSL security and to see how useful the other features are (mail load balancing and failover, in particular, would be welcome in my setup).