Rewriting The OpenNTF Site With Jakarta EE: Beans

Jun 24, 2022, 5:03 PM

Tags: jakartaee java
  1. Rewriting The OpenNTF Site With Jakarta EE, Part 1
  2. Rewriting The OpenNTF Site With Jakarta EE: REST
  3. Rewriting The OpenNTF Site With Jakarta EE: Data Access
  4. Rewriting The OpenNTF Site With Jakarta EE: Beans
  5. Rewriting The OpenNTF Site With Jakarta EE: UI

Now that I've covered the basics of REST services and data access in the new OpenNTF web site, I'll dive a bit into the use of CDI for beans. The two previous topics implied some of the deeper work of CDI, with the @Inject annotation being used by CDI to supply bean and proxy values, but in those cases it was fine to just assume what it was doing.

CDI itself - Contexts and Dependency Injection - contains more capabilities than I'll cover here. Some of them, like its event/observer system, are things that I'll probably end up using in this app, but haven't made their way in yet. For now, I'll talk about the basic "managed beans" level and then build to the way Jakarta NoSQL uses its proxy-bean capabilities.

Managed Beans

In the OpenNTF site, I use a couple beans, some to provide scoped state and some to provide "services" for the app. I'll start with one of the simpler ones, a bean used to convert Markdown to HTML using CommonMark. I use a more-complicated version of this bean in my blog, but for now the OpenNTF one is small:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
package bean;

import org.commonmark.node.Node;
import org.commonmark.parser.Parser;
import org.commonmark.renderer.html.HtmlRenderer;

import jakarta.enterprise.context.ApplicationScoped;
import jakarta.inject.Named;

@ApplicationScoped
@Named("markdown")
public class MarkdownBean {
    private Parser markdown = Parser.builder().build();
    private HtmlRenderer markdownHtml = HtmlRenderer.builder()
            .build();

    public String toHtml(final String text) {
        Node parsed = markdown.parse(text);
        return markdownHtml.render(parsed);
    }
}

The core concepts here are exactly the same as you have with XPages Managed Beans. The "bean" itself is just a Java object and doesn't need to have any particular special characteristics other than, if it's stored in a serialized context, being Serializable or otherwise storable. The only difference here for that purpose is that, rather than being configured in faces-config.xml, the bean attributes are defined inline (there's a "beans.xml" for explicit definitions, but it's not needed in common cases). Here, the @ApplicationScoped annotation will cover its scope and the @Named annotation will allow it to be addressable by name in contexts like JSP or XPages. A CDI bean doesn't have to be named, but it's common in cases where the bean will be used in the UI.

Once a bean is defined, the most common way to use it is to use the @Inject annotation on another CDI-capable class, such as another bean or a JAX-RS resource. For example, it could be injected into a controller class like:

1
2
3
4
5
6
7
8
@Path("/blog")
@Controller
public class BlogController {
    @Inject
    private MarkdownBean markdown;

    // (snip)
}

CDI will handle the dirty business of making sure the field is populated, and that all scopes are respected. You can also retrieve a bean programmatically, with just a bit of gangliness:

1
MarkdownBean markdown = CDI.current().select(MarkdownBean.class).get();

You can think of that one as roughly equivalent to ExtLibUtil.resolveVariable(...).

By default, CDI comes with a few main scopes for our normal use: @ApplicationScoped, @SessionScoped, @RequestScoped, and @ConversationScoped. The last one is a bit weird: it kind of covers whatever your framework considers a "conversation". It's kind of like the view scope in XPages, and in the XPages JEE support project I mapped it to that, but it could also potentially be a conversation between distinct pages in an app. JSF, for its part, has its own @ViewScoped annotation, and I'm considering stealing or reproducing that.

That touches on the last bit I'll mention for this "basic" section of CDI: scope definitions. Though CDI comes with a handful of standard scopes, they're defined in a way that users can use. You could, for example, make a @InvoicingScope to cover beans that exist for the duration of a billing process, and then you'd managed initiating and terminating the scope yourself. Usually, this isn't necessary or particularly useful, but it's good to know it's there.

Producer Methods

The next level of this is the ability of a bean to programmatically produce beans for downstream use. By this I mean that a bean's method can be annotated with @Produces, and then it can provide a type to be matched elsewhere. In the OpenNTF app, I use this as a way to delay loading of a resource bundle until it's actually used:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
package bean;

import java.util.ResourceBundle;

import jakarta.enterprise.context.RequestScoped;
import jakarta.enterprise.inject.Produces;
import jakarta.inject.Inject;
import jakarta.inject.Named;
import jakarta.servlet.http.HttpServletRequest;

@RequestScoped
public class TranslationBean {
    @Inject
    HttpServletRequest request;

    @Produces @Named("translation")
    public ResourceBundle getTranslation() {
        return ResourceBundle.getBundle("translation", request.getLocale()); //$NON-NLS-1$
    }
}

Here, TranslationBean itself exists as a request-scoped bean and can be used programmatically, but it's really a shell for delayed retrieval of a ResourceBundle named "translation" for use in the UI. This allows me to use the built-in mapping behavior of ResourceBundle in Expression Language when writing bits of JSP like <p>${translation.copyright}</p>.

You can get more complicated than this, for sure. For example, if I switch the UI of this app to XPages, I may do a replacement of my classic controller framework that uses such a producer bean instead of the ViewHandler I used in the original implementation.

Proxy Beans

Finally, I'll talk a bit about dynamically-created proxy beans.

CDI's implementations make heavy use of object proxies to do their work. Technically, injected objects are proxies themselves, which allows CDI to let you do stuff like inject a @RequestScoped bean into an @ApplicationScoped one. But the weird part of CDI I plan to talk about here is the use of proxies to provide an object for an interface that doesn't have any implementation class.

I've mentioned this sort of injection a few times:

1
2
3
4
5
6
@Path("/pages")
public class PagesController {
    @Inject
    Page.Repository pageRepository;

    // snip

And then the interface is just:

1
2
3
4
@RepositoryProvider("homeRepository")
public interface Repository extends DominoRepository<Page, String> {
    Optional<Page> findBySubject(String subject);
}

There's no class that implements Page.Repository, so how come you can call methods on it? That's where the proxying comes in. While the CDI container (in this case, our NSF-based app) is being initialized, the Domino JNoSQL driver looks for classes implementing DominoRepository:

1
2
3
4
5
6
7
8
9
<T extends DominoRepository> void onProcessAnnotatedType(@Observes final ProcessAnnotatedType<T> repo) {
    Class<T> javaClass = repo.getAnnotatedType().getJavaClass();
    if (DominoRepository.class.equals(javaClass)) {
        return;
    }
    if (DominoRepository.class.isAssignableFrom(javaClass) && Modifier.isInterface(javaClass.getModifiers())) {
        crudTypes.add(javaClass);
    }
}

Then, once they're all found, it registers a special kind of bean for them:

1
2
3
void onAfterBeanDiscovery(@Observes final AfterBeanDiscovery afterBeanDiscovery, final BeanManager beanManager) {
    crudTypes.forEach(type -> afterBeanDiscovery.addBean(new DominoRepositoryBean(type, beanManager)));
}

I mentioned above that beans are generally just normal Java classes, but you can also make beans by implementing jakarta.enterprise.inject.spi.Bean, which gives you programmatic control over many aspects of the bean, including providing the actual implementation of them. In the Domino driver's case, as in most/all of the JNoSQL drivers, this is done by providing a proxy object:

1
2
3
4
5
6
7
public DominoRepository<?, ?> create(CreationalContext<DominoRepository<?, ?>> creationalContext) {
    DominoTemplate template = /* Instance of a DominoTemplate, which handles CRUD operations */;
    Repository<Object, Object> repository = /* JNoSQL's default Repository */;

    DominoDocumentRepositoryProxy<DominoRepository<?, ?>> handler = new DominoDocumentRepositoryProxy<>(template, this.type, repository);
    return (DominoRepository<?, ?>) Proxy.newProxyInstance(type.getClassLoader(), new Class[] { type }, handler);
}

Finally, that proxy class implements java.lang.reflect.InvocationHandler, which lets it provide custom handling of incoming methods.

This well goes deep, including the way JNoSQL will parse out method names and parameters to handle queries, but I think that will suffice for now. The important thing to know is that this is possible to do, common in underlying frameworks, and fairly rare in application code.

Next Up

I'm winding down on major topics, but at least critical one remains: the actual UI. Currently (and likely when shipping), the app uses MVC and JSP to cover this need. I've discussed these before, but I think it'll be useful to do so again, both as a refresher and to show how they bring these other parts of the app together.

Rewriting The OpenNTF Site With Jakarta EE: Data Access

Jun 21, 2022, 10:12 AM

Tags: jakartaee java
  1. Rewriting The OpenNTF Site With Jakarta EE, Part 1
  2. Rewriting The OpenNTF Site With Jakarta EE: REST
  3. Rewriting The OpenNTF Site With Jakarta EE: Data Access
  4. Rewriting The OpenNTF Site With Jakarta EE: Beans
  5. Rewriting The OpenNTF Site With Jakarta EE: UI

In my last post, I talked about how I make use of Jakarta REST to handle the REST services in the new OpenNTF site I'm working on. There'll be more to talk about on that front when I get to the UI and my use of MVC. For now, though, I'll dive a bit into how I'm accessing NSF data.

I've been talking a lot lately about how I've been fleshing out the Jakarta NoSQL driver for Domino that comes as part of the XPages JEE project, and specifically how writing this app has proven to be an ideal impetus for adding specific capabilities that are needed for working with Domino. This demonstrates some of the fruit of that labor.

Model Objects

There are a few ways to interact with Jakarta NoSQL, and they vary a bit by database type (key/value, column, document, graph), but I focus on using the Repository interface capability, which is a high-level abstraction over the pool of documents.

Before I get to that, though, I'll start with an entity object. Part of the heavy lifting that a framework like Jakarta NoSQL does is to map between a Java class and the actual data representation. In the SQL world, one would likely come across the term object-relational mapping for this, and the concept is generally the same. The project currently has a handful of such classes, and so the data layer looks like this:

Screenshot of Designer showing the data-related classes in the NSF

The mechanism for mapping a class in JNoSQL is very similar to JPA:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
@Entity("Release")
public class ProjectRelease {
    
    public enum ReleaseStatus {
        Yes, No
    }
    
    @Id
    private String documentId;
    @Column("ProjectName")
    private String projectName;
    @Column("ReleaseNumber")
    private String version;
    @Column("ReleaseDate")
    private Temporal releaseDate;
    @Column("WhatsNewAbstract")
    private String description;
    @Column("DownloadsRelease")
    private int downloadCount;
    @Column("MainID")
    private String mainId;
    @Column("ReleaseInCatalog")
    private ReleaseStatus releaseStatus;
    @Column("DocAuthors")
    private List<String> docAuthors;
    @Column(DominoConstants.FIELD_ATTACHMENTS)
    private List<EntityAttachment> attachments;

    /* getters/setters and utility methods here */
}

@Entity("Release") at the top there declares that this class is a JNoSQL entity, and then the Domino driver uses "Release" as the form name when creating documents and performing queries.

The @Id and @Column("...") annotations map Java object properties to fields and attributes on the document. @Id populates the field with the document's UNID, while @Column does a named field. There's a special one there - @Column(DominoConstants.FIELD_ATTACHMENTS) - that will populate the field with references to the document's attachments when present. In each of these cases, all of the heavy lifting is done by the driver: there's no code in the app that manually accesses documents or views.

Repositories

The way I get access to documents mapped by these classes is to use the JNoSQL Repository mechanism, by way of the extended DominoRepository interface. They look like this (used here as an inner class for stylistic reasons, not technical ones):

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
@Entity("Release")
public class ProjectRelease {

    @RepositoryProvider("projectsRepository")
    public interface Repository extends DominoRepository<ProjectRelease, String> {
        Stream<ProjectRelease> findByProjectName(String projectName, Sorts sorts);

        @ViewEntries("ReleasesByDate")
        Stream<ProjectRelease> findRecent(Pagination pagination);
        
        @ViewDocuments("IP Management\\Pending Releases")
        Stream<ProjectRelease> findPendingReleases();
    }

    /* snip: entity class from above */
}

Merely by creating this interface, I'm able to get access to the associated documents: I don't actually have to implement it myself. As seen in the last post, these interfaces can be injected into a bean or REST resource using CDI:

1
2
3
4
5
6
7
public class IPProjectsResource {
    
    @Inject
    private ProjectRelease.Repository projectReleases;

    /* snip */
}

Naturally, there is implementation code for this repository, but it's all done with what amounts to "Java magic": proxy objects and CDI. That's a huge topic on its own, and it's pretty weird to realize that that's even possible, but it will have to suffice for now to say that it is possible and it works great.

When you create one of these repositories, you get basic CRUD capabilities "for free": you can create new documents, look up existing documents by ID, and modify or delete existing documents.

Basic Queries

Beyond that, JNoSQL will do some lifting for you to give sensical implementations for methods based on their method signature in the absence of any driver-specific code. I'm making use of that here with findByProjectName(String projectName, Sorts sorts). The proxy object that provides this implementation is able to glean that String projectName refers to the projectName field of the ProjectRelease class, which is then mapped by annotation to the ProjectName item on the back end. The Sorts object is a JNoSQL type that allows you to specify one or more sort columns and their orders. When executed, this is translated to a DQL query like:

1
Form = 'ProjectRelease' and ProjectName = 'Some Project'

When Sorts are specified, this is also run through QueryResultsProcessor to create a QRP view with the given sort columns in a local temp database. Thanks to that, running the same query multiple times when the data hasn't changed will be very speedy.

You can customize these queries further by adding more parameters, or by using the @Query annotation to provide a SQL-like query with parameters.

Domino-Specific Queries

Since Domino is so view-heavy and DQL+QRP isn't quite at the level where you can just throw any old query+extraction at it and expect it to perform well, it made sense for me to add extensions to JNoSQL to explicitly target views as sources. I use them both here, in one case to efficiently retrieve view data without opening documents and in another in order to piggyback on an existing view used by the IP Tools services already deployed.

The @ViewEntries("ReleasesByDate") annotation causes the findRecent annotation to skip JNoSQL's normal interpretation of the method and instead be handled by the Domino driver directly. It will open that view and read entries based on the Pagination rules sent to it (another JNoSQL object). Since the columns in this view line up to the item names in the documents, I'm able to get useful entity objects out if it without having to actually crack open the docs. In practice, I'll need to be careful when using this so as to not save entities like this back into the database, since not ALL columns are present in the view, but that's a reasonable caveat to have.

The @ViewDocuments("IP Management\\Pending Releases") annotation causes findPendingReleases to read full documents out of the named view, ignoring view columns. Eventually, I'll likely replace this with an equivalent query in JNoSQL's dialect, but for now it's more practical to just use the existing view like a stored query and not have to translate the selection formula to another mechanism.

Repository Provider

The last thing to touch on with this repository is the @RepositoryProvider annotation. The OpenNTF web site is stored in its own NSF, and then references several other NSFs, such as the projects DB, the blog DB (which is still based on BlogSphere), and the patron directory. The @RepositoryProvider annotation allows me to tell JNoSQL to use a different database than the current one, and it does so by finding a matching CDI producer method that gives it a lotus.domino.Database housing the documents and a high-privilege lotus.domino.Session to create QRP views. In this app's case, that's this in another bean:

1
2
3
4
5
6
7
8
@Produces
@jakarta.nosql.mapping.Database(value = DatabaseType.DOCUMENT, provider = "projectsRepository")
public DominoDocumentCollectionManager getProjectsManager() {
    return new DefaultDominoDocumentCollectionManager(
        () -> getProjectsDatabase(),
        () -> getSessionAsSigner()
    );
}

I'll touch on what the heck a @Produces method is in CDI later, but for now you can take it for granted that this works. The getProjectsDatabase() method that it calls is a utility method that opens the project DB based on some configuration documents.

I'll note with no small amount of pleasure that this bean that provides databases is one of the only two places in the app that actually reference Domino API classes at all, and the other instance is just to convert Notes names. I'm considering ways to remove this need as well, perhaps making it so that this producer only needs to provide a path to the target database and the name of a high-privilege user to act as, and then the driver would do the session creation and DB opening itself.

Next Up

In the next post, I'll most likely talk about my use of CDI to handle the "managed beans" layer. In a lot of ways, that will just be demonstrating the way CDI makes the tasks you'd otherwise accomplish with XPages Managed Beans simpler and more code-focused, but (as the @Produces annotation above implies) there's a lot more to it.

Rewriting The OpenNTF Site With Jakarta EE: REST

Jun 20, 2022, 1:09 PM

Tags: jakartaee java
  1. Rewriting The OpenNTF Site With Jakarta EE, Part 1
  2. Rewriting The OpenNTF Site With Jakarta EE: REST
  3. Rewriting The OpenNTF Site With Jakarta EE: Data Access
  4. Rewriting The OpenNTF Site With Jakarta EE: Beans
  5. Rewriting The OpenNTF Site With Jakarta EE: UI

In deciding how to kick off implementation specifics of my new OpenNTF site project, I had a few options, and none of them perfect. I considered starting with the managed beans via CDI, but most of those are actually either UI support beans or interact primarily with other components. I ended up deciding to talk a bit about the REST services in the app, since those are both an extremely-common task to perform in XPages and one where the JEE project runs laps around what you get by default from Domino.

The REST layer is handled by Jakarta REST, which is still primarily called by its old name JAX-RS. JAX-RS has existed in Domino for a good while via the Wink implementation included with the Extension Library, but that's a much-older version. Additionally, that implementation didn't include a lot of convenience features like automatic JSON conversion out of the box. The implementation in the XPages JEE Support project uses RESTEasy, which is one of the primary active implementations and covers the latest versions of the spec.

Example

Though the primary way JAX-RS is actually used in this app is as the backbone for the UI with MVC, that'll be a topic for later. Since I also plan to use this as a way to modernize the IP Management tools I wrote, I'm making some JSON-based services for that.

I have a service that lets me get a list of project releases that haven't yet been approved, as well as an endpoint to mark one as approved. That class looks like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
package webapp.resources.iptools;

import java.text.MessageFormat;
import java.util.Collections;
import java.util.List;
import java.util.Map;
import java.util.stream.Collectors;

import jakarta.annotation.security.RolesAllowed;
import jakarta.inject.Inject;
import jakarta.validation.constraints.NotEmpty;
import jakarta.ws.rs.GET;
import jakarta.ws.rs.NotFoundException;
import jakarta.ws.rs.POST;
import jakarta.ws.rs.Path;
import jakarta.ws.rs.PathParam;
import jakarta.ws.rs.Produces;
import jakarta.ws.rs.core.MediaType;
import model.projects.ProjectRelease;

@Path("iptools/projects")
@RolesAllowed("[IPManager]")
public class IPProjectsResource {
    
    @Inject
    private ProjectRelease.Repository projectReleases;
    
    @GET
    @Path("pendingReleases")
    @Produces(MediaType.APPLICATION_JSON)
    public Map<String, Object> getPendingReleases() {
        return Collections.singletonMap("payload", projectReleases.findPendingReleases().collect(Collectors.toList()));
    }
    
    @POST
    @Path("releases/{documentId}/approve")
    @Produces(MediaType.APPLICATION_JSON)
    public boolean approveRelease(@PathParam("documentId") @NotEmpty String documentId) {
        ProjectRelease release = projectReleases.findById(documentId)
            .orElseThrow(() -> new NotFoundException(MessageFormat.format("Could not find project for UNID {0}", documentId)));
        release.markApprovedForCatalog(true);
        projectReleases.save(release);
        
        return true;
    }
}

We can ignore the ProjectRelease.Repository business, since that's the model objects making use of Jakarta NoSQL - that'll be for later. For now, we can just assume that methods like findPendingReleases and findById do what you might assume based on their names.

The resource as a whole is marked as available at the path iptools/projects. In an NSF, that will resolve to a path on the server like /foo.nsf/xsp/app/iptools/projects. The "app" part there is customizable, though the "xsp" part is unchangeable, at least for now: it's the way the XPages stack notices that it's supposed to handle this URL instead of passing it to the classic Domino web server side.

The @RolesAllowed annotation allows me to restrict use of all the methods in this resource to specific roles or names/globs from the ACL. Though the underlying documents will still be protected by the ACL and reader/author fields, it's still good practice to not make services publicly available unless there's a reason to do so.

Often, a resource class like this will have a method marked with @GET but no @Path annotation, which would match the base URL from the class level. That isn't the case here, though: I may eventually merge these methods into an overall projects API, but for now I'm mirroring the old one I made, which doesn't have that.

JSON Conversion

The getPendingReleases method shows off a nice advantage over the older way I was doing this. In the original app, I had a utility class that used Gson to process arbitrary objects and convert them to JSON. Here, since I'm working on top of the whole JEE framework, I don't have to care about that in the app. I can just return my payload object and know that the scaffolding beneath me will handle the fiddly details of translating it to JSON for the browser, based on the @Produces(MediaType.APPLICATION_JSON) annotation there. It happens to use Jakarta JSON Binding (JSON-B), but I don't have to know that. I can just be confident that it will emit JSON representing the documents in a predictable way.

Entity Manipulation

The approveRelease method is available with a URL like /foo.nsf/xsp/app/iptools/projects/releases/12345678901234567890123456789012/approve. With the UNID from the path, I call projectReleases.findById to find the release document with that ID. That method returns an Optional<ProjectRelease> to cover the case that it doesn't exist - the orElseThrow method of Optional allows me to "unwrap" it when present or otherwise throw a NotFoundException. In turn, that exception (part of JAX-RS) will be translated to an HTTP 404 response with the provided message.

I used a @NotEmpty annotation on the @PathParam parameter here since this would currently also match a URL like /foo.nsf/xsp/app/iptools/projects/releases//approve. While I could check for an empty ID, this is a little cleaner and can provide a better error message to the calling user. That's just another nice way to make use of the underlying stack to get better behavior with less code.

The markApprovedForCatalog method on the model object just handles setting a couple fields:

1
2
3
4
5
6
7
8
public void markApprovedForCatalog(boolean approved) {
    if(approved) {
        this.releaseStatus = ReleaseStatus.Yes;
        this.docAuthors = Arrays.asList(ROLE_ADMIN);
    } else {
        this.releaseStatus = ReleaseStatus.No;
    }
}

Then projectReleases.save(release) will store the document in the NSF, throwing an exception in the case of any validation failures. Like with the @NotEmpty parameter annotation above, I don't have to worry about handling that explicitly: Jakarta NoSQL will handle that implicitly for me, since it works with the Bean Validation spec the same way JAX-RS does.

Next Components

Next time I write about this, I figure I'll go over the specific NoSQL entities I've set up and discuss how they handle data access for the app. That will be similar to a number of my recent posts, but I think it'll be helpful to have an example of using that in practice rather than just talking about it hypothetically.

Rewriting The OpenNTF Site With Jakarta EE, Part 1

Jun 19, 2022, 10:13 AM

Tags: jakartaee java
  1. Rewriting The OpenNTF Site With Jakarta EE, Part 1
  2. Rewriting The OpenNTF Site With Jakarta EE: REST
  3. Rewriting The OpenNTF Site With Jakarta EE: Data Access
  4. Rewriting The OpenNTF Site With Jakarta EE: Beans
  5. Rewriting The OpenNTF Site With Jakarta EE: UI

The design for the OpenNTF home page has been with us for a little while now and has served us pretty well. It looks good and covers the bases it needs to. However, it's getting a little long in the tooth and, more importantly, doesn't cover some capabilities that we're thinking of adding.

While we could potentially expand the current one, this provides a good opportunity for a clean start. I had actually started taking a swing at this a year and a half ago, taking the tack that I'd make a webapp and deploy it using the Domino Open Liberty Runtime. While that approach would put all technologies on the table, it'd certainly be weirder to future maintainers than an app inside an NSF (at least for now).

So I decided in the past few weeks to pick the project back up and move it into an NSF via the XPages Jakarta EE Support project. I can't say for sure whether I'll actually complete the project, but it'll regardless be a good exercise and has proven to be an excellent way to find needed features to implement.

I figure it'll also be useful to keep something of a travelogue here as I go, making posts periodically about what I've implemented recently.

The UI Toolkit

The original form of this project used MVC and JSP for the UI layer. Now that I was working in an NSF, I could readily use XPages, but for now I've decided to stick with the MVC approach. While it will make me have to solve some problems I wouldn't necessarily have to solve otherwise (like file uploads), it remains an extremely-pleasant way to write applications. I am also not constrained to this: since the vast majority of the logic is in Java beans and controller classes, switching the UI front-end would not be onerous. Also, I could theoretically mix JSP, JSF, XPages, and static HTML together in the app if I end up so inclined.

In the original app (as in this blog), I made use of WebJars to bring in JavaScript dependencies, namely Hotwire Turbo to speed up in-site navigation and use Turbo Frames. Since the NSF app in Designer doesn't have the Maven dependency mechanism the original app did, I just ended up copying the contents of the JAR into WebContent. That gave me a new itch to scratch, though: I'd love to be able to have META-INF/resources files in classpath JARs picked up by the runtime and made available, lowering the number of design elements present in the NSF.

The Data Backend

The primary benefit of this project so far has been forcing me to flesh out the Jakarta NoSQL driver in the JEE support project. I had kind of known hypothetically what features would be useful, but the best way to do this kind of thing is often to work with the tool until you hit a specific problem, and then solve that. So far, it's forced me to:

  • Implement the view support in my previous post
  • Add attachment support for documents, since we'll need to upload and download project releases
  • Improve handling of rich text and MIME, though this also has more room to grow
  • Switched the returned Streams from the driver to be lazy loading, meaning that not all documents/entries have to be read if the calling code stops reading the results partway through
  • Added the ability to use custom property types with readers/writers defined in the NSF

Together, these improvements have let me have almost no lotus.domino code in the app. The only parts left are a bean for formatting Notes-style names (which I may want to make a framework service anyway) and a bean for providing access to the various associated databases used by the app. Not too shabby! The app is still tied to Domino by way of using the Domino-specific extensions to JNoSQL, but the programming model is significantly better and the amount of app code was reduced dramatically.

Next Steps

There's a bunch of work to be done. The bulk of it is just implementing things that the current XPages app does: actually uploading projects, all the stuff like discussion lists, and so forth. I'll also want to move the server-side component of the small "IP Tools" suite I use for IP management stuff in here. Currently, that's implemented as Wink-based JAX-RS resources inside an OSGi bundle, but it'll make sense to move it here to keep things consolidated and to make use of the much-better platform capabilities.

As I mentioned above, I can't guarantee that I'll actually finish this project - it's all side work, after all - but it's been useful so far, and it's a further demonstration of how thoroughly pleasant the programming model of the JEE support project is.

Working Domino Views Into Jakarta NoSQL

Jun 12, 2022, 3:33 PM

A few versions ago, I added Jakarta NoSQL support to the XPages Jakarta EE Support project. For that, I used DQL and QueryResultsProcessor exclusively, since it's a near-exact match for the way JNoSQL normally goes things and QRP brought the setup into the realm of "good enough for the normal case".

However, as I've been working on a project that puts this to use, the limitations have started to hold me back.

The Limitations

The first trouble I ran into was the need to list, for example, the most recent 20 of an entity. This is something that QRP took some steps to handle, but it still has to build the pseudo-view anew the first time and then any time documents change. This gets prohibitively expensive quickly. In theory, QRP has enough flexibility to use existing views for sorting, but it doesn't appear to do so yet. Additionally, its "max entries" and "max documents" values are purely execution limits and not something to use to give a subset report: they throw an exception when that many entries have been processed, not just stop execution. For some of this, one can deal with it when manually writing the DQL query, but the driver doesn't have a path to do so.

The second trouble I ran into was the need to get a list composed of multiple kinds of documents. This one is a limitation of the default idiom that JNoSQL uses, where you do queries on named types of documents - and, in the Domino driver, that "type" corresponds to Form field values.

The Uncomfortable Solution

Thus, hat in hand, I returned to the design element I had hoped to skim past: views. Views are an important tool, but they are way, way overused in Domino, and I've been trying over time to intentionally limit my use of them to break the habit. Still, they're obviously the correct tool for both of these jobs.

So I made myself an issue to track this and set about tinkering with some ways to make use of them in a way that would do what I need, be flexible for future needs, and yet not break the core conceit of JNoSQL too much. My goal is to make almost no calls to an explicit Domino API, and so doing this will be a major step in that direction.

Jakarta NoSQL's Extensibility

Fortunately for me, Jakarta NoSQL is explicitly intended to be extensible per driver, since NoSQL databases diverge more wildly in the basics than SQL databases tend to. I made use of this in the Darwino driver to provide support for stored cursors, full-text search, and JSQL, though all of those had the bent of still returning full documents and not "view entries" in the Domino sense.

Still, the idea is very similar. Jakarta NoSQL encourages a driver author to write custom annotations for repository methods to provide hints to the driver to customize behavior. This generally happens at the "mapping" layer of the framework, which is largely CDI-based and gives you a lot of room to intercept and customize requests from the app-developer level.

Implementation

To start out with, I added two annotations you can add to your repository methods: @ViewEntries and @ViewDocuments. For example:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
@RepositoryProvider("blogRepository")
public interface BlogEntryRepository extends DominoRepository<BlogEntry, String> {
    public static final String VIEW_BLOGS = "vw_Content_Blogs"; //$NON-NLS-1$
    
    @ViewDocuments(value=VIEW_BLOGS, maxLevel=0)
    Stream<BlogEntry> findRecent(Pagination pagination);
    
    @ViewEntries(value=VIEW_BLOGS, maxLevel=0)
    Stream<BlogEntry> findAll();
}

The distinction here is one of the ways I slightly break the main JNoSQL idioms. JNoSQL was born from the types of databases where it's just as easy to retrieve the entire document as it is to retrieve part - this is absolutely the case in JSON-based systems like Couchbase (setting aside attachments). However, Domino doesn't quite work that way: it can be significantly faster to fetch only a portion of a document than the data from all items, namely when some of those items are rich text or MIME.

The @ViewEntries annotation causes the driver to consider only the item values found in the entries of the view it's referencing. In a lot of cases, this is all you'll need. When you set a column in Designer to be just directly an item value from the documents, the column is by default named with the same name, and so a mapped entity pulled from this column can have the same fields filled in as from a document. This does have the weird characteristic where objects pulled from one method may have different instance values from the "same" objects from another method, but the tradeoff is worth it.

@ViewDocuments, fortunately, doesn't have this oddity. With that annotation, documents are processed in the same way as with a normal query; they just are retrieved according to the selection and order from the backing view.

Using these capabilities allowed me to slightly break the JNoSQL idiom in the other way I needed: reading unrelated document types in one go. For this, I cheated a bit and made a "document" type with a form name that doesn't correspond to anything, and then made the mapped items based on the view name. So I created this entity class:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
@Entity("ProjectActivity")
public class ProjectActivity {
    @Column("$10")
    private String projectName;
    @Column("Entry_Date")
    private OffsetDateTime date;
    @Column("$12")
    private String createdBy;
    @Column("Form")
    private String form;
    @Column("subject")
    private String subject;

    /* snip */
}

As you might expect, no form has a field named $10, but that is the name of the view column, and so the mapping layer happily populates these objects from the view when configured like so:

1
2
3
4
5
@RepositoryProvider("projectsRepository")
public interface ProjectActivityRepository extends DominoRepository<ProjectActivity, String> {
    @ViewEntries("AllbyDate")
    Stream<ProjectActivity> findByProjectName(@ViewCategory String projectName);
}

These are a little weird in that you wouldn't want to save such entities lest you break your data, but, as long as you keep that in mind, it's not a bad way to solve the problem.

Future Changes

Since this implementation was based on fulfilling just my immediate needs and isn't the result of careful consideration, it's likely to be something that I'll revisit as I go. For example, that last example shows the third custom annotation I introduced: @ViewCategory. I wanted to restrict entries to a category that is specified programmatically as part of the query, and so annotating the method parameter was a great way to do that. However, there are all sorts of things one might want to do dynamically when querying a view: setting the max level programmatically, specifying expand/collapse behavior, and so forth. I don't know yet whether I'll want to handle those by having a growing number of parameter annotations like that or if it would make more sense to consolidate them into a single ViewQueryOptions parameter or something.

I also haven't done anything special with category or total rows. While they should just show up in the list like any other entry, there's currently nothing special signifying them, and I don't have a way to get to the note ID either (just the UNID). I'll probably want to create special pseudo-items like @total or @category to indicate their status.

There'll also no doubt be a massive wave of work to do when I turn this on something that's not just a little side project. While I've made great strides in my oft-mentioned large client project to get it to be more platform-independent, it's unsurprisingly still riven with Domino API references top to bottom. While I don't plan on moving it anywhere else, writing so much code using explicit database-specific API calls is just bad practice in general, and getting this driver to a point where it can serve that project's needs would be a major sign of its maturity.

Per-NSF-Scoped JWT Authorization With JavaSapi

Jun 4, 2022, 10:35 AM

Tags: domino dsapi java
  1. Poking Around With JavaSapi
  2. Per-NSF-Scoped JWT Authorization With JavaSapi
  3. WebAuthn/Passkey Login With JavaSapi

In the spirit of not leaving well enough alone, I decided the other day to tinker a bit more with JavaSapi, the DSAPI peer tucked away undocumented in Domino. While I still maintain that this is too far from supported for even me to put into production, I think it's valuable to demonstrate the sort of thing that this capability - if made official - would make easy to implement.

JWT

I've talked about JWT a bit before, and it was in a similar context: I wanted to be able to access a third-party API that used JWT to handle authorization, so I wrote a basic library that could work with LS2J. While JWT isn't inherently tied to authorization like this, it's certainly where it's found a tremendous amount of purchase.

JWT has a couple neat characteristics, and the ones that come in handy most frequently are a) that you can enumerate specific "claims" in the token to restrict what the token allows the user to do and b) if you use a symmetric signature key, you can generate legal tokens on the client side without the server having to generate them. "b" there is optional, but makes JWT a handy way to do a quick shared secret between servers to allow for trusted authentication.

It's a larger topic than that, for sure, but that's the quick and dirty of it.

Mixing It With An NSF

Normally on Domino, you're either authenticated for the whole server or you're not. That's usually fine - if you want to have a restricted account, you can specifically grant it access to only a few NSFs. However, it's good to be able to go more fine-grained, restricting even powerful accounts to only do certain things in some contexts.

So I had the notion to take the JWT capability and mix it with JavaSapi to allow you to do just that. The idea is this:

  1. You make a file resource (hidden from the web) named "jwt.txt" that contains your per-NSF secret.
  2. A remote client makes a request with an Authorization header in the form of Bearer Some.JWT.Here
  3. The JavaSapi interceptor sees this, checks the target NSF, loads the secret, verifies it against the token, and authorizes the user if it's legal

As it turns out, this turned out to be actually not that difficult in practice at all.

The main core of the code is:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
public int authenticate(IJavaSapiHttpContextAdapter context) {
    IJavaSapiHttpRequestAdapter req = context.getRequest();

    // In the form of "/foo.nsf/bar"
    String uri = req.getRequestURI();
    String secret = getJwtSecret(uri);
    if(StringUtil.isNotEmpty(secret)) {
        try {
            String auth = req.getHeader("Authorization"); //$NON-NLS-1$
            if(StringUtil.isNotEmpty(auth) && auth.startsWith("Bearer ")) { //$NON-NLS-1$
                String token = auth.substring("Bearer ".length()); //$NON-NLS-1$
                Optional<String> user = decodeAuthenticationToken(token, secret);
                if(user.isPresent()) {
                    req.setAuthenticatedUserName(user.get(), "JWT"); //$NON-NLS-1$
                    return HTEXTENSION_REQUEST_AUTHENTICATED;
                }
            }
        } catch(Throwable t) {
            t.printStackTrace();
        }
    }

    return HTEXTENSION_EVENT_DECLINED;
}

To read the JWT secret, I used IBM's NAPI:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
private String getJwtSecret(String uri) {
    int nsfIndex = uri.toLowerCase().indexOf(".nsf"); //$NON-NLS-1$
    if(nsfIndex > -1) {
        String nsfPath = uri.substring(1, nsfIndex+4);
        
        try {
            NotesSession session = new NotesSession();
            try {
                if(session.databaseExists(nsfPath)) {
                    // TODO cache lookups and check mod time
                    NotesDatabase database = session.getDatabase(nsfPath);
                    database.open();
                    NotesNote note = FileAccess.getFileByPath(database, SECRET_NAME);
                    if(note != null) {
                        return FileAccess.readFileContentAsString(note);
                    }
                }
            } finally {
                session.recycle();
            }
        } catch(Exception e) {
            e.printStackTrace();
        }
    }
    return null;
}

And then, for the actual JWT handling, I use the auth0 java-jwt library:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
public static Optional<String> decodeAuthenticationToken(final String token, final String secret) {
	if(token == null || token.isEmpty()) {
		return Optional.empty();
	}
	
	try {
		Algorithm algorithm = Algorithm.HMAC256(secret);
		JWTVerifier verifier = JWT.require(algorithm)
		        .withIssuer(ISSUER)
		        .build();
		DecodedJWT jwt = verifier.verify(token);
		Claim claim = jwt.getClaim(CLAIM_USER);
		if(claim != null) {
			return Optional.of(claim.asString());
		} else {
			return Optional.empty();
		}
	} catch (IllegalArgumentException | UnsupportedEncodingException e) {
		throw new RuntimeException(e);
	}
}

And, with that in place, it works:

JWT authentication in action

That text is coming from a LotusScript agent - as I mentioned in my original JavaSapi post, this authentication is trusted the same way DSAPI authentication is, and so all elements, classic or XPages, will treat the name as canon.

Because the token is based on the secret specifically from the NSF, using the same token against a different NSF (with no JWT secret or a different one) won't authenticate the user:

JWT ignored by a different endpoint

If we want to be fancy, we can call this scoped access.

This is the sort of thing that makes me want JavaSapi to be officially supported. Custom authentication and request filtering are much, much harder on Domino than on many other app servers, and JavaSapi dramatically reduces the friction.

XPages Jakarta EE 2.5.0 And The Looming Java-Version Wall

May 25, 2022, 2:41 PM

Earlier today, I published version 2.5.0 of the XPages Jakarta EE Support project. It's mostly a consolidation and bug-fix release, but there are few interesting features and notes about the implementation. Plus, as teased in the post title up there, there's a looming problem for the project.

New Features

There are two main new features in this version.

First, I added some configurable CORS support for REST services. Fortunately for me, RestEasy comes with a CORS filter by default, and it just needs to be enabled. I wired it up using MicroProfile Config to read some values out of xsp.properties:

1
2
3
4
5
6
7
8
rest.cors.enable=true                   # required for CORS
rest.cors.allowCredentials=true         # defaults to true
rest.cors.allowedMethods=GET,HEAD       # defaults to all
rest.cors.allowedHeaders=Some-Header    # defaults to all
rest.cors.exposedHeaders=Some-Header    # optional
rest.cors.maxAge=600                    # optional
# allowedOrigins is required, and can be "*"
rest.cors.allowedOrigins=http://foo.com,http://bar.com

I also added support for using the long-standing @WebServlet annotation. Though REST services will generally do what you want, sometimes it's handy to use the lower-level Servlet capability, and now you can do so inline:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
@WebServlet(urlPatterns = { "/someservlet", "/someservlet/*", "*.hello" })
public class ExampleServlet extends HttpServlet {
	private static final long serialVersionUID = 1L;
	
	@Inject
	ApplicationGuy applicationGuy;

	@Override
	protected void doGet(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException {
		resp.setContentType("text/plain");
		resp.getWriter().println("Hello from ExampleServlet. context=" + req.getContextPath() + ", path=" + req.getServletPath() + ", pathInfo=" + req.getPathInfo());
		resp.getWriter().println("ApplicationGuy: " + applicationGuy.getMessage());
		resp.getWriter().flush();
	}
}

Consolidation

There were a couple specs where I had previously either copied the source into the repository (CDI, Mail) or had maintained a local branch fork (NoSQL). Those were always uncomfortable concessions to reality, but I decided to look further into ways to handle that.

For NoSQL, part of it was what I talked about in my last post: using Eclipse Transformer to make use of javax.* compiled binaries and source converted to jakarta.* automatically. But beyond that, it had the same problem that I had forked Mail for. Namely, it hits the same trouble that lots of non-OSGi code does in an OSGi context, where it uses ServiceLoader in a non-extensible way. Though I have an open PR to make use of the pseudo-standard "HK2" ServiceLoader provider, waiting for that would mean continuing the local-build trouble.

Instead, for all of these cases I made use of OSGi's Weaving capability to re-write those parts of the class files on the fly. While this is a bit unfortunate, it works well in practice. The only real down side for now is having to be a bit more careful when bumping the versions in the future, but this type of code changes very rarely.

The Looming Wall

While this has been going swimmingly, I've started to hit some real impediments with Domino's Java version. The next release of Jakarta EE, version 10, requires Java 11 as a minimum. This is similar to the move Equinox (Domino's OSGi framework of choice) made just under two years ago, and which has itself bitten me with a blockage to upgrading Tycho to version 2.0 and above. Java 11 is about four years old now, and is no longer even the latest LTS release, so this all makes sense.

I've known this was coming for a while, but incompatible versions of JEE specs and implementations started to trickle in over the past year, leading to me leaving notes for myself about maximum versions. JEE 10 itself is fairly imminent now, so I'll be capped at the ones released with JEE 9 a while ago.

So I've been pondering my options here.

In one sense, I solved this problem years ago. The Domino Open Liberty Runtime project has had the ability to download any version of open-source Java that you want, and I expanded it last year to let you pick from several common flavors. Liberty maintains a breathless pace of advancement, adding official support for Java 18 the month after it came out. If one wants to run JEE apps on Domino, that's the most complete way. However, though it does its job technologically well, it's not exactly a natural fit for Domino developers in its current state.

But I've been considering anew a notion I had years ago, which is to write an extension for Liberty so that it reads class files and resources out of an NSF directly. In some early investigation a bit ago, this started to appear quite doable. In theory, I could write an adapter that would take an incoming request for "foo.nsf" and then read files out of the NSF in the same way XPages does, but instead feeding them to Liberty's runtime. Doing this would essentially implement all remaining JEE and MicroProfile specs in one fell swoop on top of the "any Java version" support, but would add the fault-prone attribute of running a separate process and proxying requests to it. In practice, that setup has proven itself good, but it's certainly more complicated than the "single process on port 80" deal that Domino's HTTP is now.

That route also wouldn't inherently support XPages, which would be something of an impediment to the XPages JEE project's original remit. That's something I've also pondered, and in theory I could make an auto-vivifying version of the XPages Runtime project that grabs all the pertinent XPages bundles from the current server and patches them into the Liberty server as an extension feature, similar to how all the built-in Liberty features work. This could be done, but I'll admit that I balk a bit at the prospect. Though I run XPages outside Domino constantly, it's with full knowledge of the tradeoffs and special considerations. Getting a normal NSF-based XPages app to run in this way would take some additional work.

Anyway, those options could work, but none of them are great. The true fix would naturally be for HCL to move to a newer Java version in Domino's HTTP stack, but I don't control that, so I'll content myself with considering what to do in the mean time. Admittedly, pondering this sort of thing is enjoyable in its own right. Also fortunately, even without tackling this, there's still plenty of stuff in the pile for me to tackle as the fancy strikes me.

Putting Eclipse Transformer To Use In Dependency Wrangling

May 24, 2022, 3:46 PM

Tags: jakartaee java

Setting code aside, the backbone of the XPages Jakarta EE Support project is its dependency pool. In it, I use my fork of the p2-maven-plugin to wrangle all the spec and implementation dependencies. Aside from just collecting them, this file does a ton of work to create and reconfigure their OSGi bundle rules to get everything working on Domino.

There have been limitations, though, and some of them have to do with the Jakarta NoSQL project. Though there are side branches of that project using the jakarta.* namespace, the main master branch is still on javax.* for a couple Jakarta depenencies. Historically, I've dealt with this by running a build locally and deploying it to OpenNTF's Maven server. However, this adds a bit of randomness to the mix: if a snapshot build of NoSQL goes out to the main repository that happens to be newer, then building the dependency repository locally might pick up on that instead, since it's named the same thing.

Transformer

Fortunately, IBM wrote the solution for me: Eclipse Transformer. This Transformer is a rules engine to translate files (Java and related resources, namely) based on configuration - and, while it's generic, it's really designed for the transition from javax.* to jakarta.* namespaces.

It allows you to do these transformations at runtime or (as I'll be doing here) ahead of time, even if you don't have access to the original source. Though I do have access to the source, it's more useful at the moment to act like I don't.

I'd known about the tool and have seen how it's used heavily by both app servers and implementation vendors to be able to support both old- and new-style uses, and so I've kept it in mind for in case the need ever came up. It's a perfect fit for this.

p2-maven-plugin

I considered a couple ways to handle this, but realized the cleanest for now would be to integrate it into the dependency pool generator that I already have, since it fits right in with the OSGi transformations I'm doing.

So I went on over to the p2-maven-plugin fork and got to work. When defining Maven artifacts to bring in, the format looks like this:

1
2
3
4
<artfiact>
    <id>jakarta.servlet:jakarta.servlet-api:4.0.4</id>
    <source>true</source>
</artfiact>

Now, Servlet already has a jakarta.* version, but it'll be useful here as an example that avoids the other transformations I'm doing.

My addition is to add a transform configuration option here, with jakarta as the only value for now:

1
2
3
4
5
<artfiact>
    <id>jakarta.servlet:jakarta.servlet-api:4.0.4</id>
    <source>true</source>
    <transform>jakarta</transform>
</artfiact>

...and that'll be it! When that is specified, the code will now run the artifact and its source JAR transparently through Transformer and the version you get in your p2 repository will reflect the transition. And, well, it works perfectly in my case. The resultant NoSQL spec and dependencies are functionally equivalent to the ones in the jakarta.* source branch, but without having to actually change the source files yet. Neat.

Implementation

Though it took a bit to track down the best way to do it, it turned out that Transformer is quite easy to embed into a Java app like the Maven plugin. The majority of the code ends up being effectively Java boilerplate to provide the default values for Jakarta transformation. Truncated, it looks like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
String inputFileName = t.getAbsolutePath(); // the artifact in ~/.m2/repository
File dest = File.createTempFile(t.getName(), ".jar"); //$NON-NLS-1$
String outputFileName = dest.getAbsolutePath();

Map<String, String> optionDefaults = JakartaTransform.getOptionDefaults();
Function<String, URL> ruleLoader = JakartaTransform.getRuleLoader();
TransformOptions options = /* build TransformOptions object that reads the above variables */

Transformer transformer = new Transformer(logger, options);
ResultCode result = transformer.run();
switch(result) {
case ARGS_ERROR_RC:
case FILE_TYPE_ERROR_RC:
case RULES_ERROR_RC:
case TRANSFORM_ERROR_RC:
	throw new IllegalStateException("Received unexpected result from transformer: " + result);
case SUCCESS_RC:
default:
	return dest;
}

There are plenty of options to specify, but that's really about it. Once given the Jakarta defaults, it will do the right thing in the normal case, both for the compiled class files as well as the source JAR.

I'm not sure if I'll need it in other cases (NoSQL will move over in the main branch eventually), but it's sure handy here and should be useful in a pinch. From time to time, I've run across dependencies that would be useful to include but use old JEE specs, and this could do the trick in those cases too.

Poking Around With JavaSapi

May 19, 2022, 4:49 PM

Tags: dsapi java
  1. Poking Around With JavaSapi
  2. Per-NSF-Scoped JWT Authorization With JavaSapi
  3. WebAuthn/Passkey Login With JavaSapi

Earlier this morning, Serdar Basegmez and Karsten Lehmann had a chat on Twitter regarding the desire for OAuth on Domino and their recollections of a not-quite-shipped technology from a decade ago going by the name "JSAPI".

Seeing this chat go by reminded me of some stuff I saw when I was researching the Domino HTTP Java entrypoint last year. Specifically, these guys, which have been sitting there since at least 9.0.1:

JavaSapi class files in com.ibm.domino.xsp.bridge.http

I'd made note of them at the time, since there's a lot of tantalizing stuff in there, but had put them back on the shelf when I found that they seemed to be essentially inert at runtime. For all of IBM's engineering virtues (and there are many), they were never very good at cleaning up their half-implemented experiments when it came time to ship, and I figured this was more of the same.

What This Is

Well, first and foremost, it is essentially a non-published experiment: I see no reference to these classes or how to enable them anywhere, and so everything within these packages should be considered essentially radioactive. While they're proving to be quite functional in practice, it's entirely possible - even likely - that the bridge to this side of thing is full of memory leaks and potential severe bugs. Journey at your own risk and absolutely don't put this in production. I mean that even more in this case than my usual wink-and-nod "not for production" coyness.

Anyway, this is the stuff Serdar and Karsten were talking about, named "JavaSapi" in practice. It's a Java equivalent to DSAPI, the API you can hook into with native libraries to perform low-level alterations to requests. DSAPI is neat, but it's onerous to use: you have to compile down to a native library, target each architecture you plan to run on, deploy that to each server, and enable it in the web site config. There's a reason not a lot of people use it.

Our new friend JavaSapi here provides the same sorts of capabilities (rewriting URLs, intercepting requests, allowing for arbitrary user authentication (more on this later), and so forth) but in a friendlier environment. It's not just that it's Java, either: JavaSapi runs in the full OSGi environment provided by HTTP, which means it swims in the same pool as XPages and all of your custom libraries. That has implications.

How To Use It

By default, it's just a bunch of classes sitting there, but the hook down to the core level (in libhttpstack.so) remains, and it can be enabled like so:

set config HTTP_ENABLE_JAVASAPI=1

(strings is a useful tool)

Once that's enabled, you should start seeing a line like this on HTTP start:

[01C0:0002-1ADC] 05/19/2022 03:37:17 PM  HTTP Server: JavaSapi Initialized

Now, there's a notable limitation here: the JavaSapi environment isn't intended to be arbitrarily extensible, and it's hard-coded to only know about one service by default. That service is interesting - it's an OAuth 2 provider of undetermined capability - but it's not the subject of this post. The good news is that Java is quite malleable, so it's not too difficult to shim in your own handlers by writing to the services instance variable of the shared JavaSapiEnvironment instance (which you might have to construct if it's not present).

Once you have that hook, it's just a matter of writing a JavaSapiService instance. This abstract class provides fairly-pleasant hooks for the triggers that DSAPI has, and nicely wraps requests and responses in Servlet-alike objects.

Unlike Servlet objects, though, you can set a bunch of stuff on these objects, subject to the same timing and pre-filtering rules you'd have in DSAPI. For example, in the #rawRequest method, you can add or overwrite headers from the incoming request before they get to any other code:

1
2
3
4
5
public int rawRequest(IJavaSapiHttpContextAdapter context) {
    context.getRequest().setRequestHeader("Host", "foo-bar.galaxia");
        
    return HTEXTENSION_EVENT_HANDLED;
}

If you want to, you can also handle the entire request outright:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
public int rawRequest(IJavaSapiHttpContextAdapter context) {
    if(context.getRequest().getRequestURI().contains("foobar")) {
        context.getResponse().setStatus(299);
        context.getResponse().setHeader("Content-Type", "text/foobar");
        try {
            context.getResponse().getOutputStream().print("hi.");
        } catch (IOException e) {
            e.printStackTrace();
        }
        return HTEXTENSION_REQUEST_PROCESSED;
    }
    
    return HTEXTENSION_EVENT_HANDLED;
}

You probably won't want to, since we're not lacking for options when it comes to responding to web requests in Java, but it's nice to know you can.

You can even respond to tell http foo commands:

1
2
3
4
5
6
7
8
9
public int processConsoleCommand(String[] argv, int argc) {
    if(argc > 0) {
        if("foo".equals(argv[0])) { //$NON-NLS-1$
            System.out.println(getClass().getSimpleName() + " was told " + Arrays.toString(argv));
            return HTEXTENSION_SUCCESS;
        }
    }
    return HTEXTENSION_EVENT_DECLINED;
}

So that's neat.

The fun one, as it usually is, is the #authenticate method. One of the main reasons one might use DSAPI in the first place is to provide your own authentication mechanism. I did it years and years ago, Oracle did it for their middleware, and HCL themselves did it recently for the AppDev Pack's OAuth implementation.

So you can do the same here, like this super-secure implementation:

1
2
3
4
public int authenticate(IJavaSapiHttpContextAdapter context) {
    context.getRequest().setAuthenticatedUserName("CN=Hello From " + getClass().getName(), getClass().getSimpleName());
    return HTEXTENSION_REQUEST_AUTHENTICATED;
}

The cool thing is that this has the same characteristics as DSAPI: if you declare the request authenticated here, it will be fully trusted by the rest of HTTP. That means not just Java - all the classic stuff will trust it too:

Screenshot showing JavaSapi authentication in action

Conclusion

Again: this stuff is even further from supported than the usual components I muck around in, and you shouldn't trust any of it to work more than you can actively observe. The point here isn't that you should actually use this, but more that it's interesting what things you can find floating around the Domino stack.

Were this to be supported, though, it'd be phenomenally useful. One of Domino's stickiest limitations as an app server is the difficulty of extending its authentication schemes. It's always been possible to do so, but DSAPI is usually prohibitively difficult unless you either have a bunch of time on your hands or a strong financial incentive to use it. With something like this, you could toss Apache Shiro in there as a canonical source of user authentication, or maybe add in Soteria - the Jakarta Security implementation - to get per-app authentication.

There's also that OAuth 2 thing floating around in there, which does have a usable extension point, but I think it's fair to assume that it's unfinished.

This is all fun to tinker with, though, and sometimes that's good enough.

Upcoming Sessions With OpenNTF and at Engage

May 11, 2022, 8:48 AM

Tags: engage openntf

This month contains a few presentations and sessions of note for me, and I realized I should compile a list.

To start out with, I'll be the second presenter at next week's OpenNTF webinar, which will be about the Domino One-Touch Setup capability from V12 onward. The first leg of the presentation will feature Roberto Boccadoro covering its use in the standard case of easing the lives of administrators, while my section will cover more developer-centric needs, particularly my use of it as a tool for repeatable integration-test suites of Domino code. This webinar will take place next week, on May 19th, and you can register for it at https://attendee.gotowebinar.com/register/2937214894267353356.

While I won't personally be attending Engage this year, it's shaping up to be a good conference and OpenNTF and Jakarta EE will be well represented (in chronological order):

  • Ro05. Happy Birthday, OpenNTF! Now What? - on Tuesday, Serdar Basegmez (and possibly other OpenNTF directors) will run our traditional community roundtable, where you can hear about what OpenNTF has been up to and provide ideas about where you'd like us to go.
  • De20. Domino apps CI with Docker - also on Tuesday, Martin Pradny will discuss the use of the NSF ODP Tooling project within a Docker container to automate NSF building within a CI infrastructure in a smooth and reliable way.
  • De17. Domino + Jakarta EE = AppDev In Heaven - on Wednesday morning, Daniele Vistalli will cover the XPages Jakarta EE Support project, going over the myriad Jakarta and MicroProfile specifications that it brings to Domino development and showing how you can bring them to bear to fix a lot of long-standing Domino dev limitations.

If you're attending Engage, I definitely suggest you check out all of those.