Maven Native Chronicles, Part 3: Improving Native Artifact Handling

Sun Jul 26 21:38:37 EDT 2015

Tags: maven
  1. Maven Native Chronicles, Part 1: Figuring Out nar-maven-plugin
  2. Maven Native Chronicles, Part 2: Setting Up a Windows Jenkins Node
  3. Maven Native Chronicles, Part 3: Improving Native Artifact Handling
  4. Maven Native Chronicles: Running Automated Notes-based Tests

This post isn't so much a part of the current series as it is a followup to a post from the other week, but I can conceptually retcon that one in as a prologue. This will also be a good quick tip for dealing with Maven projects.

In my previous post, I described how I copied the built native shared library from the C++ project into the OSGi fragments for distribution, and I left it with the really hacky approach of copying the file using a project-relative path that reached up into the other project. It technically functioned, but it relied on the specific project structure, which wouldn't survive any reorganization or breaking up of the module tree.

To improve it, I reworked it to be a bit more Maven-y, which involves two steps: attaching the built artifacts to the output of the native project and then using the dependency plugin to copy the native artifacts in as needed. For the first step, I used the build-helper-maven-plugin, though there may be other ways to do it. This is relatively straightfoward, though:

<plugin>
	<groupId>org.codehaus.mojo</groupId>
	<artifactId>build-helper-maven-plugin</artifactId>
	<version>1.3</version>
	<executions>
		<execution>
			<id>attach-artifacts</id>
			<phase>package</phase>
			<goals>
				<goal>attach-artifact</goal>
			</goals>
			<configuration>
				<artifacts>
					<artifact>
						<file>${project.basedir}/x64/Debug/nativelib-win32-x64.dll</file>
						<type>dll</type>
						<classifier>win32-x64</classifier>
					</artifact>
					<artifact>
						<file>${project.basedir}/Win32/Debug/nativelib-win32-x86.dll</file>
						<type>dll</type>
						<classifier>win32-x86</classifier>
					</artifact>
				</artifacts>
			</configuration>
		</execution>
	</executions>
</plugin>

This causes the native libraries - so far, the two Windows ones - to be included in the Maven repository during installation, and to then be accessible from other projects. The files are named using the module base name plus the classifier appended and the type as the file extension, like native-project-name-win32-x64.dll.

To copy that artifact into the OSGi bundle project, I then use maven-dependency-plugin to copy it in. Here I reference it via the module name and the classifier/type pair used above (with some shorthands because they're in the same multi-module project):

<plugin>
	<groupId>org.apache.maven.plugins</groupId>
	<artifactId>maven-dependency-plugin</artifactId>
	<version>2.10</version>
	
	<executions>
		<execution>
			<id>copy-native-lib</id>
			<phase>prepare-package</phase>
			<goals>
				<goal>copy</goal>
			</goals>
			<configuration>
				<artifactItems>
					<artifactItem>
						<groupId>${project.groupId}</groupId>
						<artifactId>native-project-name</artifactId>
						<version>${project.version}</version>
						<type>dll</type>
						<classifier>win32-x64</classifier>
					</artifactItem>
				</artifactItems>
				<outputDirectory>lib</outputDirectory>
				<stripVersion>true</stripVersion>
			</configuration>
		</execution>
	</executions>
</plugin>

The net result here is the same as previously, but should be more maintainable.

Maven Native Chronicles, Part 2: Setting Up a Windows Jenkins Node

Sun Jul 26 11:16:50 EDT 2015

Tags: maven
  1. Maven Native Chronicles, Part 1: Figuring Out nar-maven-plugin
  2. Maven Native Chronicles, Part 2: Setting Up a Windows Jenkins Node
  3. Maven Native Chronicles, Part 3: Improving Native Artifact Handling
  4. Maven Native Chronicles: Running Automated Notes-based Tests

Before I get to the meat of this post, I want to point out that Ulrich Krause wrote a post on a similar topic today and you should read it.

The build process I've been working with involves a Jenkins server running on OS X (in order to build iOS binaries), and so it will be useful to have a Windows instance set up as well to run native builds and, importantly, tests. Jenkins comes with support for distributed builds and makes it relatively straightforward.

To start with, I installed VirtualBox and went through the usual Windows setup process - it shouldn't matter too much which major version of Windows you use, as long as it's 64-bit, in order to be able to generate and test both types of binaries. Once that was running, I installed the latest 64-bit JDK followed by Visual Studio Community, which is a pretty smooth process (for all their faults, Microsoft knows how to treat developers). To provide access to the VM from the Mac host, I added a second network adapter to the VM and set it to host-only networking:

During this process, I found Jump Desktop to be a very useful tool. Since the Mac host runs SSH, I was able to set up an RDP connection to the Windows VM using an SSH tunnel, which Jump does transparently for you. This made for a much better experiencing than VNCing into the Mac and controlling Windows in the VirtualBox window in there.

Next, I decided that the route I wanted to take to control the Windows slave was SSH, since SSH is the bee's knees. I installed Cygwin, which creates a fairly Unix-like environment on top of Windows, and included OpenSSH in the process. After going through the afore-linked setup process, I had SSH access to the Windows machine (including, thanks to SSH proxying, remote access via the primary build server). On the Jenkins side on the Mac, I installed the "Cygpath plugin" (which is in the built-in plugin manager) to avoid any of the issues mentioned on the wiki page. The configuration in Jenkins is relatively straightforward (I will probably end up changing the base directory to be a clean Jenkins home, since I hadn't initially been sure if I needed Jenkins installed on the slave):

With that, I was able to set the build to run on servers with the "windows" label, kick it off, and start going through its complaints until I had it working.

First off, I had some more Java setup to do, specifically creating a system environment variable named JAVA_HOME and setting it to the root of the JDK ("C:\Program Files\Java\jdk1.8.0_51" in this case). Then, I set up Maven, which is something of an awkward process on Windows, but not TOO bad. I downloaded the latest binaries, unzipped them to "C:\Program Files\maven", added an environment variable of M2_HOME to point to that:

I also added %M2_HOME%\bin;C:\Program Files (x86)\MSBuild\12.0\Bin to the end of the PATH variable, to cover both the Maven tools and the msbuild executable for later.

I ran into a bit of weirdness when it came to setting up configuration for SSH and Maven, specifically because it seems that Cygwin has two home folders for the logged-in user: the Unix-style /home/jesse and the normal Windows C:\Users\jesse (which is available in Cygwin as /cygdrive/c/Users/jesse). Since this Jenkins build checks out the code from GitHub via SSH, I needed to copy over the id_rsa file for the Jenkins user: this went into /home/jesse/.ssh/id_rsa. In order to configure Maven, though, the settings file went to C:\Users\jesse\.m2\settings.xml.

Eventually, it slogged its way through the build to completion, including a successful run of the integration tests. I still need to figure out the best way to get the resultant artifacts back out (or maybe it will be best to just deploy from both to the same Artifactory server), but this seems to do the main task for me.

Maven Native Chronicles, Part 1: Figuring Out nar-maven-plugin

Fri Jul 24 15:48:59 EDT 2015

  1. Maven Native Chronicles, Part 1: Figuring Out nar-maven-plugin
  2. Maven Native Chronicles, Part 2: Setting Up a Windows Jenkins Node
  3. Maven Native Chronicles, Part 3: Improving Native Artifact Handling
  4. Maven Native Chronicles: Running Automated Notes-based Tests

As I mentioned the other day, my work lately involves a native shared library that is then included in an OSGi plugin. To get it working during a Maven compile, I just farmed out the actual build process to Visual Studio's command-line project builder. That works as far as it goes, but it's not particularly Maven-y and, more importantly, it's Windows-only.

In looking around, it seems like the most popular method of doing native compilation in Maven, especially with JNI components, is maven-nar-plugin - nar means "Native ARchive", and it's meant to be a consistent way to package native artifacts (executables and libraries) across platforms. It does an admirable job wrangling the normally-loose nature of a C/C++ program to work with Maven-ish standards and attempts to paper over the differences between platforms and toolchains. I'm not entirely convinced that this will be the way I go long-term (in particular, its attitude towards multi-platform/arch builds seems to be "eh, sort of?"), but it's a good place to get started with non-Windows compilation.

The first step was to move the files around to mostly match a Maven-style layout. Starting out, the .cpp and .h files were in the src folder directly, while dependency headers were in a dependencies folder next to it. I left the Notes includes in there for now, but it seems that nar-maven-plugin will cover the JNI stuff for me, so I could simplify that somewhat. The new project structure looks like:

  • (project root)
    • src
      • main
        • c++
        • include
    • dependencies
      • inc
        • notes

Next was to set up the project configuration. For now, I want to still use Visual Studio's CLI app to build the Windows version, and I'm going to have to specifically define supported platforms, so I define the project as a nar, but then disable actual execution of the plugin by default:

<project>
	...
	<packaging>nar</packaging>
	
	<build>
		<plugins>
			<plugin>
				<groupId>com.github.maven-nar</groupId>
				<artifactId>nar-maven-plugin</artifactId>
				<version>3.2.3</version>
				<extensions>true</extensions>
				
				<configuration>
					<skip>true</skip>
				</configuration>
			</plugin>
		</plugins>
	</build>
</project>

Then, much as I did for the Windows-specific builds, I added a profile to try to build on my Mac. Note that these build settings produce a library that fails all unit tests, so they're surely not correct, but hey, it compiles and links, so that's a start. To ensure that it only builds when it has an appropriate context, it is triggered by a combination of OS family and the presence of the notes-program Maven property, which should point to the Notes executable directory.

<project>
	...
    
	<profiles>
		...
		<profile>
			<id>mac</id>
		
			<activation>
				<os>
					<family>mac</family>
				</os>
				<property>
					<name>notes-program</name>
				</property>
			</activation>
	
			<build>
				<plugins>
					<plugin>
						<groupId>com.github.maven-nar</groupId>
						<artifactId>nar-maven-plugin</artifactId>
						<extensions>true</extensions>
			
						<configuration>
							<skip>false</skip>
				
							<cpp>
								<debug>true</debug>
								<includePaths>
									<includePath>${project.basedir}/src/main/include</includePath>
									<includePath>${project.basedir}/dependencies/inc/notes</includePath>
								</includePaths>
					
								<options>
									<option>-DMAC -DMAC_OSX -DMAC_CARBON -D__CF_USE_FRAMEWORK_INCLUDES__ -DLARGE64_FILES -DHANDLE_IS_32BITS -DTARGET_API_MAC_CARBON -DTARGET_API_MAC_OS8=0 -DPRODUCTION_VERSION -DOVERRIDEDEBUG</option>
								</options>
							</cpp>
							<linker>
								<options>
									<option>-L${notes-program}</option>
								</options>
								<libSet>notes</libSet>
							</linker>
				
							<libraries>
								<library>
									<type>shared</type>
								</library>
							</libraries>
						</configuration>
					</plugin>
				</plugins>
			</build>
		</profile>
	</profiles>
</project>

Unstable though the result may be, the nar plugin does its job: it produces an archive containing the dylib, suitable for distribution as a Maven artifact and extraction into the downstream project, which I'll go into later.

So this is a good step towards my final goal. As I mentioned, I may end up getting rid of nar-maven-plugin specifically, but this is a good way to shape the code into something more portable (I also got rid of a few Windows-isms in the C++ while I was at it). My ultimate goal is to get a single build run that produces artifacts for all of the important platforms (Windows 32/64 and Linux 32/64 for production, Mac 32/64(?) for JUnit tests during development). I may be able to accomplish that using the nar plugin with a distributed Jenkins build, or I may be able to do it with Makefiles with GCC cross-compilers on OS X build host. If that works, it's the sort of thing that makes all this Maven stuff worthwhile.

Adding Components to an XPage Programmatically

Sun Jul 19 09:16:35 EDT 2015

Tags: xpages java

One of my favorite aspects of working with apps using my framework is the component binding capability. This lets me just write the main structure of the page and let the controller do the grunt work of creating fields with validators and converters. There's a lot of magic behind the scenes to make it happen, but the core concept of dynamic component creation is relatively straightforward.

An XPage is a tree of components, and those components are all Java objects on the back end, which can be manipulated and added or removed programmatically. To demonstrate, I'll start with this basic XPage:

<?xml version="1.0" encoding="UTF-8"?>
<xp:view xmlns:xp="http://www.ibm.com/xsp/core" beforePageLoad="#{controller.beforePageLoad}" afterPageLoad="#{controller.afterPageLoad}">
	<xp:div id="container">
	</xp:div>
</xp:view>

Now, I'll add a basic form table using the afterPageLoad method in the controller class:

package controller;

import javax.faces.component.UIComponent;
import javax.faces.context.FacesContext;

import com.ibm.xsp.component.UIViewRootEx2;
import com.ibm.xsp.component.xp.XspInputText;
import com.ibm.xsp.extlib.component.data.UIFormLayoutRow;
import com.ibm.xsp.extlib.component.data.UIFormTable;
import com.ibm.xsp.extlib.util.ExtLibUtil;
import com.ibm.xsp.util.FacesUtil;

import frostillicus.xsp.controller.BasicXPageController;

public class home extends BasicXPageController {
	private static final long serialVersionUID = 1L;

	@SuppressWarnings("unchecked")
	@Override
	public void afterPageLoad() throws Exception {
		super.afterPageLoad();

		UIViewRootEx2 view = (UIViewRootEx2)ExtLibUtil.resolveVariable(FacesContext.getCurrentInstance(), "view");
		UIComponent container = FacesUtil.findChildComponent(view, "container");

		UIFormTable formTable = new UIFormTable();
		formTable.setFormTitle("Some Form");
		formTable.setStyle("margin: 2em; width: 20em");
		container.getChildren().add(formTable);
		formTable.setParent(container);

		UIFormLayoutRow formRow = new UIFormLayoutRow();
		formRow.setLabel("Name");
		formTable.getChildren().add(formRow);
		formRow.setParent(formTable);

		XspInputText inputText = new XspInputText();
		formRow.getChildren().add(inputText);
		inputText.setParent(formRow);
	}
}

There are a few concepts to get a handle on here, but fortunately they're not as esoteric as other aspects of back-end XPages development.

To start out with, there's the question of how you're supposed to know what classes the components are. The best way to find this out is to create a basic XPage containing the control with an ID and then go to the Package Explorer view, then the "Local" source folder, and find the class file for your page in the "xsp" package. In there, you can see the code that actually generates the XPage (which is doing basically the same thing as we're doing here). Look for the ID you gave the control on the page and you can find the class behind it.

Next is the job of finding the parent control you're going to attach the new components to. In this case, I used the FacesUtil class to search for the component by its ID, because I know it's the only one on the page with that base ID. This tack will usually do the trick for you, but there are other ways to find it, such as binding or XspQuery.

Finally, there's the need to both add the new child to the parent's children list and to set the parent in the child itself. This is a bit of housekeeping boilerplate, but it has to be done.

Once you have this code running, you get a basic form (using the Bootstrap3.2.0 theme in this case):

Some Form

This can get much more in-depth: anything you can do declaratively in the XPage XML, you can do programmatically in Java. You can also manipulate existing components: change their properties, rearrange them, or remove them from the tree entirely. I've found knowledge of this to be both useful as a part of the toolbox as well as a way to really clear up the mental model of what's going on on the page.

Quick-and-Dirty Inclusion of a Visual C++ Project in a Maven Build

Sat Jul 11 19:26:34 EDT 2015

Tags: maven jni

One of my projects lately makes use of a JNI library distributed via an OSGi plugin. The OSGi side of the project uses the typical Maven+Tycho combination for its building, but the native library was developed using Visual C++. This is workable enough, but ideally I'd like to have the whole thing part of one smooth build: compile the native library, then subsequently copy its resultant shared 32- and 64-bit libraries into the OSGi plugins.

From what I've gathered, the "proper" way to do this sort of setup is to use the nar-maven-plugin, which is intended to wrap around the normal compilers for each platform and handle packaging and access to the libraries and related components. I tinkered with this a bit but ran into a lot of trouble trying to get it to work properly, no doubt due to my extremely-limited knowledge of C++ toolchains combined with the natural weirdness of Windows's development environment.

For now, I decided to do it the "ugly" way that nonetheless gets the job done: just run the Visual C++ toolchain from Maven. Fortunately, Microsoft includes a tool called msbuild for this purpose: if you run it in the directory of a Visual C++ project, it will act like the full IDE. I added its executables to my PATH (C:\Program Files (x86)\MSBuild\12.0\bin) and then used a Maven plugin called exec-maven-plugin to launch it (the Ant plugin would also work, but this is more explicit). Since this will only run on Windows, I wrapped it in a triggered profile and added two executions to cover both 32-bit and 64-bit versions:

<project>
	...
	<packaging>pom</packaging>
	...
	
	<profiles>
		<profile>
			<id>windows-x64</id>
		
			<activation>
				<os>
					<family>windows</family>
					<arch>amd64</arch>
				</os>
			</activation>
			
			<build>
				<plugins>
					<plugin>
						<groupId>org.codehaus.mojo</groupId>
						<artifactId>exec-maven-plugin</artifactId>
						<version>1.4.0</version>
						<executions>
							<execution>
								<id>build-x86</id>
								<phase>generate-sources</phase>
								<goals>
									<goal>exec</goal>
								</goals>
								<configuration>
									<environmentVariables>
										<Platform>Win32</Platform>
									</environmentVariables>
									<executable>msbuild</executable>
								</configuration>
							</execution>
							<execution>
								<id>build-x64</id>
								<phase>generate-sources</phase>
								<goals>
									<goal>exec</goal>
								</goals>
								<configuration>
									<environmentVariables>
										<Platform>X64</Platform>
									</environmentVariables>
									<executable>msbuild</executable>
								</configuration>
							</execution>
						</executions>
					</plugin>
				</plugins>
			</build>
		</profile>
	</profiles>
</project>

The project itself remains configured in Visual Studio. While the source files are certainly modifiable in Eclipse, it won't have the full C/C++ toolchain environment until I figure out a proper way to do that. But this does indeed do the trick: it creates the two DLLs in the same way as when I had been building them in the IDE.

The next step is to automatically include these in the appropriate OSGi fragment projects. For this, at least for now, I'm using the maven-resources-plugin. This configuration depends on the structure of the Maven projects, which is sort of fragile, but it's not too bad when they're in the same overall project. This is the config for the x64 plugin, and there is a separate x86 project with an almost-identical configuration:

<project>
	...
	<build>
		<plugins>
			...
			<plugin>
				<groupId>org.apache.maven.plugins</groupId>
				<artifactId>maven-resources-plugin</artifactId>
				<version>2.7</version>
				
				<executions>
					<execution>
						<id>copy-native-lib</id>
						<phase>generate-resources</phase>
						<goals>
							<goal>copy-resources</goal>
						</goals>
						<configuration>
							<resources>
								<resource>
									<directory>${project.basedir}/../../native-project-name/x64/Debug/</directory>
									<includes>
										<include>nativelib-win32-x64.dll</include>
									</includes>
								</resource>
							</resources>
							<outputDirectory>${project.basedir}/lib</outputDirectory>
						</configuration>
					</execution>
				</executions>
			</plugin>
		</plugins>
	</build>
</project>

The result is that, at least when I build on Windows, everything is properly compiled and put in its right place. When running in my normal Mac dev environment, it uses the built libraries that have previously been copied into the plugin, so it still works well enough.

This is still a far cry from an optimal configuration. The requirement of using Visual Studio is cumbersome, which means that any multi-platform build will mean a redundant config (whether it be in the pom or in a separate Makefile), and this current setup isn't properly "Mavenized": the output doesn't go into the "target" folder and the DLLs aren't tagged for inclusion in the installed Maven repo. It suits the purpose, though, of being an intermediate step in a larger build.

My long-term desire is to get this fully cross-platform and automated on a build server. That will involve a lot of learning about the nar-maven-plugin (or Makefiles) as well as either setting up a cross-compilation infrastructure or a series of Jenkins slaves. In theory, an OS X system can have everything it would need to build for the other platforms itself, but I've gathered that the safest way to do it is with the "multiple Jenkins nodes" route. When I develop an improved build system for this, I'll write followup posts.

Working with Rich Text's MIME Structure

Wed Jul 08 20:28:19 EDT 2015

Tags: mime

My work lately has involved, among other things, processing and creating MIME entities in the format used by Notes for storage as rich text. This structure isn't particularly complicated, but there are some interesting aspects to it that are worth explaining for posterity. Which is to say, myself when I need to do this again.

As a quick primer, MIME is a format originally designed for email which has proven generally useful, including for HTTP and, for our needs, internal storage in NSF. Like many things in programming, it is organized as a tree, with each node consisting of a set of headers (generally, things like "Content-Type: text/html"), content, and children.

Domino stores the text part of rich text in MIME as HTML. In the simplest case, this ends up a one-element "tree", which you can see in the document's properties dialog:

Content-Type: text/html; charset="US-ASCII"

<font size=2 face="sans-serif">Hello <b>there</b></font>

There's slightly more to its full storage implementation (like the MIME_Version item), but the MIME Part items are the important bits. This simple structure can be abstracted to this tree:

  • text/html

Things get a little more complicated when you add embedded images and/or attachments. When you do either of those, the MIME grows to multiple items and becomes a multi-node tree.

Embedded Images

When you add an embedded image in the rich text field, the storage grows to four same-named MIME Part items. Concatenated (and clipped for brevity), the items then look like:

Content-Type: multipart/related; boundary="=_related 006CEB9D85257E7C_="

This is a multipart message in MIME format.

--=_related 006CEB9D85257E7C_=
Content-Type: text/html; charset="US-ASCII"

<font size=3>Here's a picture:</font>
<br>
<br><img src=cid:_2_0C1832A80C182E18006CEB9885257E7C style="border:0px solid;">
<br>
<br><font size=3>Done.</font>

--=_related 006CEB9D85257E7C_=
Content-Type: image/jpeg
Content-ID: <_2_0C1832A80C182E18006CEB9885257E7C>
Content-Transfer-Encoding: base64

*snip*

--=_related 006CEB9D85257E7C_=--

You can see the same sort of HTML block as before contained in there, but it sprouted a lot of other stuff. To begin with, the starting part turned into "multipart/related". The "multipart" denotes that the top MIME entity has children, and the "related" is used when the children consist of an HTML body and inline images. There are delimiters used to separate each part, using the auto-generated convention of "related" plus an effectively-random number. The image itself is represented as a MIME Part of its own, in this case stored inline and Base64-encoded (it can be shifted off to an attachment by Notes/Domino after a certain size). This structure can be abstracted to:

  • multipart/related
    • text/html
    • image/jpeg

The HTML is designed so that there is an image tag that references the attached image using a "cid" URL, an email convention that basically means "find the entity in this related MIME structure with the following content ID" - you can then see the content ID reflected in the JPEG MIME Part. This sort of URL doesn't fly on the web, so anything displaying this field on a web page (or otherwise converting it to a non-MIME storage format) needs to translate that reference to something appropriate for its needs.*

Attachments

When you have a rich text field with an attachment (in this case without the embedded image), you get a very similar structure:

Content-Type: multipart/mixed; boundary="=_mixed 006EBF7C85257E7C_="

This is a multipart message in MIME format.

--=_mixed 006EBF7C85257E7C_=
Content-Type: text/html; charset="US-ASCII"

<font size=3>Here's an attachment: <br>
</font>
<br>
<br><font size=3><br>
Done. </font>

--=_mixed 006EBF7C85257E7C_=
Content-Type: application/octet-stream; name="cert.cer"
Content-Disposition: attachment; filename="cert.cer"
Content-Transfer-Encoding: binary

cert.cer

--=_mixed 006EBF7C85257E7C_=--

The structure is the same sort of tree as previously, but the "related" content sub-type has changed to "mixed". This indicates that there are multiple types of content, but they're conceptually distinct. In any event, the tree looks like:

  • multipart/mixed
    • text/html
    • application/octet-stream

"application/octet-stream" is a generic MIME type for, basically, "bag of bytes" - MIME-based tools use it when they either don't know the content type or, as in this case, don't care. In this case, Notes/Domino splits out the content to be an NSF-style attachment and then references that in the MIME - this is an implementation detail, though, as the API returns the value regardless.

This also highlights a minor limitation in rich text storage: attachments do not have an inline representation in the HTML, and so they are always moved to the end of the field in Notes. At first, I was peeved by this limitation, but it makes a sort of sense: cid references are really about images, and I guess Lotus didn't want to override that for use in normal link elements.

That brings us to the final potential structure you're likely to run across:

Embedded Images And Attachments

When you include both embedded images and attachments, things get slightly more complicated. I'll skip the raw MIME and go straight to the tree:

  • multipart/mixed
    • multipart/related
      • text/html
      • image/jpeg
    • application/octet-stream

So this becomes a combination of the two formats, and a bit of logic emerges. In Notes's structure, "multipart/mixed" always contains two or more children, and the first one is the textual body, whatever form that may take. One of those forms is just a single-part "text/html", and the other is a "multipart/related" subtree containing the "text/html" and one or more images.


Once you get a feel for these structures, it makes the task of reading and creating Notes-alike MIME items much less daunting. There are a number of other concerns I've been dealing with as well (such as the conversion of composite-data rich text to HTML and how there are two ways to do it), and maybe I'll make a followup post at some point about those.


* As a minor note on this point, it's an area where the Notes client and XPages diverge slightly. The Notes client (which generated the example above), leaves inline images "nameless" - they contain no "Content-Disposition" header and no name in the "Content-Type", instead sticking with just the "Content-ID" for identification. With XPages, however, presumably due to the fact that it has filename information during the upload process, the result still contains (and is referenced by) the "Content-ID" value, but it also contains a line like:

Content-Disposition: inline; filename="foo.jpg"

This functions the same way for most purposes, but it may be significant. For example, if you happen to write processing code that uses the presence of absence of the "Content-Disposition" header as an indicator of whether it's an attachment or not, knowing this ahead of time could save you a certain amount of headache. The right way to do it is to see if the header is either missing or has a basic value of "inline" instead of "attachment".