First Steps to Code Coverage Analysis in Domino Plugins
Thu Nov 09 08:53:04 EST 2017
I'm always interested in getting the computer to tell me how to tell it what to do more successfully, and, to further that pursuit, I've started taking an interest in code coverage.
If you're not familiar with the term, "code coverage" refers to reporting on which lines of code were actually executed during runtime, most commonly in association with unit tests. Eclipse (and presumably other IDEs) has support for this, and I've decided to give it a shot.
Since I'm starting this out in the context of Domino plugins, there are more wrinkles than in most tutorials. Namely, the test suites I've written run exclusively through Maven instead of the Eclipse UI due to all the Notes environment setup, so I can't just use the normal UI tools to gather the data. Fortunately, Eclipse's EclEmma will work just fine with the output from a Maven project, as long as you configure it properly. I looked around for a while to find the right combination of tools to use, but it ended up being fairly simple to configure basic output that can be consumed in Eclipse's Coverage view.
There are two main additions. First, add the jacoco-maven-plugin
to your root project's project.build.plugins
block:
<plugin> <groupId>org.jacoco</groupId> <artifactId>jacoco-maven-plugin</artifactId> <version>0.7.8</version> <executions> <execution> <goals> <goal>prepare-agent</goal> </goals> </execution> </executions> </plugin>
In normal cases, that would suffice. However, since the test configuration I have for Notes overrides the argLine
property of the Tycho test runner, there's another step - add the tycho.testArgLine
property manually into those blocks, such as in the Windows profile:
<profile> <activation> <os> <family>Windows</family> </os> <property> <name>notes-program</name> </property> </activation> <build> <plugins> <plugin> <groupId>org.eclipse.tycho</groupId> <artifactId>tycho-surefire-plugin</artifactId> <version>${tycho-version}</version> <configuration> <skip>false</skip> <argLine>${tycho.testArgLine} -Dfile.encoding=UTF-8 -Djava.library.path="${notes-program}"</argLine> <environmentVariables> <PATH>${notes-program}${path.separator}${env.PATH}</PATH> </environmentVariables> </configuration> </plugin> </plugins> </build> </profile>
Once that's configured, running the test suite via Maven will create a new file in the target
folder of the test plugin: jacoco.exec
. This file can then be consumed in Eclipse by opening the "Coverage" view:
In that view, right click and choose "Import Session..." and point to the data file. Click "Next" and check the projects+source folders from your workspace you're interested in analyzing. When you click "Finish", it'll do two things. First, it'll fill the Coverage view with statistics from your run:
(We have a lot of work to do fleshing out our test suites for this one)
Secondly, it'll start highlighting your code to show you what code is executed, which branches are only partially covered, and which lines are skipped entirely. For example (ignore the sickly color scheme - I need to work on that):
This shows how several of the if
branches are only tested in one direction, while the "Faces" block is skipped entirely. That also shows some of the trouble with testing XPages-run code: the Tycho environment can't reproduce the XPages environment fully, so some branches aren't testable in that way. I haven't looked into the possibility of gathering similar data from JUnit for XPages, so perhaps that's possible.
For now, though, this will have to do. And, like with these other "code improvement" techniques I've integrated lately, there's a lot of potential tedium - juggling when to write a test to cover some code that will obviously always work just to improve the highlighting vs. just focusing on the low-hanging fruit - but I expect that it will be a nice addition to my workflow over time.