Multi-project AspectJ builds with Gradle and Eclipse

Using Gradle for build/CI and Eclipse for development is a nice ecosystem with reasonable integration, but things get a bit trickier when we add multi-project builds and AspectJ into the mix. This post steps through some of the manual steps required to get it all working together.

Environment

Note: I am using the built-in gradle eclipse plugin, but not the eclipse gradle plugin

The multi-project build

For reasons beyond the scope of this post, I’m using three projects, in order of dependency:

model – a rich domain model
persistence – a project which uses AspectJ to layer a set of generic persistence-aware superclasses on top of model
modelp – a project which takes the emitted classes from persistence and adds all the necessary persistence plumbing, such as hibernate mappings, optimized DAOs, etc.

Gradle configuration

Details irrelevant to the multi-project configuration are omitted.

persistence project:

dependencies {
	ajInpath project(path: ':model', transitive: false)
}

The persistence project will then emit all of the classes from the model project woven with the aspects from persistence. Note that the upstream dependencies of model are not woven, nor are they automatically available to the persistence project. We need to use the normal gradle dependency mechanisms if we want to do that.

modelp project

Similarly:

dependencies {
	ajInpath project(path: ':persistence', transitive: false)
}

Eclipse configuration

So far so good. Gradle is pretty clever about wiring up multi-project builds. Eclipse is a little less clever, or maybe just different. So after

gradle eclipse

we still have some manual steps to do to recreate this setup in Eclipse.

AspectJ setup

Edited 7th June 2016, thanks to Daniel’s very helpful comment

Here we come to the first difference between Eclipse and Gradle. If we add the upstream project to the inpath, AspectJ will try to weave all of that project’s referenced libraries as well. In effect, Eclipse is missing the “transitive: false” argument we used in Gradle. This is (mostly) harmless (probably), but it’s slow and can throw spurious errors. So instead of adding the whole upstream project to the inpath, we add the project’s emitted class folder.

Together with adding the AspectJ nature to the project, the gradle code to configure eclipse looks like this for the modelp project:

eclipse {

	project {
		natures = ['org.eclipse.ajdt.ui.ajnature','org.eclipse.jdt.core.javanature']
		
		buildCommand 'org.eclipse.ajdt.core.ajbuilder'
	}
	
	// Add the inpath entry to the classpath
	classpath {
		file {
			withXml {
				def node = it.asNode();
				node.appendNode("classpathentry",  [kind:"lib", path:"/model/bin"])
					.appendNode("attributes", [:])
					.appendNode("attribute", [name:"org.eclipse.ajdt.inpath", value:"org.eclipse.ajdt.inpath"]);
			}
		}

	}
}

Dependent project setup

We still need the upstream project and its libraries to be available to the Eclipse compiler. The gradle eclipse plugin will take care of this if we have a normal compile project dependency in our gradle build (e.g. compile project(":model")), but we don’t necessarily need that for our gradle build. If we only have the inpath dependency the gradle eclipse plugin will miss it, so in Eclipse we also need to add the upstream project as a required project in the Java Build Path, like so:

javapath

Export exclusions

By default, adding the AspectJ nature to an Eclipse project causes it to export the AspectJ runtime (aspectjrt-x.x.x.jar). As all three of these projects are AspectJ projects, we end up with multiply defined runtimes, so we need to remove the runtime from the export list of the upstream projects.

Gradle is much better than Eclipse at dealing with complex dependency graphs. In particular, if an upstream project depends on an older version of a jar and a downstream project depends on a newer version of the same jar, the newer version will win. In Eclipse, both jars will be included in the classpath, with all the corresponding odd behaviour. So you might also need to tweak the export exclusions to avoid these situations.

Run configuration

Once you’ve cleaned up the exports from upstream projects, Eclipse will cheerfully ignore your exclusions when creating run or debug configurations, for example when running a JUnit test. This seems to be a legacy behaviour that has been kept for backward compatibility, but fortunately you can change it at a global level in the Eclipse preferences:

preferences

Make sure the last item, “only include exported classpath entries when launching”, is checked. Note that this applies to Run configurations as well, not just Debug configurations.

Conclusion

The manual Eclipse configuration needs to be redone whenever you do a gradle cleanEclipse eclipse, but usually not after just a plain gradle eclipse. It only takes a few minutes to redo from scratch, but it can be a hassle if you forget a step. Hence this blog post.

Gradle – copy to multiple destinations

TL:DR (edited);

def deployTargets = ["my/dest/ination/path/1","my/other/desti/nation"]
def zipFile = file("${buildDir}/distributions/dist.zip")

task deploy (dependsOn: distZip) {
	inputs.file zipFile
	deployTargets.each { outputDir ->
		outputs.dir outputDir
	}
	
	doLast {
		deployTargets.each { outputDir ->
			copy {
				from zipTree(zipFile).files
				into outputDir
			}
		}
	}
}

My specific use case is to copy the jars from a java library distribution to tomcat web contexts, so you can see the distZip dependency in there, along with zip file manipulation.

The multiple destination copy seems to be a bit of FAQ for gradle newcomers like myself. Gradle has a cool copy task, and lots of options to specify how to copy multiple sources into one destination. What about copying one source into multiple destinations? There’s a fair bit of confusion around the fact that the copy task supports multiple “from” properties, but only one “into” property.

The answers I’ve found seem to fall into one of two classes. The first is to just do the copy imperatively, like so:

task justDoit << {
  destinations.each { dest ->
    copy {
      from 'src'
      to dest
    }
  }
}

which gives up up-to-date checking. The solution I’ve settled on fixes that by using the inputs and outputs properties. Unlike the copy task type’s “into” property, a generic task can have multiple outputs.

The other advice given is to create multiple copy tasks, one for each destination. The latter seems to be a little unsatisfactory, and un-dynamic. What if I have 100 destinations? Must I really clutter up my build script with 100 copy tasks? The following is my attempt to handle it dynamically.

def deployTargets = ["my/dest/ination/path/1","my/other/desti/nation"]
def zipFile = file("${buildDir}/distributions/dist.zip")

task deploy

// Set up a copy task for each deployment target
deployTargets.eachWithIndex { outputDir, index ->
	task "deploy${index}" (type: Copy, dependsOn: distZip) {
		from zipTree(zipFile).files
		into outputDir
	}
	
	deploy.dependsOn tasks["deploy${index}"]
}

This one suffers from the problem that it will not execute on the same build when the zip file changes, but it will execute on the next build. So in sequence:

  • Change a source file
  • Run “gradle deploy”
  • Sources compile, distZip executes, zip file is produced, but deploy tasks do not execute
  • Run “gradle deploy” again
  • Deploy tasks execute

Why is this so? I don’t know. This thread seems to imply that there could be some race condition in gradle, but beyond that – *shrug*. The multiple copy task approach is recommended by a lot of smart people, so I assume there’s a better way to do it, but for now the single custom task is working for me.