Training sessions and talks

One of the things I like most about my work is the ability to help teams improve, by sharing my experiences and knowledge in training sessions. These sessions can be conducted in the form of interactive workshops or talks.

I’m offering training sessions on DevOps, Continuous Delivery, Event Sourcing, Microservices and many other topics. Contact me if you’re interested in details or pricing!

Using conditional build steps to speed up your Jenkins PHP builds

At my client Spotney, we have a pretty solid (and common) build infrastructure for our PHP projects; SVN commits are checked out by Jenkins, tests are run by PHPUnit, Sonar runs static code analysis, and artifacts are built and deployed to a staging environment by Phing. However, some of the code relies pretty heavily on (complex) db queries, adding the need for DbUnit style tests. The nature and quantity of the tests, combined with a slow VM (possibly related to this Xdebug issue) meant that our buildtimes were becoming prohibitively long.

An interesting post by @pascaldevink triggered a conversation and sparked an idea. I started working on our build setup, eventually resulting in a 60-70% decrease of our build times!

Here’s how I did it.

Starting point

Let’s assume we have a fairly standard Jenkins job. The job checks out an SVN repository, and periodically scans that repository for changes, triggering a new build if any are detected.

Each build of the job performs three steps:

  • Run phpunit
  • Run phing (target “build”)
  • Invoke Sonar (using the Jenkins Sonar plugin – this plugin also allows invoking Sonar as a post-build action, but that option requires Maven)

After the build steps, the job publishes the test and code coverage reports, and archives the generated artifacts.

Disabling code coverage and Sonar for regular builds

Two of the most obvious optimizations (also suggested by Pascal) are disabling code coverage on all tests and disabling Sonar runs during regular Jenkins builds. We define regular as either manually started by a user, or by an SCM trigger.

Disabling code coverage generation in PHPUnit is easy, simply remove the “coverage-xxx” elements from the logging section of your phpunit.xml configuration file (see this section of the PHPUnit manual). Disabling Sonar is trivial too, just remove the last build step from the job.

However, this is not an ideal solution: we do want to generate code coverage information and run Sonar at some point, such as during nightly builds, preferably without cloning our existing job. This means that we’ll need to skip code coverage and Sonar invocations on all but the scheduled (nightly) builds.

The Sonar plugin supports excluding SCM triggered builds (“Skip if triggered by SCM Changes”), but that only works if you use the post-build action. Additionally, we need to be able to change the PHPUnit configuration – one file to enable code coverage generation, one file to disable it.

Conditional build steps

The Conditional BuildStep plugin wraps one or more build steps in a conditional execution block. One of the possible conditions is the build cause, i.e. was the build triggered by a user, an upstream project, a timer, an SCM change, etc. etc.

First we define the steps that should be taken for each nightly build of our job. These steps should only be executed when the build is trigger by a timer.

We add a “Conditional steps (multiple)” build step, setting the condition to “Build Cause” and the Build Cause to “TimerTrigger”.

Conditional Sonar Build Config [Jenkins]2

Then we add our three (original) build steps:

Conditional Sonar Build Config [Jenkins]3

As stated before, regular builds are those that are triggered by a user or an SCM change.

We again add a “Conditional steps (multiple)” build step. The condition for this step is a little more interesting, as seen below. We combine two Build Cause conditions (one set to “UserCause”, the other to “SCMTrigger”) using the Or condition.

Conditional Sonar Build Config [Jenkins]4

We then add two build steps: the first will run PHPUnit without code coverage (note that we are specifying a different configuration file here), the second one will run Phing.

Conditional Sonar Build Config [Jenkins]5

Note that in the above build steps we’re invoking Phing from the shell instead of using the Phing plugin. Unfortunately this plugin is currently not supported as a conditional build step (probably because of this JIRA issue).

Build schedule

As a final step we need to update our build schedule.

Conditional Sonar Build Config [Jenkins]1

This will ensure our job runs somewhere after midnight (between 12:00 AM and 2:59 AM to be precise).

The end result:

  • A nightly scheduled build, with all the bells and whistles enabled
  • User and SCM triggered builds run (a lot) faster

Please let me know if you think this post is useful, or if you have any other Jenkins/PHP optimization tips!

Building and deploying Java WebSphere applications with Jenkins CI

Jenkins CI (the new name of Hudson) is a very popular continuous integration system. It can be used to monitor the execution of various jobs, including but not limited to compilation, packaging, testing and deploying of software. Also, it is very easy to configure and comes with a great set of (3rd party) plugins.
I use Jenkins in a number of ways: to monitor and prepare & test new releases of Phing, monitor various internal processes (such as backup logs), and build and deploy various other projects that I work on.

In this post I will expand on some of the techniques discussed in an earlier IBM developerWorks article, to (automatically) build and deploy Java J2EE applications to a WebSphere server. The code fragments listed below are contained in a downloadable archive that you’ll be able to download at the end of this post.

Requirements

To get started, you’ll need to have installed:

  • Jenkins CI with the following plugins (can be installed via “Manage Jenkins” -> “Manage Plugins”):
    • Copy Artifact
    • Blame Subversion
    • Parameterized Trigger
    • RAD Builder
  • Ant Contrib
  • IBM Rational Application Developer
  • A test/staging installation of the WebSphere Application Server

This post assumes you have some knowledge of Ant, and are able to install Jenkins and IBM RAD.

Job configuration

For this particular case we will configure two Jenkins CI jobs: one job will build number of artifacts (in this case, .ear files) from source code contained in a version control repository, and another job will deploy the generated artifacts to a WebSphere server. This deployment job will be triggered when the build job completes (successfully).

Jenkins CI Dashboard
Jenkins CI Dashboard

Build job

Create a new “free-style” job, and configure it as you normally would. Make sure you check out the source code to the src directory, within the job workspace.

Build Job - SVN config
Build Job - SVN config

Then, click “Add build step”, and select the IBM RAD plugin. The field “build file” should contain the path to the build file we will use (Builder\build.xml in the archive). The field “RAD workspace” points to a directory (within the job’s workspace) where a RAD workspace will be created, in this case (see the build file below) we use the path “rad-workspace”. The other settings can be left at their default values.

Build Job - RAD builder
Build Job - RAD builder

Build file

The Jenkis RAD builder plugins creates a fresh workspace, similar to the workspace that is used inside RAD (or Eclipse). To prepare this workspace with the right configuration settings, we use the task workspacePreferenceFile. The input for this task is a simple preferences file, either text format (key=value pairs, see sample), or the Eclipse .epf format.

compiler.compliance=1.5
compiler.source=1.5
classpath.SOMELIBRARY=D:\Development\somelibrary.jar

The task workspacePreferenceFile is then called in the setup-workspace target.

<target name="setup-workspace"
	description="Sets the preferences for the current workspace">

	<!-- Debug information -->
	<echo level="verbose" message="rad.preferences.filename=${rad.preferences.filename}"/>

	<!-- Set the workspace preferences -->
	<workspacePreferenceFile
		PreferenceFileName="${rad.preferences.filename}"
		useEclipsePrefs="false"
		overwrite="true"/>
	<echo level="verbose" message="workspacePreferenceFile done"/>
</target>

Next, the code that has previously been checked out by Jenkins will need to be copied to this new workspace. The properties copy.from.path and copy.excludes (optional, comma-separated list of excluded patterns) are set in the IBM RAD builder configuration (build job).

<target name="copy-projects"
description="Copies the content of a folder to the current workspace">

	<!-- Debug information -->
	<echo level="verbose" message="copy.from.path=${copy.from.path}"/>
	<echo level="verbose" message="workspace=${workspace}"/>

	<copy
		todir="${workspace}"
		includeEmptyDirs="true">
		<fileset dir="${copy.from.path}" excludes="${copy.excludes}">
			<include name="**/**"/>
		</fileset>
	</copy>
	<echo level="verbose" message="copy done"/>
</target>

Now that the workspace is configured and contains the projects we’d like to build, it’s time to make RAD aware of the contents by actively importing each project – this is done by calling the projectImport task. The list of projects is generated by scanning the workspace for directories that contain a .project file.

<target name="import-projects"
	description="Imports a set of projects into the current workspace">
	
	<!-- Retrieve list of projects (folders containing a .project file) -->
	<dirset id="projects.list" dir="${workspace}">
		<include name="*"/>
		<present targetdir="${workspace}">
			<mapper type="glob" from="*" to="*/.project" />
		</present>
	</dirset>
	<pathconvert property="projects.name" refid="projects.list" pathsep=",">
		<map from="${workspace}" to=""/>
	</pathconvert>
	
	<!-- Debug information -->
	<echo level="verbose" message="projects.name=${projects.name}"/>

	<!-- Import the projects -->
	<foreach
		list="${projects.name}"
		target="import-project"
		param="project.name"/>
</target>

<target name="import-project">
	<!-- Debug information -->
	<echo level="verbose" message="project.name=${project.name}"/>
	<echo level="verbose" message="workspace=${workspace}"/>

	<projectImport
		projectName="${project.name}"
		projectLocation="${workspace}/${project.name}"/>
	<echo level="verbose" message="projectImport ${project.name} done"/>
</target>

The most important part of the build file is the target build-workspace, which calls the task workspaceBuild to perform a full build. By default, this task will fail the build if any (compiler) errors are encountered – this is what we want.

<target name="build-workspace"
	depends="setup-workspace,copy-projects,import-projects"
	description="Builds the current workspace">

	<!-- Fully build the workspace -->
	<workspaceBuild
		BuildType="Full"/>
	<echo level="verbose" message="workspaceBuild done"/>
</target>

Hopefully, there are no errors, and we are in a situation where all the projects have been built successfully. Time to generate some artifacts!
The target export-ear first updates the (generated) manifest file with a few Jenkins parameters, such as build number, SVN revision, job name, and the current date. This data is a useful (extra) aid to identify the version / origin of deployed code (please note that you can also use the fingerprinting functionality for this, see below).

We then call the earExport task to create a .ear file, identical to choosing “Export” -> “EAR file” within RAD.

<target name="export-ear"
	description="Exports the EAR defined by the ear.project.name/ear.filename properties">
	
	<property name="ear.filename" value="${workspace}${ear.project.name}-${env.BUILD_NUMBER}-${env.BUILD_ID}.ear"/>
		
	<!-- Update the manifest with Jenkins build info -->
	<echo>Updating manifest</echo>
	<tstamp>
		<format property="TODAY" pattern="yyyy-MM-dd HH:mm:ss"/>
	</tstamp>
	<manifest
		file="${workspace}${ear.project.name}/META-INF/MANIFEST.MF"
		mode="update">
		<attribute name="Built-By" value="Jenkins CI"/>
		<attribute name="Implementation-Version" value="#${env.BUILD_NUMBER} - r${env.SVN_REVISION} - ${env.BUILD_ID}"/> 
		<attribute name="Implementation-Title" value="${env.JOB_NAME}"/>
		<attribute name="Built-Date" value="${TODAY}"/>
	</manifest>
	
	<!-- Debug information -->
	<echo level="verbose" message="ear.filename=${ear.filename}"/>
	<echo level="verbose" message="ear.project.name=${ear.project.name}"/>
	
	<!-- Export the EAR project as an EAR file -->
	<earExport
		EARProjectName="${ear.project.name}"
		EARExportFile="${ear.filename}"
		ExportSource="false"
		IncludeProjectMetaFiles="false"
		Overwrite="true"/>
	<echo level="verbose" message="earExport ${ear.filename} done"/>
</target>

When the RAD builder finishes succesfully, the build part of the job is completed and a number of artifacts (.ear files) will have been generated.

Build job - post-build actions
Build job - post-build actions

In the post-build actions we make sure the generated artifacts are scooped up and archived. This makes sure that artifacts are kept even if the original build was (re)moved. Additionally, we enable the recording of fingerprints on each artifact. In essence, this will calculate and store a hash value (MD5 or similar) based on the contents of each file. Should we need to identify a particular artifact at some point in the future, we can simply upload that file to Jenkins, let it calculate a hash value, and match that hash value against its internal fingerprint database. If there’s a match, Jenkins will tell us the job name, build number, date, and any other useful information.

Finally, we call the deploy job using the parameterized trigger plugin. In this case, we do not override any of the default parameters (see below). Should you want to, click “Add parameter”, then “Predefined parameters”. Enter the parameters (key=value pairs) in the text area.

Deploy job

As stated before, the deployment job copies generated artifacts from the build job, and installs the artifacts on a (test/staging) WebSphere server. To achieve this, the job calls the wsadmin tool and executes a single JACL script.

An important part of this job are the predefined parameters, telling the JACL script which SOAP connection to use, and which node / cell / server name / virtual host to install the application to. In this case, each of these parameters has a default value – pointing to a default (local) testing server.

Deploy Job - Build parameters
Deploy Job - Build parameters

The build phase of the job consists of three separate build steps:

  • Remove any artifacts that were left by previous builds
  • Copy the artifacts generated by the last successful run of the build job
  • Execute ws_ant (Ant with WebSphere functionality/classes preloaded), which in turn uses wsadmin to run a JACL script.
Deploy Job - Build steps
Deploy Job - Build steps

The JACL script has two modes of operation. First, it stops and uninstalls the previous version of the application we are trying to install. Errors that occur during this first part are ignored.

set appManager [$AdminControl queryNames cell=$cell,node=$node,type=ApplicationManager,process=$server,*]

catch { $AdminControl invoke $appManager stopApplication $appname } result

$AdminConfig save

$AdminApp uninstall $appname

$AdminConfig save

In the second part of the script, the application is installed on the specified node/cell/server/virtual host. Then, after giving the application server some time to process the installed artifact, the script starts the application. If this completes without errors the application is ready to use!

$AdminApp install "$workspace/$earfile" "-node $node -cell $cell -server $server -verbose -defaultbinding.virtual.host $vhost -usedefaultbindings"

$AdminConfig save

set ready false
set retries 0

while {$retries < 20} {
	incr retries
	set ready [$AdminApp isAppReady $appname]
	puts "AdminApp isAppReady: $ready ($retries)"
	
	if {$ready} { break }
	
	sleepDelay 5
}

set appManager [$AdminControl queryNames node=$node,cell=$cell,type=ApplicationManager,process=$server,*]

$AdminControl invoke $appManager startApplication $appname

$AdminConfig save

Conclusion / thoughts

In this post you’ve seen how to use Jenkins CI to build (through IBM RAD) and deploy (through IBM wsadmin) a J2EE application to a WebSphere server. I hope these exampless can serve as a starting point for your forays into the exciting world of Jenkins CI.

Comments and suggestions are very welcome!

Downloads