Continuous Integration with Jenkins


Running webdriver-sync tests from Jenkins

Simply create a new job and add a build task using a shell command for linux slaves or a windows batch command.

You can kill any running browsers and update from source control before running the job


killall chrome || true
cd ../../../Testing
hg pull
hg update -C "$hgbranch"
hg pull -u --force
node Portal/scripts/ptl.js Chrome "$JOB_NAME" $param1 $param2 $etc

There needs to be a function in the test scripts that matched the Jenkins job name (if you want to use the $JOB_NAME variable).

Returning test results to Jenkins

Jenkins can understand JUnit format test results and we can fake these with XML.

I have a log object that handles writing out results to Jenkin's console log and creating a JUnit-compatible test results file containing error counts and individual test timings.


var fs = require("fs");
var xml = require("xml");

    log: {
        testTag: null,
        caseTag: null,
        caseCount: 0,
        testErrors: 0,
        testName: "",
        testTime: 0,
        startTest: function log_startTest(testName) {
            console.log("TEST STARTING: " + testName);
            console.log(new Date());
            var workspace = HOME + "/Jenkins/workspace/" + testName + "/";
            if (!fs.existsSync(workspace)) {
                fs.mkdirSync(workspace);
            }
            try {
                fs.unlinkSync(workspace + "/*.png");
            } catch (e) {
            }
            this.testName = testName;
            this.testTag = [{"testsuite": [ { _attr: {"time": 1, "errors": 0, "name": testName}}]}];
            this.caseCount = 0;
            this.testTime = 0;
        },
        caseOpen: false,
        caseErrors: 0,
        caseName: "",
        caseTime: 0,
        startCase: function log_startCase(caseName) {
            if (this.caseOpen) {
                this.endCase();
            }
            this.caseErrors = 0;
            this.caseCount++;
            this.caseName = (this.caseCount < 10 ? "00" : this.caseCount < 100 ? "0" : "") 
                + this.caseCount + " " + caseName;
            this.caseTag = {"testcase": [ { _attr: {
                "errors": 0, 
                "name": this.caseName, 
                "time": 1}} ]};
            console.log("");
            console.log("TEST: " + caseName);
            this.caseOpen = true;
            this.caseTime = Date.now();
            console.log(new Date());
        },
        error: function log_error(errorText, screenshot) {
            this.caseErrors++;
            this.testErrors++;
            console.log("\n[ERROR] " + errorText + "\n");
            this.caseTag.testcase.push({"failure": [ { _attr: {"type": "Error"}}, errorText]});
            if (screenshot) {
                fs.writeFileSync(HOME + "/Jenkins/workspace/" + this.testName 
                    + "/[" + this.caseName.replace(/(\\|\/)/g, "#") + "] "
                    + ".png",
                    screenshot, "base64");
            }
        },
        endCase: function log_EndCase() {
            this.caseTime = (Date.now() - this.caseTime) / 1e3;
            console.log("TEST ENDED\n\n");
            this.caseTag.testcase[0]["_attr"].errors = this.caseErrors.toString();
            this.caseTag.testcase[0]["_attr"].time = this.caseTime.toString();
            if (!this.caseErrors) {
                this.caseTag.testcase.push("SUCCESS");
            }
            this.testTag[0].testsuite.push(this.caseTag);
            this.caseOpen = false;
            this.testTime += this.caseTime;
        },
        endTest: function log_endTest() {
            if (this.caseOpen) {
                this.endCase();
            }
            this.testTag[0].testsuite[0]["_attr"].errors = this.testErrors.toString();
            this.testTime = this.testTime.toFixed(0);
            this.testTag[0].testsuite[0]["_attr"].time = this.testTime.toString();
            console.log("");
            console.log("TESTS COMPLETE");
            console.log(new Date());
            console.log("Tests run: " + this.caseCount + ", Failures: " + this.testErrors);
            console.log("Testing time: " + this.testTime);
            if (require("os").hostname() !== "test-Latitude-E6500") {
                fs.writeFileSync(HOME + "/Jenkins/workspace/" + this.testName + "/log.xml", 
                    xml(this.testTag, {"declaration": {"encoding": "UTF-8"}}));
                console.log("Results saved to " + HOME + "/Jenkins/workspace/" 
                    + this.testName + "/log.xml");
            }
        },
        perfResult: function log_perfResult(load, metric, median, max) {
            median = (+median).toFixed(3);
            max = (+max).toFixed(3);
            fs.writeFileSync(HOME + "/Jenkins/workspace/" + this.testName + "/" + 
                load + "_" + metric + ".csv", //.properties", 
                "Max " + load + ",Median " + load + "\n" + max + "," + median);
        }
    },

At the end of the test, there will be a log.xml file in the Jenkins job's workspace, so we just need to tell Jenkins to publish it as the Junit test result reports in the post-build actions.

Showing Webpagetest performance charts in Jenkins

We can take the csv files from the webpagetest test and plot them in Jenkins to show any improvements or regressions between builds.

Install the plot plugin and in the webpagetest job add a "Plot build data" Post-build action.

Add a new plot group and set the titles as you prefer. Select a Line plot style then "Load data from csv file" and add the performance test result CSV file (it should be saved in the job's workspace for ease of access).

Scheduling tests with a Jenkins Groovy script

You may want to have one master job in Jenkins that runs all of the tests and aggregates the results.

You can run (test) jobs in parallel or in series using a postbuild groovy script:


def PARALLEL(testList) {
    def queuedTests = testList.collect{
        Jenkins.instance.getJob("${it}").scheduleBuild2(5, new Cause.UpstreamCause(manager.build));
    };
    queuedTests.each() {
        it.get()
    };
    testList.each() {
        manager.listener.logger.println "${it} #" 
            + Jenkins.instance.getJob("${it}").getLastBuild().number;
        manager.listener.logger.println Jenkins.instance
            .getJob("${it}").getLastBuild().getResult().toString();
    }
}

def SERIALP(testName, params) {
    manager.listener.logger.println Jenkins.instance.getJob(testName)
        .scheduleBuild2(5, new Cause.UpstreamCause(manager.build), 
        new ParametersAction(params)).get();
    manager.listener.logger.println Jenkins.instance.getJob(testName)
        .getLastBuild().getResult().toString();
}

def SERIAL(testName) {
    manager.listener.logger.println Jenkins.instance.getJob(testName)
        .scheduleBuild2(5, new Cause.UpstreamCause(manager.build)).get(); 
    manager.listener.logger.println Jenkins.instance.getJob(testName)
        .getLastBuild().getResult().toString();
}



def deployParams = 
    [
        new StringParameterValue('build_number', 
                manager.build.buildVariables.get('build_number')),
        new StringParameterValue('archive_number', 
                manager.build.buildVariables.get('archive_number'))
    ];
SERIALP("Deploy", deployParams);

SERIAL("Smoke");

PARALLEL(["Some Tests", "That Can", "Be Run", "In Parallel"]);

SERIAL("Final Solo Test");

Jenkins Results Aggregation

It's nice to see the results of a test run aggregated together, so you can see if you are making progress overall.

The option to aggregated results postbuild never worked for me in Jenkins. To get it to work, I create a unique fingerprint file in my main test runner job and copy it to each test job.

Then you need to run a system groovy script to load the test aggregator with the results

Note that this appears in the job as a build step, so would actually be run before the test running script, above, but it does work:


import jenkins.model.*
import hudson.model.*

def STjobs = "Deploy,Smoke,Some Tests,That Can,Be Run,In Parallel,Final Solo Test";
def jobName = this.binding.build.project.name;

// (re-)create a clean publishers list for the results
Jenkins.instance.getJob(jobName).getPublishersList()
        .removeAll(hudson.tasks.test.AggregatedTestResultPublisher);

def atrp = new hudson.tasks.test.AggregatedTestResultPublisher(STjobs, true);
Jenkins.instance.getJob(jobName).getPublishersList()
        .add(atrp); 

Note that results aggregation may fail if the clocks on the Jenkins master and the slaves are out of sync.

Jenkins Slave Status Monitor

I have a 14 slaves testing 2 projects and as I share Jenkins with the developers and all of their build jobs it is a pain keeping track of the tests through the standard Jenkins web UI.

I decided to knock up a small server that could interrogate the Jenkins API to show just the current status of the test jobs.

For node, the Jenkins API is encapsulated within nestor. Adding this to a minimal server gives us:


var http   = require("http");
var nestor = new (require('nestor'))('');
var fs     = require("fs");

var server = http.createServer(function (req, res) {
    if (req.url.match(/\.json/)) {
        //computer, parseFeed, info
        nestor[req.url.split(".json")[0].replace(/\//g, "")](
            function (err, result) {
                err && console.dir(err);
                res.writeHead(200, {"Content-Type": "application/json"});
                res.end(JSON.stringify(result));
                return;
            }
        );
    } else {
        var template = fs.readFileSync("index.html").toString();
        res.writeHead(200, {"Content-Type": "text/html"});
        res.end(template);
    }
    
});
server.listen(1337);

This gives us a server that listens on port 1337 and delivers either the json from the nestor computer, parsefeed and info calls or returns an html page to display the status monitor:



//style the divs with borders and colours
        div {
            display: inline-block;
            border: solid 1px gray;
            padding: 8px;
            margin: 4px;
            border-radius: 5px 5px 5px;
            font-size: 24px;
            font-family: sans-serif;
            box-shadow: 1px 1px 5px #808080;
        }
        
        div.active {
            background-color: white;
        }
        .idle {
            background-color: lightgray;
        }

        #portalTesters {
            background-color: #E0FFE0;
        }
        #portalTesters div{
            /*font-size: 32px;*/
            width: 45%;
        }
        #rssTesters {
            background-color: lightcyan;
        }
        #rssTesters div {
            width: 20%;
        }

<div id="portalTesters">
    <div id="Portal_Smoke_Tester"><h3>Portal_Smoke_Tester</h3><p></p></div>
    <div id="Portal_Web_Test_LNX"><h3>Portal_Web_Test_LNX</h3><p></p></div>
    <div id="Portal_Web_Test_OSX"><h3>Portal_Web_Test_OSX</h3><p></p></div>
    <div id="Portal_Web_Test_OSX_YOS"><h3>Portal_Web_Test_OSX_YOS</h3><p></p></div>
    <div id="Portal_Web_Test_W7"><h3>Portal_Web_Test_W7</h3><p></p></div>
    <div id="Portal_Web_Test_IE11"><h3>Portal_Web_Test_IE11</h3><p></p></div>
</div>

<div id="rssTesters">
    <div id="test_732_uk"><h3>test_732_uk</h3><p></p></div>
    <div id="test_732_us"><h3>test_732_us</h3><p></p></div>
    <div id="test_764_uk"><h3>test_764_uk</h3><p></p></div>
    <div id="test_764_us"><h3>test_764_us</h3><p></p></div>
    <div id="test_832_uk"><h3>test_832_uk</h3><p></div>
    <div id="test_832_us"><h3>test_832_us</h3><p></div>
    <div id="test_864_uk"><h3>test_864_uk</h3><p></div>
    <div id="test_864_us"><h3>test_864_us</h3><p></p></div>
</div>


    function xhrJSON(requestType, jURI, jParm, jCallback) {
        var xhr = new XMLHttpRequest();
        xhr.open(requestType, jURI, true);
        xhr.setRequestHeader("Content-Type", "application/json; charset=utf-8");
        xhr.onreadystatechange = function () {
            if ((xhr.readyState === 4) && (xhr.status === 200)) {
                jCallback(JSON.parse(xhr.responseText ));
            }
        };
        xhr.send(jParm);
    }
    function getJSON(jURI, jParm, jCallback) {
        xhrJSON("GET", jURI, jParm, jCallback);
    }

    function refreshStatus() {
        getJSON("computer.json", "", function (result) {
            result.computer.forEach(function (computer) {
                if (computer.displayName.match(/test/i)) {
                    var computerName = computer.displayName.replace(/ /g, "_");
                    if (computer.idle || !computer.executors[0].currentExecutable) {
                        document.querySelector("#" + computerName).style.backgroundImage = "";
                        document.querySelector("#" + computerName).className = "idle";
                        if (document.querySelector("#" + computerName + " p span")) {
                            if (document.querySelector("#" + computerName + " p span")
                                    .innerHTML.match(/(END|(10|9)\d)/)) {
                                document.querySelector("#" + computerName + " p span")
                                    .innerHTML = "END";
                            }
                        }
                    } else {
                        var job = computer.executors[0].currentExecutable.url;
                        var jobName = computer.executors[0].currentExecutable.url
                            .split(/job\//)[1].replace(/(%20|\/)/g, " ").split(/ label/)[0];
                        var progress = computer.executors[0].progress + "%";
                        document.querySelector("#" + computerName + " p").innerHTML 
                            = "" + jobName + " " 
                            + progress + "";
                        document.querySelector("#" + computerName).className = "active";
                        document.querySelector("#" + computerName).style.backgroundImage 
                            = "linear-gradient(to right, yellow " + progress + ", white " 
                                + progress + ")"; 
                    }
                }
            });
        });
    }
    refreshStatus();
    window.setInterval(refreshStatus, 20e3);

Each slave's progress is contained within a div. Every 20 seconds it requests the computer data from nestor and updates the page.

If the slave is idle, the div turns grey.

If the slave is running a job, it has a yellow linear-gradient background that expands to fill the div as the job's progress increases. This also serves as a useful visual alert - the div should not stay yellow for long - if it does then the job may have stalled and needs attention.

Jenkins Log Parsing

I was interested to see where my tests were spending their time. Were they navigating to a lot of pages, finding a lot of elements, clicking many buttons, etc.

This information is quite easy (and really quick) to pull out of each job's console log. It helps to have your test jobs grouped into a View in Jenkins and to have specific wording for interesting events in your log.

Go to the Manage Jenkins page and open the Script Console (this may require a plug-in) and enter a script like this:


def J = Jenkins.instance.getView("Web Tests").getItems()
for (job in J) {
  println(job.name);
  println("MINS " + (job.getLastCompletedBuild().getDuration()/60000).toInteger())

  def L = job.getLastCompletedBuild().getLog()
  println("GO " + (L =~ /NAVIGATING/).count)
  println("LOCATE "+ (L =~ /LOCATING ELEMENT/).count)
  println("CLICK " + (L =~ /CLICKING/).count)
  println("SET " + (L =~ /SETTING/).count)
  println("CHECK " + (L =~ /ROUGHLY/).count)
  println("REST " + (L =~ /status:/).count)
  println("SSH " + (L =~ /SSH to/).count)
  println("UPLOAD " + (L =~ /UPLOADING/).count)    
  println()
}

Then you get a list of all your tests with the time taken in minutes and the number of times each event/action occured.

Next: Tips and tricks