Fix the JenkinsAPI No valid crumb error

If you are working with the Python based JenkinsAPI library you might run into the No valid crumb was included in the request error.  The error below will probably look familiar if you’ve run into this issue.

Traceback (most recent call last):
 File "myscript.py", line 47, in <module>
 deploy()
 File "myscript.py", line 24, in deploy
 jenkins.build_job('test')
 File "/usr/local/lib/python3.6/site-packages/jenkinsapi/jenkins.py", line 165, in build_job
 self[jobname].invoke(build_params=params or {})
 File "/usr/local/lib/python3.6/site-packages/jenkinsapi/job.py", line 209, in invoke
 allow_redirects=False
 File "/usr/local/lib/python3.6/site-packages/jenkinsapi/utils/requester.py", line 143, in post_and_confirm_status
 response.text.encode('UTF-8')
jenkinsapi.custom_exceptions.JenkinsAPIException: Operation failed. url=https://jenkins.example.com/job/test/build, data={'json': '{"parameter": [], "statusCode": "303", "redirectTo": "."}'}, headers={'Content-Type': 'application/x-www-form-urlencoded'}, status=403, text=b'<html>\n<head>\n<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>\n<title>Error 403 No valid crumb was included in the request</title>\n</head>\n<body><h2>HTTP ERROR 403</h2>\n<p>Problem accessing /job/test/build. Reason:\n<pre> No valid crumb was included in the request</pre></p><hr><a href="http://eclipse.org/jetty">Powered by Jetty:// 9.4.z-SNAPSHOT</a><hr/>\n\n</body>\n</html>\n'

It is good practice to enable additional security in Jenkins by turning on the “Prevent Cross Site Forgery exploits” option in the security settings, so if you see this error it is a good thing.  The below example shows this security feature in Jenkins.

enable xss protection

The Fix

This error threw me off at first, but it didn’t take long to find a quick fix.  There is a crumb_requester class in the jenkinsapi that you can use to create the crumbed auth token.  You can use the following example as a guideline in your own code.

from jenkinsapi.jenkins import Jenkins
from jenkinsapi.utils.crumb_requester import CrumbRequester

JENKINS_USER = 'user'
JENKINS_PASS = 'pass'
JENKINS_URL = 'https://jenkins.example.com'

# We need to create a crumb for the request first
crumb=CrumbRequester(username=JENKINS_USER, password=JENKINS_PASS, baseurl=JENKINS_URL)

# Now use the crumb to authenticate against Jenkins
jenkins = Jenkins(JENKINS_URL, username=JENKINS_USER, password=JENKINS_PASS, requester=crumb)

...

The code looks very similar to creating a normal Jenkins authentication object, the only difference being that we create and then pass in a crumb for the request, rather than just a username/password combination.  Once the crumbed authentication object has been created, you can continue writing your Python code as you would normally.  If you’re interested in learning more about crumbs and CSRF you can find more here, or just Google for CSRF for more info.

This issue was slightly confusing/annoying, but I’d rather deal with an extra few lines of code and know that my Jenkins server is secure.

Read More

Remote Jenkins builds using Github auth

Having the ability to call Jenkins jobs remotely is pretty slick and adds some extra flexibility and allows for some interesting applications.  For example, you could use remote builds to call a script from a chat app or from some other web application.  I have chosen to write a quick bash script as a proof of concept, but this could easily be extended or written in a different language via one of its language specific libraries.

The instructions for the method I am using assume that you are using the Jenkins freestyle build as I haven’t experimented much yet with pipelines for remote builds yet.

The first step is to enable remote builds for the Jenkins job that will be triggered.  There is an option in the job for “Trigger builds remotely” which allows the job to be called from a script.

trigger remote builds

The authentication token can be any arbitrary string you choose.  Also note the URL below, you will need that later as part of the script to call this job.

With the authentication token configured and the Jenkins URL recorded, you can begin writing the script.  The first step is to populate some variables for kicking off the job.  Below is an example of how you might do this.

jenkins_url="https://jenkins.example.com"
job_name="my-jenkins-job"
job_token="xxxxx"
auth="username:token"
my_repo="some_git_repo"
git_tag="abcd123"

Be sure to fill in these variables with the correctly corresponding values.  job_token should correspond to the string you entered above in the Jenkins job, auth should correspond to your github username/token combination.  If you are not familiar with Github tokens you can find more information about setting them up here.

As part of the script, you will want to create a Jenkins “crumb” using your Github credentials that will be used to prevent cross-site scripting attacks.  Here’s what the creation of the crumb looks like (borrowed from this Stackoverflow post).

crumb=$(curl -s 'https://'auth'@jenkins.example.com/crumbIssuer/api/xml?xpath=concat(//crumbRequestField,":",//crumb)')

Once you have your variables configured and your crumb all set up, you can test out the Jenkins job.

curl -X POST -H "$crumb" $jenkins_url/job/$job_name/build?token=$job_token \
  --user $auth \
  --data-urlencode json='
  {"parameter":
    [
      {"name":"parameter1", "value":"test1"},
      {"name":"parameter2", "value":"test2"},
      {"name":"git_repo", "value":"'$my_repo'"},
      {"name":"git_tag", "value":"'$git_tag'"}
    ]
  }'

In the example job above, I am using several Jenkins parameters as part of the build.  The json name values correspond to the parameters.  Notice that I am using variables for a few of the values above, make sure those variables are wrapped in singe quotes to correctly escape the json.  The syntax for variables is slightly different but allows for some additional flexibility in the job configuration and also allows the script to be called with dynamic values.

If you call this script now, it should kick off a Jenkins job for you with all of the values you have provided.

Read More

Curl on Windows using a Docker wrapper

Does the Windows built-in version of “curl” confuse or intimidate you?  Maybe you come from a Linux or Unix background, and yearn for some of your favorite go-to tools?  Newer versions of Powershell include a cmdlet for interacting with the web called Invoke-WebRequest, which is useful, but is not a great drop in replacement for those with experience in non Windows environments.  The Powershell cmdlets are a move in the right direction to unifying CLI experiences but there are still many folks that have become attached to curl over the years, including myself.  It is worth noting that a Windows compatible version of curl has existed for a long time, however it has always been a nuisance dealing with the zip file, just as using SSH has always been a hassle on Windows.  It has always been possible to use the *nix equivalent tools, it is just clunky.

I found a low effort solution for adding curl to my Windows CLI flow, that acts as a nice middle ground between learning Invoke-WebRequest and installing curl binaries directly, which I’d like to share.  This alias trick is a simple way to use curl for working with API’s and other various web testing in Windows environments without getting tangled in managing versions, and dealing with vulnerabilities.  Just download the latest Docker image to update curl to the newest version, and don’t worry about its implementation across different systems.

Prerequisites are light.  First, make sure to have the Docker for Windows app installed (stable or beta are both fine) as well as a semi-recent version of Powershell.

Next step.  If you haven’t set up a Powershell profile, there are also lots of links and resources about how to do it.   I even wrote about it recently, so I am skipping that step as well.  Start by adding the following snippet to your Powershell profile (by default located in C:\Users\<user>\Documents\WindowsPowerShell\Microsoft.PowerShell_profile.ps1) and saving.

# Curl alias using docker
function Docker-Curl {
   docker run --rm byrnedo/alpine-curl $args
}

# Aliases
New-Alias dcurl Docker-Curl

Then source you terminal and run the curl command that was just created.

dcurl -h

One issue you might notice from the snippet above is that the Docker image is not an “official” image.  If this bothers you (security concerns, etc.), it is really easy to create your own, secure image.  There are lots of examples of how to create minimal images with Curl pre-installed.  Just be aware that your custom image will need to be maintained and occasionally rebuilt/published to guard against future vulnerabilities.  For brevity, I have skipped this process, but here’s an example of creating a custom image.

Optional

To update curl, just run the docker pull command.

docker pull apline-curl

Now you have the best of both worlds.  The built-in Invoke-WebRequest cmdlet provided by Powershell is available, as well as the venerable curl command.

My number one case for using curl in a container is that it has been in existence for such a long time (less bugs and edge cases) and it can be used for nearly any web related task.  It is also much handier to use curl for those with a background using *nix systems, rather than digging around in unfamiliar Powershell docs for similar functionality.  Having the ability to run some of my favorite tools in an easy, reproducible way on Windows has been a refreshing experience while sliding back into the Windows world.

Read More

Backup Route 53 zones

We have all heard about DNS catastrophes.  I just read about horror story on reddit the other day, where an Azure root DNS zone was accidentally deleted with no backup.  I experienced a similar disaster a few years ago – a simple DNS change managed to knock out internal DNS for an entire domain, which contained hundreds of records.  Reading the post hit close to home, uncovering some of my own past anxiety, so I began poking around for solutions.  Immediately, I noticed that backing up DNS records is usually skipped over as part of the backup process.  Folks just tend to never do it, for whatever reason.

I did discover, though, that backing up DNS is easy.  So I decided to fix the problem.

I wrote a simple shell script that dumps out all Route53 zones for a given AWS account to a json file, and uploads the zones to an S3 bucket.  The script is a handful lines, which is perfect because it doesn’t take much effort to potentially save your bacon.

If you don’t host DNS in AWS, the script can be modified to work for other DNS providers (assuming they have public API’s).

Here’s the script:

#!/usr/bin/env bash

set -e

# Dump route 53 zones to a text file and upload to S3.

BACKUP_DIR=/home/<user>/dns-backup
BACKUP_BUCKET=<bucket>
# Use full paths for cron
CLIPATH="/usr/local/bin"

# Dump all zones to a file and upload to s3
function backup_all_zones () {
  local zones
  # Enumerate all zones
  zones=$($CLIPATH/aws route53 list-hosted-zones | jq -r '.HostedZones[].Id' | sed "s/\/hostedzone\///")
  for zone in $zones; do
  echo "Backing up zone $zone"
  $CLIPATH/aws route53 list-resource-record-sets --hosted-zone-id $zone > $BACKUP_DIR/$zone.json
  done

  # Upload backups to s3
  $CLIPATH/aws s3 cp $BACKUP_DIR s3://$BACKUP_BUCKET --recursive --sse
}

# Create backup directory if it doesn't exist
mkdir -p $BACKUP_DIR
# Backup up all the things
time backup_all_zones

Be sure to update the <user> and <bucket> in the script to match your own environment settings.  Dumping the DNS records to json is nice because it allows for a more programmatic way of working with the data.  This script can be run manually, but is much more useful if run automatically.  Just add the script to a cronjob and schedule it to dump DNS periodically.

For this script to work, the aws cli and jq need to be installed.  The installation is skipped in this post, but is trivial.  Refer to the links for instructions.

The aws cli needs to be configured to use an API key with read access from Route53 and the ability to write to S3.  Details are skipped for this step as well – be sure to consult the AWS documentation on setting up IAM permissions for help with setting up API keys.  Another, simplified approach is to use a pre-existing key with admin credentials (not recommended).

Read More

Limit Jenkins Multibranch Pipeline Builds

Update: Thanks to Torben Kerr for adding instructions for setting up these properties at the multibranch build level rather then for each Jenkinsfile.  You can check out his “fix” here.

As the Jenkins pipeline functionality continues to rapidly evolve – the project documentation (or lack thereof), has been a consistent pain point as a user. Invariably, the documentation is either out of date or completely missing.  I expect the docs to improve as the project matures, but for now, the cake is a lie.  I ran into this roadblock recently, looking for a way to limit the number of concurrent builds that happen in Jenkins, using the pipeline.  In all of my anguish, I hope this post will help others in avoiding the tediousness of finding the seemingly simple functionality of limiting concurrent builds, as well as give some insight into strategies for figuring out how to find undocumented features in Jenkins.

While this feature is fairly obvious for old-style Jenkins jobs, a simple check box in the job configuration – finding the same functionality for pipelines is seemingly non existent.  Through extensive Googling and Stack Overflowing, I discovered this feature was recently added to the Multibranch plugin.  Specifically, I found an issue in the (awful) issue tracker used by Jenkins, which in turn led me to uncover some code in a semi recent PR that basically allows concurrency to be turned on or off.  Of course when I tried to use the code from the PR it didn’t work right away.  So I had to go deeper.

Eventually, I  stumbled across a SO post that discusses how to use the properties functionality of pipelines.  Equipped with this new piece of information, I finally had enough substance to start playing around with the code.  To make the creation of pipelines easier, Jenkins also recently added a snippet generator, which allows users to build out sample snippets quickly.

To use the snippet generator, either drill into an existing pipeline style job using a similar URL as below:

https://jenkins.example.com/job/<jobname>/pipeline-syntax/

Or create a new job, and click on the “Pipeline Syntax” link after it has been created to test out different snippets.

pipeline syntax

Inside the snippet generator there are a number of “steps” to choose from.  From the information I had already gathered, I just selected the properties step to create the basic skeleton of what I wanted and was able to use the disableConcurrentBuilds() function I found earlier. Below is a snippet of what the code in your Jenkinsfile might actually look like:

node {
 // This oneliner is what limits concurrent builds
 properties([disableConcurrentBuilds()])

 // Do stuff
 ...
}

Yep.  That’s it.  Just make sure to put the properties() function at the beginning of the node block, otherwise concurrency won’t be adjusted right away and could lead to problems.  Another thing to note; the step to disable concurrency could just as easily be moved into workflow libraries and applied at the global level and applied at the beginning of all jobs if you wanted to limit concurrency for all pipeline builds, since the code is just Groovy.  Finally, the code will disable concurrent builds on a per branch basis.  Essentially, if you push many different branches it will still build all of them, it will just limit each branch to one build at a time and will queue up jobs for any commits that get pushed after the initial job has been created.  I know that is a mouthful.  Let me know in the comments if this explanation needs any clarification.

While I love open source software, sometimes project’s move so fast that certain areas of it get neglected.  I am thankful for things like Github, because I was able use it to piece together all the other information I found to come up with a solution.  But, I would argue having good documentation not only saves folks like me the time and energy of the crazy searches, it also makes it much easier for potentially new users to look at, and understand what is going on.  I will be 100% honest and say that Jenkins pipelines are not for the faint of heart, and I’m sure there are many others who will agree with this sentiment.  I know it is easier said than done, but anything right now would be an improvement in my opinion.

Read More