- Defining Shared Libraries
- Trusted Versus Untrusted Libraries
- Internal Versus External Libraries
- Library Scope Within Jenkins Items
- Using Libraries
- Loading Libraries
- @Library syntax:
- The libraries directive
- Writing libraries
- Library Structure
- src
- vars
- Defining global variables
- Defining custom steps
- Defining a more structured DSL
- resources
- Using Third-Party Libraries
- Troubleshooting
- Replaying External Code and Libraries
- Sample Library Routine
- Summary
- Reference
Defining Shared Libraries
Trusted Versus Untrusted Libraries
Trusted Libraries, with high execution rights and security. This type of library can execute arbitrary Groovy code, can reference external dependencies and tools, and can use a version control system to manage the code and version of the library, which can be used to perform sensitive operations and access restricted resources.
Untrusted Libraries are relatively insecure. This type of library can only execute restricted Groovy code, can only use the steps and functions provided by Jenkins, and cannot reference external dependencies and tools.Untrusted Libraries are relatively insecure. This type of library can only execute restricted Groovy code, can only use the steps and functions provided by Jenkins, and cannot reference external dependencies and tools.
Internal Versus External Libraries
Library Scope Within Jenkins Items
Using Libraries
Loading Libraries
Load implicitly: If checked, scripts will automatically have access to this library without needing to request it via @Library
.
@Library syntax:
@Library('<libname>[@<version>]')_ [<import statement>]
A couple of points about the syntax:
- The library name is required.
- The version should be preceded by the @ sign.
- The version can be a tag, branch name, or other specification of a revision in the source code repository.
- Specific subsets of methods can be imported by including an
import
statement at the end of the annotation or on the next line. - An
import
statement is not required. If one is not specified, all methods will be imported. - If no
import
statement is specified, then an underscore (_
) must be placed at the end of the annotation, directly after the closing parenthesis. (This is required since an annotation needs something to annotate by definition. In this case, the_
is simply serving as a placeholder.) - Multiple library names (with respective versions if desired) can be specified in the same annotation. Just separate them with commas.
Here are some simple examples:
// Load the default version of a library
@Library('myLib')_
// Override the default version and load a specific version of a library
@Library('yourLib@2.0')_
// Accessing multiple libraries with one statement
@Library(['myLib', 'yourLib@master'])_
// Annotation with import
@Library('myLib@1.0') import static org.demo.Utilities.*
The libraries directive
Within a Declarative Pipeline, we have one other option for pulling in libraries.
pipeline {
agent any
libraries {
lib("mylib@master")
lib("alib")
}
Writing libraries
Library Structure
(root)
+- src # Groovy source files
| +- org
| +- foo
| +- Bar.groovy # for org.foo.Bar class
+- vars
| +- foo.groovy # for global 'foo' variable
| +- foo.txt # help for 'foo' variable
+- resources # resource files (external libraries only)
| +- org
| +- foo
| +- bar.json # static helper data for org.foo.Bar
src
The src
directory should look like standard Java source directory structure. This directory is added to the classpath when executing Pipelines.
Any Groovy code is valid to use here. However, in most cases, you’ll probably want to invoke some kind of pipeline processing, using actual pipeline steps. There are several options for how to implement the step calls within the library, and correspondingly, how to invoke them from the script.
Here are some examples of things you could have in the src area:
- Library classes cannot directly call steps such as
sh
orgit
. They can however implement methods, outside of the scope of an enclosing class, which in turn invoke Pipeline steps, for example:// src/org/foo/Zot.groovy package org.foo def checkOutFrom(repo) { git url: "git@github.com:jenkinsci/${repo}" } return this
Which can then be called from a Scripted Pipeline:
def z = new org.foo.Zot() z.checkOutFrom(repo)
This approach has limitations; for example, it prevents the declaration of a superclass.
- Alternately, a set of DSL
steps
can be passed explicitly usingthis
to a library class, in a constructor, or just one method:package org.foo class Utilities implements Serializable { def steps Utilities(steps) {this.steps = steps} def mvn(args) { steps.sh "${steps.tool 'Maven'}/bin/mvn -o ${args}" } }
When saving state on classes, such as above, the class must implement the
Serializable
interface. This ensures that a Pipeline using the class, as seen in the example below, can properly suspend and resume in Jenkins.@Library('utils') import org.foo.Utilities def utils = new Utilities(this) node { utils.mvn 'clean package' }
If the library needs to access global variables, such as
env
, those should be explicitly passed into the library classes, or methods, in a similar manner.package org.foo class Utilities implements Serializable { def steps def env Utilities(steps, env) { this.steps = steps this.env = env } def mvn(args) { steps.sh "${steps.tool 'Maven'}/bin/mvn -o ${args}" steps.sh "echo Building for ${env.BUILD_TAG}" } }
- Instead of passing numerous variables from the Scripted Pipeline into a library:
package org.foo class Utilities { static def mvn(script, args) { script.sh "${script.tool 'Maven'}/bin/mvn -s ${script.env.HOME}/jenkins.xml -o ${args}" } }
The above example shows the script being passed in to one
static
method, invoked from a Scripted Pipeline as follows:@Library('utils') import static org.foo.Utilities.* node { mvn this, 'clean package' }
vars
The vars
directory hosts script files that are exposed as a variable in Pipelines. The name of the file is the name of the variable in the Pipeline.
The basename of each .groovy
file should be a Groovy (~ Java) identifier, conventionally camelCased
. The matching .txt
, if present, can contain documentation, processed through the system’s configured markup formatter (so may really be HTML, Markdown, etc., though the .txt
extension is required). This documentation will only be visible on the Global Variable Reference pages that are accessed from the navigation sidebar of Pipeline jobs that import the shared library. In addition, those jobs must run successfully once before the shared library documentation will be generated.
The Groovy source files in these directories get the same “CPS transformation” as in Scripted Pipeline.
Defining global variables
Internally, scripts in the vars
directory are instantiated on-demand as singletons. This allows multiple methods to be defined in a single .groovy
file for convenience. For example:
def info(message) {
echo "INFO: ${message}"
}
def warning(message) {
echo "WARNING: ${message}"
}
@Library('utils') _
log.info 'Starting'
log.warning 'Nothing to do!'
Declarative Pipeline does not allow method calls on objects outside "script" blocks. (JENKINS-42360). The method calls above would need to be put inside a script
directive:
@Library('utils') _
pipeline {
agent none
stages {
stage ('Example') {
steps {
// log.info 'Starting'
script {
log.info 'Starting'
log.warning 'Nothing to do!'
}
}
}
}
}
Defining custom steps
Shared Libraries can also define global variables which behave similarly to built-in steps, such as sh
or git
. Global variables defined in Shared Libraries must be named with all lowercase or "camelCased" in order to be loaded properly by Pipeline.
For example, to define sayHello
, the file vars/sayHello.groovy
should be created and should implement a call
method. The call
method allows the global variable to be invoked in a manner similar to a step:
def call(String name = 'human') {
// Any valid steps can be called from this code, just like in other
// Scripted Pipeline
echo "Hello, ${name}."
}
The Pipeline would then be able to reference and invoke this variable:
sayHello 'Joe'
sayHello() /* invoke with default arguments */
If called with a block, the call
method will receive a Closure
. The type should be defined explicitly to clarify the intent of the step, for example:
def call(Closure body) {
node('windows') {
body()
}
}
The Pipeline can then use this variable like any built-in step which accepts a block:
windows {
bat "cmd /?"
}
Defining a more structured DSL
If you have a lot of Pipelines that are mostly similar, the global variable mechanism provides a handy tool to build a higher-level DSL that captures the similarity. For example, all Jenkins plugins are built and tested in the same way, so we might write a step named buildPlugin
:
def call(Map config) {
node {
git url: "https://github.com/jenkinsci/${config.name}-plugin.git"
sh 'mvn install'
mail to: '...', subject: "${config.name} plugin build", body: '...'
}
}
Assuming the script has either been loaded as a Global Shared Library or as a Folder-level Shared Library the resulting Jenkinsfile
will be dramatically simpler:
buildPlugin name: 'git'
There is also a “builder pattern” trick using Groovy’s Closure.DELEGATE_FIRST
, which permits Jenkinsfile
to look slightly more like a configuration file than a program, but this is more complex and error-prone and is not recommended.
def call(body) {
// collect assignments passed in into our mapping
def settings = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = settings
body()
// now, time the commands
timestamps {
cmdOutput = echo sh (script:"${settings.cmd}", returnStdout:true).trim()
}
echo cmdOutput
writeFile file: '${settings.logFilePath}', text: '${cmdOutput}'
}
In this form, we declare a Groovy map via the def settings = [:]
syntax. Then the values we pass in get mapped and we can execute whatever other steps we need to. The references to delegate
here have to do with Groovy functionality. A complete discussion of delegation behavior in Groovy is beyond the scope of this section, but you can essentially think of it as telling Groovy to allow us to reference any values passed in utilizing the mapping we’re doing in this function.
Note that here, as in other vars steps, you should only use valid pipeline steps. Non-step Groovy code may not work or may have uncertain behavior.
With this form, we can invoke the code from our pipeline script very simply, as shown here:
timedCommand4 {
cmd = 'sleep 5'
logfilePath = 'log.out'
}
}
resources
A resources
directory allows the libraryResource
step to be used from an external library to load associated non-Groovy files. Currently this feature is not supported for internal libraries.
def request = libraryResource 'com/mycorp/pipeline/somelib/request.json'
The file is loaded as a string, suitable for passing to certain APIs or saving to a workspace using writeFile
.
Using Third-Party Libraries
TBD
Troubleshooting
Replaying External Code and Libraries
Sample Library Routine
Summary
This document provides a comprehensive guide to using shared libraries in Jenkins. It covers the differences between trusted and untrusted libraries, internal and external libraries, and how to load and use libraries in your pipelines. It also includes information on defining global variables and custom steps, and troubleshooting tips.