Jenkins is an open source automation server that enables developers to reliably build, test, and deploy applications. Two of the primary ways to define workload automation in Jenkins are via jobs and pipelines.
Jobs in Jenkins
A Jenkins job is the basic buildable item in Jenkins. It is essentially an execution unit that defines the end-to-end build process for a particular task or set of tasks. Jobs are configured via the web UI or config files to trigger builds that execute a series of steps defined in the job.
Some key aspects of Jenkins jobs:
- Jobs are independent execution units that do not depend on any other jobs
- Jobs are defined via XML config files or via the web UI
- Jobs consist of a series of discrete build steps executed in order
- Build triggers can be configured to automatically start jobs on a schedule, changes in source control, etc.
- Jobs can be parameterized to accept input values at runtime
- Jobs manage their own workspace on the Jenkins server to check out sources, save artifacts, etc.
Some common examples of how Jenkins jobs are used:
- Execute a build process – compile code, run tests, create artifacts
- Deploy applications to various environments
- Run automated tests
- Package software
Pipelines in Jenkins
Jenkins Pipeline (or simply “Pipeline” with a capital “P”) is a suite of plugins that supports implementing and integrating continuous delivery pipelines into Jenkins. A Jenkins Pipeline is defined in a text file called a Jenkinsfile which contains the pipeline steps to be executed.
Some key aspects of Jenkins pipelines:
- Pipelines are defined in Jenkinsfile text files checked into source control
- Pipeline steps are written in Groovy syntax in the Jenkinsfile
- Steps can execute shell scripts or call other Jenkins jobs
- Common conventions for structured pipelines: stages, parallel workflows, environment directives
- Rich ecosystem of plugins provides extensible building blocks for pipelines
- Pipelines offer pausing, restarting, replays, etc.
Some common examples of how pipelines are used:
- Implement Continuous Delivery workflows – build, test, analysis, staging, production
- Model complex workflows with parallel stages
- Standardize and templatize workflows across projects
- Integrate testing, security, and compliance practices
Key Differences Between Jobs and Pipelines
While Jenkins jobs and pipelines have some overlap in capabilities, there are some key differences between the two:
Jobs | Pipelines |
---|---|
Configured via UI or config files | Configured via Jenkinsfile source files |
Discrete, independent execution units | Model end-to-end workflows |
Manual step-by-step execution | Structured scripted execution |
Older, legacy system | Newer pipeline-as-code approach |
Suited for simple builds and tasks | Suited for complex workflows |
Configuration
One major difference is how jobs and pipelines are configured. Jenkins jobs are configured via the web UI or XML config files, while pipelines are configured via Jenkinsfile text files checked into source control.
Defining jobs in the UI or config files couples the job logic to the Jenkins controller. Changes require reconfiguring the job in Jenkins itself. But Jenkinsfiles are checked into source control and treat the pipeline as code. Changes are tracked in version control and automatically picked up by Jenkins.
Execution Model
Jobs provide an imperative, step-by-step execution model where each discrete step executes linearly in order. But pipelines provide a declarative execution model defined in a structured script. The Jenkinsfile allows modeling complex workflows with stages, parallel flows, environment handling, etc.
Jobs operate independently while pipelines model entire workflows end-to-end. Jobs work best for simple build tasks while pipelines are better suited for orchestrating complex CD processes with multiple integrated steps.
Extensibility
Both jobs and pipelines can be extended with plugins. But the pipeline ecosystem provides many more modular building blocks via plugins. Steps for tool integrations, notifications, pipeline stage features, etc. make pipelines highly extensible.
For example, the Pipeline Maven Integration Plugin, Pipeline Utility Steps Plugin, Pipeline GitHub Library provide reusable functions to help standardize pipeline scripts.
Reusability and Templatization
Jobs are typically created as standalone units with minimal reuse across jobs. But the pipeline-as-code model encourages reuse of Pipeline code across projects.
Shared libraries allow defining common functions to call from pipeline scripts. Template catalogs standardize and approve pipelines for reuse across teams. This templatization streamlines onboarding of new pipelines.
When to Use Jobs vs Pipelines
In summary:
- Use Jobs for simple, linear build tasks and workflows
- Use Pipelines to model complex, structured CD workflows with reuse and templatization
In practice, many Jenkins instances use a combination of jobs and pipelines.
Jobs are useful for executing standalone tasks independent of any larger workflow – builds, tests, deployments, etc. Jobs integrate nicely with pipelines for reusable build steps.
Pipelines focus on end-to-end workflow orchestration. They provide the structure, reusable conventions, and templatization for advanced Continuous Delivery practices.
For simple build execution on legacy Jenkins instances, jobs may be sufficient. But pipelines facilitate pipeline-as-code approaches and advanced modern CD architectures.
Conclusion
Jenkins jobs and pipelines serve complementary purposes:
- Jobs – configurable build execution units for simple workflows
- Pipelines – structured pipeline-as-code for complex CD flows
Jobs have a discrete, linear execution model while pipelines provide declarative, structured scripts. Pipelines emphasize code reuse, templatization, and end-to-end workflows.
In practice, jobs and pipelines can interoperate. Jobs handle individual build tasks while pipelines focus on orchestrating reusable CD processes.
The pipeline model is key for implementing modern continuous delivery patterns. As Jenkins evolves, pipelines are becoming the primary method for automating complex development workflows.