Deploying with Ansible from GitLab

Published on 11-04-21
featured image

Many of us deal with custom, scripted operations in our CI/CD pipelines. Often engineered from a direct need, not thinking about long-term maintenance, extendability, transferability and typically not well-documented. One of the ways to overcome these issues is by automating your IT operations using an industry-proven framework, which accounts for all of the above. My framework of choice is Ansible, an open source community project sponsored by Red Hat, delivering standardized and well-documented tools and plugins to implement enterprise-wide automation.

Having Ansible in place, you can write so called playbooks to "offer a repeatable, re-usable, simple configuration management and multi-machine deployment system, one that is well suited to deploying complex applications". Both playbooks and the ecosystem you're deploying on, are considered to be specific to the project you're automating.

To use Ansible in Gitlab, we can leverage Gitlab its capabilities to easily run pipeline jobs based on (custom) Docker images. Using this approach, we'll effectively setup two Gitlab projects:

  1. A project to build a generic Ansible Docker image, to use in Ansible in downstream projects and jobs
  2. A project using the generic Ansible Docker image, to run an example playbook

Building a generic Ansible Docker image Gotolink

In order to actually build the Docker image and provide the generated image as a base image for other projects, we first set up a separate project in Gitlab, containing a Dockerfile and the pipeline configuration to build a Docker image using the Dockerfile. To do so, follow the steps below or clone my sample repository as a start.

At the time of writing no community image was available, so I had to package Ansible and the required plugins myself. Here is my minimal Dockerfile to run Ansible in an Alpine-based Docker container:

FROM python:3-alpine

# Install Alpine packages
RUN apk add build-base libffi-dev

# Install Python modules
RUN pip install pip --upgrade
RUN pip install ansible 

# Install Ansible plugins
RUN ansible-galaxy collection install community.general

Next, we need to set up our pipeline configuration for GitLab, which builds a Docker image with this Dockerfile. A minimal version of the .gitlab-ci.yml achieving this:

build:
  stage: build
  image:
    name: gcr.io/kaniko-project/executor:debug
    entrypoint: [""]
  script:
    - mkdir -p /kaniko/.docker
    - echo "{\"auths\":{\"$CI_REGISTRY\":{\"username\":\"$CI_REGISTRY_USER\",\"password\":\"$CI_REGISTRY_PASSWORD\"}}}" > /kaniko/.docker/config.json
    - /kaniko/executor --context $CI_PROJECT_DIR --dockerfile $CI_PROJECT_DIR/Dockerfile --destination $CI_REGISTRY_IMAGE:${CI_COMMIT_REF_SLUG}

Combining the above, every single commit to your main-branch will create a new Docker image, tagged main in your Gitlab Container Registry. In production pipelines, delicate container tagging based on Git tags should be introduced to safeguard compatibility and stability, but that is out-of-scope for this article.

Using the generic Ansible Docker image Gotolink

To use the Ansible Docker image generated before to automate certain tasks in our project, we need to set up the pipeline configuration and provide an initial Ansible playbook to work with. To do so, follow the steps below or clone my sample repository as a start.

Ansible requires two files at minimum to operate:

  1. A playbook in yml format, specifiying Ansible tasks to execute
  2. An inventory in yml format, containing the inventory of target hosts to run the Ansible playbook on

Building the playbook Gotolink

In order to test our Ansible setup in our project, we'll add a minimal playbook.yml to our project. To learn more about playbooks, consult the official Ansible documentation on playbooks. A minimal version of a playbook which just prints Hello world! in our GitLab pipeline looks like this:

---
- name: Run sample playbook in Docker
  hosts: localhost                          # run this Ansible playbook against localhost (i.e. in the GitLab Runner)
  connection: local                         # connect to Ansible using a local socket
  tasks:
    - name: Log message
      debug:
        msg: "Hello world!"

Setting up the inventory Gotolink

In order to run our playbooks against one or more hosts in our ecosystem, we need to define our inventory i the inventory.yml file. To learn more about inventories, consult the official Ansible documentation on inventories. A minimal version of the inventory which just allows us to run Ansible playbooks against localhost looks like this:

all:
  hosts:
    localhost:

Setting up the pipeline Gotolink

Having both our playbook and inventory in place, we're now ready to fire up Ansible through our previously built Docker image. A minimal version of the .gitlab-ci.yml achieving this:

stages:         
  - build

build:  
  image: registry.gitlab.com/briansnijders/docker-ansible:main
  stage: build
  only:
    - main
  script:
    - >-
      ansible-playbook 
      --inventory-file inventory.yml
      playbook.yml

Combining the above, every single commit to your main-branch will:

  • spawn a new GitLab runner, using our Docker image (containing Ansible) as a base image
  • run the Ansible playbook defined in playbook.yml with the given inventory in inventory.yml.

Further tailoring the playbook.yml to your project-specific automation needs, allows you to automate almost every task in your project. Consult the Ansible documentation or Ansible collections to start extending your playbook.