Ansible roles and collections on Ansible Galaxy

Gepostet von oxi am Tue, Oct 31, 2023

Background

In my day job as a Linux System Engineer I write a lot of Linux automation stuff (often ansible things). But sadly very little of this is made public due to sensitive information, cost or just no interest of the customer or employer. My personal home lab is not nearly as complex as my jobs environments, but I rather write one role than do stuff manually multiple times. I try to adopt the infrastructure as code idea as much as possible also in my home lab.

I dreaded this step for a long time, since it was VERY time consuming, but I migrated all my roles which are still in use and not “internal only” from my own GitLab instance to GitHub. All roles are called role-NAME and the collections containing the roles are called collection-NAME.

I split up the 68 roles into five collections withing my Ansible Galaxy Namespace:

Migration to GitHub

First, I decided which roles to migrate and bundle them up into the collection. Once I had that, I used the following script to migrate the roles to GitHub. I did this in batches just to ensure everything is working as expected, but theoretically you could use this to migrate all roles at once. This script does the following tasks for each role:

  1. Clones it to a temporary directory
  2. Renames the branch master to main if it exists
  3. Replaces the linter pipeline I used on GitLab with a corresponding GitHub action
  4. Updates the $ANSIBLE_DIR/requirements.yml file
  5. Archives the role on GitLab

migrate_roles.sh

  1!/bin/bash
  2set -e
  3
  4echo "Preparing environment"
  5
  6GITHUB_TOKEN=github_pat_SOMETHING_VERY_SECURE
  7GITLAB_TOKEN=ALSO_VERY_SECURE
  8GITLAB_URL=https://gitlab.somewhere.lan/api/v4
  9TMP_DIR=~/.ansible
 10ANSIBLE_DIR=~/ansible
 11
 12mkdir -p $TMP_DIR
 13
 14REPOS=$'
 15  git@gitlab.somewhere.lan:ansible/role-win_debloat.git
 16  git@gitlab.somewhere.lan:ansible/role-win_explorer.git
 17  git@gitlab.somewhere.lan:ansible/role-win_startmenu.git
 18  git@gitlab.somewhere.lan:ansible/role-win_taskbar.git
 19'
 20
 21for REPO in $REPOS
 22do
 23  echo -e "\n\nWorking on $REPO"
 24  cd $TMP_DIR
 25
 26  REPO_NAME=$(echo $REPO | sed "s|git@gitlab.somewhere.lan:ansible/||g" | sed "s/.git//g")
 27  echo "> Discovered repo name: $REPO_NAME"
 28
 29  git clone $REPO
 30  cd $REPO_NAME
 31
 32  if [ "$(git rev-parse --abbrev-ref HEAD)" == "master" ];
 33  then
 34    echo "> Renaming master branch to main"
 35    git branch -m master main
 36  fi
 37
 38  echo "> Creating github repo"
 39  curl -s -H "Authorization: token $GITHUB_TOKEN" --data "{\"name\":\"$REPO_NAME\"}" https://api.github.com/user/repos
 40
 41  echo "> Updating repo origin url"
 42  git remote set-url origin git@github.com:YOUR_GITHUB_USER/$REPO_NAME.git
 43
 44  echo "> Fix linting"
 45  git rm .gitlab-ci.yml
 46  mkdir -p .github/workflows/
 47  cp $TMP_DIR/push.yml .github/workflows/push.yml
 48  git add .github/workflows/push.yml
 49  git commit -m "migrate pipeline from gitlab to github"
 50
 51  echo "> Pushing data to github"
 52  git push -u origin main
 53
 54  echo "> Fix requirements file for ansible from $REPO to $NEW_ORIGIN_URL"
 55  NEW_ORIGIN_URL=$(git config --get remote.origin.url)
 56  sed -i "s|$REPO|$NEW_ORIGIN_URL|g" $ANSIBLE_DIR/requirements.yml
 57
 58  echo "> Update ansible requirements file"
 59  cd $ANSIBLE_DIR
 60  git add requirements.yml
 61  git commit -m "Migrate role $REPO_NAME to github"
 62  git push
 63
 64  echo "> Archive repo in gitlab"
 65
 66  # Function to get the project ID by repository name
 67  get_project_id() {
 68    local search_response
 69    search_response=$(curl --request GET --header "PRIVATE-TOKEN: $GITLAB_TOKEN" "$GITLAB_URL/projects?search=$1")
 70
 71    # Extract the project ID from the response
 72    project_id=$(echo "$search_response" | jq -r '.[0].id')
 73    echo "$project_id"
 74  }
 75
 76  # Function to archive a project by project ID
 77  archive_project() {
 78    local project_id="$1"
 79    local archive_response
 80    archive_response=$(curl --request POST --header "PRIVATE-TOKEN: $GITLAB_TOKEN" "$GITLAB_URL/projects/$project_id/archive")
 81
 82    # Check if the archive request was successful based on the "archived" field
 83    archived=$(echo "$archive_response" | jq -r '.archived')
 84
 85    if [ "$archived" == "true" ]; then
 86      echo "> Repository '$REPO_NAME' (Project ID: $project_id) has been archived."
 87    else
 88      echo "> Failed to archive repository '$REPO_NAME' (Project ID: $project_id)."
 89      echo "> Response: $archive_response"
 90    fi
 91  }
 92
 93  # Main script logic
 94  project_id=$(get_project_id "$REPO_NAME")
 95
 96  if [ -n "$project_id" ]; then
 97    archive_project "$project_id"
 98  else
 99    echo "> Repository '$REPO_NAME' not found."
100  fi
101
102  echo "> Cleaning up $TMP_DIR/$REPO_NAME"
103  rm -rf $TMP_DIR/$REPO_NAME
104done

.github/workflows/push.yml

 1name: ansible-lint
 2
 3on: [push]
 4
 5jobs:
 6  build:
 7    runs-on: ubuntu-latest
 8    name: ansible-lint
 9    steps:
10      - uses: actions/checkout@master
11      - uses: actions/setup-python@v2
12      - run: pip install ansible ansible-lint
13      - run: ansible-lint --version
14      - run: ansible-lint --show-relpath .

Ansible collections

The collections contain only some meta information and all included roles as git submodules, which will then be packaged into collection releases. After creating a empty collection, I had to change several things:

Ansible version in meta/runtime.yml

Since my environment is at current versions, I was able to just choose the latest version. You might have to change that depending on the age of your roles and such.

1[..]
2requires_ansible: '>=2.9.10'
3[..]

Galaxy configuration galaxy.yml

You can use this as a working example. Keep in mind that the requirements of the roles contained within this collection have to be defined here. If you define dependencies in the roles as well, it will fail and annoy you. See the top answer on stackoverflow for more info.

 1### REQUIRED
 2# The namespace of the collection. This can be a company/brand/organization or product namespace under which all
 3# content lives. May only contain alphanumeric lowercase characters and underscores. Namespaces cannot start with
 4# underscores or numbers and cannot contain consecutive underscores
 5namespace: oxivanisher
 6
 7# The name of the collection. Has the same character restrictions as 'namespace'
 8name: linux_base
 9
10# The version of the collection. Must be compatible with semantic versioning
11version: 1.0.15
12
13# The path to the Markdown (.md) readme file. This path is relative to the root of the collection
14readme: README.md
15
16# A list of the collection's content authors. Can be just the name or in the format 'Full Name <email> (url)
17# @nicks:irc/im.site#channel'
18authors:
19  - Marc Urben
20
21### OPTIONAL but strongly recommended
22# A short summary description of the collection
23description: Collection of linux base roles
24
25# Either a single license or a list of licenses for content inside of a collection. Ansible Galaxy currently only
26# accepts L(SPDX,https://spdx.org/licenses/) licenses. This key is mutually exclusive with 'license_file'
27license:
28  - GPL-2.0-or-later
29
30# The path to the license file for the collection. This path is relative to the root of the collection. This key is
31# mutually exclusive with 'license'
32license_file: ""
33
34# A list of tags you want to associate with the collection for indexing/searching. A tag name has the same character
35# requirements as 'namespace' and 'name'
36tags:
37  - linux
38  - base
39
40# Collections that this collection requires to be installed for it to be usable. The key of the dict is the
41# collection label 'namespace.name'. The value is a version range
42# L(specifiers,https://python-semanticversion.readthedocs.io/en/latest/#requirement-specification). Multiple version
43# range specifiers can be set and are separated by ','
44dependencies:
45  "ansible.posix": "*"
46  "community.general": "*"
47
48# The URL of the originating SCM repository
49repository: https://github.com/oxivanisher/collection-linux_base.git
50
51# The URL to any online docs
52documentation: https://github.com/oxivanisher/collection-linux_base/blob/main/README.md
53
54# The URL to the homepage of the collection/project
55homepage: https://oxi.ch
56
57# The URL to the collection issue tracker
58issues: https://github.com/oxivanisher/collection-linux_base/issues
59
60# A list of file glob-like patterns used to filter any files or directories that should not be included in the build
61# artifact. A pattern is matched from the relative path of the file or directory of the collection directory. This
62# uses 'fnmatch' to match the files or directories. Some directories and files like 'galaxy.yml', '*.pyc', '*.retry',
63# and '.git' are always filtered. Mutually exclusive with 'manifest'
64build_ignore: []
65# A dict controlling use of manifest directives used in building the collection artifact. The key 'directives' is a
66# list of MANIFEST.in style
67# L(directives,https://packaging.python.org/en/latest/guides/using-manifest-in/#manifest-in-commands). The key
68# 'omit_default_directives' is a boolean that controls whether the default directives are used. Mutually exclusive
69# with 'build_ignore'
70# manifest: null

Add the roles as submodules

Since I created the collection manually and configured several things, some lines are commented.

 1#!/bin/bash
 2
 3# Define the source GitLab repositories and their roles
 4gitlab_repositories=(
 5    "https://github.com/oxivanisher/role-apt_unattended_upgrade.git"
 6    "https://github.com/oxivanisher/role-apt_source.git"
 7    "https://github.com/oxivanisher/role-oxiscripts.git"
 8    "https://github.com/oxivanisher/role-ssh_server.git"
 9    "https://github.com/oxivanisher/role-os_upgrade.git"
10    "https://github.com/oxivanisher/role-rebootcheck.git"
11    "https://github.com/oxivanisher/role-nullmailer.git"
12    "https://github.com/oxivanisher/role-smartd.git"
13    "https://github.com/oxivanisher/role-packages.git"
14    "https://github.com/oxivanisher/role-timezone.git"
15    "https://github.com/oxivanisher/role-locales.git"
16    "https://github.com/oxivanisher/role-ssh_keys.git"
17    "https://github.com/oxivanisher/role-syslog.git"
18    "https://github.com/oxivanisher/role-ntp.git"
19    "https://github.com/oxivanisher/role-logwatch.git"
20    "https://github.com/oxivanisher/role-realtime_clock.git"
21    "https://github.com/oxivanisher/role-dnscache.git"
22    "https://github.com/oxivanisher/role-hosts_file.git"
23    "https://github.com/oxivanisher/role-hosts_override.git"
24    "https://github.com/oxivanisher/role-nextcloud_davfs.git"
25    "https://github.com/oxivanisher/role-keyboard_layout.git"
26)
27
28# Define the destination GitHub repository for the Ansible collection
29#github_repo="https://github.com/your-user/ansible-collection.git"
30
31# Clone the GitHub repository
32#git clone $github_repo ansible-collection
33#cd ansible-collection
34
35# Initialize an empty Git repository if not already done
36#git init
37
38# Add a remote for the GitHub repository
39#git remote add github $github_repo
40
41# Loop through GitLab repositories
42for repo_url in "${gitlab_repositories[@]}"
43do
44    # Extract the role name from the repository URL (adjust this if needed)
45    role_name=$(basename $repo_url .git | sed 's/role-//g')
46
47    # Add the GitLab repository as a submodule
48    git submodule add $repo_url roles/$role_name
49
50    # Commit the submodule addition
51    git add .gitmodules roles/$role_name
52    git commit -m "Add submodule for role $role_name from Github repository"
53
54    # Push the changes to GitHub
55    #git push origin master
56done
57
58# Clean up - remove local clones
59#cd ..
60#rm -rf ansible-collection

Automatically create releases and upload them to Ansible Galaxy

And now to the magic bit. Since I am (probably) the only one using those roles, my aim is to automate this as much as possible … I don’t have to check this with anyone else and can do whatever I want. 🤪 So I created this GitHub workflow which automatically updates all the submodules to the latest commit, creates a release and then uploads it to Ansible Galaxy. This only happens, if I increase the version in galaxy.yml.

.github/workflows/release-new-version.yml

  1name: Auto Release and Publish to Ansible Galaxy
  2
  3on:
  4  push:
  5    branches:
  6      - main
  7  workflow_run:
  8    workflows: ["Check for Version Change"]
  9    types:
 10      - completed
 11
 12jobs:
 13  check_version:
 14    runs-on: ubuntu-latest
 15
 16    steps:
 17      - name: Checkout code
 18        uses: actions/checkout@v4
 19        with:
 20          submodules: false
 21
 22      - name: Update Submodules
 23        run: |
 24          git submodule update --init --recursive
 25          git submodule update --remote --merge          
 26
 27      - name: Check for Submodule Changes
 28        id: submodule_changes
 29        run: |
 30          # Check if there are changes in submodules
 31          if git diff --quiet --exit-code && git diff --quiet --exit-code --cached; then
 32            echo "No submodule changes detected."
 33            exit 0
 34          fi
 35          git config --local user.email "action@github.com"
 36          git config --local user.name "GitHub Action"
 37          git commit -am "Update submodules from workflow"
 38          git push
 39          echo "Submodule changes detected and pushed to repository."          
 40        continue-on-error: true
 41
 42      - name: Check for Version Change
 43        id: version_change
 44        run: |
 45          # Get the current version in galaxy.yml
 46          CURRENT_VERSION=$(grep -Eo 'version: [0-9]+\.[0-9]+\.[0-9]+' galaxy.yml | cut -d ' ' -f 2)
 47
 48          # Get the previous version in galaxy.yml from the last commit
 49          PREVIOUS_VERSION=$(git log -1 --pretty=format:"%h" -- galaxy.yml~1)
 50
 51          # Check if the versions are different
 52          if [ "$CURRENT_VERSION" != "$PREVIOUS_VERSION" ]; then
 53            echo "Version change detected."
 54          else
 55            echo "No version change detected."
 56            exit 0
 57          fi          
 58        continue-on-error: true
 59
 60  create_release:
 61    runs-on: ubuntu-latest
 62    outputs:
 63      my_output: ${{ steps.get_new_version.outputs.new_version }}
 64    needs: check_version
 65
 66    steps:
 67      - name: Checkout Repository
 68        uses: actions/checkout@v4
 69        with:
 70          ref: main
 71          submodules: true
 72
 73      - name: Get New Version
 74        id: get_new_version
 75        run: |
 76          # Extract the new version from galaxy.yml
 77          NEW_VERSION=$(grep -Eo 'version: [0-9]+\.[0-9]+\.[0-9]+' galaxy.yml | cut -d ' ' -f 2)
 78          echo "New version: $NEW_VERSION"
 79          echo "new_version=$NEW_VERSION" >> "$GITHUB_ENV"          
 80
 81      - name: Get Submodule Commit Messages
 82        if: env.new_version != ''
 83        id: submodule_commit_messages
 84        run: |
 85          # Get the latest commit messages from all updated submodules
 86          SUBMODULE_COMMIT_MESSAGES=$(git submodule foreach --quiet 'git log -1 --pretty=format:"%s"')
 87          echo "submodule_commit_messages=$SUBMODULE_COMMIT_MESSAGES" >> "$GITHUB_ENV"          
 88
 89      - name: Create Release on Github
 90        id: release_created
 91        if: env.new_version != ''
 92        run: |
 93          # Use the submodule commit messages as release notes
 94          RELEASE_NOTES="Release notes for version $new_version:\n$SUBMODULE_COMMIT_MESSAGES"
 95          echo "Release notes:"
 96          echo "$RELEASE_NOTES"
 97
 98          # Create a new release using the GitHub API
 99          RESULT=$(curl -X POST \
100            -H "Authorization: token ${{ secrets.GITHUB_TOKEN }}" \
101            -H "Accept: application/vnd.github.v3+json" \
102            https://api.github.com/repos/${{ github.repository }}/releases \
103            -d "{\"tag_name\":\"v$new_version\",\"name\":\"Release v$new_version\",\"body\":\"$RELEASE_NOTES\"}")          
104
105      - name: Build ansible galaxy collection
106        if: env.new_version != ''
107        run: |
108          ansible-galaxy collection build .          
109        working-directory: ${{ github.workspace }}
110
111      - name: Publish collection to Ansible Galaxy
112        if: env.new_version != ''
113        env:
114          ANSIBLE_GALAXY_API_TOKEN: ${{ secrets.ANSIBLE_GALAXY_API_TOKEN }}
115        run: |
116          ansible-galaxy collection publish --token $ANSIBLE_GALAXY_API_TOKEN *.tar.gz          
117        working-directory: ${{ github.workspace }}

Set ANSIBLE_GALAXY_API_TOKEN on GitHub

Add a Action Secret named ANSIBLE_GALAXY_API_TOKEN containing your Galaxy API token. The GitHub URL to this setting looks something like: https://github.com/user/repo/settings/secrets/actions

Set permissions on the collection repository

On https://github.com/user/repo/settings/actions you have to set:

  • Actions permissions to Allow all actions and reusable workflows
  • Workflow permissions to Read and write permissions

Push and upload

After all this is done, you should be able to push all this to GitHub and see the magic happen. Since I had the luck to do this during the surprise deployment of Ansible Galaxy NG by Red Hat, those steps took more than a month for me to be working. So if anything is missing or does not work, drop me a line so that I can update this page.

Usage

To ensure all requirements are at the latest version, I run the following commands:

1ansible-galaxy collection install --force --requirements-file requirements.yml
2ansible-galaxy role install --force -r requirements.yml

I use a “master playbook” which contains all the different plays. This is how I configure all the basic things on all linux systems.

site.yaml

 1---
 2- name: Basic system updates, packages and settings
 3  hosts: all,!windows
 4  collections:
 5    - oxivanisher.linux_base
 6  roles:
 7    - role: oxivanisher.linux_base.packages                           # install site specific packages
 8    - role: oxivanisher.linux_base.ssh_server                         # configure root login
 9    - role: oxivanisher.linux_base.os_upgrade                         # upgrade system packages
10      tags:
11        - upgrade
12        - never
13    - role: oxivanisher.linux_base.rebootcheck                        # check if a reboot is required and if requested with the reboot tag, do the reboot
14    - role: oxivanisher.linux_base.syslog                             # configure syslog
15      tags:
16        - syslog
17    - role: oxivanisher.linux_base.logwatch                           # configure logwatch
18    - role: oxivanisher.linux_base.apt_unattended_upgrade             # configure automatic security updates
19    - role: oxivanisher.linux_base.oxiscripts                         # configure oxiscripts
20      tags:
21        - oxiscripts
22    - role: oxivanisher.linux_base.timezone                           # configure timezone
23    - role: oxivanisher.linux_base.apt_source                         # configure apt-sources
24      tags:
25        - apt_sources
26    - role: oxivanisher.linux_base.locales                            # ensure basic locales
27    - role: oxivanisher.linux_base.keyboard_layout                    # configure keyboard layout
28    - role: oxivanisher.linux_base.prometheus_node_exporter           # prometheus node exporter
29    - role: oxivanisher.linux_base.smartd                             # configure smartd to not crash if no disks are available (i.e. on rpis)
30      tags:
31        - smartd
32
33- name: Basic for the really limited OpenElec (busy box based)
34  hosts: all,!windows
35  collections:
36    - oxivanisher.linux_base
37  roles:
38    - role: oxivanisher.linux_base.ssh_keys                           # install ssh keys and configure root login
39      tags:
40        - basic_access
41
42- name: Desktop/client only roles
43  hosts: client,!windows
44  collections:
45    - oxivanisher.linux_base
46    - oxivanisher.linux_desktop
47  roles:
48    - role: oxivanisher.linux_desktop.nextcloud_client                # configure nextcloud client package
49    - role: oxivanisher.linux_desktop.vivaldi                         # install vivaldi browser
50    - role: oxivanisher.linux_desktop.vscode                          # install visual studio code
51    - role: oxivanisher.linux_desktop.sublime                         # install sublime text
52    - role: oxivanisher.linux_desktop.signal_desktop                  # install signal desktop
53    - role: oxivanisher.linux_desktop.keepassxc                       # install keepass xc
54    - role: oxivanisher.linux_desktop.howdy                           # install howdy
55    - role: oxivanisher.linux_desktop.wol                             # install and setup wol
56    - role: oxivanisher.linux_desktop.fonts                           # install fonts
57    - role: oxivanisher.linux_desktop.nas_mounts                      # mount nas drives
58    - role: oxivanisher.linux_base.realtime_clock                     # set clock to use local rtc (thanks windows -.-)
59    - role: oxivanisher.linux_desktop.ubuntu_hide_amazon_link         # disable ubuntu amazon spam
60    - role: oxivanisher.linux_desktop.gnome_disable_inital_setup      # disable gnome initial setup
61    - role: oxivanisher.linux_desktop.gnome_loginscreen_configure     # configure gnome loginscreen
62[..]

requirements.yml

1[..]
2collections:
3  - oxivanisher.linux_base
4  - oxivanisher.linux_desktop
5  - oxivanisher.linux_server
6  - oxivanisher.raspberry_pi
7  - oxivanisher.windows_desktop
8[..]

Development workflow for roles

To still be able to develop and test the roles without creating releases all the time, I link a development version into roles/ and change the corresponding entry in the playbook. If you add a dev tag, it will speed up test runs massively.

 1- name: Jump hosts
 2  hosts: jump
 3  collections:
 4    - oxivanisher.linux_base
 5    - oxivanisher.linux_server
 6  roles:
 7    # - role: oxivanisher.linux_server.jump_host                      # configure jumphost specific things
 8    - role: jump_host_dev                                             # configure jumphost specific things
 9      tags:
10        - dev

Final thoughts

Please be aware, that lots of roles don’t have a readme yet and are even missing default variables and such. I will update the roles I touch for other reasons and update them, but use them at your own risk in the current state!

Also the GitHub workflow should use the newly created SHA on role submodule updates, since it theoretically is possible that a newer commit gets released if two pushes withing a short time window would mess things up. Since I am the only one doing stuff there, this is not really a concern fpr me.

Remember that you have to increase the version in galaxy.yml to trigger a release. For example if you updated one of the roles.

Ansible Galaxy would like to have a changelog and there are mechanisms to automate that also for the collections. But this is not yet implemented.