Vid Bregar
Automating Google Cloud Image Upgrades with a Custom Renovate Datasource

Automating Google Cloud Image Upgrades with a Custom Renovate Datasource

Tired of manually updating your Google Cloud (GCP) images? This post dives into how you can use Renovate to automate this process, ensuring your infrastructure is always updated.

Versioning Images

Building VM images with tools like Packer allows you to bake in all necessary dependencies, treating them like container images for easy recreation. Versioning such images is important. You gain precise control over what's deployed and what you're upgrading to while also paving the way for automation.

While Google Cloud doesn't provide a standard way for versioning images, I propose a simple yet effective image naming convention:

<image_name>---<image_version>

For instance, with semver: my-app---1-0-0 or using a Unix timestamp: my-other-app---1746168402.

The triple hyphen (---) acts as a clear separator between the image name and version. This format adheres to Google Cloud image naming restrictions, which only allow lowercase letters, numbers, and hyphens.

Code

Explore the example source code here.

Overcoming Renovate Limitations with a Custom Datasource

Renovate provides numerous datasources, but a dedicated one for GCP images is currently missing. While some existing datasources (aws-machine-image) come close, we can bridge this gap by building our own.

You can find the official documentation for building custom datasources here: Custom Datasource.

Instead of following the examples in the documentation, we'll take a different approach: creating a cloud function that Renovate can query.

Renovate will send a request like:

GET /v1/releases/images/my-gcp-project/my-app

The cloud function then interacts with the GCP Compute API to retrieve relevant images:

def fetch_image_releases(project: str, name: str) -> dict:
    # all versioned images names are of the format:
    # <name>---<version>
    regex = re.compile(f"{name}---.*")

    request = compute_v1.ListImagesRequest(project=project)
    images = images_client.list(request=request)

    matching_images = [
        image for image in images \
        if regex.match(image.name) and image.status == "READY"
    ]

    return images_to_renovate_response(matching_images, project, name)

and extracts the versions from the versioned image names:

def extract_image_version(versioned_image_name: str, image_name: str):
    gcp_image_version = versioned_image_name.removeprefix(f"{image_name}---")

    match gcp_image_version:
        case _ if re.match(r"^\d{10,}$", gcp_image_version):
            # If there are 10 or more digits,
            # the image is most likely versioned with a unix timestamp.
            # For example, image-name---1744888624
            return gcp_image_version
        case _:
            # Otherwise we assume, for example,
            # image-name---1-2-3 or image-name---1-5, etc.
            # 1-2-3 becomes `1.2.3`, and 1-5 becomes `1.5`.
            return gcp_image_version.replace("-", ".", 3)

Finally, it formats the output into a JSON structure Renovate understands:

{
	"releases": [
		{ "version": "1.1.0" },
		{ "version": "1.0.0" }
	]
}

You can find the complete cloud function code here.

Deploying this cloud function on GCP is straightforward with Terraform. First, store the function's source code in a GCP bucket:

resource "google_storage_bucket" "this" {
  name          = "${var.project}-renovate-gcp-datasource"
  project       = var.project
  location      = local.region
  force_destroy = true

  versioning {
    enabled = false
  }
}

data "archive_file" "this" {
  type        = "zip"
  source_dir  = "${path.module}/src"
  output_path = "/tmp/renovate-gcp-datasource.zip"
}

resource "google_storage_bucket_object" "this" {
  bucket       = google_storage_bucket.this.name
  name         = "${data.archive_file.this.output_md5}.zip"
  content_type = "application/zip"
  source       = data.archive_file.this.output_path
}

Then, define the cloud function, referencing the stored source code:

locals {
  region = "europe-west1"
}

resource "google_cloudfunctions2_function" "gcp_datasource" {
  name        = "renovate-gcp-datasource"
  project     = var.project
  location    = local.region
  description = "Function that acts as an Renovate custom datasource for GCP"

  build_config {
    runtime     = "python311"
    entry_point = "main"

    source {
      storage_source {
        bucket = google_storage_bucket.this.name
        object = google_storage_bucket_object.this.name
      }
    }
  }

  service_config {
    max_instance_count = 1
    available_memory   = "512Mi"
    timeout_seconds    = 30
    ingress_settings   = "ALLOW_ALL"

    environment_variables = {
      PROJECT = var.project
    }

    service_account_email = google_service_account.this.email
  }

  depends_on = [
    google_project_iam_member.disk_viewer,
  ]
}

Crucially, the cloud function needs the roles/compute.imageUser role to access your GCP images.

Connecting Renovate to Google Cloud

With the cloud function deployed, we need to enable Renovate to securely access it. This involves creating a dedicated service account for Renovate:

resource "google_service_account" "renovate" {
  project      = var.project
  account_id   = "renovate"
  display_name = "Renovate Service Account"
}

and granting it permission to invoke the cloud function:

resource "google_cloud_run_service_iam_member" "gcp_datasource_invoker" {
  project  = var.project
  location = var.region
  service  = google_cloudfunctions2_function.gcp_datasource.name
  role     = "roles/run.invoker"
  member   = "serviceAccount:${google_service_account.renovate.email}"
}

Next, generate a service account key and configure your Renovate pipeline to authenticate when calling the cloud function. For example, in GitLab pipelines:

renovate:auth:
  # ...
  variables:
    GCP_PROJECT: "my-project"
    SERVICE_ACCOUNT_EMAIL: "renovate@my-project.iam.gserviceaccount.com"
  script:
    - gcloud config set project "$GCP_PROJECT"
    - gcloud auth activate-service-account $SERVICE_ACCOUNT_EMAIL --key-file "$RENOVATE_SA_KEY"
    - echo "GOOGLE_IDENTITY_TOKEN=$(gcloud auth print-identity-token)" >> auth.env
  artifacts:
    reports:
      dotenv:
        - auth.env

renovate:
  # ...
  needs:
    - job: renovate:auth
      artifacts: true
  image:
    name: renovate/renovate:latest
  before_script:
    - >-
      export RENOVATE_HOST_RULES="[
      {\"matchHost\": \"europe-west1-my-project.cloudfunctions.net\", \"token\": \"$GOOGLE_IDENTITY_TOKEN\" }
      ]"
  script:
    - renovate "${PROJECT_PATH}"
  parallel:
    matrix:
      - PROJECT_PATH: "some-org/some-project"
        RENOVATE_LABELS: "renovate"

Creating a Custom Manager for Google Cloud Images

The custom datasource is ready, and Renovate has the necessary permissions. Now, we need to tell Renovate where to look for GCP image definitions in your code and which datasource to use. This is where a custom Renovate manager comes in (refer to the official documentation).

This custom manager will scan files matching the fileMatch regex and apply a specific regex to extract details like the dependency name and current version:

"customManagers": [
	{
		"customType": "regex",
		// This will check all .tf and .pkr.hcl files for the regex match
		"fileMatch": [
		".*\\.(tf|pkr\\.hcl)$"
		],
		// Will match extract the necessary information from, for example:
		// locals {
		//   # renovate: gcp-images versioning=loose
		//   source_image_name       = "my-gcp-project/my-image-name"
		//   source_image_version    = "28.1"
		// }
		//
		// The actual full image name in GCP would be `my-image-name---28-1`
		// because the name must consist only of lowercase letters (a-z), numbers and hyphens.
		"matchStrings": [
		"#\\s*?renovate: gcp-images( versioning=(?<versioning>.*?))?\\s+.*image_name\\s+=\\s+\\\"(?<depName>.*?)\\\"\\s+.*image_version\\s+=\\s+\\\"(?<currentValue>.*?)\\\""
		],
		"versioningTemplate": "{{#if versioning}}{{{versioning}}}{{else}}semver{{/if}}",
		"datasourceTemplate": "custom.gcp-disk-image"
	}
],
"customDatasources": {
	"gcp-disk-image": {
		"defaultRegistryUrlTemplate": "https://europe-west1-my-project.cloudfunctions.net/renovate-gcp-datasource/v1/releases/images/{{packageName}}"
	}
}

See It in Action

With everything configured, simply add the renovate gcp-images tag to your Terraform or Packer HCL files:

locals {
    # renovate: gcp-images
    source_image_name    = "blockportal-production/my-image-semver"
    source_image_version = "1.5.3"
}
locals {
    # renovate: gcp-images versioning=loose
    boot_image_name    = "blockportal-production/my-image-timestamp"
    boot_image_version = "1744958585"
}

Renovate will now identify *_image_name and *_image_version, query the custom datasource, and propose upgrades when new image versions are available.

You can get the image's project and full image name within your Terraform configurations like this:

locals {
  image_project   = split("/", local.source_image_name)[0]
  full_image_name = "${split("/", local.source_image_name)[1]}---${replace(local.source_image_version, ".", "-")}"
}

Conclusion

By introducing a versioning convention, combining a custom cloud-function-based datasource and a Renovate regex manager, you can effectively version and automate the upgrading of your GCP images.


Need help with DevOps or want to upgrade your team's skills?
Let's connect


Get notified when I post