Trigger Bitbucket Pipeline Only If Certain Files Are Changed (With Google Cloud Functions)

Overview

Overview of what you will do.

Having DevOps practices can make you really productive as a developer. It is a culture in an organization and there are countless tools we can use to implement such solutions.

In this article we will implement a Bitbucket CI/CD pipeline.

Assumption : you have some familiarity with DevOps, CI/CD, Cloud Functions, Bitbucket and docker. If not, you may visit the references.

Scenerio

You have multiple cloud functions in a repository and want an individual build-test-deployment option for each function.

Screenshot of bitbucket repository

Note : these example functions are extended from the sample provided in GCP documentation.

Let’s say you have two functions which prints

Hello World!

and

Hello Mars!

These functions are hosted in Bitbucket and initially you can deploy them from your local environment :

gcloud functions deploy hello_world --allow-unauthenticated --runtime=python37 --memory=128MB --source hello_word/ --timeout=300 --trigger-http --entry-point=hello_world

But this is not the ideal process. Solution is to use version control and automated build, test and deployment.

Below is a new feature for Bitbucket pipelines at the time of writing which triggers the pipeline when certain files are changed :

condition:
    changesets:
        includePaths:
          - "path/*"

Hands-on

Steps

From GCP console

Your service account is created and you need to provide some permissions for the Bitbucket pipelines to work properly.

From IAM & Admin

Next up is configuring Bitbucket pipelines.

You will have to set up the environment variables first. We have to do this because hard-coding credentials is a bad practice.

Try to make the variables as verbose as possible, it helps when you have multi-environment variables. e.g. think if you had 10 functions from 10 different projects.

The following variables should be added.

GCP_PROJECT_PROJECT_ID
GCP_PROJECT_PRIVATE_KEY_ID
GCP_PROJECT_PRIVATE_KEY
GCP_PROJECT_EMAIL
GCP_PROJECT_CLIENT_ID
GCP_PROJECT_CERT_URL
GCP_PROJECT_REGION

Substitute the ‘PROJECT’ with your ‘project name’.

As this is a how-to guide, I will not be using multi-environment variables. The default naming is not changed.

In Bitbucket repository

Create a file named “sa-PROJECT.dist.json”. Where ‘sa’ stands for ‘service account’ and ‘PROJECT’ for ‘projet name’.

Contents of the file is a copy of your credentials from the service account but with the environment variables.

{
"type": "service_account",
"project_id": "$GCP_PROJECT_PROJECT_ID",
"private_key_id": "$GCP_PROJECT_PRIVATE_KEY_ID",
"private_key": "$GCP_PROJECT_PRIVATE_KEY",
"client_email": "$GCP_PROJECT_EMAIL",
"client_id": "$GCP_PROJECT_CLIENT_ID",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "$GCP_PROJECT_CERT_URL"
}

Now you will work with the bitbucket-pipelines.yml file

They have a sweet tool to validate your pipelines.

bitbucket-pipelines.yml :

pipelines:
  branches:
    master:
      - step:
          name: Deploy hello_world
#          deployment: test/ staging/ production
          image: google/cloud-sdk:234.0.0
          script:
              - apt-get update && apt-get install -y gettext-base
              - envsubst < sa-PROJECT.dist.json > service-account.json
              - gcloud auth activate-service-account --key-file=service-account.json
              - gcloud functions deploy hello_world --runtime python37 --trigger-http --project $GCP_PROJECT_PROJECT_ID --region $GCP_PROJECT_REGION --source hello_world/
          condition:
              changesets:
                 includePaths:
                   - "hello_world/*"
      - step:
          name: Deploy hello_mars
#          deployment: test/ staging/ production
          image: google/cloud-sdk:234.0.0
          script:
              - apt-get update && apt-get install -y gettext-base
              - envsubst < sa-PROJECT.dist.json > service-account.json
              - gcloud auth activate-service-account --key-file=service-account.json
              - gcloud functions deploy hello_mars --runtime python37 --trigger-http --project $GCP_PROJECT_PROJECT_ID --region $GCP_PROJECT_REGION --source hello_mars/
          condition:
              changesets:
                 includePaths:
                   - "hello_mars/*"

You can find the code at this Bitbucket repository. And you can use it as template.

So now if you want to experiment you may modify ‘hello_mars’ only ‘hello mars’ will be deployed and vice versa.

And you are done triggering bitbucket pipelines when certain files are changed!

Thanks for reading!

If you have any feedback, you may comment below or tweet @omar161000. Connect with me on Linkedin.

Thanks for reviewing the drafts :

References

Definitions

Go up.

comments powered by Disqus