Developing a web application can involve a lot of time-consuming steps that might mess up your planning if you need to do them manually every time. That is why we try to automate as much as possible during the development process. In this blog post, we will be exploring how our deployment pipeline has been set up to take human error out of the equation.
Of containers and repositories
Let's start with some background information about the services we use from hosting to our preferred source control implementation. The projects we make are kept safe in bitbucket's Git repositories. As for hosting, we make use of DigitalOcean's droplets. Docker is also an important technology to us, since a single app can consist of several containerised background-services. We use Portainer to manage our containers. And Portus, a self-hosted, private docker registry to host our docker images.
While these technologies and services have their undeniable benefits, it would be preferable if we didn't have to tinker too much with any of them after some basic configuration. After all, manual steps are prone to mistakes, and take up valuable time which could be used more productively.
Our solution in a nutshell
Thanks to each of the above services, it is possible to automate the deployment process completely. The only manual step that is needed for a live web application to update to the latest version is this:
After pushing to our project's repository, BitBucket's continuous delivery kicks in. Here is what a basic pipeline needs to do after receiving an update:
- log in to our docker registry (Portus)
- build the docker image, and push it to the docker registry
- trigger a Portainer webhook of the relevant docker service
Assuming the build has succeeded, Portainer will receive the cURL request which was made at the last step of BitBucket's build pipeline. It is the signal to start downloading the newly built Docker image from Portus. To achieve this, we defined our Portus instance as a custom registry in the Portainer settings.
When the image has been downloaded, the relevant service will automatically restart, which means we can visit the web application to inspect the update's result.
In the deployment pipeline I just summarised, the main factors are Portainer's new(ish) webhook functionality; and BitBucket's powerful Continuous Delivery mechanisms, which can also be used to test your builds.
It's certainly nice to have a self-hosted docker registry, but this is not a prerequisite for the automation of a deployment process. More accessible solutions like DockerHub make for a fine substitute for Portus.
It may take some time to configure a deployment process you are satisfied with, but my advice to any development team is to automate anything you can. Many small steps can chip away at the time you spend not developing.