English
[CI/CD] A Guide to Automated Deployment of FastAPI Servers Using Docker, Jenkins, and Git Webhooks

[CI/CD] A Guide to Automated Deployment of FastAPI Servers Using Docker, Jenkins, and Git Webhooks

Implementing Production-Ready Automated Deployment

Notes in progress > First update completed as of May 2023 For deployment-related content, please refer to part 2 which is more polished.

Intro

While working on a current project, I was building a service that exposes outputs powered by an AI model.

I wanted to build that feature on Firebase Cloud Functions, but loading the model with nodeJS + tensorflow.js forced model conversion, the runtime was inherently heavy and kept timing out — for various reasons it kept getting blocked, so I decided to build a small server that returns results from the AI model.

That meant the server build and deployment fell on me, and as a first goal I aimed to get automated deployment working, then later push toward zero-downtime deployment with my own hands.

This post only covers the system build on top of an instance such as AWS and the automated deployment.

Background concepts

CI: Continuous Integration

An automated process for developers in which code changes go through build and test and are regularly merged into a shared repository. Automating this process is used as a way to reduce risk during development.

Composed of CI Server, Source Control Management, Build Tool, Test Tool, etc.

CD: Continuous Delivery / Deployment

Refers to the deployment automation process. By automating the additional pipeline stages, changes can flow all the way to production without manual intervention. Continuous Integration must back it up, and once a reliable environment is in place, multiple releases per day become possible.

As a result, going through this CI/CD process shortens the time to product release, increases workforce efficiency, and reduces the chance of major issues because everything is a chain of incremental, continuous steps.

Setting up the Docker environment

  • Instance info

Instead of the EC2 free tier, I went with Oracle Cloud's free tier, which has relatively better terms. Allocated and used an arm64 Ampere A1 Compute with 2 cores and 12GB RAM.

It also provides things like backups for free to some extent — worth checking out!

I gathered the references I used. There are plenty of articles about installing Docker and setting up the instance, so look them up and set up Docker accordingly.

Setting up Docker in Docker (failed > switched to Docker out of Docker via the API)

Why use it

The reason for handling Docker on top of Docker is to control containers through a container, and while the privileges of that container become very strong (which raises security concerns), the approach itself is undeniably appealing.

The setup I tried

1. Build a Jenkins container on Docker

First, build the Jenkins container with Docker. Using Docker Compose makes the build a bit easier, so keep that in mind.

docker pull jenkins/jenkins:lts

First pull the Jenkins LTS container.

vim docker-compose.yml

Create or open the yml file with that command.

version: "2.5.0" # specify the compose version

services:

  jenkins:
    container_name: jenkins
    image: jenkins/jenkins:lts # use the Jenkins LTS image
    restart: always # auto-restart option

    ports:
      - "8181:8080" # map host port 8181 to Jenkins port 8080

    volumes: # share host volumes through this option
      - /home/opendocs/jenkins:/var/jenkins_home
      - /var/run/docker.sock:/var/run/docker.sock

    user: root
    privileged: true # the key option to enable Docker in Docker

Fill in the yml file with the necessary content like the example.

docker-compose -f {filename} up

This command runs the setup based on docker-compose.yml. If you keep the default name docker-compose.yml, you can omit the -f option. See the Docker Compose Docs for details.

Docker Compose Documents

+ How to use docker without sudo → add the current user to the Docker group

# this command adds the current user to the Docker group
sudo usermod -aG docker $USER
2. Setting up the Jenkins Container
  1. Open the configured IP and the assigned Port Number in a browser
  2. Enter the key shown in the Jenkins install log on the screen
# this command lets you check the container logs
docker logs jenkins
  1. Install the recommended plugins

You may run into issues, but for now finish the install and resolve them afterwards. If you don't finish, dependency errors caused by uninstalled plugins keep coming up...

Workaround for plugin install timeouts

  1. Install the GitHub API Plugin

A plugin that lets webhook triggers work. Install it. After installing, register your Git ID/PW as a Secret.

3. GitHub configuration
  1. Create a Token under Settings > Developer Settings. As of writing, I use a Classic token, and only repo and admin:repo_hook permissions are needed.

  2. In the settings of the repo to be auto-deployed, add a Webhook under Webhooks. Setting it to roughly http://{jenkins_url}/github-webhook/ works. Leaving off the trailing / can cause problems, so just keep it.

4. Setting up the Jenkins project
  1. Add a project.

  2. Under build configuration > source code management, choose Git and pick the previously registered Secret to receive webhook signals.

  3. You must select the GitHub hook trigger for GITScm polling option for the trigger to fire.

  4. Pick the branch as needed, finish the configuration, and run a test.

댓글 작성

게시글에 대한 의견을 남겨 주세요.

댓글 0