Different ways & strategies to build and deploy Node.js for Production

Ever wondered what are all the different ways to build and deploy your Node.js application for production?. If you're a backend developer or Node.js developer, Knowing how to build and deploy your Node.js application in production helps you in many ways. For example, it can improve your profile value or give you a different perspective on building applications or even your product.

Let me ask you something. How do you usually build and deploy your application?

If your answer is 'PM2'. I recommend you to read this article till the end. I was in the same stage where I thought PM2 was the only way to deploy Node.js applications. But, then I realized there are several advanced ways to deploy the Node.js application. We will cover a few of them in this article. Let's start with different ways to build and deploy Nodejs in production.

PM2 Deployment

Before we start with more advanced deployment strategies, we will see the Pm2 deployment along with test coverages. In that way, it helps you to understand the entire flow of the strategy.

Here's a source code for the Node.js application: Nodejs API. You can follow up along with the tutorial.

To create the configuration for Pm2, run the init command,

1pm2 init

It creates a configuration ecosystem.config.js for pm2 deployment. Change it to the following config,

1module.exports = {
2 apps : [{
3 name: "app",
4 script: "./server.js",
5 instances: "max",
6 env: {
7 NODE_ENV: "development",
8 },
9 env_production: {
10 NODE_ENV: "production",
11 }
12 }]
13}

After that, you need to change the script in package.json to run the application via pm2.

1"start": "pm2-runtime start ecosystem.config.js --env production",

Since we are using Heroku, Create a new app either via Heroku CLI or dashboard.

Screenshot 2021-09-18 at 9.56.10 AM.png

An important thing to note here is, we need to test suits before merging the change to the main branch, i.e., the Continous Integration process(CI). Fortunately, Heroku provides a CI pipeline as an add-on.

Screenshot 2021-09-18 at 9.58.40 AM.png

Once you create an application in Heroku, Go to settings ⇒ Connect to Github and select the repository. Make sure to select Wait for CI to pass before deploy. It's important because we are deploying the application to the production environment.

Screenshot 2021-09-18 at 10.01.00 AM.png

Now, it's time to configure the CI pipeline for the application. Go to Pipeline and connect to the Github repository.

Screenshot 2021-09-18 at 10.13.31 AM.png

Once you connect the GitHub Repository, you need to configure app.json to run tests on this pipeline. It configures whether we need any add-ons such as test databases or caching for test suits. You need to configure them. Also, make sure you enable Heroku CI.

Screenshot 2021-09-18 at 10.14.13 AM.png

add app.json with the following configuration,

1{
2 "environments": {
3 "test": {
4 "addons":[
5 "Heroku-PostgreSQL:in-dyno"
6 ]
7 }
8 }
9 }

After that, create a pull request and merge the changes to the main branch. Heroku CI will run the test cases and deploy them to production.

Screenshot 2021-09-18 at 10.19.31 AM.png

Docker Container Deployment

Let's see how to deploy the Node.js application by containerizing it along with CI, CD pipeline. There are different ways to deploy docker containers to the production environment. For example, you can deploy to AWS on-premise or use managed services like AWS ECS to deploy and manage containers or use Hosting services like DigitalOcean, Heroku.

Here we are going to use Heroku and GitHub Actions to deploy docker containers to production. We will see how to deploy to AWS ECS in upcoming articles.

If you're new to docker/containerization world as Nodejs developer, I recommend you to checkout this article first.

First and foremost, we need a Dockerfile for the application. create Dockerfile and add the following configuration.

1FROM node:14-alpine
2WORKDIR /usr/src/app
3COPY package.json ./
4COPY . /usr/src/app/
5RUN npm install --production
6USER node
7EXPOSE 80
8
9CMD ["node","index.js"]

To run containers for development, we can use docker-compose that works well while developing an application.

Here's a docker-compose-yml for development purposes,

1version: "3.6"
2services:
3 web:
4 build:
5 context: .
6 environment:
7 DATABASE_URL: postgres://postgres:postgres@postgres:5432/postgres
8 depends_on:
9 - postgres
10 ports:
11 - '8080:8080'
12 postgres:
13 image: postgres
14 restart: always
15 environment:
16 POSTGRES_PASSWORD: postgres
17 ports:
18 - '5432:5432'
19 volumes:
20 - todo-db:/var/lib/postgresql/data
21
22volumes:
23 todo-db:

Make sure you add .dockerignore to avoid committing unnecessary files into the docker container.

1**/node_modules/
2**/.git
3**/README.md
4**/LICENSE
5**/.vscode
6**/npm-debug.log
7**/coverage
8**/.env
9**/.editorconfig
10**/.aws
11**/dist

For Development

To run your application in the local machine, you can use the docker-compose up command that deploys the containers locally.

Screenshot 2021-09-18 at 1.11.37 PM.png

For Production

Deploying docker containers to production is not straightforward, like running the command docker-compose up. Technically, experts don't recommend running docker compose on production in AWS or any other cloud providers. Instead, you need to push the images to container registry like AWS Container Registry, DigitalOcean Container Registry, Heroku Container Registry.

We will deploy our application to Heroku by pushing the image to the Heroku container registry using Github Actions.

If you're new to GitHub Actions, check out this great tutorial on GitHub Actions from TechWorld with Nana

create a file .github/workflows/deploy.yml and add the following configuration.

1# This is a basic workflow to help you get started with Actions
2name: Deploy container to Heroku Server in Production
3
4# Controls when the workflow will run
5on:
6 # Triggers the workflow on push or pull request events but only for the main branch
7 push:
8 branches: [ docker-aws ]
9 pull_request:
10 branches: [ docker-aws ]
11
12 # Allows you to run this workflow manually from the Actions tab
13 workflow_dispatch:
14
15# A workflow run is made up of one or more jobs that can run sequentially or in parallel
16jobs:
17 # This workflow contains a single job called "build"
18 build:
19 # The type of runner that the job will run on
20 runs-on: ubuntu-latest
21
22 # Steps represent a sequence of tasks that will be executed as part of the job
23 steps:
24 - name: Checkout Repo
25 uses: actions/checkout@v2
26 with:
27 ref : docker-aws
28 # Building Heroku Deployment with Container Registry - Update
29 - name: Build, Push and Release a Docker container to Heroku
30 uses: gonuit/heroku-docker-deploy@v1.3.3
31 with:
32 email: ${{ secrets.HEROKU_EMAIL }}
33 heroku_api_key: ${{ secrets.HEROKU_API_KEY }}
34 heroku_app_name: ${{ secrets.HEROKU_APP_NAME }}
35 dockerfile_directory: ./
36 dockerfile_name: Dockerfile
37 docker_options: "--no-cache"
38 process_type: web

Let's break down the config file. on hook defines the trigger point for the action. In our case, we need to run the GitHub Action when someone pushes or creates a pull request to docker-aws branch here.

1on:
2 # Triggers the workflow on push or pull request events but only for the main branch
3 push:
4 branches: [ docker-aws ]
5 pull_request:
6 branches: [ docker-aws ]
7
8 # Allows you to run this workflow manually from the Actions tab
9 workflow_dispatch:

jobs define the job that we need to run when there is a code push or merging pull requests. An interesting thing about GitHub Actions is that it contains many predefined actions that we can import into our config to use. Likewise, Here we have actions/checkout@v2 action to check out the branch in the server.

1steps:
2 - name: Checkout Repo
3 uses: actions/checkout@v2
4 with:
5 ref : docker-aws

Finally, we have gonuit/heroku-docker-deploy@v1.3.3 action to build, push and release docker container to Heroku.

1- name: Build, Push and Release a Docker container to Heroku
2 uses: gonuit/heroku-docker-deploy@v1.3.3
3 with:
4 email: ${{ secrets.HEROKU_EMAIL }}
5 heroku_api_key: ${{ secrets.HEROKU_API_KEY }}
6 heroku_app_name: ${{ secrets.HEROKU_APP_NAME }}
7 dockerfile_directory: ./
8 dockerfile_name: Dockerfile
9 docker_options: "--no-cache"
10 process_type: web

Once you configure it in your application, you can see the job running in GitHub Actions when you create and merge a PR.

Screenshot 2021-09-18 at 5.41.38 PM.png

Note: Make sure to set up secrets in GitHub Repo.

Screenshot 2021-09-18 at 5.44.58 PM.png

Kubernetes Deployment

So far, we have seen deploying Nodejs application using Pm2, Docker container. Now, we are going to see the enhanced version of Container Deployment. We are still going to deploy the docker container but in a different way.

When deploying Containers by ourselves to the server, We need to consider and manage several aspects. For example, How does the container scale up or scale down based on the traffic?

Orchestrating containers is a tedious task when we do it manually. To solve this problem, we have solutions like Kubernetes. So let's see how to deploy Nodejs applications using Docker and Kubernetes.

Note: If you're new to Kubernetes, Checkout this article that is specifically for Nodejs devs. Kubernetes for Nodejs developers

TLDR, To deploy Nodejs application via Kubernetes, we need Configurations for Deployment and Services. Deployment is to build containers; Services define the traffic's load balance across a different replica of containers. There are different types of services in Kubernetes.

If it's a database, we need another configuration to persist the data in the container. It's called PersistentVolumeClaim.

Application

create a directory infra and add app-depl.yml . Firstly, create Deployment configuration for Application.

1apiVersion: apps/v1
2kind: Deployment
3metadata:
4 name: node-app
5spec:
6 replicas: 1
7 selector:
8 matchLabels:
9 app: node-app
10 template:
11 metadata:
12 labels:
13 app: node-app
14 spec:
15 containers:
16 - name: node-app
17 image: ganeshmani009/node-deployment-web
18 resources:
19 limits:
20 memory: "128Mi"
21 cpu: "500m"
22 ports:
23 - containerPort: 8080
24 env:
25 - name: DATABASE_URL
26 value: postgres://postgres:postgres@postgres:5432/postgres
27 imagePullPolicy: Always

Kubernetes_Config.png

Database in Kubernetes

Create pg-depl.yml and add the following configuration.

1apiVersion: v1
2kind: PersistentVolumeClaim
3metadata:
4 name: pg-claim
5spec:
6 resources:
7 requests:
8 storage: 3Gi
9 accessModes:
10 - ReadWriteOnce
11---
12apiVersion: v1
13kind: Service
14metadata:
15 name: postgres
16spec:
17 selector:
18 app: postgres
19 ports:
20 - port: 5432
21 targetPort: 5432
22---
23apiVersion: apps/v1
24kind: Deployment
25metadata:
26 name: postgres
27spec:
28 selector:
29 matchLabels:
30 app: postgres
31 template:
32 metadata:
33 labels:
34 app: postgres
35 spec:
36 containers:
37 - name: postgres
38 image: postgres:13.4-alpine
39 resources:
40 limits:
41 memory: "128Mi"
42 cpu: "500m"
43 ports:
44 - containerPort: 5432
45 env:
46 - name: POSTGRES_PASSWORD
47 value: postgres
48 volumeMounts:
49 - mountPath: /data/db
50 name: storage
51 volumes:
52 - name: storage
53 persistentVolumeClaim:
54 claimName: pg-claim

One of the major differences between Application config and database configuration is PersistentVolumeClaim. It binds the database data inside Kubernetes. So that, even if there is no container running, it will persist the data.

1apiVersion: v1
2kind: PersistentVolumeClaim
3metadata:
4 name: pg-claim
5spec:
6 resources:
7 requests:
8 storage: 3Gi
9 accessModes:
10 - ReadWriteOnce

To test our application deployment in a local machine, you can use Minikube to create and run the Kubernetes cluster. First, Minikube creates a single-node Kubernetes cluster running in a virtual machine.

Note: Minikube is only intended for testing purposes, not for production. Make sure your install kubectl and Minikube.

Once you install minikube, you can create a cluster using the command,

1minikube start --vm

To deploy the application locally, you need to go to the root directory of your application and run the kubectl command,

1kubectl apply -f <Directory name of config - Here's its infra>

Screenshot 2021-09-18 at 6.21.32 PM.png

Here's the complete source code

To Read More

Modern React Redux Toolkit - Login ...

User Authentication is one of the common workflow in web applications. In this t...

Building Nodejs Microservice - A Cl...

This Article explains everything about how to build Nodejs Microservices in clou...

I Accidentally wiped the entire dat...

One of the tragic accident in my job turned out to be good learning for me in re...

Never miss a story from us, subscribe to our newsletter