In this article, we will see how to build a scalable rate limiter for API in nodejs Application. Scalable API Rate Limiter in Node.js Application.
TypeScript for React developers in 2020
Let's say, you are building a Public API Service where user can access the Service using that API. Meanwhile, you need to protect from DDOS Attack of that Public API Service.
Consider that we built a Product. we need to provide free trial to the users to access the Product service.
On both the scenarios, Rate limiting Algorithm is a way to solve the Problems.
Firstly, Rate limiting algorithm is a way to limit the access to API's. For example, Let's say that an user request a API in the rate of 100 Requests/Second. it might cause of the problem of server overload.
To avoid this problem, we can limit the access to the API's. Like, 20 Requests/minute for an user.
Let's see type of rate limiting algorithms in software application development:
it stores the maximum number of token it can provides for a requests per minute. For Example, For an user, API Rate limiter sets 5 tokens per minute. so, user can send maximum 5 requests to server per minutes. After that, server drops the request.
Mainly,Redis is used to store the information for fast access of the request data.
Let's say User1 sends request to server. server checks whether the request time and previous request time is greater than a minute. if it is less than a minute, it will check the token remaining for the specified user.
If it is Zero, Server drops the request.
It is a Queue which takes request in First in First Out(FIFO) way.
Once, Queue is filled. server drops the upcoming request until the queue has space to take more request.
For Example, Server gets request 1,2,3 and 4. Based on the Queue size. it takes in the request. consider the size of queue as 4. it will take requests 1,2,3 and 4.
After that, server gets request 5. it will drop it.
it increments the request counter of an user for a particular time. if counter crosses a threshold. server drops the request. it uses redis to store the request information.
For example, Server gets the request from an user. if the user request info is present in redis and request time is less than the time of previous request, it will increment the counter.
Once, the threshold is reached. server drops the upcoming request for a specified time.
Let's say that server gets lots of request at 55th second of a minute. this won't work as expected
it stores the logs of each request with a timestamp in redis or in memory. For each request, it will check the count of logs available for an user for a minute.
further, if the count is more than the threshold, server drops the upcoming requests.
On the other hand, there are few disadvantages with this approach. let's say if application receives million request, maintaining log for each request in memory is expensive.
This approach is somewhat similar to sliding logs. Only difference here is, Instead of storing all the logs,we store by grouping user request data based on timestamp.
For example, Once server receives a request by an user. we check the memory for the request timestamp. if it is available, we increment the counter of it. if it is not available, we insert it as new record.
In that way, we don't need to store each request as a separate entry , we can group them by timestamp and maintain a counter for it.
Complete source code can be found here
create a directory and initialize package.json using the following command
1npm init --yes
After that, Install Express and redis for the application using the following command
1npm i express redis moment
Redis Client is used to connect with redis server. moment is used for storing the request timestamp.
Firstly, create a file server.js and add the following code.
1const express = require("express")2const rateLimiter = require("./slidingWindowCounter")3const app = express()45const router = express.Router()67router.get("/", (req, res) => {8 res.send("<h1>API response</h1>")9})1011app.use(rateLimiter)12app.use("/api", router)1314app.listen(5000, () => {15 console.log("server is running on port 5000")16})
Secondly, create a file slidingWindowCounter.js and add the following code.
1const redis = require("redis")2const moment = require("moment")3const redisClient = redis.createClient()45module.exports = (req, res, next) => {6 redisClient.exists(req.headers.user, (err, reply) => {7 if (err) {8 console.log("problem with redis")9 system.exit(0)10 }1112 if (reply === 1) {13 redisClient.get(req.headers.user, (err, redisResponse) => {14 let data = JSON.parse(redisResponse)1516 let currentTime = moment().unix()17 let lessThanMinuteAgo = moment()18 .subtract(1, "minute")19 .unix()2021 let RequestCountPerMinutes = data.filter(item => {22 return item.requestTime > lessThanMinuteAgo23 })2425 let thresHold = 02627 RequestCountPerMinutes.forEach(item => {28 thresHold = thresHold + item.counter29 })3031 if (thresHold >= 5) {32 return res.json({ error: 1, message: "throttle limit exceeded" })33 } else {34 let isFound = false35 data.forEach(element => {36 if (element.requestTime) {37 isFound = true38 element.counter++39 }40 })41 if (!isFound) {42 data.push({43 requestTime: currentTime,44 counter: 1,45 })46 }4748 redisClient.set(req.headers.user, JSON.stringify(data))4950 next()51 }52 })53 } else {54 let data = []55 let requestData = {56 requestTime: moment().unix(),57 counter: 1,58 }59 data.push(requestData)60 redisClient.set(req.headers.user, JSON.stringify(data))6162 next()63 }64 })65}
Scalable API Rate Limiter in nodejs , Complete Source Code can be found here
Demo : https://youtu.be/qlQ5XSDFe9c
No spam, ever. Unsubscribe anytime.