API with NestJS #187. Rate limiting using Throttler

NestJS

This entry is part 187 of 187 in the API with NestJS

When building a REST API, we must prepare for various cases. At first, everything might run smoothly, but as our API gains traction, we might encounter some issues. We might run into users who begin sending hundreds or thousands of requests per second, pulling data far more often than necessary. This might cause our server to struggle to keep up. This might make our API slow or unresponsive for other users. We can also encounter attackers launching a Denial of Service (DoS) attack, flooding our API with millions of requests, and crashing our system intentionally.

To solve the above issues, we can implement rate limiting. It ensures that a single user can’t overwhelm our API by allowing us to limit how frequently they can make API requests. With this approach, our API can remain fast and reliable, keeping attackers at bay while allowing legitimate users to access the data they need.

Configuring the Throttler module

One of the ways to implement rate limiting with NestJS is to use the provided Throttler module.

With it, we can specify how many HTTP requests a particular user can make within a given time.

app.module.ts

With the above configuration, a particular user is limited to 10 requests within 60000 milliseconds. To make it easier to read, we can use built-in helpers such as or to specify the Time To Live (TTL) parameter.

app.module.ts

Applying the Throttler

To apply the Throttler, we need to use the class provided by NestJS. In NestJS, guards determine whether or not a given request should be allowed. The most common way of applying the is to use it globally by adding a provider.

app.module.ts

Thanks to the above, if we have more than 10 requests coming from a particular IP in  60 seconds, they will be blocked. Instead of the desired response, the API will respond with the status code “429 Too Many Requests”.

Customizing the behavior

Even if the guard is applied globally, we can change its behavior by turning off the rate limiting for a particular controller.

articles.controller.ts

We can also override the Throttler settings we set up globally using the decorator.

articles.controller.ts

Dealing with proxies

If our application runs behind a proxy server, HTTP requests from various users share the same IP from the perspective of our NestJS application. This prevents the Throttler from working correctly.

Fortunately, we can identify the IP address of a client connecting through a proxy thanks to the X-Forwarded-For header. To use it, we need to turn on the setting.

main.ts

Specifying the configuration per environment

There is a good chance that we would like to use different values in different environments. We can do that by specifying environment variables.

environment-variables.ts

.env

app.module.ts

Working with multiple app instances

By default, the Throttler module keeps track of the requests in the application’s memory. This works fine as long as only one instance of our application exists. However, if we have multiple instances of our NestJS app, we should create a shared cache to store the information about requests using Redis.

Fortunately, we can use the @nest-lab/throttler-storage-redis library to use a Redis storage shared by multiple instances of our NestJS application using Throttler.

If you want to know moure about caching with Redis, check out API with NestJS #24. Cache with Redis. Running the app in a Node.js cluster

Summary

In this article, we explained why our REST API might need a rate limiter. To implement it, we used the Throttler module developed by the NestJS team. We also learned how to change the Throttler configuration per environment. Besides that, we also know how to deal with web proxies and multiple application instances. All of that gives us a solid understanding of how rate limiting with Throttler works and how to use it.

Series Navigation<< API with NestJS #186. What’s new in Express 5?
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments