API with NestJS #23. Implementing in-memory cache to increase the performance

JavaScript NestJS TypeScript

This entry is part 23 of 175 in the API with NestJS

There are quite a few things we can do when tackling our application’s performance. We sometimes can make our code faster and optimize the database queries. To make our API even more performant, we might want to completely avoid running some of the code.

Accessing the data stored in the database is quite often time-consuming. It adds up if we also perform some data manipulation on top of it before returning it to the user. Fortunately, we can improve our approach with caching. By storing a copy of the data in a way that it can be served faster, we can speed up the response in a significant way.

Implementing in-memory cache

The most straightforward way to implement cache is to store the data in the memory of our application. Under the hood, NestJS uses the cache-manager library. We need to start by installing it.

To enable the cache, we need to import the in our app.

posts.module.ts

By default, the amount of time that a response is cached before deleting it is 5 seconds. Also, the maximum number of elements in the cache is 100 by default. We can change those values by passing additional options to the method.

Automatically caching responses

NestJS comes equipped with the . With it, NestJS handles the cache automatically.

posts.controller.ts

If we call this endpoint two times, NestJS does not invoke the method twice. Instead, it returns the cached data the second time.

In the twelfth part of this series, we’ve integrated Elasticsearch into our application. Also, in the seventeenth part, we’ve added pagination. Therefore, our endpoint accepts quite a few query params.

A very important thing that the official documentation does not mention is that NestJS will store the response of the method separately for every combination of query params. Thanks to that, calling and can yield different responses.

Although above, we use for a particular endpoint, we can also use it for the whole controller. We could even use it for a whole module. Using cache might sometimes cause us to return stale data, though. Therefore, we need to be careful about what endpoint do we cache.

Using the cache store manually

Aside from using the automatic cache, we can also interact with the cache manually. Let’s inject it into our service.

posts.service.ts

An important concept to grasp is that the cache manager provides a key-value store. We can:

  • retrieve the values using the method,
  • add items using ,
  • remove elements with ,
  • clear the whole cache using .

It can come in handy for more sophisticated cases. We can even use it together with the automatic cache.

Invalidating cache

If we would like to increase the time in which our cache lives, we need to figure out a way to invalidate it. If we want to cache the list of our posts, we need to refresh it every time a post is added, modified, or removed.

To do use the function to remove the cache, we need to know the key. The under the hood creates a key for every route we cache. This means that it creates separate cache keys both for and .

Instead of relying on to generate a key for every route, we can define it ourselves with the decorator. We can also use   to increase the time during which the cache lives.

postsCacheKey.constant.ts

posts.controller.ts

The above creates a big issue, though. Because now our custom key is always used for the method, it means that different query parameters yield the same result. Both and now use the same cache.

To fix this, we need to extend the class and change its behavior slightly. The method of the returns a key that is used within the store. Instead of returning the cache key, let’s add the query params to it.

To view the original method, check out this file in the repository.

httpCache.interceptor.ts

property is created by the parseurl library

If we don’t provide the decorator with a key, NestJS will use the original method through .

Otherwise, the will create keys like and .

Now we can create a method and use it when we create, update, and delete posts.

By doing the above, we invalidate our cache when the list of posts should change. With that, we can increase the Time To Live (TTL) and increase our application’s performance.

Summary

In this article, we’ve implemented an in-memory cache both by using the auto-caching and interacting with the cache store manually. Thanks to adjusting the way NestJS tracks cache responses, we’ve also been able to appropriately invalidate our cache.

While the in-memory cache is valid in a lot of cases, it has its disadvantages. For example, it is not shared between multiple instances of our application. To deal with this issue, we can use Redis. We will cover this topic in upcoming articles, so stay tuned!

Series Navigation<< API with NestJS #22. Storing JSON with PostgreSQL and TypeORMAPI with NestJS #24. Cache with Redis. Running the app in a Node.js cluster >>
Subscribe
Notify of
guest
9 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
John Schmitz
3 years ago

Really enjoyed your series, I haven’t read all of it but it sure made me understand more about NestJs, I hope to be a regular reader.

Jones
Jones
3 years ago

Thank you very much,
Really appreciated.

Andri
Andri
3 years ago

Please make UDEMY course about NestJs and microservices, and a simple implementation in React app (such as pub/sub) method.
Really enjoying the series so far!
Keep up the good work.

Genesis Bertiz
Genesis Bertiz
3 years ago

Thank you so much.

Mitch
Mitch
3 years ago

Cannot invoke an object which is possibly ‘undefined’.

89    await this.cacheManager.store.keys() 

Lex
Lex
2 years ago
Reply to  Mitch

Last edited 2 years ago by Lex
David
David
2 years ago

What about caching graphql queries?

Prabu Lakshmanan
Prabu Lakshmanan
1 year ago

In-memory cache default 100 element size never mentioned in official website here > https://docs.nestjs.com/techniques/caching#in-memory-cache

I am using like this ?

CacheModule.registerAsync({
  useFactory: () => ({
   isGlobal: true,
  }),
 })

ttl setting i am using as key level (8 hrs)
.set(key,value,28800000)

Still after sometime (just 6 users in my application) all keys are removed

can any one facing same issue ?

Aleks
Aleks
9 months ago

Check warning on same page you linked:

cache-manager
 version 4 uses seconds for 
TTL (Time-To-Live)
. The current version of 
cache-manager

 (v5) has switched to using milliseconds instead.