Introduction
Security is one of the most important things web applications and APIs have to address. API security encompasses quite an array of tactics, one of which includes rate limiting. In this article, you will learn how to set rate limits in Strapi 5, a headless CMS that is both flexible and open-source.
Rate limiting is simply the process of controlling the number of requests a specific client can make to your API within a given time period. This is important to the security of your API, especially in the following aspects:
- Protection against abuse
- Fairness
- Protection against some types of attack such as DDoS.
Strapi does not limit the choice of methods for enabling rate limits, and we will consider several ways how you can use it to your advantage.
Prerequisites
The following are required to get the best out of this tutorial:
- Node.Js installation on our local machine.
- A Good understanding of Strapi.
- A New Strapi 5 Project:
npx create-strapi-app@rc my-project --quickstart
.
See Strapi in action with an interactive demo
Understanding Rate Limiting
Rate limiting is a way of limiting how many requests a client makes to an API within a specified amount of time. For instance, you may only allow a client to make no more than 100 requests in a minute. This technique will ensure that you do not overload your API while at the same time ensuring that the server resources are shared among all the clients.
Common Use Cases for Rate Limiting
Rate limiting is necessary for several reasons. It helps prevent overuse of your API and treats users fairly in terms of access to the service. Let's look at a few use cases where rate limiting is helpful.
- It limits the number of requests a user can make within a period so as not to spam or overload your system.
- It precludes crashes on your infrastructure by preventing sudden spikes in traffic.
- It consumes your API and provides equal opportunities to each of your clients.
- Rate limiting delays and blocking malicious traffic.
Benefits of Implementing Rate Limiting
In addition to preventing abuse, rate limiting brings a few significant benefits:
- It improves the stability and availability of your API, which will make it more reliable for all users.
- Saves server costs through unwanted traffic regulation, preventing unnecessary load on your infrastructure.
- Better security against brute force and DDoS attacks are just some of the ways attacks could be performed.
Rate Limiting Methods
There are several other rate-limiting strategies, which are as follows.
- Fixed Window: Requests in a fixed period get piled up and are then cumulatively labeled as pending. Example: One hundred requests per minute.
- Sliding Window: It offers a rolling time window, It offers better tracking of the requests’ progress.
- Token Bucket: The clients are issued tokens which get refilled after sometime while each request makes one token.
- Leaky Bucket: API Calls are addressed regardless of their frequency in predetermined portions of a given period.
Approaches to Implement Rate Limiting in Strapi v5
Strapi allows for various approaches to limiting rates. All the approaches have their advantages depending on the kind of project that you are working on. Here are various ways that you can follow to implement rate limiting.
- Koa Packages Directly
- Strapi Global Middleware
- Strapi Route Middleware
- Using Strapi Policy
- Using Redis for Rate Limiting
- Using Express Limiter with Strapi
- With Third-party Services, e.g., Cloudflare for Rate Limiting in Strapi
Before you start using any of the approcahes above, create an Article content-type in your Strapi backend.
Add some fields to the Article collection and you are set. This content-type will be used for the demonstrations.
Using Koa Packages Directly
Since Strapi is built on top of Koa, the library will bring some of the optimizations and improvements of Koa to Strapi. For example, if you are using Koa, you can use Koa-specific packages for rate limiting such as koa2-ratelimit. You can set up rate limiting at the global level, meaning for your whole Strapi server application, or you can use route-level middleware to only have rate limiting enabled on the specific routes that you want.
First, navigate to your project folder on your command line and install the package with the command:
npm install koa2-ratelimit
Using Strapi Global Middleware with koa2-ratelimit
Strapi middleware intercepts requests before the request reaches the controller, making it ideal to implement rate limiting for all routes in the application. To implement rate globally, create a middlewares
folder in the src
folder of your application. Then create a rate-limit.ts
file and add the code snippet below:
1import { RateLimit } from "koa2-ratelimit";
2import type { Core } from "@strapi/strapi";
3import { Context, Next } from "koa";
4
5export default (_config: any, { }: { strapi: Core.Strapi }) => {
6 return async (ctx: Context, next: Next) => {
7 return RateLimit.middleware({
8 interval: { min: 5 }, // 5 minute
9 max: 100, // limit each IP to 100 requests per minute
10 message: "Too many requests, please try again later.",
11 headers: true,
12 })(ctx, next);
13 };
14};
This code used the fixed window rate Limiting method that restricts all the clients to 100 requests for every five minute for all the requests to your APIs and your Strapi admin account. By setting headers
to true
we are informing the clients about their remaining quota and reset time.
Now enable the rate-limiting middleware in Strapi’s configuration:
1//config/middlewares.ts
2
3export default [
4 "strapi::logger",
5 "strapi::errors",
6 "strapi::security",
7 "strapi::cors",
8 "strapi::poweredBy",
9 "strapi::query",
10 "strapi::body",
11 "strapi::session",
12 "strapi::favicon",
13 "strapi::public",
14 "global::rateLimit",
15];
Here are screenshots of what the output will look like when you hit the limits.
Using Strapi Route Middleware with koa2-ratelimit
You may need to have better control over which special routes are limited for a certain number of requests. It is better than applying rate limits to all the requests throughout the application, which can be unnecessary in most cases and only affects certain routes that require it, such as login routes or routes that make API calls. To do that, first, create a new middleware file for rate limiting:
npx strapi generate middleware rate-limit
Then, you will be prompted to select where you want to add this middleware and which API this is for. Your selection should look at the screenshot below:
Add the following rate-limiting logic using the koa2-ratelimit
package. This middleware limits requests to 100 every 15 minutes.
1//src/api/article/middlewares/rate-limit.ts
2
3/**
4 * `rate-limit` middleware
5 */
6
7import { RateLimit } from "koa2-ratelimit";
8import type { Core } from "@strapi/strapi";
9import { Context, Next } from "koa";
10
11export default (_config: any, { }: { strapi: Core.Strapi }) => {
12 return async (ctx: Context, next: Next) => {
13 return RateLimit.middleware({
14 interval: { min: 15 },
15 max: 100,
16 message: "Too many requests, please slow down.",
17 headers: true,
18 })(ctx, next);
19 };
20};
This Strapi middleware uses a token bucket system to apply rate limiting to your API. It is configured to:
Interval
: Set to 15 minutes, meaning the rate limit is applied over this time window.Max
: Allows up to 100 requests per 15-minute interval.Message
: Returns "Too many requests, please slow down." if the rate limit is exceeded.Headers
: Includes rate limit headers in the response to inform clients about their remaining quota and reset time.
1//src/api/article/routes/article.ts
2
3/**
4 * article router
5 */
6
7import { factories } from "@strapi/strapi";
8
9export default factories.createCoreRouter("api::article.article", {
10 only: ["find", "findOne", "create", "update", "delete"],
11 config: {
12 find: {
13 middlewares: ["api::article.rate-limit"],
14 },
15 findOne: {
16 middlewares: ["api::article.rate-limit"],
17 },
18 create: {
19 middlewares: ["api::article.rate-limit"],
20 },
21 update: {
22 middlewares: ["api::article.rate-limit"],
23 },
24 delete: {
25 middlewares: ["api::article.rate-limit"],
26 },
27 },
28});
Now the rate limiting is applied to on the articles routes. Depending on your implementation, it's important to target routes that are likely to be the most sensitive or resource-intensive when implementing rate limiting.
Implementing Rate Limit Using Strapi Policy
Aside from using middleware, you can implement rate limiting using Strapi policies.
"Policies are functions that execute specific logic on each request before it reaches the controller. They are mostly used for securing business logic."
This approach allows you to apply rate-limiting logic at a more granular level, such as specific routes or actions.
Step 1: Create a Rate Limit Policy
First, create a new policy file for rate limiting. For example, let's create a policy called rate-limit.ts
inside the policies
folder in the path ./src/api/article/
.
1import { Core } from "@strapi/strapi";
2import { errors } from "@strapi/utils";
3
4const { PolicyError } = errors;
5
6const rateLimitMap = new Map<string, { count: number; startTime: number }>();
7
8const RATE_LIMIT = 5; // Maximum number of requests
9const TIME_WINDOW = 60000; // Time window in milliseconds (1 minute)
10
11export default async (
12 policyContext: any,
13 config: any,
14 { strapi }: { strapi: Core.Strapi }
15) => {
16 const ip = policyContext.request.ip;
17 const currentTime = Date.now();
18
19 if (!rateLimitMap.has(ip)) {
20 rateLimitMap.set(ip, { count: 1, startTime: currentTime });
21 } else {
22 const { count, startTime } = rateLimitMap.get(ip)!;
23 if (currentTime - startTime < TIME_WINDOW) {
24 if (count >= RATE_LIMIT) {
25 throw new PolicyError("Too many requests, please try again later.", {
26 policy: "rate-limit",
27 });
28 }
29 rateLimitMap.set(ip, { count: count + 1, startTime });
30 } else {
31 rateLimitMap.set(ip, { count: 1, startTime: currentTime });
32 }
33 }
34
35 return true;
36};
In this code, we maintain a simple in-memory map to track the number of requests from each IP address within a specified time window. If the request count exceeds the limit, a PolicyError
is thrown.
Step 2: Apply the Rate Limit Policy to Routes
Next, apply this policy to the desired routes in your Strapi application. For example, you can add it to the create
function of the article
route configuration.
1// ./src/api/article/routes/article.ts
2import { factories } from "@strapi/strapi";
3
4export default factories.createCoreRouter("api::article.article", {
5 config: {
6 find: {
7 // policies
8 policies: ["api::article.rate-limit"],
9 },
10 create: {
11 policies: ["api::article.rate-limit"],
12 },
13 update: {
14 policies: ["api::article.rate-limit"],
15 },
16 delete: {
17 policies: ["api::article.rate-limit"],
18 },
19 },
20});
By adding the rate-limit
policy to specific routes, you ensure that the rate-limiting logic is applied only to those routes, providing more control over your API's behavior. Now, when you make more than 5 API requests in 1 minute to the /articles
endpoint, you will get a PolicyError
thrown.
Using Redis for Rate Limiting
Redis is an open-source in-memory data store that can be used as a database, cache, or message broker.
Using Redis for rate limiting brings another level of persistence and scalability into your system. Redis can help store request count across multiple servers which is useful for distributed enviroment where you would like to count requests in a consistent manner.
Firstly, if you haven’t done so already, you should install Redis on your system and then install the Redis client with the command below:
npm install ioredis
npm install --save-dev @types/node
Set up the Redis client in your Strapi project. You’ll use Redis to track request counts.
1// config/middlewares/rate-limit.js or //src/api/article/middlewares/rate-limit.ts
2
3import { RateLimit, Stores } from "koa2-ratelimit";
4import Redis from "ioredis";
5
6const redisClient = new Redis({
7 host: "127.0.0.1",
8 port: 6379,
9 username: "YOUR_REDIS_USERNAME",
10 password: "YOUR_REDIS_USERNAME",
11 db: 0,
12});
13
14// Configure RateLimit with Redis store
15RateLimit.defaultOptions({
16 message: "Too many requests, please try again later.",
17 store: new Stores.Redis({
18 client: redisClient,
19 }),
20});
21
22export default () => {
23 return RateLimit.middleware({
24 interval: { min: 15 },
25 max: 100,
26 });
27};
This configuration ensures that rate limiting is put in place to define the maximum number of request a user is allowed to make within a given interval. It uses Redis to store and count the number of request each user makes. Redis is set with local address of 127. 0. 0. 1, port of 6379 and a selected database of 1 to allow the request counts to be available even when the server is restarted, or when there are multiple instances.
Now, when the user's requests exceed the rate limit, it will throw an error and block the user from requesting the endpoint for the specified interval.
Now, if you open your Redis CLI with the command redis-cli
, or redis-cli -h <host> -p <port>
if your Redis server is running on a different host or port, you can type KEYS *
to list all the keys stored in your Redis database. This will allow you to see the keys used by your application to track rate limits.
Note: It is not a good practice to use
KEYS *
in a production database with large dataset so you don't block server. Also check the key before your set interval expires, else it will get cleared and you won't see it.
Using Express Limiter with Strapi
Even though Strapi is built on top of Koa, you can still utilize Express tools like express-rate-limit if you are able to bridge the two frameworks. That's going to be great for developers who have experience working with Express.js because they can now use their past experience with Express tools.
To get started, you would install express-rate-limit with the command below:
npm install express-rate-limit
Then, configure the rate limiter to your desired specifications.
1//src/api/article/routes/article.ts
2
3import { factories } from "@strapi/strapi";
4import { rateLimit } from "express-rate-limit";
5import { Context } from "koa";
6
7// Define the rate limiter configuration
8const limiter = rateLimit({
9 windowMs: 3 * 60 * 1000, // 3 minutes
10 max: 2, // limit each IP to 2 requests per windowMs
11 handler: async () => {
12 const ctx = (strapi as any).requestContext.get() as Context;
13 ctx.status = 429;
14 ctx.body = {
15 message: "Too many requests",
16 policy: "rate limit",
17 };
18 ctx.res.end();
19 },
20});
21
22// Reusable rate-limiting middleware
23const rateLimitMiddleware = async (ctx: Context, next: () => Promise<void>) => {
24 await new Promise<void>((resolve, reject) => {
25 // Cast req and res to IncomingMessage and ServerResponse from Node.js
26 limiter(ctx.req as any, ctx.res as any, (error: any) => {
27 if (error) {
28 ctx.status = 429;
29 ctx.body = { error: error.message };
30 reject(error);
31 } else {
32 resolve();
33 }
34 });
35 });
36 if (ctx.status !== 429) {
37 await next();
38 }
39};
40
41// Apply rate-limiting middleware to all routes
42export default factories.createCoreRouter("api::article.article", {
43 config: {
44 find: {
45 middlewares: [rateLimitMiddleware],
46 },
47 findOne: {
48 middlewares: [rateLimitMiddleware],
49 },
50 create: {
51 middlewares: [rateLimitMiddleware],
52 },
53 update: {
54 middlewares: [rateLimitMiddleware],
55 },
56 delete: {
57 middlewares: [rateLimitMiddleware],
58 },
59 },
60});
The code above specifies the policies on how frequently a user is allowed to make requests in a given period of time.
The windowMs
option defines the time period in which requests are to be measured; in our case 3 minutes in milliseconds. We also limited the number of requests to 2 within that 3 minutes, which is just for the sake of this example.
The max
option is the number of requests which is allowed in the given time frame. If the requests are more than this number, the ‘handler’ function is called, which returns the response with HTTP status code 429 and a message stating that the rate limit has been reached.
Using Cloudflare for Rate Limiting in Strapi
Let's move on to how you can use third-party services to limit rates. One of the most popular services we'll be looking at is Cloudflare. Cloudflare offers a comprehensive rate-limiting solution that can be easily integrated with Strapi.
Cloudflare Integration
To implement rate limiting in Strapi using Cloudflare, follow the steps below:
- Log In to Your Cloudflare dashboard and select your account.
- Add, create, or add your domain to Cloudflare.
- Select your domain from the dashboard and go to Security > WAF > Custom Rules.
- Create a Custom Rules Tab and define the Rate Limiting Rule
- Click on the Create rule button.
- Enter a descriptive name for the rule.
- Choose an URL Path property from the Field drop-down list that will match incoming requests.
- Set characteristics that define request counters. For example, any request path matching
/api/articles
. - Specify the maximum number of requests and time period under When rate exceeds.
- Select an action (e.g., Block) under Then take action.
- Set a mitigation timeout duration for when the action will stop.
Click on Deploy to activate your ruleset immediately, or select Save as Draft if you are not ready.
With this configuration in place, you'll need to update your Strapi configuration to work with Cloudflare:
1//config/middlewares.ts
2export default [
3 // ... other middlewares
4 "global::cloudflare-headers",
5];
Then update your Strapi security settings to include the Cloudflare headers:
1//config/security.ts
2export default {
3 cors: {
4 enabled: true,
5 origin: ["https://your-domain.com"],
6 headers: [
7 "Content-Type",
8 "Authorization",
9 "X-RateLimit-Limit",
10 "X-RateLimit-Remaining",
11 ],
12 },
13};
With this, you have successfully implemented rate-limiting in your Strapi project using Cloudflare.
Comparing Different Rate Limiting Approaches
Now that you’ve seen the various methods of implementing rate limiting in Strapi, here’s a breakdown of their pros and cons.
Approach | Pros | Cons |
---|---|---|
Middleware | Provides global control over requests | Requires a more complex setup |
Koa Packages | Simple to use and well-maintained | May lack options for advanced customization |
Redis | Scalable and persists across instances | Requires Redis setup and ongoing management |
Express Limiter | Familiar for Express.js developers | Needs a middleware bridge for Koa |
Cloudflare | Comprehensive solution; easy to set up | Requires a Cloudflare account and configuration |
Best Practices for Rate Limiting in Strapi
To make the most of rate limiting in Strapi, follow these best practices: To make the most of rate limiting in Strapi, follow these best practices:
- It is possible to set reasonable boundaries depending on how the device is being used.
- Sliding windows should be used in order to have better control of traffic.
- Make sure that proper error responses (for example, ‘429 Too Many Requests’) are returned to the users.
- Record the violations so as to assess the possible abuse.
- Use different rate limits for users that are authenticated and those that are not.
- Integrate rate limiting with other security mechanisms such as API key and authentication.
- Definitely describe rate limits for your API users in order for them to know the number of requests they can make.
See Strapi in action with an interactive demo
Conclusion
In this tutorial, we discussed different ways of rate limiting in Strapi v5, including middleware, Koa packages, Redis, and Express Limiter. All the approaches have their benefits and drawbacks, so it is crucial to evaluate your project’s needs to choose the right option.
Lastly, rate limiting is a vital component in the security of APIs since it helps to avoid abuse and ensure that all the clients are reasonable in their usage. Hence, it is recommended that the approach that will be most suitable for the given project be selected based on its complexity and the degree of customization required by the organization. It is, therefore, important to take the time and do this properly so as to ensure that the API does not fail or become corrupted.
I am Software Engineer and Technical Writer. Proficient Server-side scripting and Database setup. Agile knowledge of Python, NodeJS, ReactJS, and PHP. When am not coding, I share my knowledge and experience with the other developers through technical articles