Is there a way to rate limit connections to a site for certain user agent strings?
-
Hello,
I have a particular website that for the last 2+ days has been reaching max memory and restarting frequently, a dozen times a day. I've tried increasing the memory which has helped of course but that's only a temporary workaround. The issue started when (according to the logs) the site started receiving an onslaught of traffic from Facebook crawler bots, specifically their
Meta-ExternalAgent/1.1one.What I'd like to do is try to rate limit (within Cloudron if possible) the requests from certain user agents, to maybe 10 a minute for example instead of several a second (which is currently what I'm seeing). If this is possible, I'd love to know.
I may be able to use a plugin in WordPress to do that but my thinking is this will still take up Apache connections which can still saturate the connections. In fact I tried to do this with the .htaccess using something ChatGPT recommended, but this just slows down the data rate and doesn't really slow down the indexing from Facebook / Meta, so I suspect this will simply increase the connection saturation if each request takes a bit longer to respond to.
# BEGIN Meta-ExternalHit Throttling <IfModule mod_rewrite.c> RewriteEngine On # Detect Meta-ExternalHit user agent RewriteCond %{HTTP_USER_AGENT} "Meta-ExternalHit" [NC] # Set an env var if matched RewriteRule ^ - [E=IS_META_BOT:1] </IfModule> <IfModule mod_ratelimit.c> # Apply rate limit if Meta bot detected SetEnvIf IS_META_BOT 1 META_BOT <IfModule mod_filter.c> AddOutputFilterByType RATE_LIMIT text/html text/plain text/xml application/json application/xml image/jpeg image/png image/webp image/avif </IfModule> # Limit to ~50 KB/s (value is KB per second) SetEnvIf META_BOT 1 RATE_LIMIT 50 </IfModule> # END Meta-ExternalHit ThrottlingThis is leading to the health checks taking over 7000ms as well which I see in the logs.
Thank you in advance for any advice.
-
Anf of course, Meta ignores your
robots.txt? It‘s such a sh*t company…Just FYI: If you were in the EU and/or Germany:
-
You can legally prevent AI companies from using your website's data. The legal basis is the right to opt-out of Text and Data Mining (TDM) under Art. 4 of the EU Copyright Directive.
-
Germany has implemented the directive in Art. 44b UrhG: „Uses in accordance with subsection (2) sentence 1 are permitted only if they have not been reserved by the rightholder. A reservation of use in the case of works which are available online is effective only if it is made in a machine-readable format.“
-
Your objection must be machine-readable. A simple text disclaimer on your site (e.g., in the legal notice) is legally insufficient.
-
The standard method is to use the above mentioned
robots.txt. -
A comprehensive, community-maintained list can be found at projects like ai.robots.txt on GitHub.
-
While major companies respect
robots.txt, compliance is not guaranteed from all crawlers, but it is the recognized legal and technical standard for opting out.
-
-
Hello,
I have a particular website that for the last 2+ days has been reaching max memory and restarting frequently, a dozen times a day. I've tried increasing the memory which has helped of course but that's only a temporary workaround. The issue started when (according to the logs) the site started receiving an onslaught of traffic from Facebook crawler bots, specifically their
Meta-ExternalAgent/1.1one.What I'd like to do is try to rate limit (within Cloudron if possible) the requests from certain user agents, to maybe 10 a minute for example instead of several a second (which is currently what I'm seeing). If this is possible, I'd love to know.
I may be able to use a plugin in WordPress to do that but my thinking is this will still take up Apache connections which can still saturate the connections. In fact I tried to do this with the .htaccess using something ChatGPT recommended, but this just slows down the data rate and doesn't really slow down the indexing from Facebook / Meta, so I suspect this will simply increase the connection saturation if each request takes a bit longer to respond to.
# BEGIN Meta-ExternalHit Throttling <IfModule mod_rewrite.c> RewriteEngine On # Detect Meta-ExternalHit user agent RewriteCond %{HTTP_USER_AGENT} "Meta-ExternalHit" [NC] # Set an env var if matched RewriteRule ^ - [E=IS_META_BOT:1] </IfModule> <IfModule mod_ratelimit.c> # Apply rate limit if Meta bot detected SetEnvIf IS_META_BOT 1 META_BOT <IfModule mod_filter.c> AddOutputFilterByType RATE_LIMIT text/html text/plain text/xml application/json application/xml image/jpeg image/png image/webp image/avif </IfModule> # Limit to ~50 KB/s (value is KB per second) SetEnvIf META_BOT 1 RATE_LIMIT 50 </IfModule> # END Meta-ExternalHit ThrottlingThis is leading to the health checks taking over 7000ms as well which I see in the logs.
Thank you in advance for any advice.
@d19dotca Install your own WAF. We have been testing https://www.bunkerweb.io/ for almost a month. And it works.
-
@d19dotca Install your own WAF. We have been testing https://www.bunkerweb.io/ for almost a month. And it works.
@luckow said in Is there a way to rate limit connections to a site for certain user agent strings?:
@d19dotca Install your own WAF. We have been testing https://www.bunkerweb.io/ for almost a month. And it works.
Would that be interesting as a Cloudron service?
-
@d19dotca Install your own WAF. We have been testing https://www.bunkerweb.io/ for almost a month. And it works.
@luckow said in Is there a way to rate limit connections to a site for certain user agent strings?:
@d19dotca Install your own WAF. We have been testing https://www.bunkerweb.io/ for almost a month. And it works.
Sounds good. How?
-
@luckow said in Is there a way to rate limit connections to a site for certain user agent strings?:
@d19dotca Install your own WAF. We have been testing https://www.bunkerweb.io/ for almost a month. And it works.
Sounds good. How?
@jdaviescoates
The good old traditional method: https://docs.bunkerweb.io/latest/integrations/#linux
Runs on a CX22 on https://www.hetzner.com/cloud/.Bunkerweb acts as a reverse proxy for a Cloudron app that is ‘behind it’. Currently, we only use it in front of our own website (mainly because we are still learning, e.g. what happens when we block bots? Oh, there is no longer support for previews in rocket.chat). In my next spare moment, I'll try out what happens when a complete Cloudron instance is behind Bunkerweb. It should work. From what I've heard, this is the case with Cloudflare, and Bunkerweb is similar (only self-hosted)

-
Hello,
I have a particular website that for the last 2+ days has been reaching max memory and restarting frequently, a dozen times a day. I've tried increasing the memory which has helped of course but that's only a temporary workaround. The issue started when (according to the logs) the site started receiving an onslaught of traffic from Facebook crawler bots, specifically their
Meta-ExternalAgent/1.1one.What I'd like to do is try to rate limit (within Cloudron if possible) the requests from certain user agents, to maybe 10 a minute for example instead of several a second (which is currently what I'm seeing). If this is possible, I'd love to know.
I may be able to use a plugin in WordPress to do that but my thinking is this will still take up Apache connections which can still saturate the connections. In fact I tried to do this with the .htaccess using something ChatGPT recommended, but this just slows down the data rate and doesn't really slow down the indexing from Facebook / Meta, so I suspect this will simply increase the connection saturation if each request takes a bit longer to respond to.
# BEGIN Meta-ExternalHit Throttling <IfModule mod_rewrite.c> RewriteEngine On # Detect Meta-ExternalHit user agent RewriteCond %{HTTP_USER_AGENT} "Meta-ExternalHit" [NC] # Set an env var if matched RewriteRule ^ - [E=IS_META_BOT:1] </IfModule> <IfModule mod_ratelimit.c> # Apply rate limit if Meta bot detected SetEnvIf IS_META_BOT 1 META_BOT <IfModule mod_filter.c> AddOutputFilterByType RATE_LIMIT text/html text/plain text/xml application/json application/xml image/jpeg image/png image/webp image/avif </IfModule> # Limit to ~50 KB/s (value is KB per second) SetEnvIf META_BOT 1 RATE_LIMIT 50 </IfModule> # END Meta-ExternalHit ThrottlingThis is leading to the health checks taking over 7000ms as well which I see in the logs.
Thank you in advance for any advice.
@d19dotca Think what you want of Cloudflare but their caching is prettyy good, plus they also hate AI Bots and have specific options to block them: https://developers.cloudflare.com/ai-crawl-control/
-
Thank you all for the suggestions! Good ideas!
Overnight after my message to the forum it turns out the bot traffic finally went back to normal levels overnight and the app has been stable ever since. But this definitely reminded me that getting a good WAF (or improving the robots.txt at a minimum) can be important and needs to be evaluated.
Hopefully Cloudron can integrate a simplistic WAF into the system directly in the future (maybe even using that BunkerWeb if possible).

-
@jdaviescoates
The good old traditional method: https://docs.bunkerweb.io/latest/integrations/#linux
Runs on a CX22 on https://www.hetzner.com/cloud/.Bunkerweb acts as a reverse proxy for a Cloudron app that is ‘behind it’. Currently, we only use it in front of our own website (mainly because we are still learning, e.g. what happens when we block bots? Oh, there is no longer support for previews in rocket.chat). In my next spare moment, I'll try out what happens when a complete Cloudron instance is behind Bunkerweb. It should work. From what I've heard, this is the case with Cloudflare, and Bunkerweb is similar (only self-hosted)

@luckow said in Is there a way to rate limit connections to a site for certain user agent strings?:
Bunkerweb acts as a reverse proxy for a Cloudron app that is ‘behind it’. Currently, we only use it in front of our own website (mainly because we are still learning, e.g. what happens when we block bots? Oh, there is no longer support for previews in rocket.chat). In my next spare moment, I'll try out what happens when a complete Cloudron instance is behind Bunkerweb. It should work. From what I've heard, this is the case with Cloudflare, and Bunkerweb is similar (only self-hosted)

Hi @luckow I'm really curious how it went with Bunkerweb in front of Cloudron?
I am moving domains from Cloudflare to deSEC but can't do all because I use Cloudflare WAF for some Cloudron-apps (Geoblocking and/or IP whitelist with DDNS/API on app-level). And because Cloudron doesn't have anything like a WAF a workaround (what a pity) could be Bunkerweb?
-
@luckow said in Is there a way to rate limit connections to a site for certain user agent strings?:
Bunkerweb acts as a reverse proxy for a Cloudron app that is ‘behind it’. Currently, we only use it in front of our own website (mainly because we are still learning, e.g. what happens when we block bots? Oh, there is no longer support for previews in rocket.chat). In my next spare moment, I'll try out what happens when a complete Cloudron instance is behind Bunkerweb. It should work. From what I've heard, this is the case with Cloudflare, and Bunkerweb is similar (only self-hosted)

Hi @luckow I'm really curious how it went with Bunkerweb in front of Cloudron?
I am moving domains from Cloudflare to deSEC but can't do all because I use Cloudflare WAF for some Cloudron-apps (Geoblocking and/or IP whitelist with DDNS/API on app-level). And because Cloudron doesn't have anything like a WAF a workaround (what a pity) could be Bunkerweb?
@imc67 I never found the time to delve deeper into the test system (WAF – here Bunkerweb – in front of a dedicated Cloudron instance). Conversely: I completely missed my own challenge months ago. Thanks for bringing that up again.

With Bunker we're on the free tier. One thing missing in the free tier is: reporting/monitoring over a longer period. So no direct insight into the numbers the WAF filters out. But from our experience with one app on Cloudron (our own website): no downtime, no stress, nothing. Everything as expected after some manual configurations.
-
From a marketing perspective: Filtering out bots causes problems. No link previews in LinkedIn, Rocket.Chat, Signal... Problem solved by allowlists for some User-Agents. But in the long run it feels wrong to only pay attention to User-Agents. Bad bots find solutions to adopt the "good" User-Agents. In that case I don't think the WAF will work. We'll see.
-
Our website runs on Drupal. We added custom rules to forbid certain URL structures. What we learned: Some editors use workflows that generate URL structures that were forbidden. So they asked the Bunker administrators to change the structure to enable their work.
-
Our first update of Bunkerweb ended directly in a disaster. The maintainers rolled back the update, we reverted the version and a few days later a new update was released. That works. The last two updates worked without problems.
-
Is the time investment worth it? I think so. We have so few answers on alternatives to Cloudflare. We need a solid free and open-source alternative. What we learned: When it works, it works. You have to learn some new terminologies and technologies. It makes us stronger in decisions and better at consulting. Is it as good as Cloudflare? Maybe later, from my point of view. Don't forget that an important issue is DOS attacks. This is not solved with Bunkerweb in the free version on a Hetzner VPS.
Once I find time to dedicate myself to the test system (Cloudron instance behind Bunkerweb) again, I will post an update. Many thanks for the reminder.
-