Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

Cloudron Forum

Apps - Status | Demo | Docs | Install
  1. Cloudron Forum
  2. Discuss
  3. How to proctect instances from Bot, Crawlers, Requests, & Co?

How to proctect instances from Bot, Crawlers, Requests, & Co?

Scheduled Pinned Locked Moved Discuss
5 Posts 3 Posters 26 Views 3 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • P Offline
    P Offline
    p44
    translator
    wrote last edited by p44
    #1

    Only recently I realized that about 20% of my Cloudron instances’ resources are being “given away” to various parties.

    Once upon a time, only search engines accessed sitemaps, but now web-facing instances – like WordPress – are constantly bombarded.

    At this point, I took action to block these requests, and where possible, I worked at the .htaccess level (again, WordPress).

    Nevertheless, I realized that it might be better to take a centralized approach and have a single point of control.

    I don’t want to be forced to use external applications (like Cloudflare).

    How could this aspect be improved on Cloudron?

    This consideration might also be useful for adding one or more feature requests to Cloudron, given how the web is evolving, to improve existing blocking features.

    In the meantime, I had thought of using Fail2Ban and setting rules to read the various logs of specific installed apps, and from there, setting limitations.

    I’ve already read about all limitations about Fail2Ban on Cloudron, but, for example, in WordPress, I would block all 404 requests originating from requests like xyz.php pages. Or I would block access from very aggressive bots like AhrefsBot, Semrush, MJ12bot, Sentibot.

    I’d be interested in understanding how you block anomalous requests centrally (not on individual apps).

    Thanks a lot for your patience.

    1 Reply Last reply
    1
    • timconsidineT Offline
      timconsidineT Offline
      timconsidine
      App Dev
      wrote last edited by
      #2

      Interesting questions. I shall be watching hing for answers - wished I had them.

      Indie app dev, scratching my itches, lover of Cloudron PaaS, communityapps.appx.uk

      P 1 Reply Last reply
      1
      • timconsidineT timconsidine

        Interesting questions. I shall be watching hing for answers - wished I had them.

        P Offline
        P Offline
        p44
        translator
        wrote last edited by p44
        #3

        @timconsidine Great, I’m glad that I’m not alone. I also saw your posts in this discussion related to specific problem of DDOS attacks.

        I started to approach to this problem examining how VPS resources are “wasted” on daily bases when migrated from bare metal to VPS... In some peaks, I had a connection timeout on incoming 25 port, and then slowly I saw what was going on... most of accesses on that time they weren’t “human”...

        1 Reply Last reply
        0
        • fbartelsF Offline
          fbartelsF Offline
          fbartels
          App Dev
          wrote last edited by
          #4

          If the bots are compliant to it https://en.wikipedia.org/wiki/Robots.txt would be the tool you are looking for. This file can already be managed through the Cloudron UI.

          When it comes to preventing bad actors then https://docs.crowdsec.net/ could be worthwhile to look into.

          P 1 Reply Last reply
          1
          • fbartelsF fbartels

            If the bots are compliant to it https://en.wikipedia.org/wiki/Robots.txt would be the tool you are looking for. This file can already be managed through the Cloudron UI.

            When it comes to preventing bad actors then https://docs.crowdsec.net/ could be worthwhile to look into.

            P Offline
            P Offline
            p44
            translator
            wrote last edited by
            #5

            @fbartels Yes, Robots.txt, .htaccess, all good... but it could be great to manage rules in a central (and simple) way, special on Cloudron instances with multiple apps installed.

            It seems to be little bit complicated for my skills. I had a look on this post.

            Are you using Crowdsec?

            1 Reply Last reply
            0

            Hello! It looks like you're interested in this conversation, but you don't have an account yet.

            Getting fed up of having to scroll through the same posts each visit? When you register for an account, you'll always come back to exactly where you were before, and choose to be notified of new replies (either via email, or push notification). You'll also be able to save bookmarks and upvote posts to show your appreciation to other community members.

            With your input, this post could be even better 💗

            Register Login
            Reply
            • Reply as topic
            Log in to reply
            • Oldest to Newest
            • Newest to Oldest
            • Most Votes


            • Login

            • Don't have an account? Register

            • Login or register to search.
            • First post
              Last post
            0
            • Categories
            • Recent
            • Tags
            • Popular
            • Bookmarks
            • Search