Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

Cloudron Forum

Apps | Demo | Docs | Install
  1. Cloudron Forum
  2. Discuss
  3. AI on Cloudron

AI on Cloudron

Scheduled Pinned Locked Moved Discuss
a.i
245 Posts 15 Posters 81.5k Views 18 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • L Offline
      L Offline
      LoudLemur
      wrote last edited by
      #241

      StabilityMatrix

      If you use a lot of inferencing software, one thing this can do is have the models kept in one folder, which saves storage and bandwidth.

      https://github.com/LykosAI/StabilityMatrix/releases/tag/v2.14.0

      1 Reply Last reply
      0
      • L Offline
        L Offline
        LoudLemur
        wrote last edited by
        #242

        AI Search with Lepton
        This helps you have an ai language model which makes internet searches.

        An interesting feature that Cloudron might like is its one-click deployment tool.

        https://github.com/leptonai/search_with_lepton?tab=readme-ov-file

        1 Reply Last reply
        0
        • jdaviescoatesJ Offline
          jdaviescoatesJ Offline
          jdaviescoates
          wrote last edited by
          #243

          Apparently this is one of the best open source models yet:

          https://qwenlm.github.io/blog/qwen3/

          I use Cloudron with Gandi & Hetzner

          L 1 Reply Last reply
          2
          • L Offline
            L Offline
            LoudLemur
            wrote last edited by
            #244

            LLM VRAM Calculator

            If you would like to know what models your VRAM can handle, you can check with this online tool:

            https://apxml.com/tools/vram-calculator

            1 Reply Last reply
            1
            • jdaviescoatesJ jdaviescoates

              Apparently this is one of the best open source models yet:

              https://qwenlm.github.io/blog/qwen3/

              L Offline
              L Offline
              LoudLemur
              wrote last edited by
              #245

              @jdaviescoates I just tried it.
              Qwen 3 is very good!

              1 Reply Last reply
              1
              Reply
              • Reply as topic
              Log in to reply
              • Oldest to Newest
              • Newest to Oldest
              • Most Votes


                • Login

                • Don't have an account? Register

                • Login or register to search.
                • First post
                  Last post
                0
                • Categories
                • Recent
                • Tags
                • Popular
                • Bookmarks
                • Search