Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

Cloudron Forum

Apps - Status | Demo | Docs | Install
  1. Cloudron Forum
  2. Off-topic
  3. Ollama + Claude Code making coding free.

Ollama + Claude Code making coding free.

Scheduled Pinned Locked Moved Off-topic
11 Posts 4 Posters 224 Views 4 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • timconsidineT Online
    timconsidineT Online
    timconsidine
    App Dev
    wrote last edited by
    #2

    Wow !
    More to play with !

    I use ollama desktop with their cloud qwen 3 coder 480b. I’m hoping that connecting Claude code to ollama can make use of that. Will give it a shot.

    Indie app dev, scratching my itches, lover of Cloudron PaaS

    1 Reply Last reply
    0
    • nebulonN Offline
      nebulonN Offline
      nebulon
      Staff
      wrote last edited by
      #3

      I had a go with that lately also on my local setup with a GPU. ollama works well also with openwebui for the models I can run, however I never succeeded to get anything out of claude. If anyone succeeds I would be very happy to hear about the steps to do so. Using qwen3-coder, claude was endlessly spinning the GPU even on small tasks, eventually giving up without ever producing anything šŸ˜•

      1 Reply Last reply
      3
      • J Offline
        J Offline
        joseph
        Staff
        wrote last edited by
        #4

        @nebulon which GPU?

        1 Reply Last reply
        0
        • nebulonN Offline
          nebulonN Offline
          nebulon
          Staff
          wrote last edited by
          #5

          I got some Intel Arc Battlemage B580 to test local llm for real. Nvidia is quite trivial to play with, since one can get a VPS with some nvidia card and mostly things are set up ready to use regarding GPU support (it can get very expensive though!)

          So it can very likely be the case that this card is woefully underpowered for that use-case, though other similarly sized general purpose models work just fine with ollama. But maybe I miss something obvious with claude code here.

          timconsidineT 1 Reply Last reply
          0
          • nebulonN nebulon

            I got some Intel Arc Battlemage B580 to test local llm for real. Nvidia is quite trivial to play with, since one can get a VPS with some nvidia card and mostly things are set up ready to use regarding GPU support (it can get very expensive though!)

            So it can very likely be the case that this card is woefully underpowered for that use-case, though other similarly sized general purpose models work just fine with ollama. But maybe I miss something obvious with claude code here.

            timconsidineT Online
            timconsidineT Online
            timconsidine
            App Dev
            wrote last edited by timconsidine
            #6

            @nebulon said in Ollama + Claude Code making coding free.:

            VPS with some nvidia card

            I use Koyeb for this kind of thing.
            It's a good proposition but I am not needing it much, so may discontinue. Others might find it helpful.

            Generally I am happy with my TRAE and desktop Ollama client using one of their cloud models (to avoid pulling it locally).

            But everyone raves about Claude Code, so will check them with an Ollama out.

            My quick look suggested you still need a subscription for Clause Code even if you hook up Ollama model. Did I misunderstand this ?

            Indie app dev, scratching my itches, lover of Cloudron PaaS

            robiR 1 Reply Last reply
            0
            • timconsidineT timconsidine

              @nebulon said in Ollama + Claude Code making coding free.:

              VPS with some nvidia card

              I use Koyeb for this kind of thing.
              It's a good proposition but I am not needing it much, so may discontinue. Others might find it helpful.

              Generally I am happy with my TRAE and desktop Ollama client using one of their cloud models (to avoid pulling it locally).

              But everyone raves about Claude Code, so will check them with an Ollama out.

              My quick look suggested you still need a subscription for Clause Code even if you hook up Ollama model. Did I misunderstand this ?

              robiR Offline
              robiR Offline
              robi
              wrote last edited by
              #7

              @timconsidine said in Ollama + Claude Code making coding free.:

              My quick look suggested you still need a subscription for Clause Code even if you hook up Ollama model. Did I misunderstand this ?

              Yes, only the API needs a sub.

              Conscious tech

              timconsidineT 1 Reply Last reply
              1
              • robiR robi

                @timconsidine said in Ollama + Claude Code making coding free.:

                My quick look suggested you still need a subscription for Clause Code even if you hook up Ollama model. Did I misunderstand this ?

                Yes, only the API needs a sub.

                timconsidineT Online
                timconsidineT Online
                timconsidine
                App Dev
                wrote last edited by timconsidine
                #8

                @robi cool !
                Reluctant to switch dev setup mid-project but next one will try it out.
                Hoping to get Appflowy and ZeroNet out the door soon.

                Indie app dev, scratching my itches, lover of Cloudron PaaS

                robiR 1 Reply Last reply
                2
                • timconsidineT timconsidine

                  @robi cool !
                  Reluctant to switch dev setup mid-project but next one will try it out.
                  Hoping to get Appflowy and ZeroNet out the door soon.

                  robiR Offline
                  robiR Offline
                  robi
                  wrote last edited by
                  #9

                  @timconsidine don't switch. Add another instance to test.

                  I would like to see AirLLM performance with medium size models for Ralph Wiggum usage.

                  Conscious tech

                  1 Reply Last reply
                  1
                  • timconsidineT Online
                    timconsidineT Online
                    timconsidine
                    App Dev
                    wrote last edited by
                    #10

                    Got Claude Code working with Ollama šŸ‘
                    Thank you for the information @robi

                    Indie app dev, scratching my itches, lover of Cloudron PaaS

                    robiR 1 Reply Last reply
                    1
                    • timconsidineT timconsidine

                      Got Claude Code working with Ollama šŸ‘
                      Thank you for the information @robi

                      robiR Offline
                      robiR Offline
                      robi
                      wrote last edited by
                      #11

                      You are welcome @timconsidine

                      Last night I experimented with having Grok manage an agent I installed in a LAMP container.

                      The setup was through a custom PHP that could read a custom Python script which could invoke an agent via API.

                      The idea is to have it be autonomous and self manage itself.

                      Ran out of steam due to Grok limitations and complications with google libraries.

                      A useful new Cloudron app could be all this set up as a custom app with ollama, airllm, and a local model so it cannot run out of steam so easily.

                      The just give it a goal.. and let it Ralph out.

                      Conscious tech

                      1 Reply Last reply
                      2
                      Reply
                      • Reply as topic
                      Log in to reply
                      • Oldest to Newest
                      • Newest to Oldest
                      • Most Votes


                      • Login

                      • Don't have an account? Register

                      • Login or register to search.
                      • First post
                        Last post
                      0
                      • Categories
                      • Recent
                      • Tags
                      • Popular
                      • Bookmarks
                      • Search