Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

Cloudron Forum

Apps - Status | Demo | Docs | Install
  1. Cloudron Forum
  2. Off-topic
  3. Ollama + Claude Code making coding free.

Ollama + Claude Code making coding free.

Scheduled Pinned Locked Moved Off-topic
7 Posts 4 Posters 41 Views 4 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • robiR Offline
    robiR Offline
    robi
    wrote last edited by
    #1

    š—¢š—¹š—¹š—®š—ŗš—® + š—–š—¹š—®š˜‚š—±š—² š—–š—¼š—±š—² / OpenCode

    š—”š—»š—± š—¶š˜ š—æš˜‚š—»š˜€ š—°š—¼š—ŗš—½š—¹š—²š˜š—²š—¹š˜† š—¹š—¼š—°š—®š—¹.

    No subscriptions. No limits. No data leaving your machine.

    Claude Code used to cost $3-$15 per million tokens.

    Now you can run it with free open-source models on your computer.

    January 16th, 2026: Ollama version 0.14.0 became compatible with Anthropic's messages API.

    That means Claude Code now works with any Ollama model.

    Zero ongoing costs. Your code never leaves your computer.

    Here's the setup:

    Download Ollama. Run ollama pull qwen3-coder. Install Claude Code.

    Set two environment variables to connect them.

    Then run: claude --model qwen3-coder

    Give it tasks in plain English:

    "Write a function to process images." "Debug why my API calls are failing."

    It reads your files. Makes changes. Tests the code. Shows you what it did.

    Top models: Qwen 3-Coder for everyday tasks. GPT-OSS 20B for complex projects.

    Have an older GPU or lower VRAM, use AirLLM to load only needed parts and run bigger models on the same HW!

    This is completely open and free.
    Explicitly supported by both tools.
    AnyLLM works too.

    Conscious tech

    1 Reply Last reply
    2
    • timconsidineT Online
      timconsidineT Online
      timconsidine
      App Dev
      wrote last edited by
      #2

      Wow !
      More to play with !

      I use ollama desktop with their cloud qwen 3 coder 480b. I’m hoping that connecting Claude code to ollama can make use of that. Will give it a shot.

      Indie app dev, scratching my itches, lover of Cloudron PaaS

      1 Reply Last reply
      0
      • nebulonN Offline
        nebulonN Offline
        nebulon
        Staff
        wrote last edited by
        #3

        I had a go with that lately also on my local setup with a GPU. ollama works well also with openwebui for the models I can run, however I never succeeded to get anything out of claude. If anyone succeeds I would be very happy to hear about the steps to do so. Using qwen3-coder, claude was endlessly spinning the GPU even on small tasks, eventually giving up without ever producing anything šŸ˜•

        1 Reply Last reply
        3
        • J Offline
          J Offline
          joseph
          Staff
          wrote last edited by
          #4

          @nebulon which GPU?

          1 Reply Last reply
          0
          • nebulonN Offline
            nebulonN Offline
            nebulon
            Staff
            wrote last edited by
            #5

            I got some Intel Arc Battlemage B580 to test local llm for real. Nvidia is quite trivial to play with, since one can get a VPS with some nvidia card and mostly things are set up ready to use regarding GPU support (it can get very expensive though!)

            So it can very likely be the case that this card is woefully underpowered for that use-case, though other similarly sized general purpose models work just fine with ollama. But maybe I miss something obvious with claude code here.

            timconsidineT 1 Reply Last reply
            0
            • nebulonN nebulon

              I got some Intel Arc Battlemage B580 to test local llm for real. Nvidia is quite trivial to play with, since one can get a VPS with some nvidia card and mostly things are set up ready to use regarding GPU support (it can get very expensive though!)

              So it can very likely be the case that this card is woefully underpowered for that use-case, though other similarly sized general purpose models work just fine with ollama. But maybe I miss something obvious with claude code here.

              timconsidineT Online
              timconsidineT Online
              timconsidine
              App Dev
              wrote last edited by timconsidine
              #6

              @nebulon said in Ollama + Claude Code making coding free.:

              VPS with some nvidia card

              I use Koyeb for this kind of thing.
              It's a good proposition but I am not needing it much, so may discontinue. Others might find it helpful.

              Generally I am happy with my TRAE and desktop Ollama client using one of their cloud models (to avoid pulling it locally).

              But everyone raves about Claude Code, so will check them with an Ollama out.

              My quick look suggested you still need a subscription for Clause Code even if you hook up Ollama model. Did I misunderstand this ?

              Indie app dev, scratching my itches, lover of Cloudron PaaS

              robiR 1 Reply Last reply
              0
              • timconsidineT timconsidine

                @nebulon said in Ollama + Claude Code making coding free.:

                VPS with some nvidia card

                I use Koyeb for this kind of thing.
                It's a good proposition but I am not needing it much, so may discontinue. Others might find it helpful.

                Generally I am happy with my TRAE and desktop Ollama client using one of their cloud models (to avoid pulling it locally).

                But everyone raves about Claude Code, so will check them with an Ollama out.

                My quick look suggested you still need a subscription for Clause Code even if you hook up Ollama model. Did I misunderstand this ?

                robiR Offline
                robiR Offline
                robi
                wrote last edited by
                #7

                @timconsidine said in Ollama + Claude Code making coding free.:

                My quick look suggested you still need a subscription for Clause Code even if you hook up Ollama model. Did I misunderstand this ?

                Yes, only the API needs a sub.

                Conscious tech

                1 Reply Last reply
                1
                Reply
                • Reply as topic
                Log in to reply
                • Oldest to Newest
                • Newest to Oldest
                • Most Votes


                • Login

                • Don't have an account? Register

                • Login or register to search.
                • First post
                  Last post
                0
                • Categories
                • Recent
                • Tags
                • Popular
                • Bookmarks
                • Search