<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[Ollama + Claude Code &#x2F; OpenCode making coding free.]]></title><description><![CDATA[<p dir="auto"><strong>𝗢𝗹𝗹𝗮𝗺𝗮 + 𝗖𝗹𝗮𝘂𝗱𝗲 𝗖𝗼𝗱𝗲 / OpenCode</strong></p>
<p dir="auto">𝗔𝗻𝗱 𝗶𝘁 𝗿𝘂𝗻𝘀 𝗰𝗼𝗺𝗽𝗹𝗲𝘁𝗲𝗹𝘆 𝗹𝗼𝗰𝗮𝗹.</p>
<p dir="auto">No subscriptions. No limits. No data leaving your machine.</p>
<p dir="auto">Claude Code used to cost $3-$15 per million tokens.</p>
<p dir="auto">Now you can run it with free open-source models on your computer.</p>
<p dir="auto">January 16th, 2026: Ollama version 0.14.0 became compatible with Anthropic's messages API.</p>
<p dir="auto">That means Claude Code now works with any Ollama model.</p>
<p dir="auto">Zero ongoing costs. Your code never leaves your computer.</p>
<p dir="auto">Here's the setup:</p>
<p dir="auto">Download Ollama. Run <code>ollama pull qwen3-coder</code>. Install Claude Code.</p>
<p dir="auto">Set two environment variables to connect them.</p>
<p dir="auto">Then run: <code>claude --model qwen3-coder</code></p>
<p dir="auto">Give it tasks in plain English:</p>
<p dir="auto">"Write a function to process images." "Debug why my API calls are failing."</p>
<p dir="auto">It reads your files. Makes changes. Tests the code. Shows you what it did.</p>
<p dir="auto">Top models: Qwen 3-Coder for everyday tasks. GPT-OSS 20B for complex projects.</p>
<p dir="auto">Have an older GPU or lower VRAM, use AirLLM to load only needed parts and run bigger models on the same HW!</p>
<p dir="auto">This is completely open and free.<br />
Explicitly supported by both tools.<br />
AnyLLM works too.</p>
]]></description><link>https://forum.cloudron.io/topic/14937/ollama-claude-code-opencode-making-coding-free.</link><generator>RSS for Node</generator><lastBuildDate>Fri, 17 Apr 2026 18:35:40 GMT</lastBuildDate><atom:link href="https://forum.cloudron.io/topic/14937.rss" rel="self" type="application/rss+xml"/><pubDate>Fri, 23 Jan 2026 06:46:36 GMT</pubDate><ttl>60</ttl><item><title><![CDATA[Reply to Ollama + Claude Code &#x2F; OpenCode making coding free. on Sun, 22 Mar 2026 03:01:26 GMT]]></title><description><![CDATA[<p dir="auto">New update making it even simpler to code for free.<br />
<a href="https://forum.cloudron.io/topic/15277/ai-devops-opencode-make-coding-free-with-zen-models-no-claude-openai-google">https://forum.cloudron.io/topic/15277/ai-devops-opencode-make-coding-free-with-zen-models-no-claude-openai-google</a></p>
]]></description><link>https://forum.cloudron.io/post/122310</link><guid isPermaLink="true">https://forum.cloudron.io/post/122310</guid><dc:creator><![CDATA[robi]]></dc:creator><pubDate>Sun, 22 Mar 2026 03:01:26 GMT</pubDate></item><item><title><![CDATA[Reply to Ollama + Claude Code &#x2F; OpenCode making coding free. on Sun, 25 Jan 2026 03:07:59 GMT]]></title><description><![CDATA[<p dir="auto">not a fan of claude code, closed source and buggy</p>
<p dir="auto">loving <a href="http://opencode.ai" target="_blank" rel="noopener noreferrer nofollow ugc">opencode.ai</a>, TUI and Desktop app all sync with their client/server setup, and easy enough to have many instances running locally and on VPSs</p>
]]></description><link>https://forum.cloudron.io/post/119110</link><guid isPermaLink="true">https://forum.cloudron.io/post/119110</guid><dc:creator><![CDATA[marcusquinn]]></dc:creator><pubDate>Sun, 25 Jan 2026 03:07:59 GMT</pubDate></item><item><title><![CDATA[Reply to Ollama + Claude Code &#x2F; OpenCode making coding free. on Sat, 24 Jan 2026 18:15:10 GMT]]></title><description><![CDATA[<p dir="auto">You are welcome <a class="plugin-mentions-user plugin-mentions-a" href="/user/timconsidine" aria-label="Profile: timconsidine">@<bdi>timconsidine</bdi></a></p>
<p dir="auto">Last night I experimented with having Grok manage an agent I installed in a LAMP container.</p>
<p dir="auto">The setup was through a custom PHP that could read a custom Python script which could invoke an agent via API.</p>
<p dir="auto">The idea is to have it be autonomous and self manage itself.</p>
<p dir="auto">Ran out of steam due to Grok limitations and complications with google libraries.</p>
<p dir="auto">A useful new Cloudron app could be all this set up as a custom app with ollama, airllm, and a local model so it cannot run out of steam so easily.</p>
<p dir="auto">The just give it a goal.. and let it Ralph out.</p>
]]></description><link>https://forum.cloudron.io/post/119096</link><guid isPermaLink="true">https://forum.cloudron.io/post/119096</guid><dc:creator><![CDATA[robi]]></dc:creator><pubDate>Sat, 24 Jan 2026 18:15:10 GMT</pubDate></item><item><title><![CDATA[Reply to Ollama + Claude Code &#x2F; OpenCode making coding free. on Sat, 24 Jan 2026 17:41:55 GMT]]></title><description><![CDATA[<p dir="auto">Got Claude Code working with Ollama <img src="https://forum.cloudron.io/assets/plugins/nodebb-plugin-emoji/emoji/android/1f44d.png?v=665e13d50c8" class="not-responsive emoji emoji-android emoji--+1" style="height:23px;width:auto;vertical-align:middle" title=":+1:" alt="👍" /><br />
Thank you for the information <a class="plugin-mentions-user plugin-mentions-a" href="/user/robi" aria-label="Profile: robi">@<bdi>robi</bdi></a></p>
]]></description><link>https://forum.cloudron.io/post/119093</link><guid isPermaLink="true">https://forum.cloudron.io/post/119093</guid><dc:creator><![CDATA[timconsidine]]></dc:creator><pubDate>Sat, 24 Jan 2026 17:41:55 GMT</pubDate></item><item><title><![CDATA[Reply to Ollama + Claude Code &#x2F; OpenCode making coding free. on Fri, 23 Jan 2026 21:01:06 GMT]]></title><description><![CDATA[<p dir="auto"><a class="plugin-mentions-user plugin-mentions-a" href="/user/timconsidine" aria-label="Profile: timconsidine">@<bdi>timconsidine</bdi></a> don't switch. Add another instance to test.</p>
<p dir="auto">I would like to see AirLLM performance with medium size models for Ralph Wiggum usage.</p>
]]></description><link>https://forum.cloudron.io/post/119056</link><guid isPermaLink="true">https://forum.cloudron.io/post/119056</guid><dc:creator><![CDATA[robi]]></dc:creator><pubDate>Fri, 23 Jan 2026 21:01:06 GMT</pubDate></item><item><title><![CDATA[Reply to Ollama + Claude Code &#x2F; OpenCode making coding free. on Fri, 23 Jan 2026 16:25:58 GMT]]></title><description><![CDATA[<p dir="auto"><a class="plugin-mentions-user plugin-mentions-a" href="/user/robi" aria-label="Profile: robi">@<bdi>robi</bdi></a> cool !<br />
Reluctant to switch dev setup mid-project but next one will try it out.<br />
Hoping to get Appflowy and ZeroNet out the door soon.</p>
]]></description><link>https://forum.cloudron.io/post/119046</link><guid isPermaLink="true">https://forum.cloudron.io/post/119046</guid><dc:creator><![CDATA[timconsidine]]></dc:creator><pubDate>Fri, 23 Jan 2026 16:25:58 GMT</pubDate></item><item><title><![CDATA[Reply to Ollama + Claude Code &#x2F; OpenCode making coding free. on Fri, 23 Jan 2026 15:49:28 GMT]]></title><description><![CDATA[<p dir="auto"><a class="plugin-mentions-user plugin-mentions-a" href="/user/timconsidine" aria-label="Profile: timconsidine">@<bdi>timconsidine</bdi></a> said in <a href="/post/119042">Ollama + Claude Code making coding free.</a>:</p>
<blockquote>
<p dir="auto">My quick look suggested you still need a subscription for Clause Code even if you hook up Ollama model.  Did I misunderstand this ?</p>
</blockquote>
<p dir="auto">Yes, only the API needs a sub.</p>
]]></description><link>https://forum.cloudron.io/post/119043</link><guid isPermaLink="true">https://forum.cloudron.io/post/119043</guid><dc:creator><![CDATA[robi]]></dc:creator><pubDate>Fri, 23 Jan 2026 15:49:28 GMT</pubDate></item><item><title><![CDATA[Reply to Ollama + Claude Code &#x2F; OpenCode making coding free. on Fri, 23 Jan 2026 15:24:32 GMT]]></title><description><![CDATA[<p dir="auto"><a class="plugin-mentions-user plugin-mentions-a" href="/user/nebulon" aria-label="Profile: nebulon">@<bdi>nebulon</bdi></a> said in <a href="/post/119039">Ollama + Claude Code making coding free.</a>:</p>
<blockquote>
<p dir="auto">VPS with some nvidia card</p>
</blockquote>
<p dir="auto">I use Koyeb for this kind of thing.<br />
It's a good proposition but I am not needing it much, so may discontinue.  Others might find it helpful.</p>
<p dir="auto">Generally I am happy with my TRAE and desktop Ollama client using one of their cloud models (to avoid pulling it locally).</p>
<p dir="auto">But everyone raves about Claude Code, so will check them with an Ollama out.</p>
<p dir="auto">My quick look suggested you still need a subscription for Clause Code even if you hook up Ollama model.  Did I misunderstand this ?</p>
]]></description><link>https://forum.cloudron.io/post/119042</link><guid isPermaLink="true">https://forum.cloudron.io/post/119042</guid><dc:creator><![CDATA[timconsidine]]></dc:creator><pubDate>Fri, 23 Jan 2026 15:24:32 GMT</pubDate></item><item><title><![CDATA[Reply to Ollama + Claude Code &#x2F; OpenCode making coding free. on Fri, 23 Jan 2026 14:09:32 GMT]]></title><description><![CDATA[<p dir="auto">I got some Intel Arc Battlemage B580 to test local llm for real. Nvidia is quite trivial to play with, since one can get a VPS with some nvidia card and mostly things are set up ready to use regarding GPU support (it can get very expensive though!)</p>
<p dir="auto">So it can very likely be the case that this card is woefully underpowered for that use-case, though other similarly sized general purpose models work just fine with ollama. But maybe I miss something obvious with claude code here.</p>
]]></description><link>https://forum.cloudron.io/post/119039</link><guid isPermaLink="true">https://forum.cloudron.io/post/119039</guid><dc:creator><![CDATA[nebulon]]></dc:creator><pubDate>Fri, 23 Jan 2026 14:09:32 GMT</pubDate></item><item><title><![CDATA[Reply to Ollama + Claude Code &#x2F; OpenCode making coding free. on Fri, 23 Jan 2026 11:21:45 GMT]]></title><description><![CDATA[<p dir="auto"><a class="plugin-mentions-user plugin-mentions-a" href="/user/nebulon" aria-label="Profile: nebulon">@<bdi>nebulon</bdi></a> which GPU?</p>
]]></description><link>https://forum.cloudron.io/post/119031</link><guid isPermaLink="true">https://forum.cloudron.io/post/119031</guid><dc:creator><![CDATA[joseph]]></dc:creator><pubDate>Fri, 23 Jan 2026 11:21:45 GMT</pubDate></item><item><title><![CDATA[Reply to Ollama + Claude Code &#x2F; OpenCode making coding free. on Fri, 23 Jan 2026 10:11:33 GMT]]></title><description><![CDATA[<p dir="auto">I had a go with that lately also on my local setup with a GPU. ollama works well also with openwebui for the models I can run, however I never succeeded to get anything out of claude. If anyone succeeds I would be very happy to hear about the steps to do so. Using qwen3-coder, claude was endlessly spinning the GPU even on small tasks, eventually giving up without ever producing anything <img src="https://forum.cloudron.io/assets/plugins/nodebb-plugin-emoji/emoji/android/1f615.png?v=665e13d50c8" class="not-responsive emoji emoji-android emoji--confused" style="height:23px;width:auto;vertical-align:middle" title=":/" alt="😕" /></p>
]]></description><link>https://forum.cloudron.io/post/119028</link><guid isPermaLink="true">https://forum.cloudron.io/post/119028</guid><dc:creator><![CDATA[nebulon]]></dc:creator><pubDate>Fri, 23 Jan 2026 10:11:33 GMT</pubDate></item><item><title><![CDATA[Reply to Ollama + Claude Code &#x2F; OpenCode making coding free. on Fri, 23 Jan 2026 08:35:36 GMT]]></title><description><![CDATA[<p dir="auto">Wow !<br />
More to play with !</p>
<p dir="auto">I use ollama desktop with their cloud qwen 3 coder 480b. I’m hoping that connecting Claude code to ollama can make use of that. Will give it a shot.</p>
]]></description><link>https://forum.cloudron.io/post/119014</link><guid isPermaLink="true">https://forum.cloudron.io/post/119014</guid><dc:creator><![CDATA[timconsidine]]></dc:creator><pubDate>Fri, 23 Jan 2026 08:35:36 GMT</pubDate></item></channel></rss>