<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[Is local Ollama installation updatable by Cloudron?]]></title><description><![CDATA[<p dir="auto">Hi, as discussed <a href="https://forum.cloudron.io/topic/14471/ollama-is-now-available/13?_=1777406056831">here</a> and <a href="https://forum.cloudron.io/topic/12401/eta-for-gpu-support-could-we-contribute-to-help-it-along/8?_=1777406056842">here</a>: using local GPU (if available) with Ollama is critical for getting performance.</p>
<p dir="auto">Therefore: Is it possible to install Ollama locally on the server (outside of Cloudron for direct GPU Support) and then install the Ollama Cloudron package to continuously update Ollama? Or is there any other way to automatically keep Ollama up to date?</p>
<p dir="auto">I love the professional level Cloudron reached! <img src="https://forum.cloudron.io/assets/plugins/nodebb-plugin-emoji/emoji/android/1f642.png?v=d2038e51828" class="not-responsive emoji emoji-android emoji--slightly_smiling_face" style="height:23px;width:auto;vertical-align:middle" title=":)" alt="🙂" /></p>
]]></description><link>https://forum.cloudron.io/topic/15447/is-local-ollama-installation-updatable-by-cloudron</link><generator>RSS for Node</generator><lastBuildDate>Wed, 29 Apr 2026 01:42:52 GMT</lastBuildDate><atom:link href="https://forum.cloudron.io/topic/15447.rss" rel="self" type="application/rss+xml"/><pubDate>Tue, 28 Apr 2026 20:01:38 GMT</pubDate><ttl>60</ttl></channel></rss>