Open-source metering and usage-based billing
mononym
Posts
-
Lago -
Upgrade to umami 2.10.0 failedThanks. That worked. I augmented from initially 2GB to 4GB.
Strange this happens to the instance with a tracker on only one website. My other instance with trackers on four websites had no issues. But by the number of single URL/pages tracked, the first instance has much more single sites to track. -
Upgrade to umami 2.10.0 failedHello. I have two Umami instance and one did fail after the update.
Feb 29 09:27:02error Command failed with exit code 134. Feb 29 09:27:0211: 0xf147a0 v8::internal::Factory::AllocateRaw(int, v8::internal::AllocationType, v8::internal::AllocationAlignment) [/usr/local/node-18.18.0/bin/node] Feb 29 09:27:0212: 0xf0bd6c v8::internal::FactoryBase<v8::internal::Factory>::AllocateRawArray(int, v8::internal::AllocationType) [/usr/local/node-18.18.0/bin/node] Feb 29 09:27:0213: 0xf0bee5 v8::internal::FactoryBase<v8::internal::Factory>::NewFixedArrayWithFiller(v8::internal::Handle<v8::internal::Map>, int, v8::internal::Handle<v8::internal::Oddball>, v8::internal::AllocationType) [/usr/local/node-18.18.0/bin/node] Feb 29 09:27:0214: 0x11c6f4e v8::internal::MaybeHandle<v8::internal::OrderedHashMap> v8::internal::OrderedHashTable<v8::internal::OrderedHashMap, 2>::Allocate<v8::internal::Isolate>(v8::internal::Isolate*, int, v8::internal::AllocationType) [/usr/local/node-18.18.0/bin/node] Feb 29 09:27:0215: 0x11c7003 v8::internal::MaybeHandle<v8::internal::OrderedHashMap> v8::internal::OrderedHashTable<v8::internal::OrderedHashMap, 2>::Rehash<v8::internal::Isolate>(v8::internal::Isolate*, v8::internal::Handle<v8::internal::OrderedHashMap>, int) [/usr/local/node-18.18.0/bin/node] Feb 29 09:27:0216: 0x12d134d v8::internal::Runtime_MapGrow(int, unsigned long*, v8::internal::Isolate*) [/usr/local/node-18.18.0/bin/node] Feb 29 09:27:0217: 0x170a079 [/usr/local/node-18.18.0/bin/node] Feb 29 09:27:02Aborted (core dumped) Feb 29 09:27:03error Command failed with exit code 1. Feb 29 09:27:03info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command. Feb 29 09:27:03info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command. Feb 29 09:27:03ERROR: "build-app" exited with 134. Feb 29 09:27:05==> Changing ownership Feb 29 09:27:05=> Running build script that generates the migrations Feb 29 09:27:05CREATE EXTENSION Feb 29 09:27:05NOTICE: extension "pgcrypto" already exists, skipping Feb 29 09:27:06yarn run v1.22.19 Feb 29 09:27:06$ npm-run-all check-env build-db check-db build-tracker build-geo build-app Feb 29 09:27:06warning Cannot find a suitable global folder. Tried these: "/usr/local, /usr/local/share/.yarn" Feb 29 09:27:06warning Selected the next writable cache folder in the list, will be "/tmp/.yarn-cache-0". Feb 29 09:27:06warning Skipping preferred cache folder "/usr/local/share/.cache/yarn" because it is not writable. Feb 29 09:27:07$ node scripts/check-env.js Feb 29 09:27:07$ npm-run-all copy-db-files build-db-client Feb 29 09:27:07warning Cannot find a suitable global folder. Tried these: "/usr/local, /usr/local/share/.yarn" Feb 29 09:27:07warning Cannot find a suitable global folder. Tried these: "/usr/local, /usr/local/share/.yarn" Feb 29 09:27:07warning Selected the next writable cache folder in the list, will be "/tmp/.yarn-cache-0". Feb 29 09:27:07warning Selected the next writable cache folder in the list, will be "/tmp/.yarn-cache-0". Feb 29 09:27:07warning Skipping preferred cache folder "/usr/local/share/.cache/yarn" because it is not writable. Feb 29 09:27:07warning Skipping preferred cache folder "/usr/local/share/.cache/yarn" because it is not writable. Feb 29 09:27:08$ node scripts/copy-db-files.js Feb 29 09:27:08warning Cannot find a suitable global folder. Tried these: "/usr/local, /usr/local/share/.yarn" Feb 29 09:27:08warning Selected the next writable cache folder in the list, will be "/tmp/.yarn-cache-0". Feb 29 09:27:08warning Skipping preferred cache folder "/usr/local/share/.cache/yarn" because it is not writable. Feb 29 09:27:08Copied /app/code/db/postgresql to /app/code/prisma Feb 29 09:27:08Database type detected: postgresql Feb 29 09:27:09$ prisma generate Feb 29 09:27:09warning Cannot find a suitable global folder. Tried these: "/usr/local, /usr/local/share/.yarn" Feb 29 09:27:09warning Selected the next writable cache folder in the list, will be "/tmp/.yarn-cache-0". Feb 29 09:27:09warning Skipping preferred cache folder "/usr/local/share/.cache/yarn" because it is not writable. Feb 29 09:27:10=> Healtheck error: Error: connect ECONNREFUSED 172.18.17.25:3000 Feb 29 09:27:11Prisma schema loaded from prisma/schema.prisma Feb 29 09:27:12 Feb 29 09:27:12 Feb 29 09:27:12 Feb 29 09:27:12 Feb 29 09:27:12 Feb 29 09:27:12``` Feb 29 09:27:12``` Feb 29 09:27:12``` Feb 29 09:27:12``` Feb 29 09:27:12const prisma = new PrismaClient() Feb 29 09:27:12const prisma = new PrismaClient() Feb 29 09:27:12import { PrismaClient } from '@prisma/client' Feb 29 09:27:12import { PrismaClient } from '@prisma/client/edge' Feb 29 09:27:12┌─────────────────────────────────────────────────────────────┐ Feb 29 09:27:12│ https://pris.ly/cli/accelerate │ Feb 29 09:27:12│ Deploying your app to serverless or edge functions? │ Feb 29 09:27:12│ Try Prisma Accelerate for connection pooling and caching. │ Feb 29 09:27:12See other ways of importing Prisma Client: http://pris.ly/d/importing-client Feb 29 09:27:12Start using Prisma Client in Node.js (See: https://pris.ly/d/client) Feb 29 09:27:12or start using Prisma Client at the edge (See: https://pris.ly/d/accelerate) Feb 29 09:27:12└─────────────────────────────────────────────────────────────┘ Feb 29 09:27:12✔ Generated Prisma Client (v5.9.1) to ./node_modules/@prisma/client in 379ms Feb 29 09:27:13$ node scripts/check-db.js Feb 29 09:27:13warning Cannot find a suitable global folder. Tried these: "/usr/local, /usr/local/share/.yarn" Feb 29 09:27:13warning Selected the next writable cache folder in the list, will be "/tmp/.yarn-cache-0". Feb 29 09:27:13warning Skipping preferred cache folder "/usr/local/share/.cache/yarn" because it is not writable. Feb 29 09:27:13✓ DATABASE_URL is defined. Feb 29 09:27:13✓ Database connection successful. Feb 29 09:27:13✓ Database version check successful. Feb 29 09:27:16 Feb 29 09:27:16 Feb 29 09:27:16 Feb 29 09:27:16 Feb 29 09:27:16$ rollup -c rollup.tracker.config.mjs Feb 29 09:27:16warning Cannot find a suitable global folder. Tried these: "/usr/local, /usr/local/share/.yarn" Feb 29 09:27:16warning Selected the next writable cache folder in the list, will be "/tmp/.yarn-cache-0". Feb 29 09:27:16warning Skipping preferred cache folder "/usr/local/share/.cache/yarn" because it is not writable. Feb 29 09:27:16✓ Database is up to date. Feb 29 09:27:164 migrations found in prisma/migrations Feb 29 09:27:16Datasource "db": PostgreSQL database "dbe501b1e859ba491baf25a3def0b83dfe", schema "public" at "postgresql" Feb 29 09:27:16No pending migrations to apply. Feb 29 09:27:16Prisma schema loaded from prisma/schema.prisma Feb 29 09:27:17src/tracker/index.js → public/script.js... Feb 29 09:27:17created public/script.js in 583ms Feb 29 09:27:17 Feb 29 09:27:18$ next build Feb 29 09:27:18$ node scripts/build-geo.js Feb 29 09:27:18warning Cannot find a suitable global folder. Tried these: "/usr/local, /usr/local/share/.yarn" Feb 29 09:27:18warning Cannot find a suitable global folder. Tried these: "/usr/local, /usr/local/share/.yarn" Feb 29 09:27:18warning Selected the next writable cache folder in the list, will be "/tmp/.yarn-cache-0". Feb 29 09:27:18warning Selected the next writable cache folder in the list, will be "/tmp/.yarn-cache-0". Feb 29 09:27:18warning Skipping preferred cache folder "/usr/local/share/.cache/yarn" because it is not writable. Feb 29 09:27:18warning Skipping preferred cache folder "/usr/local/share/.cache/yarn" because it is not writable. Feb 29 09:27:18Vercel environment detected. Skipping geo setup. Feb 29 09:27:20 Feb 29 09:27:20 ▲ Next.js 14.1.0 Feb 29 09:27:20 Creating an optimized production build ... Feb 29 09:27:20=> Healtheck error: Error: connect ECONNREFUSED 172.18.17.25:3000 Feb 29 09:27:31box:taskworker Starting task 4362. Logs are at /home/yellowtent/platformdata/logs/e501b1e8-59ba-491b-af25-a3def0b83dfe/apptask.log Feb 29 09:27:31box:apptask run: startTask installationState: pending_restart runState: running Feb 29 09:27:31box:tasks update 4362: {"percent":10,"message":"Starting app services"} Feb 29 09:27:31box:tasks update 4362: {"percent":20,"message":"Restarting container"} Feb 29 09:27:42==> Changing ownership Feb 29 09:27:42box:tasks update 4362: {"percent":80,"message":"Configuring reverse proxy"} Feb 29 09:27:42box:reverseproxy providerMatchesSync: subject=CN = clicks.eupalinos.eu domain=clicks.eupalinos.eu issuer=C = US, O = Let's Encrypt, CN = R3 wildcard=false/false prod=true/true issuerMismatch=false wildcardMismatch=false match=true Feb 29 09:27:42box:reverseproxy expiryDate: subject=CN = clicks.eupalinos.eu notBefore=Feb 16 09:55:44 2024 GMT notAfter=May 16 09:55:43 2024 GMT daysLeft=77.06111457175926 Feb 29 09:27:42box:reverseproxy needsRenewal: false. force: false Feb 29 09:27:42box:reverseproxy ensureCertificate: clicks.eupalinos.eu acme cert exists and is up to date Feb 29 09:27:42NOTICE: extension "pgcrypto" already exists, skipping Feb 29 09:27:42CREATE EXTENSION Feb 29 09:27:42=> Running build script that generates the migrations Feb 29 09:27:42box:reverseproxy writeAppLocationNginxConfig: writing config for "clicks.eupalinos.eu" to /home/yellowtent/platformdata/nginx/applications/e501b1e8-59ba-491b-af25-a3def0b83dfe/clicks.eupalinos.eu.conf with options {"sourceDir":"/home/yellowtent/box","vhost":"clicks.eupalinos.eu","hasIPv6":true,"ip":"172.18.17.25","port":3000,"endpoint":"app","redirectTo":null,"certFilePath":"/home/yellowtent/platformdata/nginx/cert/clicks.eupalinos.eu.cert","keyFilePath":"/home/yellowtent/platformdata/nginx/cert/clicks.eupalinos.eu.key","robotsTxtQuoted":"\"# Disable search engine indexing\\n\\nUser-agent: *\\nDisallow: /\"","cspQuoted":null,"hideHeaders":[],"proxyAuth":{"enabled":false,"id":"e501b1e8-59ba-491b-af25-a3def0b83dfe","location":"/"},"upstreamUri":"","ocsp":true,"hstsPreload":false} Feb 29 09:27:42box:shell reload spawn: /usr/bin/sudo -S /home/yellowtent/box/src/scripts/restartservice.sh nginx Feb 29 09:27:42box:shell reload (stderr): sudo: unable to resolve host 1001507-513: Name or service not known Feb 29 09:27:43yarn run v1.22.19 Feb 29 09:27:43box:tasks update 4362: {"percent":100,"message":"Done"} Feb 29 09:27:43box:taskworker Task took 11.805 seconds Feb 29 09:27:43box:tasks setCompleted - 4362: {"result":null,"error":null} Feb 29 09:27:43box:tasks update 4362: {"percent":100,"result":null,"error":null} Feb 29 09:27:43warning Skipping preferred cache folder "/usr/local/share/.cache/yarn" because it is not writable. Feb 29 09:27:43warning Selected the next writable cache folder in the list, will be "/tmp/.yarn-cache-0". Feb 29 09:27:43$ npm-run-all check-env build-db check-db build-tracker build-geo build-app Feb 29 09:27:43warning Cannot find a suitable global folder. Tried these: "/usr/local, /usr/local/share/.yarn" Feb 29 09:27:44warning Skipping preferred cache folder "/usr/local/share/.cache/yarn" because it is not writable. Feb 29 09:27:44warning Selected the next writable cache folder in the list, will be "/tmp/.yarn-cache-0". Feb 29 09:27:44$ node scripts/check-env.js Feb 29 09:27:44warning Cannot find a suitable global folder. Tried these: "/usr/local, /usr/local/share/.yarn" Feb 29 09:27:44warning Skipping preferred cache folder "/usr/local/share/.cache/yarn" because it is not writable. Feb 29 09:27:44warning Selected the next writable cache folder in the list, will be "/tmp/.yarn-cache-0". Feb 29 09:27:44$ npm-run-all copy-db-files build-db-client Feb 29 09:27:44warning Cannot find a suitable global folder. Tried these: "/usr/local, /usr/local/share/.yarn" Feb 29 09:27:45warning Skipping preferred cache folder "/usr/local/share/.cache/yarn" because it is not writable. Feb 29 09:27:45warning Selected the next writable cache folder in the list, will be "/tmp/.yarn-cache-0". Feb 29 09:27:45$ node scripts/copy-db-files.js Feb 29 09:27:45warning Cannot find a suitable global folder. Tried these: "/usr/local, /usr/local/share/.yarn" Feb 29 09:27:45Database type detected: postgresql Feb 29 09:27:45Copied /app/code/db/postgresql to /app/code/prisma Feb 29 09:27:46warning Skipping preferred cache folder "/usr/local/share/.cache/yarn" because it is not writable. Feb 29 09:27:46warning Selected the next writable cache folder in the list, will be "/tmp/.yarn-cache-0". Feb 29 09:27:46$ prisma generate Feb 29 09:27:46warning Cannot find a suitable global folder. Tried these: "/usr/local, /usr/local/share/.yarn" Feb 29 09:27:47Prisma schema loaded from prisma/schema.prisma Feb 29 09:27:49 Feb 29 09:27:49✔ Generated Prisma Client (v5.9.1) to ./node_modules/@prisma/client in 482ms Feb 29 09:27:49 Feb 29 09:27:49Start using Prisma Client in Node.js (See: https://pris.ly/d/client) Feb 29 09:27:49``` Feb 29 09:27:49import { PrismaClient } from '@prisma/client' Feb 29 09:27:49const prisma = new PrismaClient() Feb 29 09:27:49``` Feb 29 09:27:49or start using Prisma Client at the edge (See: https://pris.ly/d/accelerate) Feb 29 09:27:49``` Feb 29 09:27:49import { PrismaClient } from '@prisma/client/edge' Feb 29 09:27:49const prisma = new PrismaClient() Feb 29 09:27:49``` Feb 29 09:27:49 Feb 29 09:27:49See other ways of importing Prisma Client: http://pris.ly/d/importing-client Feb 29 09:27:49 Feb 29 09:27:49┌─────────────────────────────────────────────────────────────┐ Feb 29 09:27:49│ Deploying your app to serverless or edge functions? │ Feb 29 09:27:49│ Try Prisma Accelerate for connection pooling and caching. │ Feb 29 09:27:49│ https://pris.ly/cli/accelerate │ Feb 29 09:27:49└─────────────────────────────────────────────────────────────┘ Feb 29 09:27:49 Feb 29 09:27:49warning Skipping preferred cache folder "/usr/local/share/.cache/yarn" because it is not writable. Feb 29 09:27:49warning Selected the next writable cache folder in the list, will be "/tmp/.yarn-cache-0". Feb 29 09:27:49$ node scripts/check-db.js Feb 29 09:27:49warning Cannot find a suitable global folder. Tried these: "/usr/local, /usr/local/share/.yarn" Feb 29 09:27:49✓ DATABASE_URL is defined. Feb 29 09:27:50=> Healtheck error: Error: connect ECONNREFUSED 172.18.17.25:3000 Feb 29 09:27:50✓ Database connection successful. Feb 29 09:27:50✓ Database version check successful. Feb 29 09:27:52Prisma schema loaded from prisma/schema.prisma Feb 29 09:27:52Datasource "db": PostgreSQL database "dbe501b1e859ba491baf25a3def0b83dfe", schema "public" at "postgresql" Feb 29 09:27:52 Feb 29 09:27:524 migrations found in prisma/migrations Feb 29 09:27:52 Feb 29 09:27:52 Feb 29 09:27:52No pending migrations to apply. Feb 29 09:27:52 Feb 29 09:27:52✓ Database is up to date. Feb 29 09:27:53warning Skipping preferred cache folder "/usr/local/share/.cache/yarn" because it is not writable. Feb 29 09:27:53warning Selected the next writable cache folder in the list, will be "/tmp/.yarn-cache-0". Feb 29 09:27:53$ rollup -c rollup.tracker.config.mjs Feb 29 09:27:53warning Cannot find a suitable global folder. Tried these: "/usr/local, /usr/local/share/.yarn" Feb 29 09:27:53 Feb 29 09:27:53src/tracker/index.js → public/script.js... Feb 29 09:27:54created public/script.js in 559ms Feb 29 09:27:54warning Skipping preferred cache folder "/usr/local/share/.cache/yarn" because it is not writable. Feb 29 09:27:54warning Selected the next writable cache folder in the list, will be "/tmp/.yarn-cache-0". Feb 29 09:27:54$ node scripts/build-geo.js Feb 29 09:27:54warning Cannot find a suitable global folder. Tried these: "/usr/local, /usr/local/share/.yarn" Feb 29 09:27:54Vercel environment detected. Skipping geo setup. Feb 29 09:27:55warning Skipping preferred cache folder "/usr/local/share/.cache/yarn" because it is not writable. Feb 29 09:27:55warning Selected the next writable cache folder in the list, will be "/tmp/.yarn-cache-0". Feb 29 09:27:55$ next build Feb 29 09:27:55warning Cannot find a suitable global folder. Tried these: "/usr/local, /usr/local/share/.yarn" Feb 29 09:27:56 ▲ Next.js 14.1.0 Feb 29 09:27:56 Feb 29 09:27:56 Creating an optimized production build ... Feb 29 09:28:00=> Healtheck error: Error: connect ECONNREFUSED 172.18.17.25:3000
-
PinryPinry is potentially an alternative to https://www.are.na/ Don't know if it matches up on all the points though.
Arena is a social network and has the particularity that blocks (posts) are organized inside channels (lists) and that channels can be part of other channels. These connections establish meaningful links between the content blocks. Cleverly done. -
Element Starter Core -
Twill CMSA simple CMS a talented web designer made me discover:
https://twillcms.com/
https://github.com/area17/twill -
mirror application: one app, two domainsSeems impossible
https://meta.discourse.org/t/how-can-i-use-multiple-hostnames/136306
Maybe it would be nice to show this in the App Store, like with badges or something... We chose Discourse over NodeBB for no apparent reason. Maybe it would have made a difference.
-
mirror application: one app, two domainsHello.
Our community bumped into a stupid issue. We run an independent discourse instance to federate users from different academic institutions working in the same field. One of those stuffy institutions has decided to block access to our site from their network. Why, you ask? Most probably because their system doesn't like our fancy TLD name or so. Emails still go through, funnily.
I'm an optimist. So I think that this roadblock will eventually be cleared (after all, we're dealing with bureaucrats here), but experience tells me it might take an eternity. In the meantime, I've come up with a brilliant plan to keep things running smoothly for everyone.
For now, I try out to change the forum to a more conventional domain name and redirect the nice URL to the boring one. But ideally, what I really want is for the entire Discourse app to be mirrored – identical twins, if you will – each living under a separate domain name. That way, users from Snore Institution can stick to the boring URL while the rest of us enjoy the platform under its original identity. Is this possible by any chance ? Maybe by sharing the same storage volume with two different apps ?
Thanks for you help !
-
NC Assistant + LLM python venv issueI don't fully understand the question. But there's a pissible alternative setup skipping the LLM integration:
- Nextcloud Assistant
- OpenAI/LocalAI integration
- LocalAI server (cloudron wishlist ?)
-
NC Assistant + LLM python venv issueHello.
I tried to install the Nextcloud Assistant app together with the LLM integration. It requires acces to some python environement which seems not to be available. Or at least the app cannot set up the environement. As the interface does not give away much I run
occ maintenance:repair
where I saw the pyhton issue:# occ maintenance:repair ... - Install dependencies for llm app {"reqId":"dL8d7Onb88EjTGpwg1Qd","level":3,"time":"2024-01-23T21:06:35+00:00","remoteAddr":"","user":"--","app":"llm","method":"","url":"--","message":"Failed to create python venv: python3 -m venv ./python returned The virtual environment was not created successfully because ensurepip is not\navailable. On Debian/Ubuntu systems, you need to install the python3-venv\npackage using the following command.\n\n apt install python3.10-venv\n\nYou may need to use sudo with that command. After installing the python3-venv\npackage, recreate your virtual environment.\n\nFailing command: /app/data/apps/llm/python/bin/python3","userAgent":"--","version":"27.1.4.1","data":{"app":"llm"}} {"reqId":"dL8d7Onb88EjTGpwg1Qd","level":3,"time":"2024-01-23T21:06:35+00:00","remoteAddr":"","user":"--","app":"llm","method":"","url":"--","message":"Failed to install python dependencies: Failed to create python venv: python3 -m venv ./python returned The virtual environment was not created successfully because ensurepip is not\navailable. On Debian/Ubuntu systems, you need to install the python3-venv\npackage using the following command.\n\n apt install python3.10-venv\n\nYou may need to use sudo with that command. After installing the python3-venv\npackage, recreate your virtual environment.\n\nFailing command: /app/data/apps/llm/python/bin/python3","userAgent":"--","version":"27.1.4.1","exception":{"Exception":"Exception","Message":"Failed to create python venv: python3 -m venv ./python returned The virtual environment was not created successfully because ensurepip is not\navailable. On Debian/Ubuntu systems, you need to install the python3-venv\npackage using the following command.\n\n apt install python3.10-venv\n\nYou may need to use sudo with that command. After installing the python3-venv\npackage, recreate your virtual environment.\n\nFailing command: /app/data/apps/llm/python/bin/python3","Code":0,"Trace":[{"file":"/app/data/apps/llm/lib/Migration/InstallDeps.php","line":28,"function":"runPoetryInstall","class":"OCA\\Llm\\Migration\\InstallDeps","type":"->"},{"file":"/app/code/lib/private/Repair.php","line":127,"function":"run","class":"OCA\\Llm\\Migration\\InstallDeps","type":"->"},{"file":"/app/code/core/Command/Maintenance/Repair.php","line":123,"function":"run","class":"OC\\Repair","type":"->"},{"file":"/app/code/3rdparty/symfony/console/Command/Command.php","line":298,"function":"execute","class":"OC\\Core\\Command\\Maintenance\\Repair","type":"->"},{"file":"/app/code/3rdparty/symfony/console/Application.php","line":1040,"function":"run","class":"Symfony\\Component\\Console\\Command\\Command","type":"->"},{"file":"/app/code/3rdparty/symfony/console/Application.php","line":301,"function":"doRunCommand","class":"Symfony\\Component\\Console\\Application","type":"->"},{"file":"/app/code/3rdparty/symfony/console/Application.php","line":171,"function":"doRun","class":"Symfony\\Component\\Console\\Application","type":"->"},{"file":"/app/code/lib/private/Console/Application.php","line":211,"function":"run","class":"Symfony\\Component\\Console\\Application","type":"->"},{"file":"/app/code/console.php","line":100,"function":"run","class":"OC\\Console\\Application","type":"->"},{"file":"/app/code/occ","line":11,"args":["/app/code/console.php"],"function":"require_once"}],"File":"/app/data/apps/llm/lib/Migration/InstallDeps.php","Line":43,"message":"Failed to install python dependencies: Failed to create python venv: python3 -m venv ./python returned The virtual environment was not created successfully because ensurepip is not\navailable. On Debian/Ubuntu systems, you need to install the python3-venv\npackage using the following command.\n\n apt install python3.10-venv\n\nYou may need to use sudo with that command. After installing the python3-venv\npackage, recreate your virtual environment.\n\nFailing command: /app/data/apps/llm/python/bin/python3","exception":{},"CustomMessage":"Failed to install python dependencies: Failed to create python venv: python3 -m venv ./python returned The virtual environment was not created successfully because ensurepip is not\navailable. On Debian/Ubuntu systems, you need to install the python3-venv\npackage using the following command.\n\n apt install python3.10-venv\n\nYou may need to use sudo with that command. After installing the python3-venv\npackage, recreate your virtual environment.\n\nFailing command: /app/data/apps/llm/python/bin/python3"}} {"reqId":"dL8d7Onb88EjTGpwg1Qd","level":3,"time":"2024-01-23T21:06:35+00:00","remoteAddr":"","user":"--","app":"no app in context","method":"","url":"--","message":"Exception while executing repair step Install dependencies for llm app","userAgent":"--","version":"27.1.4.1","exception":{"Exception":"Exception","Message":"Failed to install python dependencies: Failed to create python venv: python3 -m venv ./python returned The virtual environment was not created successfully because ensurepip is not\navailable. On Debian/Ubuntu systems, you need to install the python3-venv\npackage using the following command.\n\n apt install python3.10-venv\n\nYou may need to use sudo with that command. After installing the python3-venv\npackage, recreate your virtual environment.\n\nFailing command: /app/data/apps/llm/python/bin/python3","Code":0,"Trace":[{"file":"/app/data/apps/llm/lib/Migration/InstallDeps.php","line":28,"function":"runPoetryInstall","class":"OCA\\Llm\\Migration\\InstallDeps","type":"->"},{"file":"/app/code/lib/private/Repair.php","line":127,"function":"run","class":"OCA\\Llm\\Migration\\InstallDeps","type":"->"},{"file":"/app/code/core/Command/Maintenance/Repair.php","line":123,"function":"run","class":"OC\\Repair","type":"->"},{"file":"/app/code/3rdparty/symfony/console/Command/Command.php","line":298,"function":"execute","class":"OC\\Core\\Command\\Maintenance\\Repair","type":"->"},{"file":"/app/code/3rdparty/symfony/console/Application.php","line":1040,"function":"run","class":"Symfony\\Component\\Console\\Command\\Command","type":"->"},{"file":"/app/code/3rdparty/symfony/console/Application.php","line":301,"function":"doRunCommand","class":"Symfony\\Component\\Console\\Application","type":"->"},{"file":"/app/code/3rdparty/symfony/console/Application.php","line":171,"function":"doRun","class":"Symfony\\Component\\Console\\Application","type":"->"},{"file":"/app/code/lib/private/Console/Application.php","line":211,"function":"run","class":"Symfony\\Component\\Console\\Application","type":"->"},{"file":"/app/code/console.php","line":100,"function":"run","class":"OC\\Console\\Application","type":"->"},{"file":"/app/code/occ","line":11,"args":["/app/code/console.php"],"function":"require_once"}],"File":"/app/data/apps/llm/lib/Migration/InstallDeps.php","Line":74,"message":"Exception while executing repair step Install dependencies for llm app","exception":{},"CustomMessage":"Exception while executing repair step Install dependencies for llm app"}} - ERROR: Failed to install python dependencies: Failed to create python venv: python3 -m venv ./python returned The virtual environment was not created successfully because ensurepip is not available. On Debian/Ubuntu systems, you need to install the python3-venv package using the following command. apt install python3.10-venv You may need to use sudo with that command. After installing the python3-venv package, recreate your virtual environment. Failing command: /app/data/apps/llm/python/bin/python3 ...
Thanks for any info to solve this
https://apps.nextcloud.com/apps/llm
https://apps.nextcloud.com/apps/assistant -
Element Starter CoreMaybe of interest, and could be an app by itself I guess: https://github.com/vector-im/ess-starter-edition-core#element-starter-core
-
Cleanup Backups failureThanks. OK, I misunderstood the following logs as if Cloudron couldn’t access the bucket server in some way:
@mononym said in Cleanup Backups failure:
sudo: unable to resolve host 1001507-513: Name or service not known
Could you develop what you mean by this:
@girish said in Cleanup Backups failure:
[…] because the backup was created not before the preservation time.
I guess the issue is in the settings then?
-
Cleanup Backups failureHello. I finally set up an external storage for the automatic backups. It's a bucket at scaleway.com. Works perfectly to back up from Cloudron. But after, the "Cleanup Backups" process seems to have issues to remove old backups. The consequence is that the storage is filling up over time.
Below are the logs when launching the Cleanup process.
Thanks for your help.
Mar 05 10:41:58 box:tasks startTask - starting task 1996 with options {}. logs at /home/yellowtent/platformdata/logs/tasks/1996.log Mar 05 10:41:58 box:shell startTask spawn: /usr/bin/sudo -S -E /home/yellowtent/box/src/scripts/starttask.sh 1996 /home/yellowtent/platformdata/logs/tasks/1996.log 0 400 Mar 05 10:41:58 box:shell startTask (stdout): sudo: unable to resolve host 1001507-513: Name or service not known Mar 05 10:41:58 box:shell startTask (stdout): Running as unit: box-task-1996.service Mar 05 10:42:03 box:apphealthmonitor app health: 2 running / 0 stopped / 0 unresponsive Mar 05 10:42:04 box:apphealthmonitor app health: 2 running / 0 stopped / 0 unresponsive Mar 05 10:42:08 box:shell startTask (stdout): Finished with result: success processes terminated with: code=exited/status=0 runtime: 9.427s Mar 05 10:42:08 box:shell startTask (stdout): Service box-task-1996 finished with exit code 0 Mar 05 10:42:08 box:tasks startTask: 1996 completed with code 0 Mar 05 10:42:08 box:tasks startTask: 1996 done. error: null
2023-03-05T10:42:00.135Z box:settings initCache: pre-load settings 2023-03-05T10:42:00.205Z box:taskworker Starting task 1996. Logs are at /home/yellowtent/platformdata/logs/tasks/1996.log 2023-03-05T10:42:00.481Z box:backupcleaner clean: mount point status is {"state":"active"} 2023-03-05T10:42:00.482Z box:tasks update 1996: {"percent":10,"message":"Cleaning box backups"} 2023-03-05T10:42:00.525Z box:backupcleaner applyBackupRetentionPolicy: box_v7.3.6_ab7b1d41649953eae220178abeffc1721cd95d2a6edc208c3d2c32195d77900c box keepWithinSecs 2023-03-05T10:42:00.525Z box:backupcleaner applyBackupRetentionPolicy: box_v7.3.5_17e59faebaa63bf5b92b99affe19644da805b1cbd87958be235b2856479bde15 box preserveSecs 2023-03-05T10:42:00.525Z box:backupcleaner applyBackupRetentionPolicy: box_v7.3.5_aa76eb1bb0577e1e8de134ad69094b6ede01d6922534538253245b8f2b20787f box preserveSecs 2023-03-05T10:42:00.525Z box:backupcleaner applyBackupRetentionPolicy: box_v7.3.5_4fdf94eeb2760c5657bbf829a8fde66594d374be68a449aca977a875ced87ec7 box preserveSecs 2023-03-05T10:42:00.526Z box:backupcleaner applyBackupRetentionPolicy: box_v7.3.5_11a38cf91e4eefa4b196b31f8f747054a9a7c10815c03380041dc1daf62851f2 box preserveSecs 2023-03-05T10:42:00.530Z box:backupcleaner applyBackupRetentionPolicy: box_v7.3.5_e19ec533fd0033edd0ded6df6ea8b849f02498554d8e3ea0bb28259f03157821 box preserveSecs 2023-03-05T10:42:00.530Z box:backupcleaner applyBackupRetentionPolicy: box_v7.3.5_dc1f76cde12b88d24c9ecd4b88c238086ff57807031288900a943764d7ac5683 box preserveSecs 2023-03-05T10:42:00.530Z box:backupcleaner applyBackupRetentionPolicy: box_v7.3.5_a465c2346a236acf64e63832c4b976925508dd5b59b01c6279a6674db756997c box preserveSecs 2023-03-05T10:42:00.530Z box:backupcleaner applyBackupRetentionPolicy: box_v7.3.5_e211300def60af75aa02116aaff4cd6b7100692edadf79ce7aa67a8a1e41bb85 box preserveSecs 2023-03-05T10:42:00.530Z box:backupcleaner applyBackupRetentionPolicy: box_v7.3.5_f9260b1ac9bab1a92f010c3be29eb57805eb18b7bee456f4a11af827d10b4999 box preserveSecs 2023-03-05T10:42:00.530Z box:backupcleaner applyBackupRetentionPolicy: box_v7.3.5_f2eb7526246634811e8bf298e17c8cbefb677eb5e8c6d577b9369ca0a141be62 box preserveSecs 2023-03-05T10:42:00.530Z box:backupcleaner applyBackupRetentionPolicy: box_v7.3.5_c0cf112be0d26c7a123b9dc9c2355ad0fb6e74da9b9c36a67d61aadcb6f3c98d box preserveSecs 2023-03-05T10:42:00.532Z box:backupcleaner cleanupBoxBackups: done 2023-03-05T10:42:00.532Z box:tasks update 1996: {"percent":20,"message":"Cleaning mail backups"} 2023-03-05T10:42:00.568Z box:backupcleaner applyBackupRetentionPolicy: mail_v7.3.6_0146159f52fc86cbab328e72ac014f23659a28efee094f97e05cf6f6ddadc92f mail reference 2023-03-05T10:42:00.568Z box:backupcleaner applyBackupRetentionPolicy: mail_v7.3.5_3181cf4eb848035a4deacc2cdda660cff6733a03c171fe11e9d2240b909a16ae mail reference 2023-03-05T10:42:00.568Z box:backupcleaner applyBackupRetentionPolicy: mail_v7.3.5_fd6720f6028447e31f510dff2de07a9a8fb5a04ffb696c7fd7543ca0b3032656 mail reference 2023-03-05T10:42:00.568Z box:backupcleaner applyBackupRetentionPolicy: mail_v7.3.5_5d932e7fc8c3d38f9745c5d6d17f816c5410d2a92ec66c22ccf3fb515080233e mail reference 2023-03-05T10:42:00.568Z box:backupcleaner applyBackupRetentionPolicy: mail_v7.3.5_8f2868923a011a0666ec61449b953a068bc791b38228e6199fbe9b97cc9d3dc1 mail reference 2023-03-05T10:42:00.568Z box:backupcleaner applyBackupRetentionPolicy: mail_v7.3.5_5685501a867dcdb45abf829a9894737513c55d54ecd2ab7b4964d50fdda3847a mail reference 2023-03-05T10:42:00.568Z box:backupcleaner applyBackupRetentionPolicy: mail_v7.3.5_b165a753eac3c1d66ea46e9fb1410d15b0145f7d748370ad9ff4a3d85d33f585 mail reference 2023-03-05T10:42:00.568Z box:backupcleaner applyBackupRetentionPolicy: mail_v7.3.5_31c3822f09879c92bbc1599fedc9360e1dc70ea1a6bdc8c68abd92f833f9bdc0 mail reference 2023-03-05T10:42:00.570Z box:backupcleaner applyBackupRetentionPolicy: mail_v7.3.5_9cec5456ea61f06df125594fb2866109d99f86072d2ac8b39e3912760a7fe750 mail reference 2023-03-05T10:42:00.570Z box:backupcleaner applyBackupRetentionPolicy: mail_v7.3.5_f56cb92e0fd24ff852a6f1e354dcc3c8c01110597e1f5627d87438550830b1ff mail reference 2023-03-05T10:42:00.570Z box:backupcleaner applyBackupRetentionPolicy: mail_v7.3.5_2ca10305ebebbc93b9fc95dea590b9dad36b6ebabfcf8b79ca9c4d8dbdfb2d9d mail reference 2023-03-05T10:42:00.570Z box:backupcleaner applyBackupRetentionPolicy: mail_v7.3.5_4130885afcd6be79100c83795a1e12102eb122a9c7ac607e5eae5f11adedb203 mail reference 2023-03-05T10:42:00.570Z box:backupcleaner cleanupMailBackups: done 2023-03-05T10:42:00.570Z box:tasks update 1996: {"percent":40,"message":"Cleaning app backups"} 2023-03-05T10:42:00.643Z box:backupcleaner applyBackupRetentionPolicy: app_f407bd77-c312-4bb9-bf8e-78c6504be031_v4.16.3_795cd20e973f7335fdc632150aeeafd65caa674da34e1e79b69148b2cc3d1ca4 app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_f407bd77-c312-4bb9-bf8e-78c6504be031_v4.16.2_7c46657af6d3715b922212c1451422baae6deefee57178b0c6ce71102b87eb9a app preserveSecs 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_f407bd77-c312-4bb9-bf8e-78c6504be031_v4.16.2_d28420ab522c0c653e1eb8b2314aa0cc7c4e00c0324bdd5766aa044f45c823ab app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_f407bd77-c312-4bb9-bf8e-78c6504be031_v4.16.2_0fe179a751ed448eb8eebdbe902e5d6f06f2c57dcd4e30f4c15fbbe9155942ac app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_f407bd77-c312-4bb9-bf8e-78c6504be031_v4.16.2_95ce1382e82a41810636f343ed41307300dd13110982d28ee3e7cff886807ba3 app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_f407bd77-c312-4bb9-bf8e-78c6504be031_v4.16.2_b851372dc2b38728d0463c6bbc7a6b06e1589807907cb49e89993275f5208fcf app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_f407bd77-c312-4bb9-bf8e-78c6504be031_v4.16.2_6b32a06cacfb397a40e496e0b2ac551cf73b3272115f5013284dd83e1edff36b app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_f407bd77-c312-4bb9-bf8e-78c6504be031_v4.16.2_741c5d3563ecd09dbe01db8a2a56d3c70f583db1a82a7e7efd822933ea6804fe app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_f407bd77-c312-4bb9-bf8e-78c6504be031_v4.16.2_f7f118462b2dbd5e2d917f8cab1ef4d99617ef8a70649eeee7a7188e66f51bcf app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_f407bd77-c312-4bb9-bf8e-78c6504be031_v4.16.2_128bbde01808d30216cd92f54bc46d06fe399a271049120c1827e9e2fbc12b1c app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_f407bd77-c312-4bb9-bf8e-78c6504be031_v4.16.2_8036676bcc2a5b8aef4d8fe6193862be87a578bd6a21c0eef5354ae2ba01421c app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_f407bd77-c312-4bb9-bf8e-78c6504be031_v4.16.2_7e2f1991884d18c3510acf4e4902cb218729f7b0cd8ec60f2515b9c334ad81e1 app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_f407bd77-c312-4bb9-bf8e-78c6504be031_v4.16.2_60227a037a9d45364277b235137756caa41cb3403ee7209cc66c35ea06ad5e32 app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_1f819db3-6496-4976-acb8-6bf9a69c7f19_v1.7.17_bf75bb3fc4033174817e284aef386e34775a487c0a1817bd465682647376ba66 app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_1f819db3-6496-4976-acb8-6bf9a69c7f19_v1.7.17_9bcca97ad29ab211afb3b84c69bd8a6bca40eae3a5d2da2641c94166dc623e72 app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_1f819db3-6496-4976-acb8-6bf9a69c7f19_v1.7.17_baf1324d6c3611db6ce21008af2ceba8b0a5bd6d7c07b5930ec8f2a0efb7286f app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_1f819db3-6496-4976-acb8-6bf9a69c7f19_v1.7.17_dc42c65abf37962388af91c1ee92a40f90c1d4e0a090dda094a1fdc94c72e4d7 app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_1f819db3-6496-4976-acb8-6bf9a69c7f19_v1.7.17_df02571e4a2692327b2c020616117b61bf65cff2337e7b096d72b41ec48ece0c app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_1f819db3-6496-4976-acb8-6bf9a69c7f19_v1.7.17_c6f957ceffa8fe97748d45f432405b29e33061ca0164f2c2395440af732f1ab7 app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_1f819db3-6496-4976-acb8-6bf9a69c7f19_v1.7.17_e99160e984f481b81563af72c4cd9d5359ec1b75ad75225579dfdb6281767988 app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_1f819db3-6496-4976-acb8-6bf9a69c7f19_v1.7.17_d7ad9a094f3b90388fcaf7f84ae33f1d1668a126358a9336c52f96c7950a1ba5 app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_1f819db3-6496-4976-acb8-6bf9a69c7f19_v1.7.17_0244560b6bbf5c3219b30596ec2092e72feb1b8ba12e10d552c58bd8ac9a8884 app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_1f819db3-6496-4976-acb8-6bf9a69c7f19_v1.7.17_349d8dec578274199ca2b83660d252e1cdc9da876450518903255e6609614aa5 app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_1f819db3-6496-4976-acb8-6bf9a69c7f19_v1.7.17_c32fcd1b3767192c74ef0c8cec8a9b7c70796cedb180751b5f880370321a0cda app reference 2023-03-05T10:42:00.644Z box:backupcleaner applyBackupRetentionPolicy: app_1f819db3-6496-4976-acb8-6bf9a69c7f19_v1.7.17_a22012b44607dd8cfefc52c9cbe8838fd031400d0d1ea2b73dd9fa68ed23136e app reference 2023-03-05T10:42:00.644Z box:backupcleaner cleanupAppBackups: done 2023-03-05T10:42:00.645Z box:tasks update 1996: {"percent":70,"message":"Checking storage backend and removing stale entries in database"} 2023-03-05T10:42:08.172Z box:backupcleaner cleanupMissingBackups: done 2023-03-05T10:42:08.172Z box:tasks update 1996: {"percent":90,"message":"Cleaning snapshots"} 2023-03-05T10:42:08.208Z box:backupcleaner cleanupSnapshots: done 2023-03-05T10:42:08.208Z box:taskworker Task took 8.194 seconds 2023-03-05T10:42:08.209Z box:tasks setCompleted - 1996: {"result":{"removedBoxBackupPaths":[],"removedMailBackupPaths":[],"removedAppBackupPaths":[],"missingBackupPaths":[]},"error":null} 2023-03-05T10:42:08.209Z box:tasks update 1996: {"percent":100,"result":{"removedBoxBackupPaths":[],"removedMailBackupPaths":[],"removedAppBackupPaths":[],"missingBackupPaths":[]},"error":null}
-
Update to version 7.3.6 failed@girish said in Update to version 7.3.6 failed:
systemctl kill cloudron-updater
This solved it ! Thank you !
-
Update to version 7.3.6 failed2023-02-18T11:02:54.535Z box:settings initCache: pre-load settings 2023-02-18T11:02:54.550Z box:taskworker Starting task 1897. Logs are at /home/yellowtent/platformdata/logs/tasks/1897.log 2023-02-18T11:02:54.553Z box:tasks update 1897: {"percent":1,"message":"Checking disk space"} 2023-02-18T11:02:54.564Z box:tasks update 1897: {"percent":5,"message":"Downloading and verifying release"} 2023-02-18T11:02:54.567Z box:updater Downloading https://releases.cloudron.io/versions.json to /home/yellowtent/platformdata/update/versions.json 2023-02-18T11:02:54.567Z box:updater downloadUrl: curl -s --fail https://releases.cloudron.io/versions.json -o /home/yellowtent/platformdata/update/versions.json 2023-02-18T11:02:54.567Z box:shell downloadUrl spawn: /usr/bin/curl -s --fail https://releases.cloudron.io/versions.json -o /home/yellowtent/platformdata/update/versions.json 2023-02-18T11:02:55.548Z box:updater downloadUrl: downloaded https://releases.cloudron.io/versions.json to /home/yellowtent/platformdata/update/versions.json 2023-02-18T11:02:55.548Z box:updater Downloading https://releases.cloudron.io/versions.json.sig to /home/yellowtent/platformdata/update/versions.json.sig 2023-02-18T11:02:55.548Z box:updater downloadUrl: curl -s --fail https://releases.cloudron.io/versions.json.sig -o /home/yellowtent/platformdata/update/versions.json.sig 2023-02-18T11:02:55.548Z box:shell downloadUrl spawn: /usr/bin/curl -s --fail https://releases.cloudron.io/versions.json.sig -o /home/yellowtent/platformdata/update/versions.json.sig 2023-02-18T11:02:55.936Z box:updater downloadUrl: downloaded https://releases.cloudron.io/versions.json.sig to /home/yellowtent/platformdata/update/versions.json.sig 2023-02-18T11:02:55.936Z box:updater gpgVerify: /usr/bin/gpg --status-fd 1 --no-default-keyring --keyring /home/yellowtent/box/src/releases.gpg --verify /home/yellowtent/platformdata/update/versions.json.sig /home/yellowtent/platformdata/update/versions.json 2023-02-18T11:02:55.936Z box:shell gpgVerify exec: /usr/bin/gpg --status-fd 1 --no-default-keyring --keyring /home/yellowtent/box/src/releases.gpg --verify /home/yellowtent/platformdata/update/versions.json.sig /home/yellowtent/platformdata/update/versions.json 2023-02-18T11:02:55.961Z box:shell gpgVerify (stdout): [GNUPG:] NEWSIG [GNUPG:] KEY_CONSIDERED 0EADB19CDDA23CD0FE71E3470A372F8703C493CC 0 [GNUPG:] SIG_ID h6xU8U2yvdlagNvYd/tQyU/RQ0A 2023-02-02 1675330575 [GNUPG:] KEY_CONSIDERED 0EADB19CDDA23CD0FE71E3470A372F8703C493CC 0 [GNUPG:] GOODSIG 0A372F8703C493CC Cloudron UG <admin@cloudron.io> [GNUPG:] VALIDSIG 0EADB19CDDA23CD0FE71E3470A372F8703C493CC 2023-02-02 1675330575 0 4 0 1 10 00 0EADB19CDDA23CD0FE71E3470A372F8703C493CC [GNUPG:] KEY_CONSIDERED 0EADB19CDDA23CD0FE71E3470A372F8703C493CC 0 [GNUPG:] TRUST_UNDEFINED 0 pgp [GNUPG:] VERIFICATION_COMPLIANCE_MODE 23 2023-02-18T11:02:55.961Z box:shell gpgVerify (stderr): gpg: Signature made Thu Feb 2 09:36:15 2023 UTC gpg: using RSA key 0EADB19CDDA23CD0FE71E3470A372F8703C493CC gpg: Good signature from "Cloudron UG <admin@cloudron.io>" [unknown] gpg: WARNING: This key is not certified with a trusted signature! gpg: There is no indication that the signature belongs to the owner. Primary key fingerprint: 0EAD B19C DDA2 3CD0 FE71 E347 0A37 2F87 03C4 93CC 2023-02-18T11:02:55.963Z box:updater Downloading https://releases.cloudron.io/box-7df1399f17-b36f59f481-7.3.6.tar.gz to /home/yellowtent/platformdata/update/box.tar.gz 2023-02-18T11:02:55.963Z box:updater downloadUrl: curl -s --fail https://releases.cloudron.io/box-7df1399f17-b36f59f481-7.3.6.tar.gz -o /home/yellowtent/platformdata/update/box.tar.gz 2023-02-18T11:02:55.963Z box:shell downloadUrl spawn: /usr/bin/curl -s --fail https://releases.cloudron.io/box-7df1399f17-b36f59f481-7.3.6.tar.gz -o /home/yellowtent/platformdata/update/box.tar.gz 2023-02-18T11:02:58.021Z box:updater downloadUrl: downloaded https://releases.cloudron.io/box-7df1399f17-b36f59f481-7.3.6.tar.gz to /home/yellowtent/platformdata/update/box.tar.gz 2023-02-18T11:02:58.022Z box:updater Downloading https://releases.cloudron.io/box-7df1399f17-b36f59f481-7.3.6.tar.gz.sig to /home/yellowtent/platformdata/update/box.tar.gz.sig 2023-02-18T11:02:58.022Z box:updater downloadUrl: curl -s --fail https://releases.cloudron.io/box-7df1399f17-b36f59f481-7.3.6.tar.gz.sig -o /home/yellowtent/platformdata/update/box.tar.gz.sig 2023-02-18T11:02:58.022Z box:shell downloadUrl spawn: /usr/bin/curl -s --fail https://releases.cloudron.io/box-7df1399f17-b36f59f481-7.3.6.tar.gz.sig -o /home/yellowtent/platformdata/update/box.tar.gz.sig 2023-02-18T11:02:58.411Z box:updater downloadUrl: downloaded https://releases.cloudron.io/box-7df1399f17-b36f59f481-7.3.6.tar.gz.sig to /home/yellowtent/platformdata/update/box.tar.gz.sig 2023-02-18T11:02:58.411Z box:updater gpgVerify: /usr/bin/gpg --status-fd 1 --no-default-keyring --keyring /home/yellowtent/box/src/releases.gpg --verify /home/yellowtent/platformdata/update/box.tar.gz.sig /home/yellowtent/platformdata/update/box.tar.gz 2023-02-18T11:02:58.411Z box:shell gpgVerify exec: /usr/bin/gpg --status-fd 1 --no-default-keyring --keyring /home/yellowtent/box/src/releases.gpg --verify /home/yellowtent/platformdata/update/box.tar.gz.sig /home/yellowtent/platformdata/update/box.tar.gz 2023-02-18T11:02:58.594Z box:shell gpgVerify (stdout): [GNUPG:] NEWSIG [GNUPG:] KEY_CONSIDERED 0EADB19CDDA23CD0FE71E3470A372F8703C493CC 0 [GNUPG:] SIG_ID Z7LAKQiRgIE5mAceEzF7mgTqV6w 2023-02-01 1675289078 [GNUPG:] KEY_CONSIDERED 0EADB19CDDA23CD0FE71E3470A372F8703C493CC 0 [GNUPG:] GOODSIG 0A372F8703C493CC Cloudron UG <admin@cloudron.io> [GNUPG:] VALIDSIG 0EADB19CDDA23CD0FE71E3470A372F8703C493CC 2023-02-01 1675289078 0 4 0 1 10 00 0EADB19CDDA23CD0FE71E3470A372F8703C493CC [GNUPG:] KEY_CONSIDERED 0EADB19CDDA23CD0FE71E3470A372F8703C493CC 0 [GNUPG:] TRUST_UNDEFINED 0 pgp [GNUPG:] VERIFICATION_COMPLIANCE_MODE 23 2023-02-18T11:02:58.594Z box:shell gpgVerify (stderr): gpg: Signature made Wed Feb 1 22:04:38 2023 UTC gpg: using RSA key 0EADB19CDDA23CD0FE71E3470A372F8703C493CC gpg: Good signature from "Cloudron UG <admin@cloudron.io>" [unknown] gpg: WARNING: This key is not certified with a trusted signature! gpg: There is no indication that the signature belongs to the owner. Primary key fingerprint: 0EAD B19C DDA2 3CD0 FE71 E347 0A37 2F87 03C4 93CC 2023-02-18T11:02:58.596Z box:updater extractTarball: tar -zxf /home/yellowtent/platformdata/update/box.tar.gz -C /tmp/box-4230898975 2023-02-18T11:02:58.596Z box:shell extractTarball spawn: /bin/tar -zxf /home/yellowtent/platformdata/update/box.tar.gz -C /tmp/box-4230898975 2023-02-18T11:02:59.839Z box:updater extractTarball: extracted /home/yellowtent/platformdata/update/box.tar.gz to /tmp/box-4230898975 2023-02-18T11:02:59.839Z box:updater Updating box with https://releases.cloudron.io/box-7df1399f17-b36f59f481-7.3.6.tar.gz 2023-02-18T11:02:59.839Z box:tasks update 1897: {"percent":70,"message":"Installing update"} 2023-02-18T11:02:59.840Z box:shell update spawn: /usr/bin/sudo -S /home/yellowtent/box/src/scripts/update.sh /tmp/box-4230898975 2023-02-18T11:02:59.847Z box:shell update (stdout): sudo: unable to resolve host 1001507-513: Name or service not known 2023-02-18T11:02:59.859Z box:shell update (stdout): Updating Cloudron with /tmp/box-4230898975 => reset service cloudron-updater status (of previous update) 2023-02-18T11:02:59.871Z box:shell update (stdout): Failed to install cloudron. See log for details 2023-02-18T11:02:59.872Z box:shell update code: 1, signal: null 2023-02-18T11:02:59.873Z box:taskworker Task took 5.385 seconds 2023-02-18T11:02:59.873Z box:tasks setCompleted - 1897: {"result":null,"error":{"stack":"BoxError: update exited with code 1 signal null\n at ChildProcess.<anonymous> (/home/yellowtent/box/src/shell.js:78:17)\n at ChildProcess.emit (node:events:513:28)\n at Process.ChildProcess._handle.onexit (node:internal/child_process:293:12)","name":"BoxError","reason":"Spawn Error","details":{},"message":"update exited with code 1 signal null","code":1,"signal":null}} 2023-02-18T11:02:59.873Z box:tasks update 1897: {"percent":100,"result":null,"error":{"stack":"BoxError: update exited with code 1 signal null\n at ChildProcess.<anonymous> (/home/yellowtent/box/src/shell.js:78:17)\n at ChildProcess.emit (node:events:513:28)\n at Process.ChildProcess._handle.onexit (node:internal/child_process:293:12)","name":"BoxError","reason":"Spawn Error","details":{},"message":"update exited with code 1 signal null","code":1,"signal":null}} 2023-02-18T11:02:59.864Z box:shell update (stdout): => Run installer.sh as cloudron-updater. 2023-02-18T11:02:59.865Z box:shell update (stdout): => starting service cloudron-updater. see logs at /home/yellowtent/platformdata/logs/updater/cloudron-updater-2023-02-18_11-02-59.log 2023-02-18T11:02:59.871Z box:shell update (stdout): Failed to start transient service unit: Unit cloudron-updater.service already exists. BoxError: update exited with code 1 signal null at ChildProcess.<anonymous> (/home/yellowtent/box/src/shell.js:78:17) at ChildProcess.emit (node:events:513:28) at Process.ChildProcess._handle.onexit (node:internal/child_process:293:12)
-
Update to version 7.3.6 failedHello.
The latest update fails on my server. The last notifications show "Failed to update Cloudron: Task 1873 timed out." In the Event Log many more failed attempts show up. The pattern isCloudron update to version 7.3.6 was started
>Backup cleaner removed 0 backups
>Cloudron update errored. Error: update exited with code 1 signal null
. Or:{ "taskId": "1894", "boxUpdateInfo": { "version": "7.3.6", "changelog": [ "Fix display of box backups", "aws: add melbourne region", "mail usage: fix issue caused by deleted mailboxes", "reverseproxy: fix issue where renewed certs are not written to disk", "support: fix crash when opening tickets with 0 length files" ], "sourceTarballUrl": "https://releases.cloudron.io/box-7df1399f17-b36f59f481-7.3.6.tar.gz", "sourceTarballSigUrl": "https://releases.cloudron.io/box-7df1399f17-b36f59f481-7.3.6.tar.gz.sig", "boxVersionsUrl": "https://releases.cloudron.io/versions.json", "boxVersionsSigUrl": "https://releases.cloudron.io/versions.json.sig", "unstable": false } }
{ "taskId": "1895", "errorMessage": null, "removedBoxBackupPaths": [], "removedMailBackupPaths": [], "removedAppBackupPaths": [], "missingBackupPaths": [] }
{ "taskId": "1894", "errorMessage": "update exited with code 1 signal null", "timedOut": false }
Tried again with skipping the backup:
Feb 18 11:02:53 box:locker Acquired : box_update Feb 18 11:02:53 box:tasks startTask - starting task 1897 with options {"timeout":72000000,"nice":15,"memoryLimit":800}. logs at /home/yellowtent/platformdata/logs/tasks/1897.log Feb 18 11:02:53 box:shell startTask spawn: /usr/bin/sudo -S -E /home/yellowtent/box/src/scripts/starttask.sh 1897 /home/yellowtent/platformdata/logs/tasks/1897.log 15 800 Feb 18 11:02:53 box:shell startTask (stdout): sudo: unable to resolve host 1001507-513: Name or service not known Feb 18 11:02:53 box:shell startTask (stdout): Running as unit: box-task-1897.service Feb 18 11:02:59 box:shell startTask (stdout): Finished with result: exit-code processes terminated with: code=exited/status=50 runtime: 6.005s Feb 18 11:02:59 box:shell startTask (stdout): Service box-task-1897 failed to run Feb 18 11:02:59 box:shell startTask (stdout): Service box-task-1897 finished with exit code 1 Feb 18 11:02:59 box:shell startTask code: 1, signal: null Feb 18 11:02:59 box:tasks startTask: 1897 completed with code 1 Feb 18 11:02:59 box:locker Released : box_update Feb 18 11:02:59 box:updater Update failed with error { stack: 'BoxError: update exited with code 1 signal null\n' + ' at ChildProcess.<anonymous> (/home/yellowtent/box/src/shell.js:78:17)\n' + ' at ChildProcess.emit (node:events:513:28)\n' + ' at Process.ChildProcess._handle.onexit (node:internal/child_process:293:12)', name: 'BoxError', reason: 'Spawn Error', details: {}, message: 'update exited with code 1 signal null', code: 1, signal: null Feb 18 11:02:59 box:tasks startTask: 1897 done. error: { stack: 'BoxError: update exited with code 1 signal null\n' + ' at ChildProcess.<anonymous> (/home/yellowtent/box/src/shell.js:78:17)\n' + ' at ChildProcess.emit (node:events:513:28)\n' + ' at Process.ChildProcess._handle.onexit (node:internal/child_process:293:12)', name: 'BoxError', reason: 'Spawn Error', details: {}, message: 'update exited with code 1 signal null', code: 1, signal: null
Thank you for your reply.
-
Hermes | Open Source Document Management System@fbartels
Definitely. Let’s see what is meant by ‘currently’.
Anyway, +1 for hypothes.is -
Hermes | Open Source Document Management Systemhttps://www.hashicorp.com/blog/introducing-hermes-an-open-source-document-management-system
- Festured on HN
- Possible alternative: hypothes.is
-
Hypothesis web annotation tool@nebulon
In relation to hypothes.is as a self-hosted app, one scenario maybe missing on the cloudron.io landing page is Academia (next to Personal, Web hosting, Business).
It could be interesting to image which apps would be promoted and what’s maybe missing. -
How to run a scheduled job ?Thanks @girish ! It worked. The changes took effect when I ran the job manually.