Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

Cloudron Forum

Apps | Demo | Docs | Install
archosA

archos

@archos
About
Posts
400
Topics
78
Shares
0
Groups
0
Followers
0
Following
0

Posts

Recent Best Controversial

  • [GUIDE] Move PeerTube video storage to Hetzner S3
    archosA archos

    @girish said in [GUIDE] Move PeerTube video storage to Hetzner S3:

    @archos should I use the email on this forum?

    Yes, please use the same email I use for this forum. 👍

    PeerTube guides

  • [GUIDE] Move PeerTube video storage to Hetzner S3
    archosA archos

    I don’t have a GitLab account yet.
    @girish could you please send me an invite? 🙂

    PeerTube guides

  • [GUIDE] Move PeerTube video storage to Hetzner S3
    archosA archos

    @joseph Hi, glad to hear you liked the guide! 🙂
    Where can I request access or registration for your GitLab, so I can add the contribution there?
    Thanks!

    PeerTube guides

  • [GUIDE] Move PeerTube video storage to Hetzner S3
    archosA archos

    Hi everyone,
    after a few failed attempts with other S3 providers like iDrive and Backblaze,
    I tried Hetzner Object Storage (S3) — where we also host our Cloudron servers.
    Here’s my working setup and migration process — maybe it helps someone.
    Everything works great: I’ve successfully moved ~240 GB of videos, all without issues.

    This guide shows how to move PeerTube video storage to Hetzner Object Storage (S3-compatible) on a Cloudron instance. Tested with PeerTube 7.3.0 and Cloudron v8.3.2 (Ubuntu 24.04.1 LTS)

    1️⃣ Create your S3 bucket(s)

    • Region: fsn1 (Falkenstein)
    • Visibility: Public (read)
    • Block Public Access: off
      Example buckets:
    • peertube-1

    2️⃣ Set CORS configuration

    Create a file called example-cors.xml:

    <CORSConfiguration>
      <CORSRule>
        <AllowedHeader>*</AllowedHeader>
        <AllowedMethod>GET</AllowedMethod>
        <AllowedMethod>HEAD</AllowedMethod>
        <AllowedOrigin>*</AllowedOrigin>
      </CORSRule>
    </CORSConfiguration>
    

    Apply it to your bucket(s):

    s3cmd --config=/dev/null --no-check-certificate \
      --access_key=YOUR_ACCESS_KEY \
      --secret_key=YOUR_SECRET_KEY \
      --host=fsn1.your-objectstorage.com \
      --host-bucket="%(bucket)s.fsn1.your-objectstorage.com" \
      setcors example-cors.xml s3://peertube-1
    

    Check it:

    s3cmd --config=/dev/null --no-check-certificate \
      --access_key=YOUR_ACCESS_KEY \
      --secret_key=YOUR_SECRET_KEY \
      --host=fsn1.your-objectstorage.com \
      --host-bucket="%(bucket)s.fsn1.your-objectstorage.com" \
      info s3://peertube-1 | grep CORS -A1
    

    You should see:

    CORS: <CORSConfiguration ...><AllowedOrigin>*</AllowedOrigin>...</CORSConfiguration>
    

    3️⃣ Edit PeerTube configuration

    Open /app/data/production.yaml (Cloudron path) and add or modify this block:

    object_storage:
      enabled: true
      endpoint: 'https://fsn1.your-objectstorage.com'
      region: 'eu-central'
      credentials:
        access_key_id: 'YOUR_ACCESS_KEY'
        secret_access_key: 'YOUR_SECRET_KEY'
      videos:
        bucket_name: 'peertube-1'
        prefix: 'videos/'
        base_url: 'https://peertube-1.fsn1.your-objectstorage.com'
        upload_acl: 'public-read'
      streaming_playlists:
        bucket_name: 'peertube-1'
        prefix: 'hls/'
        base_url: 'https://peertube-1.fsn1.your-objectstorage.com'
        upload_acl: 'public-read'
      previews:
        bucket_name: 'peertube-1'
        prefix: 'previews/'
        base_url: 'https://peertube-1.fsn1.your-objectstorage.com'
        upload_acl: 'public-read'
      thumbnails:
        bucket_name: 'peertube-1'
        prefix: 'thumbnails/'
        base_url: 'https://peertube-1.fsn1.your-objectstorage.com'
        upload_acl: 'public-read'
      captions:
        bucket_name: 'peertube-1'
        prefix: 'captions/'
        base_url: 'https://peertube-1.fsn1.your-objectstorage.com'
        upload_acl: 'public-read'
    

    Save and restart PeerTube from the Cloudron dashboard.

    4️⃣ Move videos to S3

    From the Cloudron Web Terminal:

    cd /app/code/server
    
    gosu cloudron:cloudron npm run create-move-video-storage-job -- --to-object-storage
    

    This creates jobs that migrate all videos to your S3 bucket. Progress can be monitored in Cloudron → App → Logs.

    5️⃣ Verify

    Check the directory size before/after:

    du -sh /app/data/storage
    du -sh /app/data/storage/* | sort -h
    

    When migration finishes, most data (videos, HLS, previews) should move to S3. Local disk usage should drop to a few GB.

    Tested setup

    • PeerTube 7.3.0
    • Cloudron v8.3.2
    • Hetzner Object Storage (fsn1)
    PeerTube guides

  • PeerTube – 413 error when uploading thumbnail
    archosA archos

    @james Thanks for the quick reply and especially for the fast update, you’re the best! 🙌

    PeerTube

  • PeerTube – 413 error when uploading thumbnail
    archosA archos

    Hi,
    we are running a PeerTube instance and I’ve run into a problem when uploading a thumbnail:

    uploading a thumbnail (~4.4 MB) fails with 413 Request Entity Too Large

    PeerTube UI shows that up to 8 MB is allowed

    uploading an avatar (~6 MB) works without problems

    uploading videos (GB size) also works fine

    The error comes directly from nginx on the host (see HTML error page).

    <html> <head><title>413 Request Entity Too Large</title></head> <body> <center><h1>413 Request Entity Too Large</h1></center> <hr><center>nginx/1.24.0 (Ubuntu)</center> </body> </html>
    

    How can I increase client_max_body_size for PeerTube on Cloudron so that thumbnails up to 8 MB can be uploaded?

    PeerTube

  • Import YouTube
    archosA archos

    @robi Ok, I’ll give it a try. I’ll also search the PeerTube GitHub to see if someone has already reported this issue.
    Thanks a lot for your time and the information.

    PeerTube

  • Import YouTube
    archosA archos

    @robi said in Import YouTube:

    --cookies-from-browser
    

    did you try this?
    Yes, I also tried --cookies-from-browser, but I uploaded a cookies.txt file into PeerTube.
    I added the path to the file in the import section of production.yaml and restarted the instance.
    Maybe I’m doing something wrong, but I have no idea how else to do it.

    PeerTube

  • Import YouTube
    archosA archos

    @robi In the logs I always see this:

    ERROR: [youtube] ***: Sign in to confirm you’re not a bot.  
    Use --cookies-from-browser or --cookies for the authentication.  
    See https://github.com/yt-dlp/yt-dlp/wiki/FAQ#how-do-i-pass-cookies-to-yt-dlp
    
    PeerTube

  • Import YouTube
    archosA archos

    @robi Thanks for the reply 👍
    All files are owned by the cloudron user. I also restarted the app, but the problem still persists.

    PeerTube

  • Import YouTube
    archosA archos

    Hi,
    I’m running PeerTube on Cloudron and I’m trying to get YouTube imports working, but they now require login (yt-dlp error: Sign in to confirm you’re not a bot).

    What I tried:

    Exported cookies.txt from my browser (Netscape format, starts with # Netscape HTTP Cookie File).

    Placed the file at /app/data/storage/cookies.txt.

    Edited /app/data/config/production.yaml and added:

    import:
      youtube_dl:
        args: "--cookies /app/data/storage/cookies.txt"
    
    

    Has anyone tried to set this up?
    Maybe I’m doing something wrong. Any advice would be appreciated.

    PeerTube

  • Docker volumes are filling up disk
    archosA archos

    @robi I only added the recommended commands:

    0 13,23 * * * rm -rf /app/data/public/wp-content/cache/* > /dev/null 2>&1  
    0 13,23 * * * rm -rf /tmp/magick-* /tmp/imagick* > /dev/null 2>&1
    
    Support docker volumes

  • Docker volumes are filling up disk
    archosA archos

    The cron job is working perfectly, temporary files are no longer accumulating. Many thanks to everyone for the advice. ✅

    Support docker volumes

  • Docker volumes are filling up disk
    archosA archos

    Thanks a lot everyone for the advice 🙏
    I’ve added some cleanup jobs to the WP cron for now.
    I’ll let you know if it helps.

    Support docker volumes

  • Docker volumes are filling up disk
    archosA archos

    @joseph said in Docker volumes are filling up disk:

    Is it still WordPress and imagemagick temp files?

    Yes, still in the same location and same naming pattern – seems to be temporary files from WordPress using ImageMagick.

    Support docker volumes

  • Docker volumes are filling up disk
    archosA archos

    Unfortunately, I just checked and the files are back again – only two for now. It seems something is still generating them.

    Support docker volumes

  • Docker volumes are filling up disk
    archosA archos

    I tried deleting the files gradually starting from the oldest ones, and everything seems fine so far. Looks like they were really just leftover tmp files.

    Support docker volumes

  • Docker volumes are filling up disk
    archosA archos

    @timconsidine Thanks! I’ll try to search around a bit more with that in mind. Good to know it might not be just a one-off issue.

    Support docker volumes

  • Docker volumes are filling up disk
    archosA archos

    I tried docker inspect. Looks like it's a WordPress instance from a single user. Shouldn't WordPress clean this up on its own? There are thousands of magick-*.txt files taking up hundreds of GB. I think the oldest one is from around June 16th.

    Support docker volumes

  • Docker volumes are filling up disk
    archosA archos

    Largest Docker volumes on the system

    437G    /var/lib/docker/volumes/2cd35318182d2ca6e0a83bfa4967707b33201c9d0389b54dc67b1da13727ece3
    3.3G    /var/lib/docker/volumes/21ae2f0cdd518b6beb761b3d8f690573b88854456b1f9b8b62a59533950a36b0
    2.8G    /var/lib/docker/volumes/69b8db67d33f9b732414a2a903b2c31b5af7900bed64ae1ba15a36361672c724
    2.2G    /var/lib/docker/volumes/578df1d49aeae3d49ec7eb0d1b3fe247d8485e1b6bc007dc39e9bb2492ea2dbf
    1.9G    /var/lib/docker/volumes/18659aaf1c1a71f000167fabeb5de42147b417840968edf857b0a28d49c059c5
    1.8G    /var/lib/docker/volumes/d47e05be1ab7c9afebc44a7472b51c9123abfc0cb8dbee90efdde6f986b2b10a
    1.3G    /var/lib/docker/volumes/c48e870f9326a2420d3484f9a8c9d6e101ed1b43155a072f680d4dbeb8c6d6dc
    1.2G    /var/lib/docker/volumes/92ceda1eef40d4331bdc0b40dee90fa244c6322f709b17c9c057e2c23b27f7e3
    879M    /var/lib/docker/volumes/4835a5f6bd07bc0ccbc48d9984188ea4cba19827c38475ff96379428fe8573e5
    766M    /var/lib/docker/volumes/c52c6083cac12f83f5d1c7ac3844d29a8aa196607821d594fcea31c6efcca544
    734M    /var/lib/docker/volumes/7a4d464132751933477aaaea0ac84f5b9d8c6675f14740b3bd52e68cf11a71b6
    716M    /var/lib/docker/volumes/fe8398266b2ee41d4936326db1d148110c6c0246d2d55abb56ecd779982c40a7
    660M    /var/lib/docker/volumes/3c73c37c138981b00cfbde782127f91876859659fe22e5795909930e44fa46ff
    592M    /var/lib/docker/volumes/a7529bb1adcf3c008bb991b4b8d53071b1b978d78fc1dbf4d41400f1c2136583
    573M    /var/lib/docker/volumes/c07bb67d0c6d2f7ac5f6b16379d2217fc04342bdbe50d647d8f2e348824952fe
    439M    /var/lib/docker/volumes/1783e3b02509013f9e5967d24cb0308c429d316e7bf1fe91319d1d13c3d52d5f
    370M    /var/lib/docker/volumes/fad8b9e0768542ebb33f3c43f37a5ab4d38bf78396f9fcbcbc26eccfaf599b2f
    351M    /var/lib/docker/volumes/789e68c10bd2281d2728948d03121939e7465bca108d5fb0f4b3a4b637e57507
    191M    /var/lib/docker/volumes/fdf7c2c7d257742135a5a02ff6326e12b08f990d8c68a043778a99022f45debe
    177M    /var/lib/docker/volumes/c434bfac6cc7eb90390082546dfdbb95f7c9d516e69dff2f4a3f0d8d28bfadd0
    

    The biggest volume alone is 437 GB.

    I checked the contents of that volume and it’s filled with thousands of magick-* files

    magick-00nzf1efCQtQKRL2bCCzds2a7Kt6cLIK
    

    I'm not sure if these magick-* files can be safely deleted.
    They look like leftover temp files from ImageMagick, but I want to make sure before removing ~400 GB of them.
    Any guidance or confirmation would be appreciated.

    Support docker volumes
  • Login

  • Don't have an account? Register

  • Login or register to search.
  • First post
    Last post
0
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search