Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content

PeerTube

66 Topics 633 Posts
  • PeerTube - Package Updates

    Pinned
    45
    2 Votes
    45 Posts
    1k Views
    nebulonN

    [2.15.2]

    Update PeerTube to 6.0.4 Full changelog Important: Prevent XSS injection in embed. Thanks Syst3m0ver!
  • 1 Votes
    4 Posts
    37 Views
    jdaviescoatesJ

    @scooke you could also experiment with taking off the https:// from your endpoint as I don't think that's supposed to be there 🙂

  • 2 Votes
    7 Posts
    126 Views
    S

    @scooke @LoudLemur @nebulon

    Sure.

    My data is stored in 3 places:
    a. local storage (Only this is included in the 'App backups')
    b. mounted volume (for certain options that are temporarily space-intensive, like: tmp & streaming_playlists )
    c. object storage (primary/permanent storage for the video files)

    The config in production.yaml :

    storage: tmp: '/media/my-mounted-volume/my-project/storage/tmp/' # Use to download data (imports etc), store uploaded files before processing... avatars: '/app/data/storage/avatars/' streaming_playlists: '/media/my-mounted-volume/my-project/storage/streaming-playlists/' redundancy: '/app/data/storage/redundancy/' logs: '/app/data/storage/logs/' previews: '/app/data/storage/previews/' thumbnails: '/app/data/storage/thumbnails/' torrents: '/app/data/storage/torrents/' captions: '/app/data/storage/captions/' cache: '/app/data/storage/cache/' plugins: '/app/data/storage/plugins/' client_overrides: '/app/data/storage/client-overrides/' bin: /app/data/storage/bin/ well_known: /app/data/storage/well_known/ tmp_persistent: /app/data/storage/tmp_persistent/ # Use two different buckets for Web videos and HLS videos on AWS S3 storyboards: /app/data/storage/storyboards/ web_videos: /app/data/storage/web-videos/ object_storage: enabled: true # Example AWS endpoint in the us-east-1 region endpoint: 'region.my-s3-domain' # Needs to be set to the bucket region when using AWS S3 region: 'region' web_videos: bucket_name: 'my-bucket-name' prefix: 'direct/' streaming_playlists: bucket_name: 'my-bucket-name' prefix: 'playlist/' AWS_ACCESS_KEY_ID: 'my-key-ID' AWS_SECRET_ACCESS_KEY: 'my-access-key' credentials: aws_access_key_id: 'my-key-ID' aws_secret_access_key: 'my-access-key' access_key_id: 'my-key-ID' secret_access_key: 'my-access-key' max_upload_part: '1GB' For Remote Runners:

    a. cb1339ef-0ec4-4767-8f7b-5b573e3c276a-image.png

    b. 2c492638-b754-4c36-a1de-cec5ab00550a-image.png

    c. Set up remote machine(s) using the Peertube CLI to connect to your app, using the Runner registration tokens.

    https://docs.joinpeertube.org/maintain/tools#peertube-runner

    For Matomo, i'm using a plugin: https://www.npmjs.com/package/peertube-plugin-matomo

    67f2d3d1-0aeb-4ac1-b422-b3cc2392c59e-image.png

    278462f6-5336-4e47-a7a5-c68aefbc87a9-image.png

  • Poor network throughput, limited to PeerTube

    9
    1 Votes
    9 Posts
    122 Views
    V

    I found the issue with some help from their Git. I'm posting it here because it's not well-documented and doesn't seem to be mentioned anywhere else. (EDIT: Hats off to the dev, who upon seeing my Git post went and updated the documentation to specifically recommend reviewing this directive!)

    Peertube DOES throttle network speed! A fresh install is limited to 5 MB/s. This is fine for 1080P but barely adequate for typical 30 FPS 4K footage, and unworkable for high-quality 4K or 4K at 60 FPS. For perspective, YouTube recommends nearly 9 MB/s for 60FPS 4K video.

    You can adjust this hard limit by modifying peertube.conf and adjusting the values listed below:

    proxy_limit_rate $peertube_limit_rate

    You should take your network performance, desired video quality, and download functionality into consideration before modifying these. For most people, 10M should give you ample 4K quality without excessive buffering. If you want to allow downloads, consider setting them to 25M or higher.

    I'm glad this wasn't a Cloudron issue, but I appreciate the effort of anyone who took the time to stop and think about this one. I suspect this value was different in previous versions and carries over with upgrades; this may only affect new installations.

  • Peertube CLI V-6 change?

    2
    4 Votes
    2 Posts
    35 Views
    girishG

    I fixed the docs, thanks for reporting.

  • 0 Votes
    6 Posts
    230 Views
    jdaviescoatesJ

    @LoudLemur said in How to setup Object Storage for Peertube on Cloudron (iDrive e2):

    Do you think this might be to enable other instances following / subscribing to your own, so they might need access to those files?

    Good question, possibly, but I've really no idea.

    Try subscribing to https://uniteddiversity.tv and/ or https://bridport.tv and let's see what happens...?

  • 0 Votes
    4 Posts
    99 Views
    scookeS

    @jdaviescoates Good catch, I meant AWS S3 as that seems to be the standard and most tuts use that, and most S3-compatible also use similar phrasing for their options, but as the number of posts eve just here on this forum say, non AWS S3 is a crapshoot.

  • 2 Votes
    7 Posts
    90 Views
    girishG

    @shrey great work figuring the root cause! I have published a new package now that highlights the breaking changes in 6.0 .

    For others reading, the main issue was that some of the configuration keys have changed in peertube 6. See https://github.com/Chocobozzz/PeerTube/blob/develop/CHANGELOG.md#v600 for more information. For example, if you use object storage to store videos, the storage paths have changed.

  • 2 Votes
    3 Posts
    75 Views
    L

    @girish said in Peertube audio-only transcoding with HLS, AV1, VP9 Transcoding - 2024 roadmap:

    Hardware support for transcoding is a different beast though (of which I don't have much idea about)

    On AMD, RDNA3 capable CPUs can do great transcoding. Contabo have AMD Ryzen 9 7900 VPS which can do AV1 encode.

    Peertube also supports runners, so you can send your transcoding to another server for transcoding. There is some great new kit for video transcoding now:

    https://www.xilinx.com/applications/data-center/video-imaging/alveo-ma35d.html

  • 4 Votes
    1 Posts
    53 Views
    No one has replied
  • 3 Votes
    1 Posts
    44 Views
    No one has replied
  • 0 Votes
    1 Posts
    34 Views
    No one has replied
  • 1 Votes
    15 Posts
    231 Views
    L

    @robi said in How to run these Peertube CLI commands to move videos to object storage on Cloudron?:

    @LoudLemur If you contribute one, it can be added.

    I suppose I could make a video for it, but it would be more like a Laurel and Hardy "How not to..." than a "How to...", I think! Custard pies, planks of wood and banana skins everywhere!

  • 0 Votes
    1 Posts
    408 Views
    No one has replied
  • Peertube 6.0

    3
    1 Votes
    3 Posts
    79 Views
    girishG

    They have renamed a bunch of storage related parameters which will break existing installations. So, we have to make a major release.

  • Peertube, I tried it today...

    Moved
    9
    1 Votes
    9 Posts
    152 Views
    girishG

    @AartJansen Good idea, I will bump the default memory limit.

  • Regenerate video thumbnails

    Unsolved
    19
    1 Votes
    19 Posts
    252 Views
    girishG

    @mdc773 said in Regenerate video thumbnails:

    i don't fully understand max upload part

    On S3 compatible storage, you can upload large files, like 5TB. It would be infeasible to upload such large files in a single shot. So, they have APIs to upload files in "parts". Like you upload 10k parts of 5GB each for example to get to 5TB. The number of parts and size of each part depends on the S3 storage.

    Usually, I leave upload part size to something like 10MB-50MB. There is actually nothing wrong with 100MB part size, but for some reason peertube is complaining about it.

  • Peertube and S3/Minio Objectstorage

    41
    2 Votes
    41 Posts
    2k Views
    benborgesB

    I wish It could be used like for the LAMP containers, apache handling the file access, I'm using it a lot in that context and it works perfectly ! but yeah I get it 🙂

  • 0 Votes
    8 Posts
    175 Views
    jdaviescoatesJ

    Just to say, I think the issue here was that the server just didn't have enough RAM nor CPU in order to process the large video someone tried to upload. Things have been working nicely since moving my Cloudron to a dedicated server.

  • Error when installing

    Unsolved
    11
    0 Votes
    11 Posts
    164 Views
    mhgcicM

    For the moment we have just changed it to Manual DNS until we can get it sorted as their community is terrible for support.