Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

Cloudron Forum

Apps | Demo | Docs | Install
  1. Cloudron Forum
  2. Paperless-ngx
  3. gotenberg process failing to start in paperless container

gotenberg process failing to start in paperless container

Scheduled Pinned Locked Moved Solved Paperless-ngx
4 Posts 3 Posters 40 Views 3 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • ChristopherMagC Offline
    ChristopherMagC Offline
    ChristopherMag
    wrote last edited by
    #1
    root@e40353c4-d8bb-484c-8147-aeeb639c716a:/app/code# supervisorctl
    gotenberg                        FATAL     Exited too quickly (process log may have details)
    paperless-consumer.service       RUNNING   pid 40, uptime 1:16:53
    paperless-scheduler.service      RUNNING   pid 41, uptime 1:16:53
    paperless-task-queue.service     RUNNING   pid 42, uptime 1:16:53
    paperless-webserver.service      RUNNING   pid 43, uptime 1:16:53
    tika                             RUNNING   pid 44, uptime 1:16:53
    supervisor> restart gotenberg
    gotenberg: ERROR (not running)
    gotenberg: ERROR (spawn error)
    supervisor> maintail
    1 15:07:00,261 WARN exited: gotenberg (exit status 1; not expected)
    2025-09-01 15:07:03,266 INFO spawned: 'gotenberg' with pid 201
    2025-09-01 15:07:03,351 WARN exited: gotenberg (exit status 1; not expected)
    2025-09-01 15:07:04,352 INFO gave up: gotenberg entered FATAL state, too many start retries too quickly
    2025-09-01 16:02:26,198 INFO reaped unknown pid 680 (exit status 1)
    2025-09-01 16:03:53,293 INFO reaped unknown pid 711 (exit status 1)
    2025-09-01 16:03:59,299 INFO reaped unknown pid 718 (exit status 1)
    2025-09-01 16:05:40,868 INFO spawned: 'gotenberg' with pid 788
    2025-09-01 16:05:40,953 WARN exited: gotenberg (exit status 1; not expected)
    2025-09-01 16:05:41,956 INFO spawned: 'gotenberg' with pid 793
    2025-09-01 16:05:42,041 WARN exited: gotenberg (exit status 1; not expected)
    2025-09-01 16:05:44,046 INFO spawned: 'gotenberg' with pid 798
    2025-09-01 16:05:44,126 WARN exited: gotenberg (exit status 1; not expected)
    2025-09-01 16:05:47,132 INFO spawned: 'gotenberg' with pid 803
    2025-09-01 16:05:47,208 WARN exited: gotenberg (exit status 1; not expected)
    2025-09-01 16:05:47,391 INFO gave up: gotenberg entered FATAL state, too many start retries too quickly
    2025-09-01 16:24:29,228 INFO spawned: 'gotenberg' with pid 959
    2025-09-01 16:24:29,310 WARN exited: gotenberg (exit status 1; not expected)
    2025-09-01 16:24:30,312 INFO spawned: 'gotenberg' with pid 965
    2025-09-01 16:24:30,391 WARN exited: gotenberg (exit status 1; not expected)
    2025-09-01 16:24:32,396 INFO spawned: 'gotenberg' with pid 970
    2025-09-01 16:24:32,472 WARN exited: gotenberg (exit status 1; not expected)
    

    I have not yet found a way to see the output of gotenberg to determine what is causing it to error as the config file logs to stdout but it doesn't appear to be captured by supervisord:

    root@e40353c4-d8bb-484c-8147-aeeb639c716a:/app/code# cat /etc/supervisor/conf.d/gotenberg.conf 
    [program:gotenberg]
    directory=/app/code
    environment=HOME=/home/cloudron,USER=cloudron,PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
    command=gotenberg
    user=cloudron
    autostart=true
    autorestart=true
    stdout_logfile=/dev/stdout
    stdout_logfile_maxbytes=0
    stderr_logfile=/dev/stderr
    stderr_logfile_maxbytes=0
    

    I have tried the following to see the the output of gotenberg and try and find the error:

    root@e40353c4-d8bb-484c-8147-aeeb639c716a:/app/code# su -s /bin/bash - cloudron
    cloudron@e40353c4-d8bb-484c-8147-aeeb639c716a:~$ cd /app/code
    cloudron@e40353c4-d8bb-484c-8147-aeeb639c716a:/app/code$ gotenberg 
    
      _____     __           __               
     / ___/__  / /____ ___  / /  ___ _______ _
    / (_ / _ \/ __/ -_) _ \/ _ \/ -_) __/ _ '/
    \___/\___/\__/\__/_//_/_.__/\__/_/  \_, / 
                                       /___/
    
    A containerized API for seamless PDF conversion.
    Version: 8.23.0
    -------------------------------------------------------
    [SYSTEM] modules: api chromium exiftool libreoffice libreoffice-api libreoffice-pdfengine logging pdfcpu pdfengines pdftk prometheus qpdf webhook 
    [FATAL] provision module api: get routers: provision module chromium: CHROMIUM_BIN_PATH environment variable is not set
    

    This line in the dockerconfig sets this environment variable and it is visible in root but it isn't in the environment config in /etc/supervisor/conf.d/gotenberg.conf.

    This might just be a symptom of my method for trying to launch it but it whatever the cause it appears to be having a problem starting.

    I have confirmed the same behavior with the demo cloudron instance, running supervisorctl you can see the gotenberg process with a fatal status.

    1 Reply Last reply
    0
    • jamesJ james marked this topic as a regular topic
    • jamesJ james moved this topic from Support
    • jamesJ Offline
      jamesJ Offline
      james
      Staff
      wrote last edited by
      #2

      Hello @ChristopherMag
      Thanks for reporting.
      I will have to look into it.

      1 Reply Last reply
      0
      • nebulonN Offline
        nebulonN Offline
        nebulon
        Staff
        wrote last edited by
        #3

        Indeed easy to reproduce and a package regression. Working on a fix...

        1 Reply Last reply
        1
        • nebulonN nebulon marked this topic as a question
        • nebulonN Offline
          nebulonN Offline
          nebulon
          Staff
          wrote last edited by
          #4

          This should be fixed now with latest package.

          1 Reply Last reply
          1
          • nebulonN nebulon has marked this topic as solved
          Reply
          • Reply as topic
          Log in to reply
          • Oldest to Newest
          • Newest to Oldest
          • Most Votes


          • Login

          • Don't have an account? Register

          • Login or register to search.
          • First post
            Last post
          0
          • Categories
          • Recent
          • Tags
          • Popular
          • Bookmarks
          • Search