Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

Cloudron Forum

Apps | Demo | Docs | Install
B

balticpenguin

@balticpenguin
About
Posts
15
Topics
2
Shares
0
Groups
0
Followers
0
Following
0

Posts

Recent Best Controversial

  • Backup failed ERR_CHILD_PROCESS_STDIO_MAXBUFFER
    B balticpenguin

    Hi,

    my last backup failed with

    saveFsMetadata errored with code ERR_CHILD_PROCESS_STDIO_MAXBUFFER message stdout maxBuffer length exceeded

    I tried it again, but it crashes and I still get this error.
    Can anyone tell me what is going wrong and how I can fix it?

    Support backups

  • Jupyterhub - single user notebook servers shuts down
    B balticpenguin

    I would like to use the Jupyter Scheduler extension.

    https://jupyter-scheduler.readthedocs.io/en/latest/users/index.html#installation

    The installation is working fine.
    I then set up a job that runs once a day. This also works without any problems.

    However, I have noticed one issue. If I don't use the app for a few days, my notebook server is shut down and thus the job will not run again until I start the notebook server.

    Is there a way to prevent the Notebook Server from shutting down in the Jupyterhub settings?

    JupyterHub

  • Backup failed ERR_CHILD_PROCESS_STDIO_MAXBUFFER
    B balticpenguin

    We have version v7.7.2.

    Most of the files are probably located within Nextcloud. I counted up the user directories once: We have around 165000 files under /app/data/<user>/files/ with a size of 160GB.

    How can I change the buffer size? Is there a command or do I have to change something directly in a file?

    Support backups

  • Accessing mounted volumes in jupyter notebooks
    B balticpenguin

    In the settings of Jupyterhub there are
    c.DockerSpawner.volumes and
    c.SwarmSpawner.read_only_volumes
    (https://jupyterhub-dockerspawner.readthedocs.io/en/latest/api/index.html)

    Would this perhaps be a possibility? Currently, however, these config settings do not work.

    Since I have day files with data that I need to read and edit. Therefore it is very inconvenient if I would have to upload them manually. Besides, my colleagues also work with the data. So it would be great if there was a directory "shared" which is in the home directory. Great would be a setting in custumconfig.py like
    c.DockerSpawner.volumes = {'/media/shareddata' : '/home/jovyan/shared'}

    see also https://jupyterhub-dockerspawner.readthedocs.io/en/latest/data-persistence.html#volume-mapping

    JupyterHub

  • Accessing mounted volumes in jupyter notebooks
    B balticpenguin

    @girish I have tested and it works. I can share data with other users. But it is possible to get access to mounted volume from the user container? The mounted volume is only accessible in the Jupyterhub container, but not in the user container.

    JupyterHub

  • Accessing mounted volumes in jupyter notebooks
    B balticpenguin

    If you want to share a volume with Jupyterlab, this is unfortunately not enough. Each user gets their own Jupyterlab container from a notebook image. By default, this is jupyter/datascience-notebook. However, no media group is created in this container. There is also no user cloudron.
    Since I created my own image, I could simply create the group media and add the user Jovyan. Now it works.

    JupyterHub
  • Login

  • Don't have an account? Register

  • Login or register to search.
  • First post
    Last post
0
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search