Large JSON file upload getting nginx 413 error
-
@tkd do you have a test json file for us to work with? If you have non public data in there, you may also send us a mail to support@cloudron.io
So far I wasn't finding a quick way to generate an export with over 100M
-
@tkd do you have a test json file for us to work with? If you have non public data in there, you may also send us a mail to support@cloudron.io
So far I wasn't finding a quick way to generate an export with over 100M
-
Ok, thanks that helped. There are two memory related issues I can reproduce.
- Memory limit from Cloudron side: With that json import I had to at least bump up the memory limit for that app instance via the Cloudron dashboard to 1Gb temporarily
- It will still fail due to
JavaScript heap out of memory
in nodejs
Both issues will result in status code 413, which results in Ghost showing the error message
Request is larger than the maximum file size the server allows
which is not related to the actual upload size or so, but actually a processing memory requirement issue.I don't yet have a solution but just wanted to give an update here.
-
wrote on Jul 28, 2020, 11:45 AM last edited by
Thanks. This is quite important to us to get this loaded into a Ghost instance. If anyone has any ideas or a temporary work around I'd be happy to know - or if Ghost has another method of importing content that would work with the Cloudron package.
-
So the fix, with which I was successfully able to import this file is, to give the javascript engine, which is used by nodejs, more heap space:
node --max-old-space-size=4096 current/index.js
In this case allowing up to 4Gb. The issue is, I am not sure how to put this properly in the package, as this heavily depends on the set memory limit of the whole app instance.
Hopefully we can come up with a good solution later today.
-
So the fix, with which I was successfully able to import this file is, to give the javascript engine, which is used by nodejs, more heap space:
node --max-old-space-size=4096 current/index.js
In this case allowing up to 4Gb. The issue is, I am not sure how to put this properly in the package, as this heavily depends on the set memory limit of the whole app instance.
Hopefully we can come up with a good solution later today.
-
We will make a new release which contains a env file at
/app/data/env
There the max-old-space can be set/overwriten temporarily for the import. The test json in this case worked with a 2Gb limit for me.The file can be changed using the file manger from the cloudron dashboard or webterminal. Once changed, the app has to be restarted.
Of course first update your ghost instance to latest package version 3.126.1
-
@tkd Docs are here for the new package - https://cloudron.io/documentation/apps/ghost/#importing
-
wrote on Jul 28, 2020, 10:14 PM last edited by
Thanks guys, managed to get this working with the new package
-
wrote on Jul 29, 2020, 2:44 AM last edited by
Truly heroic support!
Happy to know this is covered and tested