Installing awscli to move media to S3
-
Splitting from thread https://forum.cloudron.io/post/105859
I'm trying to install awscli in order to copy the local media files to S3.
Following the instructions found at https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html, I end up with the following error when trying to start the installation:mkdir: cannot create directory ‘/usr/local/aws-cli’: Read-only file system
Seems logic. But I'm running in circles. How do I get the files out of the container? Is there another way?
-
@nichu42 all apps run in a mostly read-only container environment. Since I assume the job is a one-time job to use the aws cli for you. You could put the app in recovery mode after that mastodon does not run, but the container gets a transient read-write filesystem. You can then install aws-cli safely and copy the media cache to S3. Afterwards disable recovery mode and your filesystem changes will be undone and the app would startup in read-only mode again.
-
@nichu42 all apps run in a mostly read-only container environment. Since I assume the job is a one-time job to use the aws cli for you. You could put the app in recovery mode after that mastodon does not run, but the container gets a transient read-write filesystem. You can then install aws-cli safely and copy the media cache to S3. Afterwards disable recovery mode and your filesystem changes will be undone and the app would startup in read-only mode again.
@nebulon That would mean putting my public instance offline for a whole working day. Can you think of another way to solve this?
I am currently thinking if I could rsync the files to a Hetzner Storage Box volume and then rclone them from there to S3.
Once to prepare, then move, then repeat to sync any missing file. -
How much media data do you have in that instance to expect a whole day?
But that is an option to download the data via other means (maybe even with the cloudron cli) to some temporary location. But then again if you have that much data, it may not be feasible.
You could also (but at your own risk) try to install aws-sdk on the host ubuntu system and perform the upload/sync there.
-
I find ‘rclone’ an excellent utility for this kind of thing. Is that installed in your system?
Is it possible to use rclone or awscli locally to connect to both source and destination ?
Might not be a great strategy but might get the job done -
I'm such a noob on the Linux command line. I now have an Ubuntu environment in Windows (WSL), and connected Cloudron CLI. Do you think it's possible to mount the S3 bucket in the WSL's file system and pull the files directly to the S3 bucket via Cloudron CLI?
Could also download a current backup and shoot those files to the S3 on my local PC first. Sorry for thinking loud in public, but I am open for suggestions. -
I'm such a noob on the Linux command line. I now have an Ubuntu environment in Windows (WSL), and connected Cloudron CLI. Do you think it's possible to mount the S3 bucket in the WSL's file system and pull the files directly to the S3 bucket via Cloudron CLI?
Could also download a current backup and shoot those files to the S3 on my local PC first. Sorry for thinking loud in public, but I am open for suggestions.