You want Rainloop to redirect messages so they appear as they come from the original sender?

nichu42
Posts
-
Email forwarding with Amazon SES -
DNS providers offering DNSSEC (Swarm intelligence (and help) needed)I moved from Cloudflare to Infomaniak (registrar) + Bunny NET (DNS).
DNSSEC works well. -
LicensingI interpret it that you are not allowed to remove the AGPL 3.0 license / copyright notice. Not the payment nag notice.
I know these are no truth-telling toys, but this is what Gemini Pro says:
"- You can fork the project.- You can remove the payment reminders/nagging feature.
- You can make this modified version available for others to deploy.
Crucially, you must do so under the terms of the AGPL 3.0, including making the complete source code of your modified version available to those who use or deploy it.
The original author chose the AGPL 3.0, which grants these freedoms. While they can request payment or add reminders in their version, the license itself allows users to modify and share those modifications. Your actions would be a demonstration of the freedoms provided by the license."
-
Licensing@BrutalBirdie Not a license expert and I don't know about Valkey, but wouldn't AGPL-3.0 allow me to create a soft fork, i.e., forking every new version and remove the license banner? Couldn't even Cloudron do that?
-
LicensingWhile I can fully understand it, I really don't like the move.
-
DMARC VisualizerThis one here is being actively developed and comes for Docker: https://github.com/cry-inc/dmarc-report-viewer
-
Run s3_media_upload script@andreasdueren Still running it as described above Everything is fine.
-
Installing awscli to move media to S3I'm such a noob on the Linux command line. I now have an Ubuntu environment in Windows (WSL), and connected Cloudron CLI. Do you think it's possible to mount the S3 bucket in the WSL's file system and pull the files directly to the S3 bucket via Cloudron CLI?
Could also download a current backup and shoot those files to the S3 on my local PC first. Sorry for thinking loud in public, but I am open for suggestions. -
Installing awscli to move media to S3The Mastodon backup takes roughly 6 hours each day.
Including the cache, though. Which is limited to 7 days. -
Installing awscli to move media to S3@nebulon That would mean putting my public instance offline for a whole working day. Can you think of another way to solve this?
I am currently thinking if I could rsync the files to a Hetzner Storage Box volume and then rclone them from there to S3.
Once to prepare, then move, then repeat to sync any missing file. -
Backing up media cache@robi Good idea, thanks. Continuing in https://forum.cloudron.io/post/105874
-
Installing awscli to move media to S3Splitting from thread https://forum.cloudron.io/post/105859
I'm trying to install awscli in order to copy the local media files to S3.
Following the instructions found at https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html, I end up with the following error when trying to start the installation:mkdir: cannot create directory ‘/usr/local/aws-cli’: Read-only file system
Seems logic. But I'm running in circles. How do I get the files out of the container? Is there another way?
-
Backing up media cacheI think the thread title might have caused some confusion. I originally started this thread because I was having issues with backups. I managed to resolve those for the most part by aggressively clearing the cache, so I postponed the S3 topic.
However, my (public) instance has continued to grow since then, and some users are posting plenty of pictures and videos. As a result, the local storage usage keeps climbing relentlessly, so now I really need to make the move to S3.The guides linked here in the thread all mention migrating the cache folder. That's why I had intended to do that too. But honestly, I'm not really care about the cache. As @nebulon rightly pointed out, it'll just get repopulated anyway. Sure, that might lead to some performance hits, but that's a secondary concern.
My main concern is really the assets that are permanently stored on my instance. I can't just leave those behind. Since there's no fallback mechanism, all existing media files will inevitably result in 404 errors. Once S3 is activated, Mastodon will only look for files in the S3 storage. So, uploading everything to the S3 bucket is essential. I intended to use awscli for this, but installing it in the container following Amazon's instructions failed. It's possible I did something wrong, though. I'd be really grateful if someone could point me in the right direction here.
Thanks a lot!
-
Backing up media cacheWell, all the guides linked in this thread say you should do this. I want to get rid of my locally stored assets altogether.
-
Backing up media cacheBonus question @nebulon: Would you be able to include awscli in the Mastodon package?
-
Backing up media cache@doodlemania2 Could you please elaborate a bit more on how you managed to get awscli working inside the Cloudron container? I'm having trouble installing it (getting "file system is read-only" errors all the way long).
-
Best way to change DNS providers from Cloudflare to an EU provider?I am currently trying bunny.net for the same reason.
-
postiz won't log out@nebulon Still excited. This is the way!