v7.10.3 / 1.75.1 lost/erased all files from S3
-
I have Mattermost configured against SOS storage at Exoscale and all - all - my files get lost there.
As support didn't get back to me, I started googling and only found a known issue in that release in the focal boards: https://mattermost.atlassian.net/browse/MM-53240 - which seems to have a fix https://github.com/mattermost/focalboard/pull/4785/commits/8f97fedad7cb14a14a198f805fbf2602231e76bf
At the same time Mattermost lost connection to S3 and I can't bring it back.
I wonder if anyone has any related experience?
I'm not sure if it's happened exactly after update, it might be day before the update and I wouldn't expect update to remove all files from S3 bucket... -
Are the files removed from S3 itself? Or are the links just not working from mattermost ?
The cloudron package itself of course has no code to delete files. And that too it has no S3 access. Does mattermost have any S3 file retention settings? Just wondering why it would even have code to delete files.
-
Are the files removed from S3 itself? Or are the links just not working from mattermost ?
The cloudron package itself of course has no code to delete files. And that too it has no S3 access. Does mattermost have any S3 file retention settings? Just wondering why it would even have code to delete files.
@girish both - the bucket is empty and S3 connection is broken.
Yeah - I don’t consider that could be Cloudron issue in any way - I’ve seen the only thing that’s changed on the packaging - it’s a patch version update.
Retention policy - not on Mattermost side, but I didn’t think of that, thanks for a new vector of thoughts!
-
I had something similar happen at a different S3 provider, and after 2 months, the solution was to delete that bucket and create a new one, as their infrastructure has upgraded past the support for the original bucket code version, heh.
@robi wtf
?!
Could possibly share the providers names? And how on earth could it be that s3 content can get lost because of the version’s update?! -
@robi wtf
?!
Could possibly share the providers names? And how on earth could it be that s3 content can get lost because of the version’s update?!@potemkin_ai nah, too much drama
-
@potemkin_ai nah, too much drama
-
@potemkin_ai nope just regular backups
-
@potemkin_ai nope just regular backups
@robi who cares about backups, right?
I would appreciate though if you could share (here or any other way) the providers names; not a problem if not, but I would feel better knowing what to avoid. -
In my specific case, Exoscale's support was quite helpful - they checked the logs of the script that calculates the bucket size and they pointed out that the bucket was always empty.
After additional checks, I realized that Mattermost doesn't respect file's storage settings and doesn't upload files to s3, even if S3 connection works; I believe something goes wrong with using those settings which leads Mattermost to silently default to work with local storage. I understand it shall be reported upstream, but I have no capacity for that right now, so just leaving this note here for now - in case if someone will encounter this issue as well.
-
@scooke yes, the files were stored locally all of the time, despite my settings were quite opposite (S3)