@girish Thanks so much for the great answers.
I look forward to installing the cloudron cli on my computer running Linux Mint .
@girish wrote "I remember mentioning NAT loopback in https://blog.cloudron.io/installing-cloudron-on-a-home-server/."
That is indeed where I learned about it and was diligent when I set up the old router... but forgot about it it completely when setting up the new one. Always a good idea to work from docs and not just your own head.
@girish thx. It was really helpful for you to say that you didn't think the problem was with the cert. It got me thinking outside that box.
The problem was that I hadn't properly set up Nat loopback support on the new router (the config for that was different from the one on the old router).
I had to un-tick the box "Stop DNS Rebind" in order to get my site to resolve on my local computers (access from outside my network was fine --which I hadn't realized at first).
There was a big warning when I unticked that setting. Any thoughts on this. It's possible that I have some other local network or router setting that isn't quite right.
I had the same problem and searched the forum.
Got the solution here, thanks! @alkomy
Since finding where to add the white-listed IP on Namecheap can cause you to pull your hair out, I thought I'd document it here for others:
Via the UI: account => profile => tools => Business & Dev tools => Namecheap API access (click "Manage") and then click "Edit" across from "Whitelisted IPs".
Yesterday I was having trouble with a cert. It turns out the underlying problem may have been Let's Encrypt being down for some time. Which is probably a pretty rare event.
In my troubleshooting attempts I tried switching to a staging cert. It was after I made that switch that Let's Encrypt seemed to come back on line. And so I got a staging cert. Which was of no help since the site was actually a production site and the browser warnings are ominous.
The log message when I clicked "Renew all certs" was that no cert was issued because one already existed. I had already edited the domain and chosen "wildcard prod" but that didn't make a difference.
In fact, deleting the domain from from
my.example.com/#/domains and re-adding (also with Wildcard prod) did nothing.
Then I ssh-ed into the Ubuntu 20.04 server Cloudron runs on and went to:
sudo rm exampleapp.com* sudo rm _.exampleapp.com*
I went back to
my.example.com/#/domains and clicked "Renew all certs" and all was good.
While I was in
/home/yellowtent/boxdata/certs I noticed that any domain that I had previously deleted still had certs there.
Is this by design? If so, why?
Also, how is one supposed to replace a staging cert with prod one?
I'd like to not use Nextcloud's encryption-at-rest.
So I would love to be able to provide users with server logs that would show anytime a file was accessed by someone who did not own it or have share-access to it.
In other words, I want to do more than promise, "I won't access user files."
Ideas? I know there is a Linux program Auditd. I wanted to get other folks perspectives before I dive into that.
Trying to understand your question. Are you referring to users somehow accessing the files via SSH'ing into your server? If not, how can someone access files they did not own?
My interest is in auditing admin access and making those audits transparent.