Cloudron 5.3.3 crashed for me, unclear why.
-
I saw exactly the same stack traces in one Cloudron today. I guess you also saw a "2020-07-02T18:28:00.273Z ERROR (node:20686) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 error listeners added. Use emitter.setMaxListeners() to increase limit" somewhere ? It will be right after the code restarted.
I have made a few fixes earlier today for this - https://git.cloudron.io/cloudron/box/-/commit/70743bd2850031d89422dbd77ef07703c5ed09af and https://git.cloudron.io/cloudron/box/-/commit/d1ff8e9d6bdd3f67df3295603b6f6438858b5b65 . So far since I made those changes, that Cloudron has not seen any errors. We will probably make a 5.3.4 with this fix.
Finally, the "Too many concurrent connections" issue is something else. Let me know if you see it again.
-
The server help up fine over the night, so we will push out a new 5.3.4 with the fix.
-
@girish Sounds great, because one of my clients just informed me of this issue again too. Unfortunately my alerting doesn't seem to be working because I guess the requests are being served fine for the first while just really slow, so I have to improve my alerting down the road here to find these types of issues, because it almost seems like a performance issue for the first few hours then it breaks and needs the restart.
-
@d19dotca Do you see the same errors in the logs? For some reason, it doesn't appear in most servers (like our own Cloudron installations).
-
@girish I just double-checked and yes it seems I do. For example:
2020-07-03T17:33:41.781Z box:ldap 172.18.0.40:48204 unexpected connection error { Error: This socket has been ended by the other party at Socket.writeAfterFIN [as write] (net.js:396:12) at SearchResponse.LDAPResult.end (/home/yellowtent/box/node_modules/ldapjs/lib/messages/result.js:58:21) at sendError (/home/yellowtent/box/node_modules/ldapjs/lib/server.js:401:22) at /home/yellowtent/box/node_modules/ldapjs/lib/server.js:418:11 at /home/yellowtent/box/src/ldap.js:262:70 at Query.<anonymous> (/home/yellowtent/box/src/mailboxdb.js:200:46) at Query.<anonymous> (/home/yellowtent/box/node_modules/mysql/lib/Connection.js:526:10) at Query._callback (/home/yellowtent/box/node_modules/mysql/lib/Connection.js:488:16) at Query.Sequence.end (/home/yellowtent/box/node_modules/mysql/lib/protocol/sequences/Sequence.js:83:24) at Query._handleFinalResultPacket (/home/yellowtent/box/node_modules/mysql/lib/protocol/sequences/Query.js:149:8) code: 'EPIPE' }
2020-07-03T17:35:01.509Z box:ldap 172.18.0.40:48312 unexpected connection error { Error: This socket has been ended by the other party at Socket.writeAfterFIN [as write] (net.js:396:12) at BindResponse.LDAPResult.end (/home/yellowtent/box/node_modules/ldapjs/lib/messages/result.js:58:21) at /home/yellowtent/box/src/ldap.js:633:25 at AsyncWrap.<anonymous> (/home/yellowtent/box/src/users.js:278:21) at AsyncWrap.wrap.ondone (internal/crypto/pbkdf2.js:36:48) code: 'EPIPE' }
2020-07-03T17:39:57.697Z box:ldap 172.18.0.40:51824 unexpected connection error { Error: This socket has been ended by the other party at Socket.writeAfterFIN [as write] (net.js:396:12) at SearchResponse.send (/home/yellowtent/box/node_modules/ldapjs/lib/messages/search_response.js:88:21) at /home/yellowtent/box/src/ldap.js:111:17 at Array.forEach (<anonymous>) at finalSend (/home/yellowtent/box/src/ldap.js:110:17) at /home/yellowtent/box/src/ldap.js:281:17 at Query.<anonymous> (/home/yellowtent/box/src/mailboxdb.js:202:13) at Query.<anonymous> (/home/yellowtent/box/node_modules/mysql/lib/Connection.js:526:10) at Query._callback (/home/yellowtent/box/node_modules/mysql/lib/Connection.js:488:16) at Query.Sequence.end (/home/yellowtent/box/node_modules/mysql/lib/protocol/sequences/Sequence.js:83:24) code: 'EPIPE' } 2020-07-03T17:39:57.710Z box:ldap 172.18.0.40:51824 unexpected connection error { Error: This socket has been ended by the other party at Socket.writeAfterFIN [as write] (net.js:396:12) at SearchResponse.LDAPResult.end (/home/yellowtent/box/node_modules/ldapjs/lib/messages/result.js:58:21) at finalSend (/home/yellowtent/box/src/ldap.js:116:9) at /home/yellowtent/box/src/ldap.js:281:17 at Query.<anonymous> (/home/yellowtent/box/src/mailboxdb.js:202:13) at Query.<anonymous> (/home/yellowtent/box/node_modules/mysql/lib/Connection.js:526:10) at Query._callback (/home/yellowtent/box/node_modules/mysql/lib/Connection.js:488:16) at Query.Sequence.end (/home/yellowtent/box/node_modules/mysql/lib/protocol/sequences/Sequence.js:83:24) at Query._handleFinalResultPacket (/home/yellowtent/box/node_modules/mysql/lib/protocol/sequences/Query.js:149:8)
And yes I now also see the error you mentioned around the MaxListeners
2020-07-03T17:40:02.373Z ERROR (node:13987) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 error listeners added. Use emitter.setMaxListeners() to increase limit [ internal/process/warning.js:25:20 ]
-
@d19dotca Thanks. The new release is going through the ci, should be available in sometime.
-
@girish Thank you. Are we talking days though or just maybe hours? I am asking only because this seems to be happening daily now and I have to issue credits to my customers for the outages, so I basically just want to know if I should temporarily setup a reboot cron every 12 hours or something or just wait for the update to come later today?
-
@d19dotca I have pushed it now. Can you update and keep me posted on how it's working? I have updated our own servers and will keep monitoring it over the weekend and make the release available for all next week.
-
@girish Sounds good, I see the update. I will install it in a few hours (just need to wait for the "business day" to be done so it has the least impact). I'll monitor it and see what becomes of it. Thanks a lot for the quick help, Girish!
I suppose I could install an agent in the future to look for certain errors and email me, eh? Just thinking about how to improve my alerting for these types of issues going forward.
-
@d19dotca You can setup statping that @fbartels recently packaged on another server https://cloudron.io/store/com.statping.cloudronapp.html . You can poll for
https//my.<domain>/api/v1/cloudron/status
-
@girish Actually funny enough, I was already using that but only testing so it was still on the same server. I actually disabled it temporarily thinking that may be causing the issue since it was setup to poll a few things on the server, and when I saw the "too many concurrent connections" and such it made me wonder if that was why as this issue coincidentally was around the same time as I setup Statping, but I realized the bigger issue I think was that it was really the 5.3.3 version as it was the day after that was installed that the issue was happening, I believe, and I setup Statping on Monday or so before the update.
-
@girish - I just updated my server a bit ago, so we'll see how it holds up. I'll try to monitor it as best I can and will report back.
Out of curiosity, I think it's odd you and I both saw the issue in our own servers about the same day. Is this because of 5.3.3 then, or something different?