Yeah it makes me concerned that the package updates got messed up somewhere along the way to have duplicated some content in the ENV and remove others, but at least it’s an easy fix (manually).
d19dotca
Posts
-
Invoice ninja server 500 internal error -
Invoice ninja server 500 internal errorInterestingly I just ran across this issue. I was able to fix it by simply adding in the following:
PDF_GENERATOR=snappdf
Odd though as I see others didn't get that working and needed a value of
hosted_ninja
instead ofsnappdf
. So YMMV as they say. -
Hetzner PTR Record InvalidConfirmed, it seems to be much better now. Thank you so much for the quick turnaround on that! Great job!
-
Hetzner PTR Record InvalidHi @girish , any chance that the fix could be released today or tomorrow for those of us on 8.2.x?
Also I’m sure you already have this being tracked for the future but I wanted to at least write down my suggestion: I think if possible it may be a great time to add some more automated test cases for the email functionality in order to make sure the DKIM signature exists in messages as an example.
-
413 "Content Too Large" error code in Surfer when using WebDav storage with Strongbox appThe client is the Strongbox app but I’m not sure what dependency they may be using under the hood for the WebDAV connection itself. I will maybe open a support case with them to get more details. Just something I thought was strange and wanted to see what possible restrictions may exist in the Surfer app since it was responding with a 413. I’ll get more details out of Strongbox and come back here if I learn anything new. Thanks so much.
-
413 "Content Too Large" error code in Surfer when using WebDav storage with Strongbox appI was recently testing Strongbox (a password manager on macOS and iOS) and wanted to save the KeePass database file in the Surfer app's WebDAV connection. After the file grew to a certain size however, after adding favicon to all the entries specifically, it seemed to continually error out with a 413 error code "Content too large".
The size of the file approached 7.8 MB but was about 6.3 MB or so approximately in Surfer before syncing started to fail. I know I've stored files larger than that before so my thinking is this may come down to how the POST/PUT requests work or something like that.
Is there a particular limitation in upload sizes or however WebDAV works for the Surfer app? I checked and didn't see such limitations but it may be good to understand where the restriction is currently.
If anyone else has encountered this before, it'd be appreciated if they could share how they fixed it. I haven't had this issue before in Surfer and didn't see any other reports, so it may be a rare event. Any guidance would be appreciated.
-
"Default" way to change timezone?It would be nice to have these times converted to the local time zone. When we set a timezone in Cloudron there shouldn't (in my opinion) be a need to still use UTC-0 when setting crons in Cloudron.
-
Hetzner PTR Record Invalid@nebulon / @girish , I think unfortunately this is coming down to a defect in 8.2.x where the DKIM signatures are not being used to sign outgoing messages.
I'm thinking we will need a patch for that as soon as possible, please as it's having a big impact in connecting to certain mail providers (seems to mostly be Google at the moment but I'm sure others are affected to a degree too). In the meantime, I may need to switch to a different SMTP server / relay service temporarily.
If there's anything I can do to help, please let me know. I can offer SSH connection into my server if you require it too.
-
Hetzner PTR Record InvalidThe Cloudron status shows everything green in my instance. The DNS records are perfectly fine. The issue is the Haraka SMTP service in Cloudron seems to no longer be signing the messages properly so they are missing DKIM signatures.
-
Haraka Mailserver Logs show loopback_is_rejected errors@nebulon I can confirm sync DNS doesn’t do anything to fix the issue. The issue isn’t DNS, I think it’s in the Haraka SMTP side. Please refer to the other thread too where there’s some more details. I think these two threads are effectively the same issue around DKIM not working.
-
Haraka Mailserver Logs show loopback_is_rejected errorsNo VPN for me either and I saw this same
loopback_is_rejected
error in my logs which I pasted for a possibly related issue here: https://forum.cloudron.io/topic/12974/hetzner-ptr-record-invalid/12?_=1735543281810By any chance are you on 8.2.0? or 8.2.1?
-
Hetzner PTR Record InvalidLast thing to add... here is a screenshot from Google Postmaster tools which shows that the DKIM success rate went down after the upgrade to Cloudron 8.2.0 when taking into account the event dates.
It seems like Cloudron isn't signing the mail with DKIM signatures at all, as if it's been disabled or something. I think we need this patched ASAP, please.
-
Hetzner PTR Record InvalidNot sure if related, but I do see DKIM mentioned as being replaced, and unsure if this is part of the reason or not. Maybe a reach but wanted to share this in case:
Dec 29 21:40:15 70:M 30 Dec 2024 05:40:15.252 * Server initialized Dec 29 21:40:15 70:M 30 Dec 2024 05:40:15.252 * Ready to accept connections tcp Dec 29 21:40:15 doveconf: Warning: service auth { client_limit=1000 } is lower than required under max. load (1300). Counted for protocol services with service_count != 1: service managesieve-login { process_limit=100 } + service pop3-login { process_limit=500 } + service lmtp { process_limit=100 } + service imap-urlauth-login { process_limit=100 } + service imap-login { process_limit=500 } Dec 29 21:40:15 doveconf: Warning: service anvil { client_limit=1000 } is lower than required under max. load (1203). Counted with: service managesieve-login { process_limit=100 } + service pop3-login { process_limit=500 } + service imap-urlauth-login { process_limit=100 } + service imap-login { process_limit=500 } + service auth { process_limit=1 } Dec 29 21:40:15 Warning: service auth { client_limit=1000 } is lower than required under max. load (1300). Counted for protocol services with service_count != 1: service managesieve-login { process_limit=100 } + service pop3-login { process_limit=500 } + service lmtp { process_limit=100 } + service imap-urlauth-login { process_limit=100 } + service imap-login { process_limit=500 } Dec 29 21:40:15 Warning: service anvil { client_limit=1000 } is lower than required under max. load (1203). Counted with: service managesieve-login { process_limit=100 } + service pop3-login { process_limit=500 } + service imap-urlauth-login { process_limit=100 } + service imap-login { process_limit=500 } + service auth { process_limit=1 } Dec 29 21:40:15 loaded TLD files: Dec 29 21:40:15 1=1445 Dec 29 21:40:15 2=8416 Dec 29 21:40:15 3=3642 Dec 29 21:40:15 loaded 9773 Public Suffixes Dec 29 21:40:15 Mail service endpoint listening on http://:::3000 Dec 29 21:40:15 loglevel: INFO Dec 29 21:40:15 log format: DEFAULT Dec 29 21:40:15 Starting up Haraka version 3.0.5 Dec 29 21:40:15 [INFO] [-] [plugins] loading delay_deny Dec 29 21:40:15 [INFO] [-] [plugins] loading dns-list Dec 29 21:40:15 [INFO] [-] [plugins] loading helo.checks Dec 29 21:40:15 [INFO] [-] [plugins] loading headers Dec 29 21:40:15 [INFO] [-] [plugins] loading tls Dec 29 21:40:15 [INFO] [-] [core] loading tls.ini Dec 29 21:40:15 [INFO] [-] [plugins] loading spf Dec 29 21:40:15 [INFO] [-] [plugins] loading cloudron Dec 29 21:40:15 [INFO] [-] [plugins] loading rcpt_to.in_host_list Dec 29 21:40:15 [NOTICE] [-] [plugins] dkim_sign has been replaced by 'dkim'. Please update config/plugins Dec 29 21:40:15 [INFO] [-] [plugins] loading dkim Dec 29 21:40:15 [INFO] [-] [plugins] loading spamassassin Dec 29 21:40:15 [INFO] [-] [plugins] loading queue/smtp_forward Dec 29 21:40:15 [INFO] [-] [plugins] loading limit Dec 29 21:40:15 [NOTICE] [-] [server] Listening on [::0]:2525 Dec 29 21:40:15 [INFO] [-] [server] getting SocketOpts for SMTPS server Dec 29 21:40:15 TypeError: Cannot read properties of undefined (reading 'loopback_is_rejected') Dec 29 21:40:15 at exports.checkZoneNegative (/app/code/haraka/node_modules/haraka-plugin-dns-list/index.js:347:22) Dec 29 21:40:15 at exports.check_zone (/app/code/haraka/node_modules/haraka-plugin-dns-list/index.js:372:20) Dec 29 21:40:15 at async Promise.all (index 0) Dec 29 21:40:15 at async exports.check_zones (/app/code/haraka/node_modules/haraka-plugin-dns-list/index.js:393:5) Dec 29 21:40:15 [INFO] [-] [dns-list] will re-test list zones every 30 minutes Dec 29 21:40:15 [INFO] [-] [server] Creating TLS server on [::0]:2465 Dec 29 21:40:15 [NOTICE] [-] [server] Listening on [::0]:2465 Dec 29 21:40:15 [NOTICE] [-] [server] Listening on [::0]:2587 Dec 29 21:40:15 [INFO] [-] [cloudron] Initializing queue server on port 6000 Dec 29 21:40:15 [INFO] [-] [limit] connected to redis://127.0.0.1:6379/4 Dec 29 21:40:15 [INFO] [-] [outbound/queue] Loading outbound queue from /app/data/haraka-queue Dec 29 21:40:15 [INFO] [-] [outbound/queue] Loading the queue... Dec 29 21:40:15 [INFO] [-] [outbound/queue] [pid: undefined] 0 files in my delivery queue Dec 29 21:40:15 [INFO] [-] [outbound/queue] [pid: undefined] 0 files in my load queue Dec 29 21:40:15 [INFO] [-] [outbound/queue] [pid: undefined] 2 files in my temp fail queue Dec 29 21:40:16 INFO [main] 05:40:16,091 org.apache.tika.server.core.TikaServerProcess Starting Apache Tika 3.0.0 server Dec 29 21:40:16 2024-12-30 05:40:16,311 INFO success: dovecot entered RUNNING state, process has stayed up for > than 1 seconds (startsecs) Dec 29 21:40:16 2024-12-30 05:40:16,311 INFO success: haraka entered RUNNING state, process has stayed up for > than 1 seconds (startsecs) Dec 29 21:40:16 2024-12-30 05:40:16,311 INFO success: mail-service entered RUNNING state, process has stayed up for > than 1 seconds (startsecs) Dec 29 21:40:16 2024-12-30 05:40:16,311 INFO success: redis entered RUNNING state, process has stayed up for > than 1 seconds (startsecs) Dec 29 21:40:16 2024-12-30 05:40:16,311 INFO success: solr entered RUNNING state, process has stayed up for > than 1 seconds (startsecs) Dec 29 21:40:16 2024-12-30 05:40:16,311 INFO success: spamd entered RUNNING state, process has stayed up for > than 1 seconds (startsecs) Dec 29 21:40:16 2024-12-30 05:40:16,311 INFO success: tika entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
-
Hetzner PTR Record InvalidFWIW, I have noticed recently that my emails to Gmail addresses in particular are being rate limited and it appears to be because Google sees it as possible spam.
When I ran some tests, I see my DKIM is no longer signed properly even though there's been no DNS changes. I've verified my DNS and everything looks good, but the date this happened according to Google Postmaster Tools is December 19th. I updated Cloudron to 8.2.0 on the night of December 18th (so basically the 19th from any Eastern Time zone or UTC-0 time).
https://unspam.email/results/plVX3ZXGoX shows bad DKIM ("Existing DKIM Signature: The email is not signed with DKIM, whether or not it is a valid signature; Verified DKIM Signature: The email is not signed with a valid DKIM signature.") and a warning for DMARC ("The email "from" domain not matches the DKIM signature "from" domain.").
It's too coincidental with no changes being made to DNS manually that issues start occurring with DKIM as soon as I upgraded Cloudron to 8.2.0, so I am of the mindset something is wrong in Cloudron.
Here's a screenshot of status from Google Postmaster Tools:
Side note: I think the DNS records part about missing PTR record has been fixed now but the Postmaster Tools hasn't updated yet. It turned out this issue exposed a missing PTR on my IPv6 record (but was set correctly for IPv4).
I should also add that Cloudron shows no errors at all when it run the DNS checks, it's all green. Also I ran host commands and see PTR just fine. So I think the issue is more related to DKIM at this point in Cloudron.
% host -t PTR <ip_address> <ip_address>.in-addr.arpa domain name pointer mail.d19.ca. % host -t PTR <ipv6_address> <ipv6_address>.ip6.arpa domain name pointer mail.d19.ca.
Just FYI, @girish .
-
Which Domain Name Registrars do you recommand in 2025 ?I use OVHcloud as it’s one of the few that invoice in my local currency, so it helps with exchange rates. With the low Canadian dollar against the USD, it’s even more helpful in helping me money. Plus it has a nice feature of allowing zone file editing directly.
-
Tika errors in Mail indexingAnother example here too (a much longer one):
Dec 22 23:24:14 WARN [qtp24334184-588] 07:24:14,205 org.apache.tika.server.core.resource.TikaResource tika/: Text extraction failed (=?utf-8?B?Tm8gMjEgeGlhb3FpbmcgemhhbmcuZG9jeA==?=) Dec 22 23:24:14 org.apache.tika.exception.TikaException: TIKA-198: Illegal IOException from org.apache.tika.parser.pkg.PackageParser@6166fb02 Dec 22 23:24:14 at org.apache.tika.parser.CompositeParser.parse(CompositeParser.java:304) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.CompositeParser.parse(CompositeParser.java:298) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.AutoDetectParser.parse(AutoDetectParser.java:204) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.server.core.resource.TikaResource.parse(TikaResource.java:363) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.server.core.resource.TikaResource.lambda$produceText$1(TikaResource.java:509) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.cxf.jaxrs.provider.BinaryDataProvider.writeTo(BinaryDataProvider.java:176) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.cxf.jaxrs.utils.JAXRSUtils.writeMessageBody(JAXRSUtils.java:1651) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.cxf.jaxrs.interceptor.JAXRSOutInterceptor.serializeMessage(JAXRSOutInterceptor.java:249) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.cxf.jaxrs.interceptor.JAXRSOutInterceptor.processResponse(JAXRSOutInterceptor.java:122) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.cxf.jaxrs.interceptor.JAXRSOutInterceptor.handleMessage(JAXRSOutInterceptor.java:84) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:307) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.cxf.interceptor.OutgoingChainInterceptor.handleMessage(OutgoingChainInterceptor.java:90) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:307) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.cxf.transport.ChainInitiationObserver.onMessage(ChainInitiationObserver.java:121) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.cxf.transport.http.AbstractHTTPDestination.invoke(AbstractHTTPDestination.java:265) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.cxf.transport.http_jetty.JettyHTTPDestination.doService(JettyHTTPDestination.java:244) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.cxf.transport.http_jetty.JettyHTTPHandler.handle(JettyHTTPHandler.java:80) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:223) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1381) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:178) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1303) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:149) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.server.Server.handle(Server.java:563) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:287) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:314) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:100) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.io.SelectableChannelEndPoint$1.run(SelectableChannelEndPoint.java:53) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.util.thread.strategy.AdaptiveExecutionStrategy.runTask(AdaptiveExecutionStrategy.java:421) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.util.thread.strategy.AdaptiveExecutionStrategy.consumeTask(AdaptiveExecutionStrategy.java:390) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.util.thread.strategy.AdaptiveExecutionStrategy.tryProduce(AdaptiveExecutionStrategy.java:277) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.util.thread.strategy.AdaptiveExecutionStrategy.run(AdaptiveExecutionStrategy.java:199) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:411) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at java.base/java.lang.Thread.run(Thread.java:840) [?:?] Dec 22 23:24:14 Caused by: org.apache.commons.io.TaggedIOException: invalid code lengths set Dec 22 23:24:14 at org.apache.commons.io.input.TaggedInputStream.handleIOException(TaggedInputStream.java:93) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.commons.io.input.ProxyInputStream.read(ProxyInputStream.java:228) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.gagravarr.tika.OggDetector.detect(OggDetector.java:68) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.detect.CompositeDetector.detect(CompositeDetector.java:84) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.AutoDetectParser.parse(AutoDetectParser.java:179) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.DelegatingParser.parse(DelegatingParser.java:71) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.extractor.ParsingEmbeddedDocumentExtractor.parseEmbedded(ParsingEmbeddedDocumentExtractor.java:111) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.pkg.PackageParser.parseEntry(PackageParser.java:481) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.pkg.PackageParser.parseEntries(PackageParser.java:386) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.pkg.PackageParser._parse(PackageParser.java:336) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.pkg.PackageParser.parse(PackageParser.java:259) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.CompositeParser.parse(CompositeParser.java:298) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 ... 41 more Dec 22 23:24:14 Caused by: org.apache.commons.io.TaggedIOException: invalid code lengths set Dec 22 23:24:14 at org.apache.commons.io.input.TaggedInputStream.handleIOException(TaggedInputStream.java:93) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.commons.io.input.ProxyInputStream.read(ProxyInputStream.java:228) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.commons.io.input.ProxyInputStream.read(ProxyInputStream.java:224) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.commons.io.input.ProxyInputStream.read(ProxyInputStream.java:224) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.gagravarr.tika.OggDetector.detect(OggDetector.java:68) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.detect.CompositeDetector.detect(CompositeDetector.java:84) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.AutoDetectParser.parse(AutoDetectParser.java:179) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.DelegatingParser.parse(DelegatingParser.java:71) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.extractor.ParsingEmbeddedDocumentExtractor.parseEmbedded(ParsingEmbeddedDocumentExtractor.java:111) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.pkg.PackageParser.parseEntry(PackageParser.java:481) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.pkg.PackageParser.parseEntries(PackageParser.java:386) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.pkg.PackageParser._parse(PackageParser.java:336) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.pkg.PackageParser.parse(PackageParser.java:259) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.CompositeParser.parse(CompositeParser.java:298) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 ... 41 more Dec 22 23:24:14 Caused by: java.util.zip.ZipException: invalid code lengths set Dec 22 23:24:14 at org.apache.commons.compress.archivers.zip.ZipArchiveInputStream.readFromInflater(ZipArchiveInputStream.java:1083) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.commons.compress.archivers.zip.ZipArchiveInputStream.readDeflated(ZipArchiveInputStream.java:998) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.commons.compress.archivers.zip.ZipArchiveInputStream.read(ZipArchiveInputStream.java:929) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:244) ~[?:?] Dec 22 23:24:14 at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:263) ~[?:?] Dec 22 23:24:14 at org.apache.commons.io.input.ProxyInputStream.read(ProxyInputStream.java:224) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.commons.io.input.ProxyInputStream.read(ProxyInputStream.java:224) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.commons.io.input.ProxyInputStream.read(ProxyInputStream.java:224) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.gagravarr.tika.OggDetector.detect(OggDetector.java:68) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.detect.CompositeDetector.detect(CompositeDetector.java:84) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.AutoDetectParser.parse(AutoDetectParser.java:179) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.DelegatingParser.parse(DelegatingParser.java:71) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.extractor.ParsingEmbeddedDocumentExtractor.parseEmbedded(ParsingEmbeddedDocumentExtractor.java:111) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.pkg.PackageParser.parseEntry(PackageParser.java:481) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.pkg.PackageParser.parseEntries(PackageParser.java:386) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.pkg.PackageParser._parse(PackageParser.java:336) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.pkg.PackageParser.parse(PackageParser.java:259) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.CompositeParser.parse(CompositeParser.java:298) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 ... 41 more Dec 22 23:24:14 Caused by: java.util.zip.DataFormatException: invalid code lengths set Dec 22 23:24:14 at java.base/java.util.zip.Inflater.inflateBytesBytes(Native Method) ~[?:?] Dec 22 23:24:14 at java.base/java.util.zip.Inflater.inflate(Inflater.java:378) ~[?:?] Dec 22 23:24:14 at org.apache.commons.compress.archivers.zip.ZipArchiveInputStream.readFromInflater(ZipArchiveInputStream.java:1081) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.commons.compress.archivers.zip.ZipArchiveInputStream.readDeflated(ZipArchiveInputStream.java:998) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.commons.compress.archivers.zip.ZipArchiveInputStream.read(ZipArchiveInputStream.java:929) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:244) ~[?:?] Dec 22 23:24:14 at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:263) ~[?:?] Dec 22 23:24:14 at org.apache.commons.io.input.ProxyInputStream.read(ProxyInputStream.java:224) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.commons.io.input.ProxyInputStream.read(ProxyInputStream.java:224) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.commons.io.input.ProxyInputStream.read(ProxyInputStream.java:224) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.gagravarr.tika.OggDetector.detect(OggDetector.java:68) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.detect.CompositeDetector.detect(CompositeDetector.java:84) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.AutoDetectParser.parse(AutoDetectParser.java:179) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.DelegatingParser.parse(DelegatingParser.java:71) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.extractor.ParsingEmbeddedDocumentExtractor.parseEmbedded(ParsingEmbeddedDocumentExtractor.java:111) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.pkg.PackageParser.parseEntry(PackageParser.java:481) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.pkg.PackageParser.parseEntries(PackageParser.java:386) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.pkg.PackageParser._parse(PackageParser.java:336) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.pkg.PackageParser.parse(PackageParser.java:259) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 at org.apache.tika.parser.CompositeParser.parse(CompositeParser.java:298) ~[tika-server-standard.jar:3.0.0] Dec 22 23:24:14 ... 41 more Dec 22 23:24:14 Dec 23, 2024 7:24:14 AM org.apache.cxf.jaxrs.utils.JAXRSUtils logMessageHandlerProblem Dec 22 23:24:14 SEVERE: Problem with writing the data, class org.apache.tika.server.core.resource.TikaResource$$Lambda$391/0x00007f41fc2ac740, ContentType: text/plain
-
Tika errors in Mail indexingI recently enabled Full Text Search (FTS) via Solr/Tika in Cloudron 8.2, and while it seems to be working overall, I did see quite a few instances of this stack trace too in the logs. Likely more of a Tika issue but raising it here in case there's some tweaks needed in Cloudron for it:
Dec 22 23:22:48 WARN [qtp24334184-3119] 07:22:48,741 org.apache.tika.server.core.resource.TikaResource tika/: Text extraction failed (null) Dec 22 23:22:48 org.apache.tika.exception.TikaException: TesseractOCRParser bad exit value 1 err msg: Tesseract Open Source OCR Engine v4.1.1 with Leptonica Dec 22 23:22:48 Error in fopenReadStream: file not found Dec 22 23:22:48 Error in pixRead: image file not found: � Dec 22 23:22:48 Image file � cannot be read! Dec 22 23:22:48 Error during processing. Dec 22 23:22:48 at org.apache.tika.parser.ocr.TesseractOCRParser.runOCRProcess(TesseractOCRParser.java:493) ~[tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.tika.parser.ocr.TesseractOCRParser.doOCR(TesseractOCRParser.java:447) ~[tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.tika.parser.ocr.TesseractOCRParser.parse(TesseractOCRParser.java:334) ~[tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.tika.parser.ocr.TesseractOCRParser.parse(TesseractOCRParser.java:276) ~[tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.tika.parser.CompositeParser.parse(CompositeParser.java:298) ~[tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.tika.parser.CompositeParser.parse(CompositeParser.java:298) ~[tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.tika.parser.AutoDetectParser.parse(AutoDetectParser.java:204) ~[tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.tika.parser.image.AbstractImageParser.parse(AbstractImageParser.java:106) ~[tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.tika.parser.CompositeParser.parse(CompositeParser.java:298) ~[tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.tika.parser.CompositeParser.parse(CompositeParser.java:298) ~[tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.tika.parser.AutoDetectParser.parse(AutoDetectParser.java:204) ~[tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.tika.server.core.resource.TikaResource.parse(TikaResource.java:363) ~[tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.tika.server.core.resource.TikaResource.lambda$produceText$1(TikaResource.java:509) ~[tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.cxf.jaxrs.provider.BinaryDataProvider.writeTo(BinaryDataProvider.java:176) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.cxf.jaxrs.utils.JAXRSUtils.writeMessageBody(JAXRSUtils.java:1651) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.cxf.jaxrs.interceptor.JAXRSOutInterceptor.serializeMessage(JAXRSOutInterceptor.java:249) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.cxf.jaxrs.interceptor.JAXRSOutInterceptor.processResponse(JAXRSOutInterceptor.java:122) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.cxf.jaxrs.interceptor.JAXRSOutInterceptor.handleMessage(JAXRSOutInterceptor.java:84) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:307) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.cxf.interceptor.OutgoingChainInterceptor.handleMessage(OutgoingChainInterceptor.java:90) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:307) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.cxf.transport.ChainInitiationObserver.onMessage(ChainInitiationObserver.java:121) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.cxf.transport.http.AbstractHTTPDestination.invoke(AbstractHTTPDestination.java:265) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.cxf.transport.http_jetty.JettyHTTPDestination.doService(JettyHTTPDestination.java:244) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.apache.cxf.transport.http_jetty.JettyHTTPHandler.handle(JettyHTTPHandler.java:80) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:223) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1381) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:178) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1303) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:149) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.server.Server.handle(Server.java:563) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:287) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:314) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:100) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.io.SelectableChannelEndPoint$1.run(SelectableChannelEndPoint.java:53) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.util.thread.strategy.AdaptiveExecutionStrategy.runTask(AdaptiveExecutionStrategy.java:421) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.util.thread.strategy.AdaptiveExecutionStrategy.consumeTask(AdaptiveExecutionStrategy.java:390) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.util.thread.strategy.AdaptiveExecutionStrategy.tryProduce(AdaptiveExecutionStrategy.java:277) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.util.thread.strategy.AdaptiveExecutionStrategy.run(AdaptiveExecutionStrategy.java:199) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:411) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [tika-server-standard.jar:3.0.0] Dec 22 23:22:48 at java.base/java.lang.Thread.run(Thread.java:840) [?:?] Dec 22 23:22:48 Dec 23, 2024 7:22:48 AM org.apache.cxf.jaxrs.utils.JAXRSUtils logMessageHandlerProblem Dec 22 23:22:48 SEVERE: Problem with writing the data, class org.apache.tika.server.core.resource.TikaResource$$Lambda$391/0x00007f41fc2ac740, ContentType: text/plain
-
Searching mail error "Server Error: UID SEARCH: Internal error" when Full Text Search (Solr) indexing is enabled, resolves when disabled.That's great, thanks @girish! It seems to work now. I enabled it and followed the steps in the new documentation you linked to, and then ran
doveadm -c /run/dovecot.conf index -A '*'
from the mail container which started indexing everything. Just waiting for the indexing to finish to confirm it'll be good. -
What's coming in 8.2I'm on 8.2.1 now but not understanding how the archive feature works. I don't see any way to archive an app even with recent backups made after upgrading to 8.2.1. Am I missing something?Sorry, I figured it out. I didn't realize it was on the Uninstall page, I thought it was part of the backups section. My bad.
-
Minor UI issue with backup file long namesI noticed there's a minor issue with the CSS (or something) on the Backups page, when copying a large file name it overflows:
When I add this CSS class in inspector for a test, it seems to look a little nicer:
.ng-binding { overflow: auto; }