Problem with loading images
-
Hi, my search engine doesn't want to load images or overviews. It's like if the internet was slow. But I don't have a problem with the other app. I tried restarting the app, but it's the same. I don't know if it's okay what the graphs show. Thank you very much for the information.
-
@nebulon Thank you for your reply. This is the last log dump.
File "/app/code/searx/network/__init__.py", line 165, in get Jul 07 16:29:02 File "/app/code/searx/network/__init__.py", line 96, in request Jul 07 16:29:02 File "/app/code/searx/network/network.py", line 247, in patch_response Jul 07 16:29:02 File "/app/code/searx/network/network.py", line 274, in call_client Jul 07 16:29:02 File "/app/code/searx/network/network.py", line 291, in request Jul 07 16:29:02 File "/app/code/searx/network/raise_for_httperror.py", line 77, in raise_for_httperror Jul 07 16:29:02 File "/app/code/searx/search/processors/online.py", line 118, in _send_http_request Jul 07 16:29:02 File "/app/code/searx/search/processors/online.py", line 149, in _search_basic Jul 07 16:29:02 File "/app/code/searx/search/processors/online.py", line 165, in search Jul 07 16:29:02 File "/usr/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result Jul 07 16:29:02 File "/usr/lib/python3.10/concurrent/futures/_base.py", line 458, in result Jul 07 16:29:02 Traceback (most recent call last): Jul 07 16:29:02 raise SearxEngineTooManyRequestsException() Jul 07 16:29:02 raise self._exception Jul 07 16:29:02 raise_for_httperror(response) Jul 07 16:29:02 response = req(params['url'], **request_args) Jul 07 16:29:02 response = self._send_http_request(params) Jul 07 16:29:02 return Network.patch_response(response, do_raise_for_httperror) Jul 07 16:29:02 return await self.call_client(False, method, url, **kwargs) Jul 07 16:29:02 return future.result(timeout) Jul 07 16:29:02 return request('get', url, **kwargs) Jul 07 16:29:02 return self.__get_result() Jul 07 16:29:02 search_results = self._search_basic(query, params) Jul 07 16:29:02 searx.exceptions.SearxEngineTooManyRequestsException: Too many request, suspended_time=3600 Jul 07 16:29:18 2023-07-07 14:29:18,769 WARNING:searx.engines.google news: ErrorContext('searx/search/processors/online.py', 127, 'count_error(', None, '2 redirects, maximum: 0', ('200', 'OK', 'consent.google.com')) True Jul 07 16:38:35 2023-07-07 14:38:35,255 WARNING:searx.engines.google news: ErrorContext('searx/search/processors/online.py', 127, 'count_error(', None, '2 redirects, maximum: 0', ('200', 'OK', 'consent.google.com')) True Jul 07 16:38:35 2023-07-07 14:38:35,512 WARNING:searx.engines.solidtorrents: ErrorContext('searx/search/processors/online.py', 127, 'count_error(', None, '1 redirects, maximum: 0', ('200', 'OK', 'solidtorrents.to')) True Jul 07 16:38:49 2023-07-07 14:38:49,214 WARNING:searx.engines.google news: ErrorContext('searx/search/processors/online.py', 127, 'count_error(', None, '2 redirects, maximum: 0', ('200', 'OK', 'consent.google.com')) True Jul 07 19:11:52 2023-07-07 17:11:52,281 WARNING:searx.engines.google news: ErrorContext('searx/search/processors/online.py', 127, 'count_error(', None, '2 redirects, maximum: 0', ('200', 'OK', 'consent.google.com')) True Jul 07 19:11:53 2023-07-07 17:11:53,475 ERROR:searx.engines.btdigg: Too many requests Jul 07 19:11:53 2023-07-07 17:11:53,475 WARNING:searx.engines.btdigg: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'searx.exceptions.SearxEngineTooManyRequestsException', None, ('Too many request',)) False Jul 07 19:11:53 File "/app/code/searx/network/__init__.py", line 165, in get Jul 07 19:11:53 File "/app/code/searx/network/__init__.py", line 96, in request Jul 07 19:11:53 File "/app/code/searx/network/network.py", line 247, in patch_response Jul 07 19:11:53 File "/app/code/searx/network/network.py", line 274, in call_client Jul 07 19:11:53 File "/app/code/searx/network/network.py", line 291, in request Jul 07 19:11:53 File "/app/code/searx/network/raise_for_httperror.py", line 77, in raise_for_httperror Jul 07 19:11:53 File "/app/code/searx/search/processors/online.py", line 118, in _send_http_request Jul 07 19:11:53 File "/app/code/searx/search/processors/online.py", line 149, in _search_basic Jul 07 19:11:53 File "/app/code/searx/search/processors/online.py", line 165, in search Jul 07 19:11:53 File "/usr/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result Jul 07 19:11:53 File "/usr/lib/python3.10/concurrent/futures/_base.py", line 458, in result Jul 07 19:11:53 Traceback (most recent call last): Jul 07 19:11:53 raise SearxEngineTooManyRequestsException() Jul 07 19:11:53 raise self._exception Jul 07 19:11:53 raise_for_httperror(response) Jul 07 19:11:53 response = req(params['url'], **request_args) Jul 07 19:11:53 response = self._send_http_request(params) Jul 07 19:11:53 return Network.patch_response(response, do_raise_for_httperror) Jul 07 19:11:53 return await self.call_client(False, method, url, **kwargs) Jul 07 19:11:53 return future.result(timeout) Jul 07 19:11:53 return request('get', url, **kwargs) Jul 07 19:11:53 return self.__get_result() Jul 07 19:11:53 search_results = self._search_basic(query, params) Jul 07 19:11:53 searx.exceptions.SearxEngineTooManyRequestsException: Too many request, suspended_time=3600
-
@archos said in Problem with loading images:
Jul 07 19:11:53 searx.exceptions.SearxEngineTooManyRequestsException: Too many request, suspended_time=3600
It seems you are getting rate limited by the upstream search engine.
-
Checking upstream this appears to be https://github.com/searxng/searxng/issues/2515 (also https://github.com/searxng/searxng/issues/2516 )
-
@archos I updated searx ng which supposedly has a fix that fixes your situation
-
-