How can I get urllib3 to try to reach a site for 10 seconds and return an error or data if it can't?
http = urllib3.PoolManager()
page = http.request('get', link)
CodePudding user response:
Based on Timeout configuration
[urllib3-docs]
you can set a timeout for your request when you're using urllib3
in all the below ways:
Timeouts can be defined as a default for a pool:
timeout = Timeout(connect=2.0, read=7.0) http = PoolManager(timeout=timeout) response = http.request('GET', 'http://example.com/')
Or per-request (which overrides the default for the pool):
response = http.request('GET', 'http://example.com/', >timeout=Timeout(10))
Timeouts can be disabled by setting all the parameters to None:
no_timeout = Timeout(connect=None, read=None) response = http.request('GET', 'http://example.com/, >timeout=no_timeout)
And I think your solution is sth like below the code snippet:
from urllib3 import Timeout, Poolmanager
timeout = Timeout(connect=10.0, read=None)
http = PoolManager(timeout=timeout)
response = http.request('GET', 'http://example.com/)
CodePudding user response:
From the docs
https://urllib3.readthedocs.io/en/stable/reference/urllib3.util.html#urllib3.util.Timeout
from urllib3 import Timeout, Poolmanager
timeout = Timeout(connect=2.0, read=7.0)
http = PoolManager(timeout=timeout)
response = http.request('GET', 'http://example.com/')