Official website :https://docs.python-requests.org/en/latest/user/install/
Requests Yes, it is python Language based urllib Compiling , It's using Apache2 Licensed Open source protocol HTTP library ,Requests It will compare with urllib It is more convenient , Can save us a lot of work .
requests Packet sending http request , Get response data
python in requests Library usage details
Reference resources URL: https://zhuanlan.zhihu.com/p/137649301
command :pip install requests
Import the project :import requests
Use the official website directly Quick start
import requests
r = requests.get('https://api.github.com/events')
r = requests.post('https://httpbin.org/post', data={
'key': 'value'})
r = requests.put('https://httpbin.org/put', data={
'key': 'value'})
r = requests.delete('https://httpbin.org/delete')
r = requests.head('https://httpbin.org/get')
r = requests.options('https://httpbin.org/get')
Use timeout
Parameter to set the number of seconds to wait for a connection , If the wait times out ,Requests It throws an exception
>>> requests.get('http://github.com', timeout=0.001)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
requests.exceptions.Timeout: HTTPConnectionPool(host='github.com', port=80): Request timed out. (timeout=0.001)
>>> requests.get('https://www.baidu.com',timeout=0.5)
<Response [200]>
The problem background :
stay response =requests.get(url) Open one https The following error occurred during the connection :
urllib.error.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:777)>
The solution is as follows , Add one verify=False:
response= requests.get(url=link, verify=False).json()
The problem background :
When we visit non http When the port reports an error , Report errors raise ConnectionError(err, request=request)
Solution :
Use exception capture processing
demo as follows :