System environment :Ubuntu20.04
requests yes python Easy to implement HTTP library , Than python Built in urllib Better use of modules , Handle URL Resources are particularly convenient , Often used for crawler related operations .
requests Installation instructions :
pip install requsets
Common methods :
requests.request()
# Construct a request , Basic methods supporting the following methods
requests.get()
# obtain HTML The main method of web page , Corresponding to HTTP Of GET
requests.head()
# obtain HTML The header information of the web page , Corresponding HTTP Of HEAD
requests.post()
# towards HTML Web submission POST Requested method , Corresponding HTTP Of POST
requests.put()
# towards HTML Web submission PUT Requested method , Corresponding to HTTP Of PUT
requests.patch()
# towards HTML Submit local modification request for web page , Corresponding to HTTP Of PATCH
requests.delete()
# towards HTML Page submit delete request , Corresponding to HTTP Of DELETE
import requests
import json
src_ip_addr = '192.168.1.1'
web_control_url = "http://" + src_ip_addr + "/....cgi?action=get&object=ethernet_all"
web_control_res = requests.get(web_control_url)
web_control_body = json.loads(web_control_res.text)['Body']['Control_IP']
control_ip_addr = web_control_body['IPv4']
control_ip_mask = web_control_body['Mask']
control_ip_gateway = web_control_body['Gateway']
vlan_id = web_control_body['VlanID']
device_config_url = "http://" + src_ip_addr + "/....cgi?action=get&object=device_config"
device_config_res = requests.get(device_config_url)
device_config_body = json.loads(device_config_res.text)['Body']
dest_ip_addr = device_config_body["DestIp"]
dest_port = device_config_body["DestPort"]
new_data = {'name':'haha'}
res = requests.post(device_config_url, new_data)
【1】python Reptiles ( One )( Use Requsets modular )_ Hey hey hey blog -CSDN Blog
【2】Python Crawler's notes ——Requsets Library usage _zqidiot The blog of -CSDN Blog
【3】Python+request: According to four different ways of submitting data post request