Python The standard database interface is Python DB-API,Python DB-API For developers to provide a database application programming interface .Python Database interface supports a lot of databases , You can choose the right database for your project :
You can visit Python Database interface and API Check out the detailed list of supported databases .
Different databases you need to download different DB API modular , For example, you need to visit Oracle Database and Mysql data , You need to download Oracle and MySQL Database module .
DB-API It's a norm . It defines a series of necessary objects and database access methods , In order to provide a consistent access interface for a variety of underlying database systems and a variety of database interface programs .
Python Of DB-API, Interfaces are implemented for most databases , After using it to connect the databases , You can operate each database in the same way .
Python DB-API Usage flow :
introduce API modular .
Get the connection to the database .
perform SQL Statements and stored procedures
Close database connection .
Python operation MySQL There are two main ways :
DB modular ( Native SQL)
ORM frame
2.1PyMySQL modular
This paper mainly introduces PyMySQL modular ,MySQLdb Use in a similar way
2.1.1 install PyMySQL
PyMySQL It's a Python Compiling MySQL The driver , Let's use Python Language operation MySQL database .
pip install PyMySQL
2.2 Basic use
#! /usr/bin/env python # -*- coding: utf-8 -*- # __author__ = "shuke" # Date: 2018/5/13 import pymysql # Create connection conn = pymysql.connect(host="127.0.0.1", port=3306, user='zff', passwd='zff123', db='zff', charset='utf8mb4') # Create cursors ( The query data is returned in tuple format ) # cursor = conn.cursor() # Create cursors ( The query data is returned in dictionary format ) cursor = conn.cursor(pymysql.cursors.DictCursor) # 1. perform SQL, Returns the number of affected rows effect_row1 = cursor.execute("select * from USER") # 2. perform SQL, Returns the number of affected rows , Insert more than one row of data effect_row2 = cursor.executemany("insert into USER (NAME) values(%s)", [("jack"), ("boom"), ("lucy")]) # 3 # Query all the data , The returned data is in tuple format result = cursor.fetchall() # increase / Delete / All changes need to be carried out commit Submit , Preservation conn.commit() # Close cursor cursor.close() # Close the connection conn.close() print(result) """ [{'id': 6, 'name': 'boom'}, {'id': 5, 'name': 'jack'}, {'id': 7, 'name': 'lucy'}, {'id': 4, 'name': 'tome'}, {'id': 3, 'name': 'zff'}, {'id': 1, 'name': 'zhaofengfeng'}, {'id': 2, 'name': 'zhaofengfeng02'}] """
2.3 Get the newly created data self increment ID
#! /usr/bin/env python # -*- coding: utf-8 -*- # __author__ = "shuke" # Date: 2018/5/13 import pymysql # Create connection conn = pymysql.connect(host="127.0.0.1", port=3306, user='zff', passwd='zff123', db='zff', charset='utf8mb4') # Create cursors ( The query data is returned in tuple format ) cursor = conn.cursor() # Get the newly created data automatically ID effect_row = cursor.executemany("insert into USER (NAME)values(%s)", [("eric")]) # All additions, deletions and modifications need to be made commit Submit conn.commit() # Close cursor cursor.close() # Close the connection conn.close() new_id = cursor.lastrowid print(new_id) """ 8 """
2.4 Query operation
#! /usr/bin/env python # -*- coding: utf-8 -*- # __author__ = "shuke" # Date: 2018/5/13 import pymysql # Create connection conn = pymysql.connect(host="127.0.0.1", port=3306, user='zff', passwd='zff123', db='zff', charset='utf8mb4') # Create cursors cursor = conn.cursor() cursor.execute("select * from USER") # Get the first row of data row_1 = cursor.fetchone() # Before acquisition n Row data row_2 = cursor.fetchmany(3) # # # Get all the data row_3 = cursor.fetchall() # Close cursor cursor.close() # Close the connection conn.close() print(row_1) print(row_2) print(row_3)
:warning: stay fetch The data is in order , have access to cursor.scroll(num,mode) To move the cursor position , Such as :
2.5 prevent SQL Inject
#! /usr/bin/env python # -*- coding: utf-8 -*- # __author__ = "shuke" # Date: 2018/5/13 import pymysql # Create connection conn = pymysql.connect(host="127.0.0.1", port=3306, user='zff', passwd='zff123', db='zff', charset='utf8mb4') # Create cursors cursor = conn.cursor() # There is sql Injection situation ( Do not concatenate with formatted strings SQL) sql = "insert into USER (NAME) values('%s')" % ('zhangsan',) effect_row = cursor.execute(sql) # Correct way 1 # execute Function accepts a tuple / List as SQL Parameters , The number of elements can only have 1 individual sql = "insert into USER (NAME) values(%s)" effect_row1 = cursor.execute(sql, ['wang6']) effect_row2 = cursor.execute(sql, ('wang7',)) # Correct way 2 sql = "insert into USER (NAME) values(%(name)s)" effect_row1 = cursor.execute(sql, {'name': 'wudalang'}) # Write insert multiple rows of data effect_row2 = cursor.executemany("insert into USER (NAME) values(%s)", [('ermazi'), ('dianxiaoer')]) # Submit conn.commit() # Close cursor cursor.close() # Close the connection conn.close()
such ,SQL Operation is safer . If more detailed documentation is needed, refer to PyMySQL Document bar . But it seems that these SQL The implementation of database is not the same ,PyMySQL Parameter placeholders for use %s In this way C Formatter , and Python Self contained sqlite3 The placeholder for the module looks like a question mark (?). So read the documentation carefully when using other databases .Welcome to PyMySQL’s documentation
There is a problem with the above approach , In the case of single thread, it can meet , The program needs to create and release connections frequently to complete the operation of the database , that , Our program / What problems do scripts cause when they are multithreaded ? here , We need to use database connection pool to solve this problem !
3.1 DBUtils modular
DBUtils yes Python A module for implementing the database connection pool .
There are two connection modes for this connection pool :
3.2 Model a
#! /usr/bin/env python # -*- coding: utf-8 -*- # __author__ = "shuke" # Date: 2018/5/13 from DBUtils.PersistentDB import PersistentDB import pymysql POOL = PersistentDB( creator=pymysql, # Use the linked database module maxusage=None, # The maximum number of times a link is reused ,None Means unlimited setsession=[], # List of commands to execute before starting a session . Such as :["set datestyle to ...", "set time zone ..."] ping=0, # ping MySQL Server side , Check if the service is available .# Such as :0 = None = never, 1 = default = whenever it is requested, 2 = when a cursor is created, 4 = when a query is executed, 7 = always closeable=False, # If False when , conn.close() Actually ignored , For next use , On thread shutdown , Will automatically close the link . If True when , conn.close() Then close the link , Then call... Again pool.connection It's a mistake , Because the connection is really closed (pool.steady_connection() You can get a new link ) threadlocal=None, # This thread only enjoys the worthy object , Used to save linked objects , If the linked object is reset host='127.0.0.1', port=3306, user='zff', password='zff123', database='zff', charset='utf8', ) def func(): conn = POOL.connection(shareable=False) cursor = conn.cursor() cursor.execute('select * from USER') result = cursor.fetchall() cursor.close() conn.close() return result result = func() print(result)
3.3 Model 2
#! /usr/bin/env python # -*- coding: utf-8 -*- # __author__ = "shuke" # Date: 2018/5/13 import time import pymysql import threading from DBUtils.PooledDB import PooledDB, SharedDBConnection POOL = PooledDB( creator=pymysql, # Use the linked database module maxconnections=6, # The maximum number of connections allowed by the connection pool ,0 and None Indicates that there is no limit to the number of connections mincached=2, # On initialization , At least free links created in the link pool ,0 Means not to create maxcached=5, # The most idle links in the link pool ,0 and None Don't limit maxshared=3, # The maximum number of links shared in the link pool ,0 and None Means share all .PS: It's useless , because pymysql and MySQLdb Wait for the module threadsafety All for 1, All values regardless of setting to ,_maxcached For ever 0, So always all links are shared . blocking=True, # If there is no connection available in the connection pool , Whether to block waiting .True, wait for ;False, Don't wait and report an error maxusage=None, # The maximum number of times a link is reused ,None Means unlimited setsession=[], # List of commands to execute before starting a session . Such as :["set datestyle to ...", "set time zone ..."] ping=0, # ping MySQL Server side , Check if the service is available .# Such as :0 = None = never, 1 = default = whenever it is requested, 2 = when a cursor is created, 4 = when a query is executed, 7 = always host='127.0.0.1', port=3306, user='zff', password='zff123', database='zff', charset='utf8' ) def func(): # Check if the number of currently running connections is less than the maximum number of links , If not less than then : Waiting or reporting raise TooManyConnections abnormal # otherwise # The priority is to get the link from the link created during initialization SteadyDBConnection. # And then SteadyDBConnection Objects are encapsulated in PooledDedicatedDBConnection And in return . # If the link you first created doesn't have a link , To create a SteadyDBConnection object , Repackaged to PooledDedicatedDBConnection And in return . # Once the link is closed , The connection is returned to the connection pool for subsequent threads to continue using . conn = POOL.connection() # print(' The connection was taken away ', conn._con) # print(' There are... In the pool at present ', POOL._idle_cache, '\r\n') cursor = conn.cursor() cursor.execute('select * from USER') result = cursor.fetchall() conn.close() return result result = func() print(result)
:warning: because pymysql、MySQLdb etc. threadsafety The value is 1, So the threads in the connection pool of this mode will be shared by all threads , So it's thread safe . If there is no connection pool , Use pymysql To connect to the database , Single threaded applications have no problem at all , But if multithreaded applications are involved, you need to lock , Once locked, the connection is bound to wait in line , When there are more requests , Performance will be reduced .
3.4 Lock
#! /usr/bin/env python # -*- coding: utf-8 -*- # __author__ = "shuke" # Date: 2018/5/13 import pymysql import threading from threading import RLock LOCK = RLock() CONN = pymysql.connect(host='127.0.0.1', port=3306, user='zff', password='zff123', database='zff', charset='utf8') def task(arg): with LOCK: cursor = CONN.cursor() cursor.execute('select * from USER ') result = cursor.fetchall() cursor.close() print(result) for i in range(10): t = threading.Thread(target=task, args=(i,)) t.start()
3.5 unlocked ( Report errors )
#! /usr/bin/env python # -*- coding: utf-8 -*- # __author__ = "shuke" # Date: 2018/5/13 import pymysql import threading CONN = pymysql.connect(host='127.0.0.1', port=3306, user='zff', password='zff123', database='zff', charset='utf8') def task(arg): cursor = CONN.cursor() cursor.execute('select * from USER ') # cursor.execute('select sleep(10)') result = cursor.fetchall() cursor.close() print(result) for i in range(10): t = threading.Thread(target=task, args=(i,)) t.start()
At this point, you can view the connection in the database : show status like 'Threads%';
# cat sql_helper.py import pymysql import threading from DBUtils.PooledDB import PooledDB, SharedDBConnection POOL = PooledDB( creator=pymysql, # Use the linked database module maxconnections=20, # The maximum number of connections allowed by the connection pool ,0 and None Indicates that there is no limit to the number of connections mincached=2, # On initialization , At least free links created in the link pool ,0 Means not to create maxcached=5, # The most idle links in the link pool ,0 and None Don't limit #maxshared=3, # The maximum number of links shared in the link pool ,0 and None Means share all .PS: It's useless , because pymysql and MySQLdb Wait for the module threadsafety All for 1, All values regardless of setting to ,_maxcached For ever 0, So always all links are shared . blocking=True, # If there is no connection available in the connection pool , Whether to block waiting .True, wait for ;False, Don't wait and report an error maxusage=None, # The maximum number of times a link is reused ,None Means unlimited setsession=[], # List of commands to execute before starting a session . Such as :["set datestyle to ...", "set time zone ..."] ping=0, # ping MySQL Server side , Check if the service is available .# Such as :0 = None = never, 1 = default = whenever it is requested, 2 = when a cursor is created, 4 = when a query is executed, 7 = always host='192.168.11.38', port=3306, user='root', passwd='apNXgF6RDitFtDQx', db='m2day03db', charset='utf8' ) def connect(): # Create connection # conn = pymysql.connect(host='192.168.11.38', port=3306, user='root', passwd='apNXgF6RDitFtDQx', db='m2day03db') conn = POOL.connection() # Create cursors cursor = conn.cursor(pymysql.cursors.DictCursor) return conn,cursor def close(conn,cursor): # Close cursor cursor.close() # Close the connection conn.close() def fetch_one(sql,args): conn,cursor = connect() # perform SQL, And return the number of affected lines effect_row = cursor.execute(sql,args) result = cursor.fetchone() close(conn,cursor) return result def fetch_all(sql,args): conn, cursor = connect() # perform SQL, And return the number of affected lines cursor.execute(sql,args) result = cursor.fetchall() close(conn, cursor) return result def insert(sql,args): """ Create data :param sql: With placeholders SQL :return: """ conn, cursor = connect() # perform SQL, And return the number of affected lines effect_row = cursor.execute(sql,args) conn.commit() close(conn, cursor) def delete(sql,args): """ Create data :param sql: With placeholders SQL :return: """ conn, cursor = connect() # perform SQL, And return the number of affected lines effect_row = cursor.execute(sql,args) conn.commit() close(conn, cursor) return effect_row def update(sql,args): conn, cursor = connect() # perform SQL, And return the number of affected lines effect_row = cursor.execute(sql, args) conn.commit() close(conn, cursor) return effect_row
PS: Static methods can be encapsulated into a class , Easy to use .