One 、 Concept of generator
Two 、 Definition of generator function
1、yield and return Differences and similarities of keywords
(1)yield and return The difference between keywords :
(2)yield and return Similarities of keywords :
2、 First knowledge of generator functions
(1) What is a generator function
(2) The benefits of generator functions
3、 ... and 、 Generator function elementary advanced
1、 Two ways to get values from a generator
(1)、 Method 1 :next Method
(2)、 Method 2 :send Method
2、 The decorator of the pre excitation process
3、Python3 Newly added yield from
4、 review
Four 、 Advanced generator functions
1、 Generator expressions and various derivations
(1)、 List derivation
(2)、 Generator expression
(3)、 Dictionary derivation
(4)、 Set derivation ( Bring the result to the duplicate function )
2、 summary
There are two types of iterators we know : One is directly returned by the calling method , such as for The cycle is Python Native iterator , One is to iterate over objects by executing iter Method derived , Iterators have the advantage of saving memory . Avoid errors and insufficient memory caused by fetching a large amount of data at one time .
If in some cases , We also need to save memory , I can only write by myself . The thing that we wrote to realize the function of iterator is called generator .
In short : A generator is an iterator we write ourselves
yield It's for generators . What is a generator , You can popularly think that , In a function , Used yield Instead of return A function of the position of , It's the generator .
(1)yield and return The difference between keywords :
It's different from the way functions are used : Function USES return To return the value , Every call , Return a newly processed data to you ;yield Different , It will call the generator , Generate data object, Then when you need it , Use next() How to get , Simultaneously irreversible . You can popularly call it " Rotating containers ", It can be understood by a kind of real object : Waterwheel , First yield To load data 、 Produce generator object( You performed a task that contained yield Keyword function , After that, calling this function will not get the return value, but get an iteratable object "generator object") Use next() To release ;
It's like after the waterwheel turns , The water tank on the wheel is filled with water , As the wheel turns , Being transferred to the sink below can send water into the waterway and into the fields . The analogy of water wheel is very appropriate , Every time there is a data to be extracted , First, put the data into the water tank of the water truck in order , When it is called later next Function is equivalent to using the water in the sink , And the water is taken according to the original order in the waterwheel ( fifo )‘
(2)yield and return Similarities of keywords :
The same thing : Are the results of the execution of the return function
Difference :return End the function after returning the result , and yield Is to turn the function into a generator ( Or iteratable objects ), The generator generates one value at a time (yield sentence ), The function is frozen , A value is generated after being awakenedSum it up with a chestnut :
def f1(): return 'aaa' return 'bbb' def f2(): yield 'aaa' yield 'bbb' print(f1()) print(f2()) for i in f2(): print(i) Output results : aaa <generator object f2 at 0x000001731AFFFC10> aaa bbb
(1) What is a generator function
A contain yield Keyword function is a generator function .yield You can return values from functions for us , however yield It's different from return,return The execution of means the end of the program , Calling the generator function will not get the specific value returned , Instead, you get an iteratable object . Get the value of this iteratable object every time , Can drive the execution of the function , Get the new return value . Until the end of the function execution .
In short, the function contains yield Keyword is the generator function
(2) The benefits of generator functions
What's the advantage of a generator is that it doesn't generate too much data in memory at once , But you find it and ask for it to give you value , If you don't ask for it, it won't return a value to you .
Take a chestnut :
If I make a reservation 2000000 An autumn dress , I told the factory , The factory should promise to come down first , And then go to production , I can order one by one , You can also go to the factory one by one . It cannot be said that we should produce 2000000 Clothes , One time production in the factory 2000000 Give me clothes together , When I get back, I'll do it , It's winter ...def produce(): """ Making clothes """ for i in range(2000000): yield " Produced the %s Clothes "%i product_g = produce() print(product_g.__next__()) # I want a dress print(product_g.__next__()) # Another dress print(product_g.__next__()) # Another dress num = 0 for i in product_g: # A batch of clothes , such as 5 Pieces of print(i) num +=1 if num == 5: break # Here we went to the factory to get 8 Clothes , I'll make my production function ( That is to say produce Generator function ) production 2000000 Clothes . # There's a lot of clothes left , We can take it all the time , You can also put it on and wait for it when you want to
def generator(): print(123) yield 1 print(456) yield 2 g1 = generator() # generator g1 g2 = generator() # generator g2 print('*',g1.__next__()) # Slave generator g1 Take the first value from print('**',g1.__next__()) # Slave generator g1 Take the second value from print('***',g2.__next__()) # Slave generator g2 Take the first value from Output results : 123 * 1 456 ** 2 123 *** 1
- send Get the effect of the next value and next Almost the same
Just give the previous value when getting the next value yield The location of the passes a data
- Use send Precautions for :
- The first time I used the generator, I used next Get the next value
- the last one yield Cannot accept external values
summary :
send: Can not be used in the first , When a value is taken down, a new value is passed to the previous position
def generator(): print(123) content = yield 1 print('=======',content) print(456) yield 2 #send Get the effect of the next value and next Almost the same # Just when getting the next value , Give me one yield The location of the pass a data # Use send Precautions for # When I first used the generator Yes, it is next Get the next value # the last one yield Cannot accept external values g = generator() print('*',g.__next__()) ret = g.send('hello') #send The effect of next equally print('***',ret) Output results : 123 * 1 ======= hello 456 *** 2
What is the decorator of the pre excitation process ?
Simply put, it is a generator with decorators ( Decorator Introduction )
An example of a decorator for a pre excitation process :
# In fact, it omits avg_g = average() avg_g.__next__() These two steps def init(func): #func = average ''' Decorator function :param func: :return: ''' def inner(*args,**kwargs): g = func(*args,**kwargs) #g = average() g.__next__() return g return inner @init #average = init(average) = inner def average(): sum = 0 count = 0 avg = 0 while 1: #num = yield num = yield avg sum += num count += 1 avg = sum/count avg_g = average() #===>inner ret = avg_g.send(10) print(ret) # avg_g.__next__() # avg1 = avg_g.send(10) # avg2 = avg_g.send(20) # print(avg1,avg2) Output results : 10.0
yield from: Followed by a list 、 generator 、 coroutines . And asyncio.coroutine Use at the same time , Define the coprogram function . stay python3.5 Later it was changed to await. When yield from And then IO Time consuming operation , Will switch to another yield from.
Here, let's simply say :yield You can replace... In a function for The loop iterates over the returned data
def generator(): a = 'ab' b = '12' for i in a: yield i for i in b: yield i g = generator() Get return value from generator for i in g: print(i) Output results : a b 1 2 def generator(): a = 'cd' b = '34' yield from a # ,b There can only be one variable yield from b g = generator() # Get return value from generator for i in g: print(i) Output results : c d 3 4
review :
One 、send:
1、send Scope and next As like as two peas ( From a yield To the next yield)
2、 Cannot be used for the first time send
3、 Function at the last of yield Cannot receive new values
Two 、
An example of a decorator of a pre excitation generator
Templates :
[ Every element or operation related to an element for Elements in Iteratable data types ] Process one by one after traversal
[ Perform related operations on the elements that meet the conditions for Elements in Iteratable data types if Element related conditions ] Filtering functionegg_list = [' egg %s'%i for i in range(10)] # This is a list derivation print(egg_list) ''' amount to ''' egg_list = [] for i in range(10): egg_list.append(' egg %s' %i) print(egg_list) print([i for i in range(10)]) Output results : [' egg 0', ' egg 1', ' egg 2', ' egg 3', ' egg 4', ' egg 5', ' egg 6', ' egg 7', ' egg 8', ' egg 9'] [' egg 0', ' egg 1', ' egg 2', ' egg 3', ' egg 4', ' egg 5', ' egg 6', ' egg 7', ' egg 8', ' egg 9'] [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
- Generator Expressions (generator expression) Also called generator derivation or generator parsing , Usage is very similar to list derivation , Formally, generator derivation uses parentheses (parentheses) As a delimiter , Instead of the square brackets used in the list derivation (square brackets).
- The biggest difference from the list derivation is , The result of the generator derivation is a generator object . A generator object is similar to an iterator object , Characterized by inert evaluation , Generate new elements only when needed , It is more efficient than list derivation , Very little space , Especially suitable for big data processing .
- When using elements of a generator object , It can be converted into lists or tuples as needed , You can also use the... Of the generator object next() Methods or built-in functions next() Traversal , Or use it directly for Loop through the elements . But no matter which way you access its elements , Each element can only be accessed forward from front to back , Cannot access again
- Accessed elements , Nor does it support the use of subscripts to access its elements . When all elements are accessed , If you need to revisit the elements , The generator object must be recreated ,enumerate、filter、map、zip And other iterator objects have the same characteristics .
g = (i for i in range(10)) print(g) # A generator is returned for i in g: print(i) Output results : <generator object <genexpr> at 0x000002075E40FC10> 0 1 2 3 4 5 6 7 8 9
Patients with a : Will a dictionary of key and value Swap
dic = {'a' : 97,'b' : 98} dic_exchange = {dic[k] :k for k in dic} print(dic_exchange) Output results : {97: 'a', 98: 'b'}
set = {x**2 for x in [1,-1,2,-2,3,4]} print(set) Output results : {16, 1, 4, 9}
Only there is no tuple derivation , To get a tuple, you can directly convert the pushed data type into a tuple
summary : Various derivations : generator list Dictionaries aggregate
1、 Traversal operation
2、 Screening operations
Two 、 Inert operation
Generators and iterators are lazy operations :
But the generator you can see because you write it and the iterator is not
1、 The data in the same generator can only be retrieved once, and then it will disappear
2、 Inert operation : If you don't find the value, it won't return
# List of analytical
sum([i for i in range(100000000)]) # Big memory footprint , The machine is easy to get stuck# Generator Expressions
sum(i for i in range(100000000)) # Almost no memory
1. Parse the list [] Switch to () You get the generator expression2. List parsing and generator expressions are both convenient programming methods , But generator expressions save more memory
3.Python Not only does it use the iterator protocol , Give Way for Loops become more common . Most built-in functions , It also uses the iterator protocol to access objects .
Thank you for seeing here : In an unpublished article, Lu Xun said :“ If you understand the code, you must operate it yourself, so that you can better understand and absorb .”
One last sentence : A man can succeed in anything he has unlimited enthusiasm , Let's make progress together