Python The coroutine in is very similar to the generator, but slightly different . The main difference is that :
First, let's review the creation process of the generator . We can create a generator like this :
def fib():
a, b = 0, 1
while True:
yield a
a, b = b, a+b
And then we often for Use it in a loop like this :
for i in fib():
print i
This is not only fast, but also does not put pressure on memory , Because all the values we need are generated dynamically instead of storing them in a list . More generally, if we now use in the above example yield You can get a synergy . The coroutine will consume the value sent to it .Python Realized grep That's a good example :
def grep(pattern):
print("Searching for", pattern)
while True:
line = (yield)
if pattern in line:
print(line)
wait !yield Back to what ? Ah , We have turned it into a collaborative process . It will no longer contain any initial values , Instead, pass values to it from the outside . We can go through send() Method to pass a value to it . Here's an example :
search = grep('coroutine')
next(search)
#output: Searching for coroutine
search.send("I love you")
search.send("Don't you love me?")
search.send("I love coroutine instead!")
output: I love coroutine instead!
The value sent will be yield receive . Why should we run next() Methods? ? This is to start a collaborative process . Just like the generator included in the collaboration does not execute immediately , But through next() Method to respond to send() Method . therefore , You have to pass next() Method to execute yield expression .
We can do it by calling close() Method to close a coroutine . like this :
search = grep('coroutine')
search.close()
Function caching allows us to cache the return value of a function for a given parameter .
When one I/O When dense functions are frequently called with the same parameters , Function caching can save time .
stay Python 3.2 Before version, we only wrote a custom implementation . stay Python 3.2 Later versions , There is one lru_cache The decorator , Allows us to quickly cache or uncache the return value of a function .
Let's see ,Python 3.2 How to use it before and after versions .
Python 3.2 And later versions
Let's implement a Fibonacci calculator , And use lru_cache.
from functools import lru_cache
@lru_cache(maxsize=32)
def fib(n):
if n < 2:
return n
return fib(n-1) + fib(n-2)
print([fib(n) for n in range(10)])
Output: [0, 1, 1, 2, 3, 5, 8, 13, 21, 34]
that maxsize The argument is to tell lru_cache, The maximum number of recent return values cached .
We can also easily empty the cache for the return value , Through this :
fib.cache_clear()