In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-01 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/02 Report--
This article mainly introduces "how to understand the cooperative process in Python". In the daily operation, I believe that many people have doubts about how to understand the cooperative process in Python. The editor consulted all kinds of materials and sorted out simple and easy-to-use operation methods. I hope it will be helpful for you to answer the doubts about "how to understand the cooperative process in Python". Next, please follow the editor to study!
We have briefly introduced the concept of collaborative process in golang's article on goroutine, so let's simply review. The co-program is also called micro-thread, and its English name is Coroutine. It can be scheduled as well as threads, but the difference is that thread startup and scheduling need to be handled by the operating system. And thread startup and destruction need to involve some operating system variable application and destruction processing, which takes a long time. As for the cooperative program, its scheduling and destruction are controlled by the program itself, so it is lighter and more flexible.
Collaborative process has so many advantages, of course, there will be some disadvantages, of which the biggest disadvantage is that it needs to be supported by the programming language itself, otherwise developers need to implement it through some methods. This mechanism is not supported for most languages. Because of its natural support for collaborative programs and very good support, go language has been widely praised and has become very popular in just a few years.
For Python, there is a huge congenital problem of GIL. GIL is the global lock of Python. Under its restriction, a Python process can only execute one thread at a time, even in a multi-core machine. This greatly affects the performance of Python, especially in CPU-intensive work. So in order to improve the performance of Python, many developers have come up with a way to use multi-process + cooperative programs. At first, it was implemented by developers themselves, and then this feature was officially included in the version of Python3.4, so it is now fair to say that Python is a language that supports collaborative programs.
Generator (generator)
Generator We have also introduced the generator in the previous article, why we need to use the generator when we introduce the protocol, because the underlying layer of the Python is implemented through the generator.
The reason for implementing the cooperative program through the generator is also very simple. We all know that the cooperative program needs to be switched and suspended, and there is a yield keyword in the generator, which happens to achieve this function. So the programmers who developed the co-program function in Python are all realized through the generator. If we want to understand the application of the co-program in Python, we must start from the most primitive generator.
The generator, which we are familiar with, is essentially a function with the keyword yield.
Def test (): n = 0 while n < 10: val = yield n print ('val = {}' .format (val)) n + = 1
If there is no yield statement in this function, then it is a normal Python function. How does it change with the addition of the statement val = yield n?
Let's try to run it:
# call the test function to get a generator g = test () print (next (g)) print (next (g)) print (next (g))
Get a result like this:
It is easy to understand the output of 0Jol 1J2, which is returned through next (g), which is also the standard usage of the generator. The strange thing is why val=None? Shouldn't val be equal to n?
It's normal not to understand here, because a new usage is involved in the generator's send method. When we add the variable name before the yield statement, it actually means to return the content after yield and then receive a variable from the outside world. That is, when we execute next (g), we get the number after yield, and when we execute g.send (), the value passed in is assigned to the number before yield. For example, we change the executed code to this:
G = test () print (next (g)) g.send ('abc') print (next (g)) print (next (g))
If we look at the results of the implementation, we will find that it is as follows:
The first line of val is no longer None, but the abc we just passed in.
Queue scheduling
The generator hangs naturally after each execution of the yield statement, which we can use as a co-program to schedule. We can implement a simple queue to simulate this process.
First of all, we declare a double-end queue, each time get the task from the left head of the queue, schedule the execution until it is suspended, and put it at the end of the queue. This is equivalent to polling all tasks in a circular manner, and the whole process does not involve any thread creation and destruction.
Class Scheduler: def _ init__ (self): self._queue = deque () def new_task (self Task): self._queue.append (task) def run (self): while self._queue: # get task task from the left side of the queue every time = self._queue.popleft () try: # put next (task) on the right side of the queue after execution through next Self._queue.append (task) except StopIteration: pass sch = Scheduler () sch.new_task (test (5) sch.new_task (test (10)) sch.new_task (test (8)) sch.run ()
This is only a very simple scheduling method, in fact, combined with yield from and send functions, we can also implement a more complex cooperative scheduling method. But we do not need to be exhausted one by one, we only need to understand the most basic methods, after all, we generally do not implement it ourselves when we use collaborative programs, but through the official native tool library.
@ asyncio.coroutine
In later versions of Python3.4, we can use the @ asyncio.coroutine annotation to encapsulate a function as a generator for co-program execution.
After absorbing the concept of a co-program, Python makes a distinction between a generator and a co-program. A function annotated with @ asyncio.coroutine is called a co-program function. We can use the iscoroutinefunction () method to determine whether a function is a co-program function. The generator object returned by this co-program function is called a co-program object. We can use the iscoroutine method to determine whether an object is a co-program object.
For example, if I annotate the function I just wrote and then execute both functions, I will get True:
Import asyncio @ asyncio.coroutine def test (k): n = 0 while n < k: yield print ('n = {} '.format (n)) n + = 1 print (asyncio.iscoroutinefunction (test)) print (asyncio.iscoroutine (test (10)
So how do we use it after we turn the method into a collaborative process through annotations?
A better way is to use the loop tool provided in the asynio library, for example, let's take a look at this example:
Loop = asyncio.get_event_loop () loop.run_until_complete (test (10)) loop.close ()
We create a scheduler through the asyncio.get_event_loop function and execute a co-program object through the scheduler's run-related methods. We can either run_until_complete or run_forever, depending on our actual usage scenario.
Async,await and future
Starting with the Python3.5 version, async,await and future have been introduced. Let's talk briefly about their respective uses, where async is actually @ asyncio.coroutine, and the use is exactly the same. Again, await replaces yield from, which means to wait for another collaboration to finish.
As soon as we change these two, the above code becomes:
Async def test (k): n = 0 while n < k: await asyncio.sleep (0.5) print ('n = {} '.format (n)) n + = 1
Because we added await, we waited half a second each time before printing. It is the same for us to replace await with yield from, except that using await is more intuitive and more in line with the meaning of collaboration.
Future can actually be thought of as a semaphore. We create a global future, and when a co-program is executed, the result is stored in this future. Other protocols can be blocked by await future. Let's look at an example to make it clear:
Future = asyncio.Future () async def test (k): n = 0 while n < k: await asyncio.sleep (0.5) print ('n = {} '.format (n)) n + = 1 future.set_result (' success') async def log (): result = await future print (result) loop = asyncio.get_event_loop () loop.run_until_complete (asyncio.wait ([log ()) Test (5)]) loop.close ()
In this example, we create two cooperators. The first is to print a number every 0.5 seconds and write the success to the future after the print is completed. The second protocol is to wait for the data in future, and then print comes out.
In loop, what we want to schedule and execute is no longer one co-program object, but two, so we wrap these two objects with wait in asyncio. The wait ends only when the execution of two objects in the wait ends. Loop waits for the end of the wait, while wait waits for the end of the passed co-program, which forms a dependency loop, which is equivalent to the end of the two co-program objects before the loop ends.
At this point, the study of "how to understand the Cooperative process in Python" is over. I hope to be able to solve your doubts. The collocation of theory and practice can better help you learn, go and try it! If you want to continue to learn more related knowledge, please continue to follow the website, the editor will continue to work hard to bring you more practical articles!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.