In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/02 Report--
Today, I will talk to you about how to use the queue library in Python. Many people may not know much about it. In order to make you understand better, the editor has summarized the following for you. I hope you can get something according to this article.
The queue module provides a first-in-first-out data structure suitable for multithreaded programming, which can be used to safely transfer messages or data between producer and consumer threads; locks are handled by callers, so multithreads can be implemented safely and conveniently using the same queue.
FIFO queue
The Queue class, which implements the most basic FIFO queue, uses the put method, adds the element to the end, and uses the get method to remove the element from the other side
Def queue_fifo ():
Q = queue.Queue ()
For i in range (5):
Q.put (I)
While not q.empty ():
Print (q.get (), end ='')
Print () LIFO stack
Unlike standard FIFO queues, LifoQueue implements LIFO, which is usually the stack
Def queue_lifo ():
Q = queue.LifoQueue ()
For i in range (5):
Q.put (I)
While not q.empty ():
Print (q.get (), end ='')
Print ()
Priority queue
Sometimes, the processing order of the elements in the queue needs to be based on the characteristics of those elements, not just the order in which they are added to the queue. For example, a print job in the finance department may take precedence over a developer's code list print. PriorityQueue uses the sort order of the contents of the queue to determine which elements to retrieve.
Class Job ():
Def _ _ init__ (self, priority, description):
Self.priority = priority
Self.description = description
Print (description)
Return
Def _ _ eq__ (self, other):
Return self.priority = = other.priority
Def _ _ lt__ (self, other):
Return self.priority < other.priority
Def priority_queue ():
Import threading
Print ('initial')
Q = queue.PriorityQueue ()
Q.put (Job (5, 'Mid Job'))
Q.put (Job (10, 'Low Job'))
Q.put (Job (1, 'Imp Job'))
Def process_job (Q):
While True:
Next_job = q.get ()
Print (next_job.description)
Q.task_done ()
Workers = [
Threading.Thread (target=process_job, args= (Q,))
Threading.Thread (target=process_job, args= (Q,))
]
Print ('get')
For w in workers:
W.setDaemon (True)
W.start ()
Q.join () Queue and multithreading
This section plays the source code for the client to demonstrate a scenario in which Queue is used with multithreading. The program reads one or more RSS feeds, puts the five latest events in each summary into the Queue to wait for download, and processes the download in parallel with multiple threads. The framework implementation demonstrates the use of the queue module.
Def podcast_client ():
# 0. Initialization
Import threading
Num_fetch_threads = 2
Enclosure_queue = queue.Queue ()
Feed_urls = [
'http://talkpython.fm/episodes/rss',
]
# 1. Auxiliary function printing information
Def message (s):
Print ('{}: {} '.format (threading.current_thread () .format, s))
# 2. Multithreaded objective function
Def download_enclosures (Q):
Import urllib
Message ('looking for the next enclosure')
Url = q.get ()
Filename = url.rpartition ('/') [- 1]
Message ('downloading {}' .format (filename))
Response = urllib.request.urlopen (url)
Data = response.read ()
Message ('writing to {}' .format (filename))
With open (filename, 'wb') as outfile:
Outfile.write (data)
Q.task_done ()
# 3. Start multithreading
For i in range (num_fetch_threads):
Worker = threading.Thread (
Target = download_enclosures
Args = (enclosure_queue,)
Name = 'work- {}' .format (I)
)
Worker.setDaemon (True)
Worker.start ()
# 4. Add URL to the queue
Import feedparser
From urllib.parse import urlparse
For url in feed_urls:
Response = feedparser.parse (url, agent='queue_module.py')
For entry in response ['entries'] [: 5]:
For enclosure in entry.get ('enclosures', []):
Parsed_url = urlparse (enclosure ['url'])
Message ('queuing {}' .format (
Parsed_url.path.rpartition ('/') [- 1]))
Enclosure_queue.put (enclosure ['url'])
# 5. Main thread
Message ('* main thread waiting')
Enclosure_queue.join ()
Message ('* done')
First, initialize the parameters to determine the operation parameters: usually from user input. The example uses a hard-coded value to represent the number of threads and a list of URL to get, and creates an auxiliary function message to print information
Execute the download_enclosures method in the work thread and use urllib to process the download. Once the target function is defined in the thread, you can start the work: in the download_enclosures method, when the statement url=q.get () executes, it blocks and waits for the queue to return, which means it is safe to start the thread before the queue has nothing.
The next step is to use the feedparser module (which needs to be installed) to retrieve the summary contents and insert the url into the queue. Once the URL is added to the queue, the thread reads it and starts downloading, adding elements to the queue in a loop until the summary is consumed, and the worker thread takes turns url out of the queue for download.
After reading the above, do you have any further understanding of how to use the queue library in Python? If you want to know more knowledge or related content, please follow the industry information channel, thank you for your support.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.