Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to understand Python Asynchronous IO

2025-04-06 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/02 Report--

This article mainly explains "how to understand Python asynchronous IO". The explanation in this article is simple and clear, easy to learn and understand. Please follow the ideas of Xiaobian and go deep into it slowly to study and learn "how to understand Python asynchronous IO" together.

Python 3.4 standard library has a new module asyncio to support asynchronous IO, but the API status is provisional, which means backward compatibility is not guaranteed and may even be removed from the standard library (very unlikely). If you pay attention to PEP and Python-Dev, you will find that this module has been brewing for a long time, and there may be subsequent adjustments in API and implementation, but there is no doubt that asyncio is very practical and powerful, and it is worth learning and delving into.

example

asyncio mainly deals with TCP/UDP socket communication, calmly manages a large number of connections without creating a large number of threads, and improves system efficiency. Here, an example of the official document is simply modified to implement an HTTP long-connection benchmark tool for diagnosing the long-connection processing capability of the WEB server.

Function overview:

Create 10 connections every 10 ms up to the target number of connections (say 10k), and each connection sends HEAD requests to the server regularly to maintain HTTP keepavlie.

The code is as follows:

import argparse import asyncio import functools import logging import random import urllib.parse loop = asyncio.get_event_loop() @asyncio.coroutine def print_http_headers(no, url, keepalive): url = urllib.parse.urlsplit(url) wait_for = functools.partial(asyncio.wait_for, timeout=3, loop=loop) query = ('HEAD {url.path} HTTP/1.1\r\n' 'Host: {url.hostname}\r\n' '\r\n').format(url=url).encode('utf-8') rd, wr = yield from wait_for(asyncio.open_connection(url.hostname, 80)) while True: wr.write(query) while True: line = yield from wait_for(rd.readline()) if not line: # end of connection wr.close() return no line = line.decode('utf-8').rstrip() if not line: # end of header break logging.debug('(%d) HTTP header> %s' % (no, line)) yield from asyncio.sleep(random.randint(1, keepalive//2)) @asyncio.coroutine def do_requests(args): conn_pool = set() waiter = asyncio.Future() def _on_complete(fut): conn_pool.remove(fut) exc, res = fut.exception(), fut.result() if exc is not None: logging.info('conn#{} exception'.format(exc)) else: logging.info('conn#{} result'.format(res)) if not conn_pool: waiter.set_result('event loop is done') for i in range(args.connections): fut = asyncio.async(print_http_headers(i, args.url, args.keepalive)) fut.add_done_callback(_on_complete) conn_pool.add(fut) if i % 10 == 0: yield from asyncio.sleep(0.01) logging.info((yield from waiter)) def main(): parser = argparse.ArgumentParser(description='asyncli') parser.add_argument('url', help='page address') parser.add_argument('-c', '--connections', type=int, default=1, help='number of connections simultaneously') parser.add_argument('-k', '--keepalive', type=int, default=60, help='HTTP keepalive timeout') args = parser.parse_args() logging.basicConfig(level=logging.INFO, format='%(asctime)s %(message)s') loop.run_until_complete(do_requests(args)) loop.close() if __name__ == '__main__': main()

test and analysis

Hardware: CPU 2.3GHz / 2 cores, RAM 2GB

Software: CentOS 6.5(kernel 2.6.32), Python 3.3 (pip install asyncio), nginx 1.4.7

Parameter settings: ulimit -n 10240;nginx worker connections changed to 10240

To start a WEB server, you need only one worker process:

# ../ sbin/nginx # ps ax | grep nginx 2007 ? Ss 0:00 nginx: master process ../ sbin/nginx 2008 ? S 0:00 nginx: worker process

Start the Benchmark tool and initiate 10k connections to the default test page with the URL nginx:

$ python asyncli.py http://10.211.55.8/ -c 10000

nginx log statistics Average requests per second:

# tail -1000000 access.log | awk '{ print $4 }' | sort | uniq -c | awk '{ cnt+=1; sum+=$1 } END { printf "avg = %d\n", sum/cnt }' avg = 548

Top partial output:

VIRT RES SHR S %CPU %MEM TIME+ COMMAND 657m 115m 3860 R 60.2 6.2 4:30.02 python 54208 10m 848 R 7.0 0.6 0:30.79 nginx Thank you for reading, the above is the content of "Python asynchronous IO how to understand", after the study of this article, I believe that everyone has a deeper understanding of Python asynchronous IO how to understand this problem, the specific use of the situation also needs to be verified by practice. Here is, Xiaobian will push more articles related to knowledge points for everyone, welcome to pay attention!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report