asyncio asynchronous programming [with video tutorials]

I don't know if you find that more and more people are chatting asynchronously around you, such as FastAPI, Tornado, Sanic, Django 3, aiohttp, etc.

Hear how asynchronous works?How does the performance hang frying days...But what happened to him?

This section wants to chat with you about asyncio asynchronization!

 

Video tutorials: https://study.163.com/instructor/3525856.htm

wiki synchronization: https://pythonav.com/wiki/

 

1. Agreement

If you want to learn asyncio, you have to know the protocol first. It's fundamental!

Coroutine, also known as microthreading, is a context-switching technology in the user state.In short, it is a thread that allows code blocks to switch execution from one another.For example:

def func1():
    print(1)
    ...
    print(2)
def func2():
    print(3)
    ...
    print(4)
func1()
func2()

The above code is a common function definition and execution, executing the code in two functions separately according to the process, and outputting: 1, 2, 3, 4.However, if you intervene in the coprotocol technology, you can switch the execution of the function to see the code, and the final input is: 1, 3, 2, 4.

There are many ways to implement a protocol in Python, such as:

  • greenlet, a third-party module, is used to implement the coprocess code (the Gevent covenant is based on the greenlet implementation)
  • yield, generator, with the help of generator features can also implement the coprocess code.
  • asyncio, a module introduced in Python 3.4, is used to write coprocess code.
  • Async & awiat, two keywords introduced in Python 3.5, combined with the asyncio module, makes it easier to write codes for a coprocess.

1.1 greenlet

greentlet is a third-party module that requires pip3 install greenlet to be installed in advance for use.

from greenlet import greenlet
def func1():
    print(1)        # Step 1: Output 1
    gr2.switch()    # Step 3: Switch to the func2 function
    print(2)        # Step 6: Output 2
    gr2.switch()    # Step 7: Switch to the func2 function and continue executing backwards from where you last executed
def func2():
    print(3)        # Step 4: Output 3
    gr1.switch()    # Step 5: Switch to the func1 function and continue executing backwards from where you last executed
    print(4)        # Step 8: Output 4
gr1 = greenlet(func1)
gr2 = greenlet(func2)
gr1.switch() # Step 1: Execute the func1 function

Note: Parameters can also be passed in the switch to pass values to each other when the switch is executed.

1.2 yield

The coprocess code is implemented based on the yield and yield form keywords of the Python generator.

def func1():
    yield 1
    yield from func2()
    yield 2
def func2():
    yield 3
    yield 4
f1 = func1()
for item in f1:
    print(item)

Note: The yield form keyword was introduced in Python 3.3.

1.3 asyncio

Class libraries for protocols were not officially available prior to Python 3.4 and are generally implemented using greenlet and others.After the release of Python 3.4, the official support protocol was the asyncio module.

import asyncio
@asyncio.coroutine
def func1():
    print(1)
    yield from asyncio.sleep(2)  # Automatically switch to other tasks in tasks when IO time-consuming operations are encountered
    print(2)
@asyncio.coroutine
def func2():
    print(3)
    yield from asyncio.sleep(2) # Automatically switch to other tasks in tasks when IO time-consuming operations are encountered
    print(4)
tasks = [
    asyncio.ensure_future( func1() ),
    asyncio.ensure_future( func2() )
]
loop = asyncio.get_event_loop()
loop.run_until_complete(asyncio.wait(tasks))

Note: The asyncio module-based implementation of the protocol is more powerful than before because it also integrates the function of automatic flower cutting with IO time-consuming operation.

1.4 async & awit

The async & awit keyword was officially introduced in Python version 3.5, and the code based on it is actually an enhanced version of the previous example, making the code easier.

After Python 3.8 @asyncio The.coroutine decorator will be removed and the async & awit keyword is recommended for the implementation of the codec.

import asyncio
async def func1():
    print(1)
    await asyncio.sleep(2)
    print(2)
async def func2():
    print(3)
    await asyncio.sleep(2)
    print(4)
tasks = [
    asyncio.ensure_future(func1()),
    asyncio.ensure_future(func2())
]
loop = asyncio.get_event_loop()
loop.run_until_complete(asyncio.wait(tasks))

1.5 Summary

There are many ways to implement collaboration, and the current mainstream use is the way Python officially recommends the asyncio module and the async&await keyword, for example, supported in tonado, sanic, fastapi, django3.

Next, we will explain the asyncio module + async & await keyword in more detail.

 

2. Significance of the Agreement

By learning, we have learned that a collaboration can be executed back and forth through a thread in multiple contexts.

But what does it mean to switch back and forth between executions?(I see a lot of articles licking agreements on the Internet. Where is the best place for them to lick agreements?)

  1. Computational operations, which use the coprocess to switch back and forth, have no meaning, and switching back and forth and saving the state reversely can degrade performance.
  2. IO-type operations, which use a protocol to switch to perform other tasks at IO wait time, and then automatically callback when IO operation is over, can save resources and provide performance for asynchronous programming (other code can be executed without waiting for the task to end).

2.1 Crawler Cases

For example, use code to download pictures from url_list.

  • Mode 1: Synchronous programming implementation
    """
    //Download pictures using third-party module requests, please install in advance: pip3 install requests
    """
    import requests
    def download_image(url):
        print("Start downloading:",url)
        # Send network requests, download pictures
        response = requests.get(url)
        print("Download complete")
        # Save Picture to Local File
        file_name = url.rsplit('_')[-1]
        with open(file_name, mode='wb') as file_object:
            file_object.write(response.content)
    if __name__ == '__main__':
        url_list = [
            'https://www3.autoimg.cn/newsdfs/g26/M02/35/A9/120x90_0_autohomecar__ChsEe12AXQ6AOOH_AAFocMs8nzU621.jpg',
            'https://www2.autoimg.cn/newsdfs/g30/M01/3C/E2/120x90_0_autohomecar__ChcCSV2BBICAUntfAADjJFd6800429.jpg',
            'https://www3.autoimg.cn/newsdfs/g26/M0B/3C/65/120x90_0_autohomecar__ChcCP12BFCmAIO83AAGq7vK0sGY193.jpg'
        ]
        for item in url_list:
            download_image(item)
  • Mode 2: Asynchronous programming implementation based on Protocol
    """
    //Download pictures using third-party module aiohttp, please install in advance: pip3 install aiohttp
    """
    #!/usr/bin/env python
    # -*- coding:utf-8 -*-
    import aiohttp
    import asyncio
    async def fetch(session, url):
        print("Send request:", url)
        async with session.get(url, verify_ssl=False) as response:
            content = await response.content.read()
            file_name = url.rsplit('_')[-1]
            with open(file_name, mode='wb') as file_object:
                file_object.write(content)
    async def main():
        async with aiohttp.ClientSession() as session:
            url_list = [
                'https://www3.autoimg.cn/newsdfs/g26/M02/35/A9/120x90_0_autohomecar__ChsEe12AXQ6AOOH_AAFocMs8nzU621.jpg',
                'https://www2.autoimg.cn/newsdfs/g30/M01/3C/E2/120x90_0_autohomecar__ChcCSV2BBICAUntfAADjJFd6800429.jpg',
                'https://www3.autoimg.cn/newsdfs/g26/M0B/3C/65/120x90_0_autohomecar__ChcCP12BFCmAIO83AAGq7vK0sGY193.jpg'
            ]
            tasks = [asyncio.create_task(fetch(session, url)) for url in url_list]
            await asyncio.wait(tasks)
    if __name__ == '__main__':
        asyncio.run(main())

Comparing the above two implementations, you will find that protocol-based asynchronous programming is much more efficient than synchronous programming.Because:

  • Synchronized programming, executed sequentially one by one, if the picture download time is 2 minutes, then it will take 6 minutes to complete.
  • Asynchronous programming, which makes requests for almost three download tasks at the same time (automatically switches to send other task requests when IO requests occur), if the picture download time is 2 minutes, then it will take about 2 minutes to complete all the execution.

2.2 Summary

Collaborations are generally used in programs with IO operations, because they can use the time IO waits to execute some other code, thereby improving code execution efficiency.

Life isn't the same. Suppose you're the owner of a car. After the employee clicks on the device's Start button, he has to wait 30 minutes before the device, and then clicks the End button. As the owner, you must want the employee to do something else in the 30 minutes he's waiting for.

3. Asynchronous programming

Procedures based on the async & await keyword can achieve asynchronous programming, which is the mainstream technology related to python asynchronous currently.

To really understand the built-in asynchronous programming in Python, look a little bit in the order below.

3.1 Event Cycle

Event loop, you can think of it as a while loop, which runs periodically and performs some tasks to terminate the cycle under specific conditions.

#Pseudo Code
 Task List= [Task 1, Task 2, Task 3,...]
while True:
    Executable Task List, Completed Task List = Go to Task List to check all tasks, and return'Executable'and'Completed' tasks
    for Ready Task in Ready Task List:
        Perform Ready Tasks
    for Completed Tasks in Completed Tasks List:
        Remove completed tasks from the task list
    If all tasks in the task list are completed, terminate the cycle

When writing a program, you can get and create event loops using the following code.

import asyncio
loop = asyncio.get_event_loop()

3.2 Coprogramming and asynchronous programming

Coequation function, defined as async def Functions of.

Coprocedure object, the object returned by calling the coprocedure function.

# Define a coequation function
async def func():
    pass
# Call the coprocedure function to return a coprocedure object
result = func()

Note: When calling a coprocedure function, the internal code of the function does not execute, only returns a coprocedure object.

3.2.1 Basic Applications

In a program, if you want to execute the internal code of a coprocess function, you need an event loop in conjunction with a coprocess object, such as:

import asyncio
async def func():
    print("Protocol internal code")
# Call the coprocedure function and return a coprocedure object.
result = func()
# Mode 1
# loop = asyncio.get_event_loop() # Create an event loop
# loop.run_until_complete(result) # Submit the collaboration as a task to the task list in the event loop and terminate when the execution of the collaboration is complete.
# Mode 2
# One is essentially the same. Inside, create an event loop and execute run_until_complete, a simple way to write.
# The asyncio.run function adds an asyncio module to Python 3.7.
asyncio.run(result)

This process can be simply understood as adding a collaboration as a task to the task list of an event loop, then the event loop detects whether the process in the list is ready (which is understood by default as ready) and executes its internal code if it is ready.

3.2.2 await

Await is a keyword that can only be used in the coprocess function to suspend the current protocol (task) when an IO operation is encountered. The event loop during the suspension of the current protocol (task) can execute other protocols (tasks), and when the current coprocess IO processing is complete, it can switch back to the code after await execution again.The code is as follows:

Example 1:

import asyncio
async def func():
    print("Execute the internal code of the coprocedure function")
    # The IO operation suspends the current protocol (task) until the IO operation is completed.
    # When the current protocol is suspended, the event loop can execute other protocols (tasks).
    response = await asyncio.sleep(2)
    print("IO The request ends with the following results:", response)
result = func()
asyncio.run(result)

Example 2:

import asyncio
async def others():
    print("start")
    await asyncio.sleep(2)
    print('end')
    return 'Return value'
async def func():
    print("Execute the internal code of the coprocedure function")
    # The IO operation suspends the current protocol (task) until the IO operation is completed.When the current protocol is suspended, the event loop can execute other protocols (tasks).
    response = await others()
    print("IO The request ends with the following results:", response)
asyncio.run( func() )

Example 3:

import asyncio
async def others():
    print("start")
    await asyncio.sleep(2)
    print('end')
    return 'Return value'
async def func():
    print("Execute the internal code of the coprocedure function")
    # The IO operation suspends the current protocol (task) until the IO operation is completed.When the current protocol is suspended, the event loop can execute other protocols (tasks).
    response1 = await others()
    print("IO The request ends with the following results:", response1)
    response2 = await others()
    print("IO The request ends with the following results:", response2)
asyncio.run( func() )

All the examples above create only one task, that is, there is only one task in the task list of the event loop, so switching to another task cannot be demonstrated while the IO is waiting.

Task objects are needed to create multiple task objects in the program.

3.2.3 Task object

Tasks are used to schedule coroutines concurrently.

When a coroutine is wrapped into a Task with functions like asyncio.create_task() the coroutine is automatically scheduled to run soon.

Tasks are used to concurrently schedule a protocol by creating a Task object using asyncio.create_task (the protocol object), which allows the protocol to join the event loop and wait for the scheduled execution.In addition to using the asyncio.create_task() function, you can also use lower-level loop.create_task() or ensure_future() functions.Manual instantiation of Task objects is not recommended.

Essentially, it encapsulates a coprocess object as a task object, immediately adds the coprocess to the event loop, and tracks the state of the coprocess.

Note: The asyncio.create_task() function was added in Python 3.7.Prior to Python 3.7, you could use the lower-level asyncio.ensure_future() function instead.

Example 1:

import asyncio
async def func():
    print(1)
    await asyncio.sleep(2)
    print(2)
    return "Return value"
async def main():
    print("main start")
    # Create a protocol, encapsulate it in a Task object, and immediately add it to the task list of the event loop, waiting for the event loop to execute (default is ready).
    task1 = asyncio.create_task(func())
    # Create a protocol, encapsulate it in a Task object, and immediately add it to the task list of the event loop, waiting for the event loop to execute (default is ready).
    task2 = asyncio.create_task(func())
    print("main End")
    # When an IO operation is encountered for a protocol, it automatically switches to perform other tasks.
    # Here await waits for the corresponding process to complete and get results
    ret1 = await task1
    ret2 = await task2
    print(ret1, ret2)
asyncio.run(main())

Example 2:

import asyncio
async def func():
    print(1)
    await asyncio.sleep(2)
    print(2)
    return "Return value"
async def main():
    print("main start")
    # Create a protocol, encapsulate it in a Task object, add it to the task list of the event loop, and wait for the event loop to execute (default is ready).
    # In call
    task_list = [
        asyncio.create_task(func(), name="n1"),
        asyncio.create_task(func(), name="n2")
    ]
    print("main End")
    # When an IO operation is encountered for a protocol, it automatically switches to perform other tasks.
    # Here await waits for all the protocols to be executed and saves the return values of all the protocols to done
    # If a timeout value is set, it means the maximum number of seconds to wait here, the return value of the completed process is written to the done, and the pending value is written to the incomplete.
    done, pending = await asyncio.wait(task_list, timeout=None)
    print(done, pending)
asyncio.run(main())

Note: The asyncio.wait source code encapsulates a Task object by executing ensure_future internally for each protocol in the list, so the value of task_list [func(),func()] is also possible when used with wait.

Example 3:

import asyncio
async def func():
    print("Execute the internal code of the coprocedure function")
    # The IO operation suspends the current protocol (task) until the IO operation is completed.When the current protocol is suspended, the event loop can execute other protocols (tasks).
    response = await asyncio.sleep(2)
    print("IO The request ends with the following results:", response)
coroutine_list = [func(), func()]
# Error: coroutine_list = [asyncio.create_task (func()), asyncio.create_task (func())]  
# It is not possible to directly asyncio.create_task here because Task is immediately added to the task list of the event loop.
# However, at this time the event loop has not been created, so an error will be reported.
# Use asyncio.wait to encapsulate the list as one protocol and call asyncio.run to implement the two protocols
# Inside asyncio.wait, ensure_future is executed for each protocol in the list and encapsulated as a Task object.
done,pending = asyncio.run( asyncio.wait(coroutine_list) )

3.2.4 asyncio.Future object

A Futureis a special low-level awaitable object that represents an eventual result of an asynchronous operation.

The Future object in asyncio is a relatively lower-level object that we don't normally use directly, but rather directly use Task objects to complete tasks and track status.(Task is a subclass of Futrue)

Future provides us with the processing of the final result in asynchronous programming (the Task class also has state processing capabilities).

Example 1:

async def main():
    # Get the current event loop
    loop = asyncio.get_running_loop()
    # # Create a task (a Future object) that does nothing.
    fut = loop.create_future()
    # Wait for the final result of the task (the Future object), and wait until there is no result.
    await fut
asyncio.run(main())

Example 2:

import asyncio
async def set_after(fut):
    await asyncio.sleep(2)
    fut.set_result("666")
async def main():
    # Get the current event loop
    loop = asyncio.get_running_loop()
    # Create a task (a Future object) with no behavior bound, and the task never knows when to end.
    fut = loop.create_future()
    # Create a task (a Task object) bound to the set_after function, which assigns a fut after 2 seconds.
    # That is, if you manually set the final result of the future task, the fut will be over.
    await loop.create_task(set_after(fut))
    # Wait for the Future object to get the final result, otherwise wait all the time
    data = await fut
    print(data)
asyncio.run(main())

The Future object itself functions are bound, so you need to set it manually if you want the event loop to get the result of Future.The Task object inherits the Future object, which actually extends Future to automatically execute set_result after the corresponding bound function is executed.

Although Task objects are usually used, the processing of results is essentially based on Future objects.

Extension: Object classes that support await object syntax can be waitable objects, so coprocess objects, Task objects, and Future objects can all be waitable objects.

3.2.5 futures.Future object

There is also a Future object in Python's concurrent.futures module, which is an object used to implement asynchronous operations based on thread pools and process pools.

port time
from concurrent.futures import Future
from concurrent.futures.thread import ThreadPoolExecutor
from concurrent.futures.process import ProcessPoolExecutor
def func(value):
    time.sleep(1)
    print(value)
pool = ThreadPoolExecutor(max_workers=5)
# Or pool = ProcessPoolExecutor(max_workers=5)
for i in range(10):
    fut = pool.submit(func, i)
    print(fut)

The two Futures objects are different. They are designed for different scenarios, such as concurrent.futures.Future s does not support await syntax.

Official Tip The difference between the two objects:

A function, asynic.wrap_future, that wraps a futures.Future object into an asyncio.Future object is provided in Python.

Next you must ask: Why does python provide this functionality?

In fact, generally in program development, we either use asycio's protocol for asynchronous operation or use process pools and thread pools for asynchronous operation.This functionality is used, however, if the asynchronous mix of the coprocess pool and the asynchronous mix of the process pool and the thread pool is used.

import time
import asyncio
import concurrent.futures
def func1():
    # A time-consuming operation
    time.sleep(2)
    return "SB"
async def main():
    loop = asyncio.get_running_loop()
    # 1. Run in the default loop's executor (default ThreadPoolExecutor)
    # Step 1: ThreadPoolExecutor's submit method is called internally to request a thread in the thread pool to execute the func1 function and return a concurrent.futures.Future object
    # Step 2: Call asyncio.wrap_future to wrap the concurrent.futures.Future object as an asycio.Future object.
    # Because the concurrent.futures.Future object does not support the await syntax, it needs to be wrapped as an asycio.Future object to be used.
    fut = loop.run_in_executor(None, func1)
    result = await fut
    print('default thread pool', result)
    # 2. Run in a custom thread pool:
    # with concurrent.futures.ThreadPoolExecutor() as pool:
    #     result = await loop.run_in_executor(
    #         pool, func1)
    #     print('custom thread pool', result)
    # 3. Run in a custom process pool:
    # with concurrent.futures.ProcessPoolExecutor() as pool:
    #     result = await loop.run_in_executor(
    #         pool, func1)
    #     print('custom process pool', result)
asyncio.run(main())

Scenario: This feature is required when a project is developed as a collaborative asynchronous programming, if you want to use a third-party module that does not support collaborative asynchronous programming, for example:

import asyncio
import requests
async def download_image(url):
    # Send network requests, download pictures (encounter IO requests to download pictures on the network, switch to other tasks automatically)
    print("Start downloading:", url)
    loop = asyncio.get_event_loop()
    # The requests module does not support asynchronous operations by default, so it is implemented using a thread pool.
    future = loop.run_in_executor(None, requests.get, url)
    response = await future
    print('Download complete')
    # Save Picture to Local File
    file_name = url.rsplit('_')[-1]
    with open(file_name, mode='wb') as file_object:
        file_object.write(response.content)
if __name__ == '__main__':
    url_list = [
        'https://www3.autoimg.cn/newsdfs/g26/M02/35/A9/120x90_0_autohomecar__ChsEe12AXQ6AOOH_AAFocMs8nzU621.jpg',
        'https://www2.autoimg.cn/newsdfs/g30/M01/3C/E2/120x90_0_autohomecar__ChcCSV2BBICAUntfAADjJFd6800429.jpg',
        'https://www3.autoimg.cn/newsdfs/g26/M0B/3C/65/120x90_0_autohomecar__ChcCP12BFCmAIO83AAGq7vK0sGY193.jpg'
    ]
    tasks = [download_image(url) for url in url_list]
    loop = asyncio.get_event_loop()
    loop.run_until_complete( asyncio.wait(tasks) )

3.2.6 Asynchronous Iterator

What is an asynchronous iterator

Implemented __aiter__() And __anext__() Object of the method._u anext_u must return a awaitable Object. async for Which handles asynchronous iterators __anext__() The waitable object returned by the method until it raises a StopAsyncIteration Exception.By PEP 492 Introduction.

What are asynchronous Iterable objects?

In async for The object used in the statement.Must pass through it __aiter__() The method returns a asynchronous iterator .By PEP 492 Introduction.

import asyncio
class Reader(object):
    """ Custom asynchronous iterators (also asynchronous Iterable objects) """
    def __init__(self):
        self.count = 0
    async def readline(self):
        # await asyncio.sleep(1)
        self.count += 1
        if self.count == 100:
            return None
        return self.count
    def __aiter__(self):
        return self
    async def __anext__(self):
        val = await self.readline()
        if val == None:
            raise StopAsyncIteration
        return val
async def func():
    # Create Asynchronous Iterable Objects
    async_iter = Reader()
    # async for must be placed inside the async def function, otherwise the syntax is incorrect.
    async for item in async_iter:
        print(item)
asyncio.run(func())

Asynchronous iterators don't really work much, they just support the async for syntax.

3.2.6 Asynchronous Context Manager

This object is defined by __aenter__() And __aexit__() Method pair async with The environment in the statement is controlled.By PEP 492 Introduction.

import asyncio
class AsyncContextManager:
    def __init__(self):
        self.conn = conn
    async def do_something(self):
        # Asynchronous Operation Database
        return 666
    async def __aenter__(self):
        # Asynchronously Linked Database
        self.conn = await asyncio.sleep(1)
        return self
    async def __aexit__(self, exc_type, exc, tb):
        # Close database links asynchronously
        await asyncio.sleep(1)
async def func():
    async with AsyncContextManager() as f:
        result = await f.do_something()
        print(result)
asyncio.run(func())

This asynchronous context manager is also useful, which can be used when opening, processing, and closing operations during development.

3.3 Summary

As long as you see the async and await keywords in your program, the internal part of asynchronous programming is based on a collaboration. This asynchronous programming uses a thread to perform other tasks during IO wait time to achieve concurrency.

These are common operations for asynchronous programming, referenced to official documentation.

4. uvloop

The asyncio module is available in the Python Standard Library to support asynchronous programming based on protocols.

Uvloop is an alternative to event loops in asyncio that can improve asyncio performance.In fact, uvloop is at least twice as fast as other python asynchronous frameworks such as nodejs, gevent, and can perform better than Go Language.

Install uvloop

pip3 install uvloop

It's also easy to replace asyncio's event loop with uvloop in your project, as long as you do so in your code.

import asyncio
import uvloop
asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
# Write the asyncio code as you did before.
# Internal event loop automation becomes uvloop
asyncio.run(...)

Note: Inside the well-known asgi uvicorn is the event loop using uvloop.

5. Actual cases

For a better understanding, the IO scenario for all the examples above is based on asyncio.sleep, where a lot of IO is used in real project development.

5.1 Asynchronous Redis

When operating redis through python, linking, setting values, and getting values all involve network IO requests, and asycio asynchronous allows you to do other tasks while the IO is waiting to improve performance.

Install Python Asynchronous Operations redis module

pip3 install aioredis

Example 1: Asynchronous operation redis.

#!/usr/bin/env python
# -*- coding:utf-8 -*-
import asyncio
import aioredis
async def execute(address, password):
    print("Start execution", address)
    # Network IO operations: Create redis connections
    redis = await aioredis.create_redis(address, password=password)
    # Network IO operation: Set the hash value car in redis, and set three key-value pairs internally: redis = {car:{key1:1, key2:2, key3:3}}
    await redis.hmset_dict('car', key1=1, key2=2, key3=3)
    # Network IO operations: get values in redis
    result = await redis.hgetall('car', encoding='utf-8')
    print(result)
    redis.close()
    # Network IO operations: close redis connection
    await redis.wait_closed()
    print("End", address)
asyncio.run(execute('redis://47.93.4.198:6379', "root!2345"))

Example 2: Connect multiple redis for operations (IO switches other tasks, providing performance).

import asyncio
import aioredis
async def execute(address, password):
    print("Start execution", address)
    # Network IO operations: connect 47.93.4.197:6379 first, switch tasks automatically when IO is encountered, connect 47.93.4.198:6379
    redis = await aioredis.create_redis_pool(address, password=password)
    # Network IO Operations: Automatically switch tasks when IO is encountered
    await redis.hmset_dict('car', key1=1, key2=2, key3=3)
    # Network IO Operations: Automatically switch tasks when IO is encountered
    result = await redis.hgetall('car', encoding='utf-8')
    print(result)
    redis.close()
    # Network IO Operations: Automatically switch tasks when IO is encountered
    await redis.wait_closed()
    print("End", address)
task_list = [
    execute('redis://47.93.4.197:6379', "root!2345"),
    execute('redis://47.93.4.198:6379', "root!2345")
]
asyncio.run(asyncio.wait(task_list))

More redis operations refer to aioredis website: https://aioredis.readthedocs.io/en/v1.3.0/start.html

5.2 Asynchronous MySQL

When MySQL is operated on through python, connection, execution of SQL, shutdown all involve network IO requests, and asycio asynchronous mode allows you to do other tasks while IO is waiting to improve performance.

Install Python Asynchronous Operations redis module

pip3 install aiomysql

Example 1:

import asyncio
import aiomysql
async def execute():
    # Network IO operations: Connect MySQL
    conn = await aiomysql.connect(host='127.0.0.1', port=3306, user='root', password='123', db='mysql', )
    # Network IO operations: Create CURSOR
    cur = await conn.cursor()
    # Network IO operations: Execute SQL
    await cur.execute("SELECT Host,User FROM user")
    # Network IO operations: Get SQL results
    result = await cur.fetchall()
    print(result)
    # Network IO operations: close link
    await cur.close()
    conn.close()
asyncio.run(execute())

Example 2:

#!/usr/bin/env python
# -*- coding:utf-8 -*-
import asyncio
import aiomysql
async def execute(host, password):
    print("start", host)
    # Network IO operation: connect 47.93.40.197 first, switch tasks automatically when IO is encountered, connect 47.93.40.198:6379
    conn = await aiomysql.connect(host=host, port=3306, user='root', password=password, db='mysql')
    # Network IO Operations: Automatically switch tasks when IO is encountered
    cur = await conn.cursor()
    # Network IO Operations: Automatically switch tasks when IO is encountered
    await cur.execute("SELECT Host,User FROM user")
    # Network IO Operations: Automatically switch tasks when IO is encountered
    result = await cur.fetchall()
    print(result)
    # Network IO Operations: Automatically switch tasks when IO is encountered
    await cur.close()
    conn.close()
    print("End", host)
task_list = [
    execute('47.93.40.197', "root!2345"),
    execute('47.93.40.197', "root!2345")
]
asyncio.run(asyncio.wait(task_list))

5.3 FastAPI Framework

FastAPI is a high-performance web framework for building APIs based on Python 3.6+ type hints.

The following asynchronous examples are illustrated with FastAPI and uvicorn (uvicorn is an asgi that supports asynchronization).

Install the FastAPI web framework,

pip3 install fastapi

Install uvicorn, an asgi that essentially provides socket server support for the web (asynchronous asgi is generally supported, asynchronous wsgi is not supported)

pip3 install uvicorn

Example:

#!/usr/bin/env python
# -*- coding:utf-8 -*-
import asyncio
import uvicorn
import aioredis
from aioredis import Redis
from fastapi import FastAPI
app = FastAPI()
REDIS_POOL = aioredis.ConnectionsPool('redis://47.193.14.198:6379', password="root123", minsize=1, maxsize=10)
@app.get("/")
def index():
    """ General Operating Interface """
    return {"message": "Hello World"}
@app.get("/red")
async def red():
    """ Asynchronous Operating Interface """
    print("A request has come")
    await asyncio.sleep(3)
    # Connection pool to get a connection
    conn = await REDIS_POOL.acquire()
    redis = Redis(conn)
    # Set Value
    await redis.hmset_dict('car', key1=1, key2=2, key3=3)
    # Read Value
    result = await redis.hgetall('car', encoding='utf-8')
    print(result)
    # Connection Return Connection Pool
    REDIS_POOL.release(conn)
    return result
if __name__ == '__main__':
    uvicorn.run("luffy:app", host="127.0.0.1", port=5000, log_level="info")

In the case of multiple concurrent requests from users, an interface written asynchronously can handle other requests while IO is waiting to provide performance.

For example, two users simultaneously send requests to the interface http://127.0.0.1:5000/red, the server has only one thread, and only one request is processed at the same time.Asynchronous processing provides concurrency because the second request is waiting to be processed when the view function processes the first request, automatically switches to receive and process the second request when the first request encounters an IO wait, automatically switches to other requests when an IO is encountered, and goes back to the specified request again once the IO has been executed to continue down.Execute its function code.

5.4 Reptiles

When crawler applications are written, network IO is required to request target data, which is suitable for asynchronous programming to improve performance. Next, we use aiohttp module to support asynchronous programming.

Install aiohttp module

pip3 install aiohttp

Example:

import aiohttp
import asyncio
async def fetch(session, url):
    print("Send request:", url)
    async with session.get(url, verify_ssl=False) as response:
        text = await response.text()
        print("Result:", url, len(text))
async def main():
    async with aiohttp.ClientSession() as session:
        url_list = [
            'https://python.org',
            'https://www.baidu.com',
            'https://www.pythonav.com'
        ]
        tasks = [asyncio.create_task(fetch(session, url)) for url in url_list]
        await asyncio.wait(tasks)
if __name__ == '__main__':
    asyncio.run(main())

summary

To improve performance, more and more frameworks are moving towards asynchronous programming, such as sanic, tornado, Django 3.0, django channels components, and so on. Why not do more with fewer resources?

Tags: Python Redis network Programming

Posted on Tue, 05 May 2020 21:47:37 -0400 by jonstu