I think you have the right general idea, but I want to clarify a few points.
- Coroutines within Python are not specifically associated with asyncio. With how PEP 492 [1] was implemented (and the legacy generator-based coroutines implemented with PEP 342 [2]), any library or framework can make use of them, as well as the associated async/await syntax.
The main purpose of asyncio is to provide a high-level API for implementing IO-bound concurrency through asynchronous programming. This often comes in the form of coroutines or other objects that use them, but my point is that coroutines are not dependent on asyncio.
- While they can be used for a similar purpose, I wouldn’t say that coroutines necessarily “represent the same thing as a thread”. OS threads have their own individual program counters and separate stacks from one another; this is not true for coroutines.
A bit more clear of a way to describe coroutines at a high-level is that they’re essentially an object that represents the state of a function/method (subroutine), and can be suspended and resumed (through usage of await) at multiple points. This is unlike a subroutine [3], which only has only one point of entry and exit.
Also, OS threads do still have a use case within asyncio. Specifically, if it is desired to run an IO-bound subroutine without blocking the event loop, they can be ran within the event loop’s ThreadPoolExecutor (from concurrent.futures) through loop.run_in_executor() [4]. This is especially useful when implementing concurrency for existing code or libraries that were not implemented with async in mind.
Not only does asyncio not solve the GIL problem, it’s also not a significant factor when dealing with IO-bound tasks. The GIL only becomes significant when implementing concurrency for CPU-bound tasks, which is not the primary focus of asyncio.
For CPU-bound concurrency in Python, we have subprocesses. Process pools can be used in asyncio via loop.run_in_executor(), by passing an instance of concurrent.futures.ProcessPoolExecutor to the executor parameter (instead of using the default one, which is ThreadPoolExecutor).
Note: We’re currently planning on improving the API for using pools in asyncio in Python 3.9. The goal is to provide a more intuitive and user friendly way of using thread pools and process pools, instead of using loop.run_in_executor(). I’m currently in the early stages of implementing an asyncio.ThreadPool().
[1] PEP 492 – Coroutines with async and await syntax | peps.python.org
[2] PEP 342 – Coroutines via Enhanced Generators | peps.python.org
[3] A generator also has more than one point of entry/exit and can suspend via yield, but unlike a coroutine, it can’t pass values or raise exceptions when the function is resumed.