Please enlighten me with what this "asynchronous execution mode" is about :(

I’ve meet the titular term first when I’s using JavaScript. Now I see that Python also has got async keyword, so it must have got asynchronous execution too (I’m such a master of deduction :wink: )

After a lot of thinking I concluded that I really don’t understand how this asynchronous-thing works and how it is different from concurrent execution :frowning:

So, I have questions:

  • If we have got task T1 taking t1 time and tasks T2, T3 taking t2 and t3 time to execute, then how long time it takes to run tasks T1, T2, and T3 concurrently and how much asynchronously?
  • Is there a difference for running asynchronous tasks on one-core CPU and on many cores one?
  • What is the use of asynchronous execution?

Unfortunately this isn’t enough information to answer the question.

  • When running “concurrently” (I assume you mean “with threads”?) how much time do the tasks spend holding the GIL (and other locks)?
  • When running asynchronously, how much time do the tasks spend on the event loop, and how much time do they spend waiting on I/O?

Depending on the answers to both, the time could be t1 + t2 + t3 (rough worst case) or max(t1, t2, t3) (rough best case). Though when using threads, if you mess up, it could end up worse still than the worst case, as each task ends up spending longer than it should, fighting for shared resources. Or even taking forever if you deadlock.

In general, pre-emptive threading will exploit more available concurrency, but is more prone to bad behaviours causing worse-than-serial performance, and also loses more potential performance due to needing expensive mechanisms to avoid data races that cooperative asynchronous code can avoid

1 Like

Threads and coroutines both provide concurrency. The difference is that threads are scheduled preemptively (the thread manager can suspend a thread at any time in order to allow another thread to run) while coroutines are scheduled cooperatively (a coroutine can run until it voluntarily yields to another coroutine.)

On one hand, threads can guarantee fairness better than coroutines, because the thread manager can ensure that each thread that wants the CPU can get an equal slice of time. Coroutines have to rely on other coroutines to not “hog” the CPU.

On the other hand (assuming I’m not oversimplifying this too much), coroutines don’t need to use complicated locking protocols to ensure that their “reservations” for shared resources are respected if they are preempted before they are done with a resource; they simply don’t yield until they are done. As such, locks are implicitly “acquited” when a coroutine is selected to run, and implicitly “released” when the thread yields.

In neither case can you really predict how long it will take a set of tasks to complete, because execution time depends in either case on the exact blocking behavior needed by all tasks. But in both cases you can assume that if one task thread can’t use the CPU, control can/will be given to a thread that can, eliminating unnecessary idle time.

3 Likes

Async is designed for cases when a process has multiple I/O-bound tasks, such as a web page server or online service such as we are using right now.

Currently, using multiple cores with python, for multiple compute-bound tasks, requires using multiple processes, as with multitasking module. This should change when the experimental free-threading build is more advanced.

Hmm…you all shed some light on the topic for me. Thanks!

*takes his favorite trench coat and goes out. Then, suddenly he appears again in the doors*

One more thing…

In my ye oldie post I made once on SO about “multithread asynchronous execution in JS” someone gave me a comment that JS is not “multithread”. By looking on that and this thread I have to ask:
Is “async” a way, like multithreading, to do things all in the same time? Of course if we omit all locks, semaphores or fighting for shared resources. If e.g. we had some simple tasks like π-counting then in a multitask execution they would run in parallel. Is it in any way similar with async-s?

Sorry for such noob questions, but I really want to know how the concept of asynchronous execution works inside >_>

Yes and no. Some things can happen in parallel, mostly things that are specifically designated, like waiting on a socket to read from a webserver. Everything else, at least generally, happens in sync on the same thread. In python, with the default setup of the event loop, only one coroutine will ever execute at the same time, although multiple coroutines might be conceptually active with some of them waiting on something (by having reached an await expression/statement).

OK, let me sum up what I’ve learned:

“Async” is not a way to run things in parallel

Although, tasks can be put into threads and run there, but per definition, async is not a multitasking.

It is a method to “slice” the time and tasks and assigning them in pairs

I understand it like a kind of reordering executions of pieces of tasks, just like what we had in ye oldie time of first operating systems with GUI applications.

If someone doesn’t know what I mean: in such time OS-es were giving an app and infinite amount of time to do what it wanted to do. Yet in order to make an “multitasking” impression on an user, wherever an app asked something from the OS, the OS suspended the app and let other app run, until it also called a system routine, and so on. The times with OS-es doing the preemptive executions of apps came after that.

It has an advantage over multitasking when tasks are operating on shared resources

Because only one task is running at a time, there is no race conditions (although I can still imagine a deadlock situation)

It requires some kind of task manager to make decisions about which task to run at a time

I suppose in JS there is a builtin manager, because people are putting async/await all over the place. However in Python one has to create a manager on his/her own (by using some library like “asyncio”).
The manager then listens which task await-ed.

In Python an async function does not create a promise object automatically

This is in contrast with JS where every call to an async function makes a promise unless preceded with “await” keyword.

I think that is all. I’m enlightened :smiley: Thank you all!

That depends on your definition of your multitasking and what exactly you are trying to do. If you consider waiting on multiple sockets at the same time as a kind of multitasking, then yes, async does allow for running things in parallel. Especially asyncio isn’t just about slicing up time between multiple different operations, but about taking advantage of slow IO operations to do other work.