You might want to check out
which encapsulates a lot of the tricky parts of “run a bunch of jobs, but not too many”.
You might want to check out
which encapsulates a lot of the tricky parts of “run a bunch of jobs, but not too many”.
Looks nice. One oddity is that if I hit Ctrl-C in my test harness (for N in ... try amap except Exception
), the current loop quits but there’s no traceback and the next loop starts. Hitting Ctrl-C a few times repeatedly exits via a traceback including GetQueuedCompletionStatus
.
Before I go off and try debugging intermittent Ctrl-C errors, and report the problem to aiometer, is this somehow “to be expected”? I recall from a different conversation you suggested that asyncio isn’t entirely ctrl-C safe. But I’d consider “handling user interrupts” as one of the “tricky parts” that I’d want a library to handle for me behind the scenes - is that a reasonable expectation? The reason I’m interested is that when I’m downloading so many files in one job, being able to stop it cleanly to give the server a break is reasonably important
I’m not sure what the current state-of-the-art is for control-C handling in asyncio. I doubt aiometer does anything special with it; it’s just a utility library, and control-C handling is more of a platform thing. So the issue might be a bug in aiometer, or a more fundamental issue with the asyncio loop, I dunno – I’ll let those whoa re more familiar with asyncio answer that one
Cool. And I can always just raise it with aiometer and let them decide. Thanks.
https://www.python-httpx.org/advanced/#pool-limit-configuration
I believe that’s all you need to limit concurrency; no queues are needed.
A while ago I’ve done a small proof-of-concept TaskPoolExecutor
: asynconcurrent/futures.py at main · berislavlopac/asynconcurrent · GitHub - based on Making an Unlimited Number of Requests with Python aiohttp + pypeln | by Cristian Garcia | Medium (FWIW).