Not sure this is the right place to ask this question but I try anyway. I’m running a simulation that uses the scipy optimizer “differential-evolution” with a “worker” count of 8. That means that it is using parallel processing and in my case it uses about 50% of the CPU capacity (12th Gen Intel(R) Core™ i9-12900 2.40 GHz). I have a problem with that the rest of the applications like the Edge browser and other programs become non-responsive for longer or shorter periods. Running the script from a iPython shell within the Spyder IDE.
Anybody have experienced something similar and have any idea of what to do for it?
How much memory is it using? I suspect that’s the culprit, rather than CPU.
This isn’t really a python thing, this is just a resource issue. If you’re using 50% of your cpus and some large amount of RAM, things are gonna start chugging.
The total memory usage is 16 out of 32 GB so that should not be a problem? I agree it is not a python problem directly, but I was thinking in terms of how the “worker” distributes excution on different cores, and if somehow some execution ends up on a core that is already used by other apps etc.
I know it is longshot, and that I will have to live with it, but think it is strange that I cannot practically utilize my high performance PC to more than 50%…
One option is to set an affinity for the python processes in Windows. In other words you would be forcing them to only run on certain cores. That way you can leave a core or two completely free for other processes to schedule against.
This stack overflow link explains how you may do that: windows - Launch program with python on a specific core - Stack Overflow
Note that you would need to do it on each worker process. I don’t know enough about scipy to know if that’s easily doable or not. If you have a way to run a function you could just run a function to set affinity in each process.