Multiprocessing queue uses more cores than specified

I have been following the examples in the multiprocessing section of the documents

you can specify NUMBER_OF_PROCESSES = 2
and then do

for i in range(NUMBER_OF_PROCESSES):
Process(target=doDemGofntQ, args=(task_queue, done_queue)).start()

but it seem to still use all available cores (4 in my case)


code posting guidelines:


Regarding your code, for each loop iteration, what is the code doing differently with respect to the previous iteration or by extension, the subsequent iteration?


I have a python project on github that calculates various radiative properties of atomic ions such as C III (2 times ionized carbon) using the chianti atomic database. A user of my package found that the classes that used multiprocessing took up too many of his cores. These use the multiprocessing queue technique. I have been testing it out and find that one can set the number of processes but that does not mean it is limited to that number of cores.

To answer you question, in each task I instantiate a class that then reads all of the atomic data that it needs to do subsequent calculations and keeps these data as attributes. The next step in the task is to compute the wavelengths and intensities of spectral lines for that ion as a function of temperature and density. These are also stored as attributes.

Looking around at various discussions of multiprocessing it seem that multiprocessing carries a large overhead and the task objects are probably pretty large. So, in the end I am not getting much benefit from multiprocessing.

probably more that you wanted to hear but it does explain what is going on.