Memory usage management with garbage collector vs concurrent features

Dear Python experts,

I am trying to run something that looks like:

import concurrent.futures  as cf

#-----------------------------------------
def high_memory_usage():
     #stuff here
#-----------------------------------------
for i_run in range(10):
    with cf.ProcessPoolExecutor(max_workers=1) as executor:
        executor.submit(high_memory_usage) 

to keep my memory usage under 4Gb as the loop progresses. However the code inside high_memory_usage does not seem to work anymore with concurrent.features. If I do not use anything, the memory does keep growing. I am thinking of calling the garbage collector:

gc.collect()

after each iteration. Is this a good idea? Is there a better way to do this?

Cheers.

Try using a memory profiler like memray to see what’s going on.