I’m priting a complex Python project using multiprocessing library. I have memorey used problems. Is difficult to describe entire project, but I would ask suggestions. Often code stops when I try to copy many big files and read many files in different processes (launched in multiprocessing mode). I would ask if is possible to add a check of availability ram memory and retrying a process when memory is available. For example if copy file stops for bad allocation memory, could be possible to repeat copy command (as other many commands) until process successes, without stopping entire project? Unfortunately I don’t have more ram memory available.
thank you very much