Limiting the memory usage of the process

Is there a way to limit memory usage when running a script with the python process?

Even though there is no problem with unused memory, it seems the garbage collector only does the bare minimum when the memory usage hits the limit and is clashing with browser + IDE etc.

But the memory actually referenced at any given time is far less than this maximum amount, and it causes delays when I start a high memory usage process and small lags when only then gc seems to start working?

Thanks for your answer

That’s simply not the case. Python will release memory as soon as it doesn’t need it (broadly speaking). I’ve no idea how you’re calculating how much memory is “actually referenced”, but chances are, all that memory really is being used.

That sounds like the exact sorts of issues that running out of RAM does, and has nothing to do with garbage collection.

I look at the memory measured to be allocated to the process in task manager.

It is highly unlikely that the memory is actually needed, instead of just waiting to be gced later, as the script runs through a significant amount of data, vastly exceeding the total capacity of my RAM. In fact, initially I made a mistake where only the smallest unit of data each iteration was not dereferenced, which after billions of iterations eventually caused my laptop to lag permanently, unless I killed the process.

Now the memory spikes to basically the maximum every time, but whenever I open a process requiring a large chunk of memory, the process readily frees it, which apparently causes minor lag, but then runs without a hitch at the lower memory levels (observed through task manager).

There are layers of memory management.

  1. OS windows/linux etc
  2. C runtime memory management
  3. Python runtime memory management

It is often the case that python returned the memory to the C runtime and it cannot return it to the OS.

If it really is an issue of the GC needs running you can run it yourself to check.
Note the GC is only needed to delete python objects that have cyclic references.

import gc

hm i see. why is that though?
why does the c runtime not return the memory to the system if it is free? so is it cpython’s fault or is it a flaw of the c runtime?

Afaik this is not CPython’s fault, and not even just the “fault” of the C runtime. It’s also related to the OS and hardware.
When you have a low-level memory management layer, it’s easy to see, I think, that immediately giving back memory, doesn’t always make sense. For one you could have performance issues. But you could also cause memory fragmentation. And even if the memory was given back, it might not really be usable by the program (or other programs). It’s not at all trivial to deal with those kind of issues.
“Why?” is always a good question :slight_smile: If you want to know why in more detail, might be best to google search for memory management in modern OS’s.