current scenario -

```
def func():
from sympy import *
return 1
```

it gives me error,

```
SyntaxError: import * only allowed at module level
```

expected scenario -

`import *`

is allowed from inside function also

current scenario -

```
def func():
from sympy import *
return 1
```

it gives me error,

```
SyntaxError: import * only allowed at module level
```

expected scenario -

`import *`

is allowed from inside function also

My guess is that when python compiles your code its needs a list of all variables that the function will use. The number of variables that import * can return can change after the python code is compiled.

2 Likes

Correct. Before Python 3, `import *`

was allowed in functions, and CPython had a separate, slower function implementation that was used for function code that had such.

1 Like

Correct. Consider:

```
foo = 123
def func():
from some_module import *
print(foo)
```

Is `foo`

a local name, imported from some_module? Or is it the global?

For that matter - what about the name `print`

? That, too, could have been overwritten by the import. This kind of code is VERY hard to deal with for humans, and for the Python interpreter, it required much slower run-time lookups for all names.

I strongly recommend just importing the module itself. If you have to use individual names without dots, be explicit - `from sympy import X, Y, Z`

- so that both Python and your subsequent self can be confident of whatâ€™s getting imported.

Good to know from documentation:

Note that in general the practice of importing

`*`

from a module or package is frowned upon, since it often causes poorly readable code. However, it is okay to use it to save typing in interactive sessions.

Interactive sessions are the intended use of `import *. I tend to use `

import longnane as letter_or_2` anyway.

1 Like

BITD, some modules were meant to be imported using `from something import *`

. The old `Tkinter`

comes to mind. Did something change when it was renamed to `tkinter`

? a quick `dir()`

suggests not. That means Terryâ€™s `import tkinter as tk`

is probably the best route to less typing without namespace pollution.

Hi Falks

Is it ever going to exist a statement, letâ€™s say â€śunimportâ€ť to release a previously imported module no longer needed?

Suppose I wanted to â€śunimportâ€ť the `tk`

name I imported. I would execute:

```
del tk
```

That would remove it from the namespace in which I imported it. You normally donâ€™t want to do that, as any future references (say, in functions which are called later) will fail.

Oh, that is very cleaver,

The only thing you do with that is, not being able to access those functions anymore, but the resources it took remain in used.

Try the following and let me know what you see:

1- Check before the import the mem usage of python in task manager (windows)

2- Execute: `import numpy as np`

, then, check mem usage. as per step 1.

3- Execute `del np`

, then, check mem usage. as per step 1.

What I see when I do this is that I lost the way to access numpy as np.

Thanks for you time.

Alexei

The `del`

statement simply removes the name from the current namespace.

I think attempting to completely unload the module from memory is pretty much pointless. You can try deleting the name from the `sys.modules`

dictionary, but that probably wonâ€™t result in memory pages being released either. Hereâ€™s what I see in sys.modules after the tkinter import I suggested:

```
>>> import sys
>>> before = set(sys.modules.keys())
>>> import tkinter as tk
>>> after = set(sys.modules.keys())
>>> after - before
{'tkinter', 'tkinter.constants', '_tkinter'}
```

You probably werenâ€™t expecting those other two names. They all came along â€śfor freeâ€ť with the `tkinter`

import. At minimum, youâ€™d have to `del`

all three of them. And your memory footprint will likely not be reduced.

Numpy is even worse. Way worse.

```
>>> import sys
>>> before = set(sys.modules.keys())
>>> import numpy as np
>>> after = set(sys.modules.keys())
>>> after - before
{'fcntl', '_posixsubprocess', 'numpy.core.numerictypes', 'numpy.core.fromnumeric', 'selectors', 'numpy.core._dtype', 'threading', 'datetime', 'numpy.version', 'numpy._distributor_init', 'numpy._version', 'numpy._pytesttester', 'numpy.__config__', 'numpy.matrixlib.defmatrix', 'numpy.random._bounded_integers', 'numpy.lib.shape_base', 'numpy.random._generator', 'numpy._globals', '_struct', 'ctypes', 'numpy.lib.utils', 'platform', 'numpy.lib._iotools', 'numpy.core.arrayprint', 'numpy.core.overrides', 'numpy.random._mt19937', 'numpy.linalg', 'numpy.polynomial.hermite', 'numpy.compat', 'numpy.core.einsumfunc', 'numpy.ma.core', 'pickle', 'numpy.core.records', 'numpy.core.memmap', 'numpy.linalg._umath_linalg', 'errno', 'numpy.core._methods', 'weakref', 'numpy.core._internal', 'numpy.lib.histograms', 'numpy.lib.index_tricks', 'numpy.core.umath', 'numpy.lib.function_base', 'numbers', 'numpy.core.numeric', 'numpy.random._sfc64', 'numpy.random.bit_generator', 'select', 'secrets', 'base64', 'numpy.core.defchararray', 'numpy.fft', 'numpy.linalg.linalg', 'numpy.polynomial.chebyshev', 'struct', 'numpy.fft._pocketfft', 'numpy.random._common', 'numpy.lib.scimath', 'numpy.core._machar', 'numpy.core.getlimits', 'numpy.polynomial.polynomial', 'urllib.parse', 'numpy.compat._pep440', 'binascii', 'urllib', 'numpy.lib.npyio', '_pickle', 'pathlib', 'numpy.lib.arraysetops', 'numpy.polynomial.laguerre', 'numpy.lib.nanfunctions', 'numpy.random._pcg64', 'numpy.core._exceptions', 'ntpath', 'json.encoder', 'numpy.core._type_aliases', 'numpy.random.mtrand', '_compat_pickle', 'numpy.random', 'numpy.random._philox', 'numpy.lib.format', 'numpy.core._multiarray_tests', 'json.decoder', 'numpy.lib.type_check', 'numpy.lib._version', '_blake2', 'numpy.polynomial.polyutils', 'hashlib', 'subprocess', 'random', 'numpy.polynomial._polybase', 'numpy.fft._pocketfft_internal', 'numpy.core._asarray', 'numpy.lib.polynomial', '_bisect', 'numpy.polynomial.hermite_e', 'math', 'ctypes._endian', 'numpy.lib.ufunclike', 'numpy.ctypeslib', 'numpy.core._dtype_ctypes', 'json.scanner', 'numpy.lib.twodim_base', 'numpy.random._pickle', 'numpy.core._add_newdocs_scalars', 'numpy.core.multiarray', 'numpy.core._string_helpers', 'numpy.core', 'cython_runtime', 'numpy', 'numpy.lib._datasource', 'numpy.compat._inspect', '_json', 'numpy.lib.stride_tricks', 'numpy.polynomial.legendre', 'numpy.core.function_base', 'numpy.ma.extras', 'json', '_ctypes', 'numpy.polynomial', 'signal', 'numpy.lib', 'numpy.fft.helper', 'bisect', 'numpy.core._ufunc_config', 'numpy.lib.mixins', '_weakrefset', 'numpy.matrixlib', '_datetime', 'numpy.lib.arrayterator', '_random', 'hmac', '_hashlib', '_cython_0_29_32', 'numpy.ma', 'numpy.core._add_newdocs', 'textwrap', '_sha512', 'numpy.compat.py3k', 'numpy.core.shape_base', 'fnmatch', 'numpy.lib.arraypad', 'numpy.core._multiarray_umath'}
```

Also, at the Python level you have no idea how the memory allocation system works. Even if you succeed in getting the memory to be freed (e.g., via the Python `obmalloc`

system and the stdlib `malloc/free`

system), thereâ€™s no guarantee that the even lower level kernel memory management system (`brk`

, `sbrk`

, etc) will release pages back to the operating system. Oh, and this is all very system-dependent. Iâ€™ve never done anything on Windows. Iâ€™m not sure what its equivalent of `brk`

is.

Thatâ€™s not what is used in contempory C libs on linux for allocating memory.

Memory is allocated in chunks by glib c using mmap. Man malloc talks about this.

Large mallocs will have their pages released to the system with you free the

block, but small allocations will not free pages.

Loading a module results in lots of small mallocâ€™s (< 256k) that even if your could

manage to drop all the objects refs will not free any pages to the system.

Thanks for the correction. Itâ€™s been quite a few years since I looked at things at that level. Theyâ€™ve obviously changed. In any case, the basic notion of returning memory from the process back to the kernel is a challenging one, one not likely to be successful by deleting a few (or many) names from the Python level.

No. Depending on what you mean by â€śunimportâ€ť, that it either redundant and unnecessary, or dangerous.

Python is a garbage collected language. We donâ€™t release blocks of memory manually, we allow the garbage collector to track what objects are being used, and when they are no longer used, the garbage collector will remove them and re-use the memory. That includes imported objects and modules.

To help manage objects (not memory directly!) Python already has the `del`

statement. You can delete the imported module, and donâ€™t forget the reference held in the module cache:

```
import math # Module we want to release.
import sys
del math
del sys['math']
```

but honestly thatâ€™s just likely to cause issues with object identity. Donâ€™t do it.

If your â€śunimportâ€ť command unconditionally releases the memory used by a module, that is dangerous, and will almost certainly lead to segmentation faults and crashes when the interpreter tries to access objects in the released memory.

If your â€śunimportâ€ť command merely removes references to the module, then it is redundant, as we already have `del`

.

Manually removing modules is rarely needed, and then only in extremely long-running processes like web servers, where you expect to hot-patch running modules. Thatâ€™s something only experts should attempt.

Thanks to all of you for your time.

I understand that python is a garbage collected language, and that is fine. But, I dig into python source code Version 3.12 (profiled it with MSVS2019) t**hey use obmalloc inside at C level** to allocate what the designers called **â€śArena poolsâ€ť** among others memory objects requested, well, I can guarantee you that it contains a few memory leaks and not small (4 mb the more significant ones). And then my doubt jumps, why if python has been around for some time now no one of the maned â€ścore developersâ€ť has fixed those. It looks to me that python developers leave that task (releasing mem resources) to the operating system at program end. Now think about long time running service with such inefficient memory management will soon has server with no memory even if you have a HP server with 4 Xeon processors (24 thread of execution) 125 GBytes of memory, Python can get very memory hunger.

Thanks again

Alexei

If you can provide an example using just the Python standard library which demonstrates such leakage, I suggest you open an issue on GitHub. The obmalloc arena code had been in Python for a long while and is quite well vetted. Iâ€™d be surprised to find an easily demonstrate memory leak, though wouldnâ€™t claim the implementation is completely leak-free.

Thanks for your attention,

This is the simplest one:

```
int main(...)
{
Py_InitializeEx(0);
Py_Finalize();
//here on the OS will release all memory used by arenas whenever the application ends.
}
```

What makes you think that leaks memory? I suppose you and I might disagree on what constitutes a memory leak, so perhaps a definition is in order.

Memory leaks if, during the course of running a program that memory canâ€™t be reused. For example, consider this little Python loop:

```
while True:
some_leaky_operation()
```

If memory was actually leaking as a result of calling `some_leaky_operation`

, you would see the memory footprint of the program continue to grow. Your example canâ€™t (in my mind) reveal memory leaks because it does nothing but initialize and finalize the Python runtime, then exit. Itâ€™s not surprising that the program would have some memory left in its `obmalloc`

arenas. That doesnâ€™t mean they leaked.

Thanks,

I understand that. **Please, notice, your code is in Python, while mine was in C**. The memory leaks I am talking are not happening in the python code being run but in the python core (in C).

thanks a lot for your time

I think you need to demonstrate why you think the code you posted leaks memory. Just saying, â€śthis leaksâ€ť isnâ€™t sufficient.

I realize my code was in Python and yours was in C. I was showing how ~~you~~ I would demonstrate a memory leak, namely showing that performing the same operation over time results in increased memory usage without holding onto references to that memory. Iâ€™m pretty certain your code doesnâ€™t leak memory. If it does, you need to explain why you believe that to be the case.

Also, as I indicated, I think your definition of â€śmemory leakâ€ť and mine are different. Thatâ€™s fine, but we need to have a common definition if we arenâ€™t going to continue talking past one another.

edit - referred to the wrong person in second paragraph.