Introducing my library to share objects across processes

Hello everyone,

In the ongoing discussion about using Python 3.8’s shared memory to pass objects between processes without serialization, I noticed some challenges mentioned regarding the inability to share arbitrary objects directly due to the need for pickling. While the native multiprocessing.shared_memory primarily supports primitive types and bytes, handling complex data structures efficiently remains an issue.

I’ve explored a few alternatives that try to address this problem, though they still rely on some form of serialization:

In response to these limitations, I’ve developed a package that attempts to facilitate the sharing and modification of more complex Python objects across processes without needing periodic serialization. It’s designed to integrate with the existing Python multiprocessing framework and supports various data types, including numpy and torch arrays, through shared memory. The idea is to reduce the overhead typically associated with pickling large data structures when sharing between processes.

This tool is part of an open-source project, and I welcome any feedback or contributions from the community to improve its functionality or discuss potential integration issues you might encounter. For those interested in looking into the technical details or contributing, here’s the link: GitHub - InterProcessPyObjects.

Looking forward to your thoughts and any feedback you might have!