I’ve got a Python script that connects to a bunch of webcams and downloads still images for a legacy system. The script spawns a process for each camera and pulls the stills using the requests module and stuffs it in an IO object. It then opens a file on a remote SFTP server with the paramiko module and writes the IO object to the remote file. This happens repeatedly as fast as the system can handle it.
I can get it to work, but not efficiently. I’m back to the drawing board and am wondering what general approach I should take. The real kicker is that I am trying to limit the SFTP connection to one and then let each process send / upload files in parallel over that connection / login.
I tried passing a single SFTP paramiko object from the main process to each subordinate process but that didn’t seem to work right. I also tried to create a dedicated SFTP process and pass the info via queues, but the files are large enough the queue would simply get backed up and effectively grow and grow and grow.
I’m looking for a broad or general direction on how to get the SFTP working with a shared connection. When I give each process its own login, the firewall thinks its attack traffic and blocks the link preemptively.