Executor.shutdown method is not working as I expected. Why?

Hey, pythonians! :snake:

Could someone tell me where I’m going wrong, please? :pray:

I would like to shutdown a pool of processes when the result of a process returns a specific condition. I’m using the shutdown method to try to do this, but the method is not working as I expected: the processes are not killed. Should not the method do this? :thinking:

I’m using the Python 3.9.7.

#!/usr/bin/env python3
# -*- coding: utf-8 -*-

import subprocess
import glob

from natsort import natsorted
from tqdm import tqdm
from concurrent.futures import ProcessPoolExecutor, as_completed

def run(commands):
    executor = ProcessPoolExecutor()
    futures = []
    for cmd in commands:
        f = executor.submit(
    for f in tqdm(as_completed(futures), total=len(commands)):
        if f.result().returncode == 1:
            print('subprocess.CompletedProcess.returncode == 1')
            executor.shutdown(wait=False, cancel_futures=True)

if __name__ == '__main__':
    executable = 'ffmpeg.exe'
    filenames = natsorted(glob.glob('*.mp4'))
    filenames[3] = 'non-existent-filename.mp4'  # add an error
    commands = []
    for f in filenames:
        cmd = [executable, '-y', '-i', f'{f}', f'{f}.mkv']

I don’t see anything in the documentation that suggests it will kill running processes. Just that it will stop submitting new ones.

Signal the executor that it should free any resources that it is using when the currently pending futures are done executing.

If cancel_futures is True , this method will cancel all pending futures that the executor has not started running. Any futures that are completed or running won’t be cancelled, regardless of the value of cancel_futures .

I have misunderstood. :man_facepalming: Thank you, @BowlOfRed!

There is a solution here: multiprocessing - How do I run multiple subprocesses in parallel and wait for them to finish in Python - Stack Overflow. Thanks for finding this, Anderson!