ProcessPoolExecutor hangs with default chunksize

Posting here because I’m not sure this is a bug or just a performance issue.
When mapping over a long list with a ProcessPoolExecutor, the program randomly hangs. Following code is an example that hangs randomly (sometimes works, sometimes stops):

from concurrent.futures import ProcessPoolExecutor


def worker_task(text):
    return text.capitalize()


if __name__ == '__main__':
    data = ['a brown fox jumps over the lazy dog'] * 16390
    with ProcessPoolExecutor(max_workers=24) as executor:
        for result in executor.map(worker_task, data, chunksize=1):
            print(result)

If I set the chunksize to 128, the program ran several times without hanging up.

The program also seemed to be working when the data length is short. If the length of the list ‘data’ is 10000, the program ran without stopping. On the other hand, when the length is 17000 or more, the program starts to hang.

I would like to hear what could be the problem.

Thanks in advance.