Always batch size in my usages. I don’t even know how you would split into a fixed number of batches without knowing the exact number of elements in the first place (and iterators can be infinite).
In my usage, either I can guarantee that the number of elements is always a multiple of the batch size, or I’m working with an infinite iterator, or in the case that it’s not divisible, I just have a smaller final batch. The latter most often arises when processing a dataset of unknown/variable size (such as a Django queryset) and I simply want to process the data in groups that aren’t too large but without the overhead of doing it one at a time. (Real-world example: I have code at work that creates a Django queryset, then sends the results of that query via messages on a RabbitMQ message bus. We batch the results in groups of 10 records per message, to cut down on the number of messages but keep individual messages at a manageable size.)
The first question is one, as noted, that I don’t even think one option is possible, and it seems that if it was, it should be a totally different function (it’s more of a partition than a batch) that probably wouldn’t cross the threshold for inclusion in itertools proper.
The second one could be easily adjustable via a keyword argument or by having two related functions. I would imagine at minimum having an option to either raise an exception or return a partial batch if the iterator is exhausted without ending on a full batch.