but it’s really not the most obvious thing in the world. Plus, code that wants to take advantage of this needs to have a clear place to put it, and it’s not at all difficult to mess things up and lose the desired performance by re-creating the deque. (Not to mention all the toy example code you’ll find out there that doesn’t optimize by grabbing the extend method - even in the more_itertools documentation, as already discovered. I reject the hypothesis that they “don’t prioritize speed oversimplicity” because the example is extensively documented and commented to explain that they’ve explicitly selected approaches that iterate at C speed and don’t collect the skipped-over items.)
I know, “not every three-line recipe needs to be in the standard library”. But in my mind, this particular example ticks all the boxes that e.g. random.choice does - maybe more.
Such change isn’t pure speed benefit. It is generally nicer way to do it. Instead of calling deque(...maxmasjdsa=12312), which suffers from readability and general confusion to anyone who sees it for the first time, now one can define it to be something “nice”. E.g.:
black_hole = deque(maxlen=0).extend
And performance is a benefit - no initialization cost and it has no wrapper overhead.
To me, this is mostly gain with little to no cost.
Maybe simplicity is not the right word here. What it does result is in consumer not being independent anymore, so the function is less portable. If someone wants to copy it, there is an extra step of finding black_hole.
Which is an aspect of simplicity I guess, but whatever amount of simplicity it subtracts here, it adds it back in other aspects.
For what it’s worth, I welcome all sorts of ideas to more-itertools. “It’s somewhat faster on this synthetic benchmark, but makes the code more complex” is usually something I’ll ultimately pass on. But if you come with examples from GitHub of people running whatever function in tight loops, I’ll be much more receptive.