Usually one wants to limit the number of parallel connections. Even if
the server copes, there will usually be a performance peak - up to that
many connections increases throughput for you, beyondthat the server
slows down and everyone loses. And of course most servers are shared, so
you don’t want to prevent service to other users. So you may want a
semaphore to limit the request calls, example:
from thrreading import Semaphore
S = Semaphore(16)
def post_request(req_data, header):
with S:
requests.post('http://127.0.0.1:8060/cv/rest/v2/analytics',
json=req_data, headers=header, timeout=20)
This doesn’t limit your threads, but no more than 16 requests.post calls
will run at once.