It would obviously need work before it would be ready for prime time, but if there’s interest, it’s something I can take a look at. I would envision that a usable product would gather the data asynchronously and probably display the results as a dashboard of graphs instead of just pretty-printed. As I said, I hadn’t really researched if something already existed before I did this because it gave me a chance to play with the GitHub API a little.
Depends how long this takes to run and how many API calls (are we going to reach the rate limit if doing this on demand as webhook?)
If there is no concern with ratelimit, webhook definitely will work. We’ll want it hosted in Heroku, and use celery or similar library as a background task. The catch is celery is not compatible with Python 3.7 yet, that’ll come in celery 5.0.
It can also work as a cron job, running it periodically every hour or so. This way we probably don’t need to worry about rate limiting.
I agree with @Mariatta that this would probably be better as a cron job than a webhook. When I created it, it was more to see what was already out there instead of getting up to the minute information for newly added PRs. I think running it once a day or even once a week would be sufficient, based on my original intent.
However, this was a fun script and not an idea that had any actual design. Taking that into consideration, the method used and frequency for grabbing the data is certainly something that can be looked at. If a base dataset is created (with existing PRs), then each touch on a PR would probably one result in one additional GitHub API call, so it probably wouldn’t have much of an impact on the ratelimit.