How to use GitHub V3 API to get commit count for a repo?

前端 未结 7 1945
小鲜肉
小鲜肉 2020-12-05 03:26

I am trying to count commits for many large github repos using the API, so I would like to avoid getting the entire list of commits (this way as an example:

7条回答
  •  再見小時候
    2020-12-05 03:46

    Using the GraphQL API v4 is probably the way to handle this if you're starting out in a new project, but if you're still using the REST API v3 you can get around the pagination issue by limiting the request to just 1 result per page. By setting that limit, the number of pages returned in the last link will be equal to the total.

    For example using python3 and the requests library

    def commit_count(project, sha='master', token=None):
        """
        Return the number of commits to a project
        """
        token = token or os.environ.get('GITHUB_API_TOKEN')
        url = f'https://api.github.com/repos/{project}/commits'
        headers = {
            'Accept': 'application/json',
            'Content-Type': 'application/json',
            'Authorization': f'token {token}',
        }
        params = {
            'sha': sha,
            'per_page': 1,
        }
        resp = requests.request('GET', url, params=params, headers=headers)
        if (resp.status_code // 100) != 2:
            raise Exception(f'invalid github response: {resp.content}')
        # check the resp count, just in case there are 0 commits
        commit_count = len(resp.json())
        last_page = resp.links.get('last')
        # if there are no more pages, the count must be 0 or 1
        if last_page:
            # extract the query string from the last page url
            qs = urllib.parse.urlparse(last_page['url']).query
            # extract the page number from the query string
            commit_count = int(dict(urllib.parse.parse_qsl(qs))['page'])
        return commit_count
    

提交回复
热议问题