Skip to content

Open Source

Starlette 1.0 is here!

A few weeks after the 1.0.0rc1 release, we are ready to welcome the long awaited 1.0. 🎉

Starlette 1.0 is not about reinventing the framework or introducing a wave of breaking changes. It is mostly a stability and versioning milestone. The changes in 1.0 were limited to removing old deprecated code that had been on the way out for years, along with a few bug fixes. From now on we'll follow SemVer strictly.

Acknowledgement

Before we continue, I'd like to thank the people that helped shape the project into what it is today.

First and foremost, thank you to Mia Kimberly Christie for creating Starlette! 🙏

Over the years, many people have shaped this project. Thomas Grainger and Alex Gronholm taught me so much about async Python, and have always been ready to help and mentor me along the way. Adrian Garcia Badaracco, one of the smartest people I know, who I have the pleasure of working with at Pydantic. Aber Sheeran has been my go-to person when I need help on many subjects. Florimond Manca was always present in the early days of both Starlette and Uvicorn, and helped a lot in the ecosystem. Amin Alaee contributed a lot with file-related PRs, and Alex Oleshkevich helped on templates and many discussions. Sebastián Ramírez maintains FastAPI upstream, and has always been in contact to help with upstream issues. Jordan Speicher worked on making Starlette anyio compatible.

On the support side, Seth Michael Larson has been someone I've relied on for help with security vulnerabilities, and Pydantic, my company, has supported me in maintaining these open source projects. A special thanks to our sponsors as well: @tiangolo, @huggingface, and @elevenlabs.

Starlette in the last year

Since the 2024 Open Source Report, here's what happened (data gathered from the GitHub API and PyPI Stats):

Downloads/month Releases Closed issues Merged PRs Closed unmerged PRs Answered discussions
325 million 19 50 144 77 49

Compared to last year (57 million downloads/month), Starlette has grown to 325 million downloads/month - almost 6x growth!

Open Source in the Age of AI

A lot of my work at Pydantic lately, building Logfire, has been focused on AI, and that has influenced my day-to-day work quite a bit, including how I maintain Starlette.

In practice, that has mostly meant using coding agents to speed up issue triage and pull request review.

The most negative side lately has been the amount of issues, pull requests and advisories opened via coding agents, that are just noise. Issues and pull requests are easy to close, but advisories are tricky - sometimes they look real, and making a judgement usually takes a long time.

What's next?

Looking ahead, we'll probably focus on improving the performance of our routing and multipart parsing. The number of issues in Starlette is down to 15 lately, so the idea is to keep maintaining the project as is. We'll be following SemVer now, and I don't foresee version 2 any time soon, but I'm also not afraid of doing that if we introduce some cool breaking change.

Go ahead and bump your Starlette version, and if you'd like to support the continued development of Starlette, consider sponsoring me on GitHub. ❤

Oh, and Sebastián, Starlette is now out of your way to release FastAPI 1.0. 😉

2024 Open Source Report

This is my first yearly report on Open Source! 🎉

I dedicate a lot of my free time doing Open Source work, and I would like to share with you some numbers. I hope you find them interesting!

Project Downloads/month Time spent Releases Closed issues Merged PRs Closed unmerged PRs Answered discussions
Starlette 57 million 70 hrs 29 mins 29 76 182 64 93
Uvicorn 49 million 48 hrs 3 mins 20 61 100 65 38
Python Multipart 25 million 17 hrs 29 mins 13 37 87 12 0
Total 131 million 136 hrs 1 min 62 174 369 141 131

Most of the time dedicated in maintaining open source projects is actually not spent coding, as most of people think. It's mainly on interacting with people: answering questions, reviewing pull requests, and investigating issues.

Sponsors

I would like to thank all the sponsors that supported me in 2024! ❤

Data Analysis

I got this data from a script I created that queries the GitHub API and WakaTime API.

Click here to see the script...

Most of the script was created with the help of Claude AI, but I had to tweak it a bit to get the data I wanted.

If you want to use it, make sure you have the following environment variables set:

  • WAKATIME_API_KEY: Your WakaTime API key.
  • GH_TOKEN: Your GitHub token.
import os
import httpx
from datetime import datetime, timedelta
from wakatime_client import WakatimeClient


def main():
    client = WakatimeClient(api_key=os.getenv("WAKATIME_API_KEY"))
    for project in client.stats(range="last_year")["data"]["projects"]:
        if project["name"] in ("starlette", "uvicorn", "python-multipart"):
            print(f'{project["name"]}: {project["text"]}')
    print()

    print(f"starlette releases: {count_releases('encode', 'starlette')}")
    print(f"uvicorn releases: {count_releases('encode', 'uvicorn')}")
    print(f"python-multipart releases: {count_releases('Kludex', 'python-multipart')}")
    print()
    print(f"starlette stats: {get_repo_stats('encode', 'starlette')}")
    print(f"uvicorn stats: {get_repo_stats('encode', 'uvicorn')}")
    print(f"python-multipart stats: {get_repo_stats('Kludex', 'python-multipart')}")
    print()
    print(f"starlette activity: {get_repo_activity('encode', 'starlette')}")
    print(f"uvicorn activity: {get_repo_activity('encode', 'uvicorn')}")
    print(f"python-multipart activity: {get_repo_activity('Kludex', 'python-multipart')}")


def count_releases(owner: str, repo: str):
    url = f"https://api.github.com/repos/{owner}/{repo}/releases"
    headers = {"Accept": "application/vnd.github.v3+json", "Authorization": f"Bearer {os.getenv('GH_TOKEN')}"}

    with httpx.Client() as client:
        response = client.get(url, headers=headers)
        response.raise_for_status()

        one_year_ago = datetime.now() - timedelta(days=365)
        return sum(
            1
            for release in response.json()
            if datetime.strptime(release["published_at"], "%Y-%m-%dT%H:%M:%SZ") > one_year_ago
        )


def get_repo_stats(owner: str, repo: str):
    headers = {"Accept": "application/vnd.github.v3+json", "Authorization": f"Bearer {os.getenv('GH_TOKEN')}"}

    base_url = f"https://api.github.com/repos/{owner}/{repo}"
    since = (datetime.now() - timedelta(days=365)).isoformat()

    try:
        with httpx.Client() as client:
            # Get issues (excluding PRs)
            issues_count = 0
            issues_url = f"{base_url}/issues"
            issues_params = {"state": "closed", "since": since}

            issues_response = client.get(issues_url, headers=headers, params=issues_params)
            issues_response.raise_for_status()

            while issues_response.status_code == 200:
                issues_count += sum(1 for issue in issues_response.json() if "pull_request" not in issue)

                if "Link" in issues_response.headers and 'rel="next"' in issues_response.headers["Link"]:
                    next_url = [
                        link.split(";")[0].strip("<> ")
                        for link in issues_response.headers["Link"].split(",")
                        if 'rel="next"' in link
                    ][0]
                    issues_response = client.get(next_url, headers=headers)
                else:
                    break

            # Get PRs
            prs_url = f"{base_url}/pulls"
            merged_count = 0
            closed_count = 0

            # First get merged PRs
            pr_params = {"state": "closed", "sort": "updated", "direction": "desc"}
            pr_response = client.get(prs_url, headers=headers, params=pr_params)
            pr_response.raise_for_status()

            while pr_response.status_code == 200:
                for pr in pr_response.json():
                    # Check if PR was updated in the last year
                    if datetime.strptime(pr["updated_at"], "%Y-%m-%dT%H:%M:%SZ") < datetime.now() - timedelta(days=365):
                        break

                    if pr["merged_at"]:
                        merged_count += 1
                    else:
                        closed_count += 1

                if "Link" in pr_response.headers and 'rel="next"' in pr_response.headers["Link"]:
                    next_url = [
                        link.split(";")[0].strip("<> ")
                        for link in pr_response.headers["Link"].split(",")
                        if 'rel="next"' in link
                    ][0]
                    pr_response = client.get(next_url, headers=headers)
                else:
                    break

            return {"closed_issues": issues_count, "merged_prs": merged_count, "closed_unmerged_prs": closed_count}

    except httpx.HTTPError as e:
        print(f"Error fetching repository stats: {e}")
        return None


def get_repo_activity(owner: str, repo: str):
    headers = {"Accept": "application/vnd.github.v3+json", "Authorization": f"Bearer {os.getenv('GH_TOKEN')}"}

    # GraphQL query for discussions (REST API doesn't support discussions)
    graphql_url = "https://api.github.com/graphql"
    query = """
    query($owner:String!, $repo:String!) {
    repository(owner: $owner, name: $repo) {
        discussions(first: 100, answered: true, orderBy: {field: UPDATED_AT, direction: DESC}) {
        totalCount
        nodes {
            answerChosenAt
        }
        }
    }
}
"""

    with httpx.Client() as client:
        # Get discussions via GraphQL
        response = client.post(
            graphql_url, json={"query": query, "variables": {"owner": owner, "repo": repo}}, headers=headers
        )
        response.raise_for_status()

        one_year_ago = datetime.now() - timedelta(days=365)
        data = response.json()

        return sum(
            1
            for discussion in data["data"]["repository"]["discussions"]["nodes"]
            if datetime.strptime(discussion["answerChosenAt"], "%Y-%m-%dT%H:%M:%SZ") > one_year_ago
        )


main()