mattesilver

A set of workshop tools

I've been looking for a tool to manage development tools like linters, test runners, etc. The tools are typically development tools, so for example pipx would qualify.

A tool manager should make the environment reproducible and portable. The installed tools should work not only on my machine as well as on machine of everyone else who works on the same project, and they should not conflict with other tools anyone has installed on their systems.

Ideally the tool would have a declarative configuration, so it's clear what's used and easy to upgrade.

The problem with task managers (that aren't also tool managers) is that they don't control the tools, so a task that works on one system may fail on another. Worse, a task that works at one point in time may fail at another, because of a new tool version. So task manager should be either also a tool manager or it should be used with one.

On the other hand, managing tools is also a big subject, it's pretty much a package manager, needing some sort of a descriptor for a lot of tools in various languages, that need to be installable on various systems and architectures.

In short, the requirements boil down to:

  • declarative tool SBOM
  • tool isolation (but with access to project dependencies)
  • task management

Installing tools directly in the system

The old way, where the user installs tools following the project readme. Not declarative, not reproducible, definitely not isolated.

❌ declarative ❌ isolated ❌ task management

Declarative package managers – Nix, Guix

Environments are reproducible, declarative (both use functional languages). Rather difficult to use and very intrusive – you pretty much install another operating system in your operating system. I don't see any technical obstacle from making them single-user application, but they chose against it. Unless you already use one of them, it's a total overkill.

Maybe they work better with dev containers, but even then there are alternatives.

✅ declarative ✅ isolated ❌ task management

pipx

pip for executable packages. Supports installation of multiple versions of the same tool, but there's no way to activate the required versions for a given project. And it only works for python tools (and those with a python wrapper).

❌ declarative ❌ isolated (tools are isolated from each other, but shared across projects) ❌ task management

uvx, npx, pnpx, etc

Set of tools that let you run their respective languages projects as programs. Each works only with tools in its language, but it's not the biggest problem, since typically the tools are written in the same language, or have wrappers.

They arent really designed to manage a toolbox, rather for one-off execution. There's no declarative versioning.

uvx is a part of uv, which is a package manager, and also python version manager, but somehow tool manager is out of scope 🤷.

❌ declarative ✅ isolated ❌ task management

Tools as dev dependencies

It's really not what dev dependencies are meant for: everything is installed in a shared environment (venv, nodemodules). In the nodejs environment this should be fine, every package maintains its own dependencies, even if versions differ – that's why you end up with huge nodemodules. In python you might end up with conflicting dependencies (this somehow doesn't stop poetry project to use dev dependency group this way). Again, they only work for tools written in the same language as the project.

✅ declarative ❌ isolated ❓ task management (not in python, package.json has scripts)

pre-commit

Technically ticks many boxes, but not really intended as a dev tool manager.

Uses git repositories as a distributed source of packages, keeps them isolated and cached per project. Versions are fully declarative – you can even pin full SHA hashes.

This is all great, but it's literal git's pre-commit utility, which means it's designed for fast and predictable execution. It's not intended as a general purpose tool manager – it doesn't provide the tools with full project environment (it doesn't include dependencies), it doesn't support test runners (also because testing is too slow for a pre-commit hook) and even support for some linters (see pythons mypy) is suboptimal (again, mypy works best with access to dependencies).

✅ declarative ✅ isolated (actually too isolated for my purpose) ❌ task management

Mise

Works both as a package manager (tools) and task manager (replacement for package.json scripts, rather than for make). The only tool with the express goal of being a tool and task manager.

It doesn't handle tools with plugins well (e.g. pydantic.mypy) – it uses a shared directory for installs, but only tool name and version are used for tools “identity”. which means if you have two projects, both using mypy, one using pydantic with mypy plugin, there's a conflict (there's a workaround to this, but not good for collaboration).

It also does funny things to $PATH so it doesn't work great with other tools, like IDEs or pre-commit.

✅ declarative ❌ isolated (buggy) ✅ task management

Dev containers

This isn't even a tool, it's simply a container (docker, podman, you name it) that has your tools installed and has access to your project directory.

It's great for isolation and reproducibility, as long your installs are versioned. The isolation makes curl | sudo sh - a bit more acceptable.

I haven't worked with it, but the concept seems very reasonable, especially for occasional build of applications that you only use (but not develop).

Conclusion

I don't see here a perfect tool, and I'm not even looking for a polyglot multi-project / monorepo manager.

Mise comes very close, and I'm currently using it but it has a problem with isolation and it doesn't seem interested in fixing it, so I wouldn't recommend it on it's own in a collaborative environment.

I'll take a closer look at mise in a dev container. Dev containers would provide the isolation and mise covers declarative tool and task management quite well.

CC BY-NC-SA 4.0

I'm developing a small package for writing typed Web API clients, similar to Uplink, only compatible with OpenAPI (lapidary). I use these clients mostly in Web servers, so I decided to make it async. So far I've settled on HTTPX, but I want to keep tabs on available python packages.

urllib.request

See also The Requests package is recommended for a higher-level HTTP client interface.

It accepts request headers only as a dict, and I've decided that it doesn't count as support for HTTP

❌ HTTP/1.1 ❌ HTTP/2 ❌ HTTP/3 ❌ asyncio

urllib3

Fast and robust, slowly acquiring HTTP/2 support, but only sync.

✅ HTTP/1.1 ❌ HTTP/2 ❌ HTTP/3 ❌ asyncio

requests

This the one every python example tells you to use. It's under a feature freeze since forever – HTTP moves on but requests is as perfect as possible. It also “broke the internet” when it moved to urllib3 2.0.0, which dropped support for older TLS algorithms, in a minor version release. As long as the API doesn't change in an incompatible way, right?

✅ HTTP/1.1 ❌ HTTP/2 ❌ HTTP/3 ❌ asyncio

pycURL

It's super fast, since it's super thin wrapper over libcURL.

Also a great showcase of how not to write python wrappers. Might as well use ctypes.

✅ HTTP/1.1 ✅ HTTP/2 ✅ HTTP/3 (I think it's supported, libcURL says it's experimental) ❌ asyncio but non-blocking, since it's a native code.

HTTPX

The most popular async HTTP client package for python. Supports HTTP/2 but claims it's “not as robust” as 1.1. Both sync and async.

API is mostly compatible with requests (it's not a feature). Depending on usage pattern, it's slightly slower or much slower than aiohttp.

✅ HTTP/1.1 ✅ HTTP/2 (discouraged, opt-in) ❌ HTTP/3 ✅ sync & async

aiohttp

API is close to the protocol, so a bit more complex than alternatives, but I like it. One of the fastest packages in python.

The community seems more focused on the server side (more features and plugins).

✅ HTTP/1.1 ❌ HTTP/2 ❌ HTTP/3 ✅ asyncio-only

Aiosonic

Looks and feels like HTTPX.

Depending on usage pattern, it's either a bit slower or a bit faster than aiohttp. Probably there's something wrong with it, because nobody seems to be using it. Definitely worth a closer look.

✅ HTTP/1.1 ✅ HTTP/2 – beta ❌ HTTP/3 ✅ sync & asyncio

Aioquic

Really low level HTTP/3 & QUIC package. Only for genuine H3 users. Could use better examples – example client has 600+ lines. Or maybe it's that hard to use...

❌ HTTP/1.1 ❌ HTTP/2 ✅ HTTP/3 ✅ asyncio-only

Niquests

Async fork of requests with HTTP/2 and 3. Fast, full featured but not user-friendly.

It uses a forked urllib3 (by the same author) that shadows it, so if you happen to have a transitive dependency on the original, it will mess up your environment. The author is fine with it. I recommend against using it.

✅ HTTP/1.1 ✅ HTTP/2 ✅ HTTP/3 ✅ sync & asyncio

Reqwest wrappers

This makes a lot of sense – Rust is a popular language, reqwest is a popular rust package, keep it DRY. None of the wrappers are widely popular, but some projects are live and definitely worth watching. – gufo-http – I think it's the oldest and still maintained – pyreqwesthttpr

Found a few more, just search reqwest in pypi.

✅ HTTP/1.1 ✅ HTTP/2 ❌ HTTP/3 – maybe, it's still experimental in reqwest ✅ sync and asyncio

Conclusion

Honestly, the clients for the protocol that makes the Web makes me wonder is Python still alive? There isn't a single popular and stable package with HTTP/2 (2015!) or HTTP/3 (2022) either sync or async. Is Python only for Django and LLM? Have everyone else switch to Rust and/or Deno?

CC BY-NC-SA 4.0

Based on the Litestar Fullstack Inertia showcase project.

Prerequisites

For social authorization to work, you first need to register your application with the identity provider (e.g., GitHub).

Implementation

Frontend

You need a login or registration page with a button/link that sends the browser to a backend route which starts the OAuth2 flow. For social login, it doesn’t matter whether you call it “login” or “registration”, as the first login typically creates the user.

Implementation details: – In the showcase project it’s implemented in partials: – Login formRegistration form – Both partials use the same button, which handles clicks with router.post(route("github.register")). – route() is an Inertia.js helper function which simply resolves handler names to paths. – A plain link and HTTP GET would also work here, since no data is sent at this stage.

Login/Register handler (backend)

Create a route that redirects to the provider’s authorization URL and includes a callback pointing to your completion handler. For production you should generate OAuth2 state and PKCE code verifier, and store them in the session.

Implementation details: – See the authorization controller – You can use httpx_oauth to get auth-related URLs for popular OAuth2 providers.

Authorization callback handler (backend)

Your callback (named “github.complete” in the showcase) should:

  1. Validate the store OAuth2 state (required for security; missing in the showcase)
  2. Exchange the authorization code for an access token
  3. Create a user object in the local database and store the user Id in the session
  4. Redirect to a post-login page (e.g., dashboard)

Optionally: – Fetch user details (e.g., email) from the provider
– Provide user data to the frontend

Implementation details: – The showcase doesn't use OAuth2 state or PKCE – GitHubOAuth2.get_access_token is used to exchange the auth code for a token (wrapped as a Litestar dependency, to facilitate re-use) – GitHubOAuth2.get_id_email is used to retrieve users email – Inertia’s share() is used to pass user details to the frontend

© 2025 Matte Silver

CC BY-NC-SA 4.0