Nobody’s Coming to Save Your Dependencies

Why I still recommend constraint-dependencies for litellm - even after PyPI deleted the compromised versions
python
security
opinion
Author

Dan O’Leary

Published

March 29, 2026

A follow-up to TIL: uv Settings I Changed after LiteLLM.

After my last post, a couple of commenters pushed back on recommendation 2 - the constraint-dependencies entry blocking litellm 1.82.7 and 1.82.8. The argument, paraphrased: PyPI pulled those versions, so they’re no longer installable. The constraint is unnecessary work defending against something that can’t happen.

I appreciate their opinion and it’s not an unreasonable position to take. PyPI’s security response to the litellm incident was genuinely fast - the malicious versions were live for only about three hours before quarantine, and the maintainers rotated credentials and engaged Google’s Mandiant for forensics. That’s not nothing. If you believe PyPI’s remediation is sufficient and permanent, a redundant constraint looks like noise.

But after careful consideration and further research, I think their view is too narrow on both the technical facts and the risk reasoning.

What “Pulled” Actually Means

“Pulled” isn’t a PyPI term, and that vagueness obscures something important. Looking across incident reports from the litellm compromise, the same versions get described as quarantined, yanked, and permanently deleted - sometimes in the same thread, because the state changed multiple times over several hours.

These are meaningfully different things. PyPI’s yank mechanism, defined in PEP 592, is a soft removal. Yanked versions disappear from general resolution - pip install litellm won’t touch them - but they remain installable if explicitly requested by exact pin: litellm==1.82.7 will still resolve a yanked version. That’s intentional. Yanking exists precisely so that maintainers can flag a bad release without breaking downstream projects that have already pinned to it.

Hard deletion is different; it makes a version truly unreachable from the index. Based on the litellm team’s own communications and third-party analysis, permanent deletion did eventually happen for 1.82.7 and 1.82.8. As of this writing, both versions return 404 on the PyPI JSON API and are completely absent from the Simple API index. Gone as if they never existed. But deletion happened after quarantine and after yanking, as a later escalation.

The practical upshot: “PyPI pulled it” is not a precise claim, and the level of protection it implies depends on exactly which action was taken, when, and from which index you’re resolving. The constraint sidesteps all of that ambiguity.

The Risk Math

Even granting that the versions are now unreachable from PyPI proper, “very unlikely to happen” is not the same as “cannot happen.” Risk has two dimensions: likelihood and impact.

The likelihood of accidentally resolving a known-compromised litellm version today is low. The impact - credential exfiltration across SSH keys, cloud provider tokens, Kubernetes configs, and CI secrets, plus a persistent backdoor polling for further instructions - is catastrophically high. When likelihood is low and impact is catastrophic, the standard engineering response isn’t to rely on circumstance. It’s to delegate the risk to a control.

PyPI deletion only controls PyPI. Corporate package mirrors like Artifactory, Nexus, and AWS CodeArtifact may have cached the wheel during the exposure window and won’t automatically honor an upstream deletion. That’s the clearest residual risk and a well-documented concern. Local uv and pip caches are another reason not to treat PyPI deletion as a magic eraser, though this is less of a risk.

That distinction matters because the litellm versions were ultimately deleted, but future incidents may not. On PyPI, the standard response is for maintainers to yank the offending release rather than delete it, and PEP 592 explicitly states that the yanked attribute “is not immutable once set, and may be rescinded in the future.” So for the next compromised release that gets yanked but not deleted, an exact pin can still resolve unless you block it yourself.

The constraint is the control. As I noted in my correction to the original post, constraint-dependencies only works at the project level in pyproject.toml - not in global uv.toml. That means adding it to each project that depends on litellm, which is more friction than I originally suggested. But it’s still a one-line addition per project, and it doesn’t slow down resolution or complicate your workflow once it’s there. The argument against it amounts to: you don’t have to. That’s true right up until it isn’t.

The Swiss Cheese

James Reason’s Swiss cheese model of accident causation describes safety as a stack of imperfect layers - each one has holes, but when the holes don’t align, incidents don’t get through. The failure mode isn’t any single layer failing; it’s multiple layers failing simultaneously, holes lined up in a row.

The Swiss cheese model of accident causation. Multiple defensive layers each have holes; an incident occurs when the holes align.

PyPI’s quarantine is one layer. Admin deletion is another. Your lockfile is another. Your corporate mirror’s policies are another. Your teammate’s awareness of the incident is another. None of these layers is perfect, and none of them knows about the others. The constraint-dependencies entry is a thin, cheap layer that you fully control. It doesn’t need PyPI to have done the right thing. It doesn’t need your mirror to have honored the yank metadata. It doesn’t need your teammate to have noticed the incident. It just blocks the versions, unconditionally, every time uv resolves.

The goal of defense-in-depth isn’t to address threats that are certain. It’s to close gaps that are cheap to close, because you never know in advance which holes are going to line up.

I greatly respect what PyPI does and think the litellm response was handled about as well as it could have been given the circumstances. But “they handled it well” and “I should rely on them handling it” are different conclusions. I’d rather keep that layer in my own hands.

Explicit is better than implicit.

The Zen of Python