Skip to content

Python Packaging Footguns

Mistakes that break environments, create unreproducible builds, or cause mysterious import failures.


1. pip install without virtualenv pollutes system Python

You run pip install requests without a virtualenv. The package installs into the system Python's site-packages. Your next system update overwrites it. Or worse, a package you install conflicts with a system package and breaks apt or yum tools that depend on Python.

# This installs into system Python — dangerous
$ pip install requests

# On Debian/Ubuntu, this can break apt
$ pip install PyYAML  # Conflicts with system python3-yaml

# Newer pip (23.0+) refuses this by default:
# "error: externally-managed-environment"

Fix: Always use a virtualenv. Always.

$ python3 -m venv .venv
$ source .venv/bin/activate
$ pip install requests  # Safe: installs into .venv/

On modern Debian/Ubuntu with PEP 668, pip blocks system installs by default. Never use --break-system-packages to override this unless you truly understand the consequences.


2. requirements.txt without pinned versions gives you different builds every time

You write requests in requirements.txt instead of requests==2.31.0. Today's build works. Next week, a new version is released with a breaking change. Your build breaks. Or worse, it installs silently and fails at runtime with a subtle behavior change you don't catch until production.

# Bad: unpinned or loosely pinned
requests
flask>=2.0
psycopg2-binary

# Good: fully pinned with hashes
requests==2.31.0
flask==3.0.2
psycopg2-binary==2.9.9

Fix: Use pip-compile from pip-tools to generate pinned requirements from loose specs:

$ pip install pip-tools

# requirements.in — your direct dependencies (loose)
requests>=2.28
flask>=3.0

# Generate pinned requirements.txt with all transitive deps
$ pip-compile requirements.in --generate-hashes
# Output: requirements.txt with exact versions + hashes for every package

3. Mixing pip and system package manager breaks both

You install python3-numpy via apt, then pip install numpy installs a newer version over it. Now apt thinks numpy is at version X but pip has version Y. apt autoremove may delete files pip depends on. Or your code imports the wrong version depending on sys.path order.

# Installed via apt
$ apt install python3-numpy  # /usr/lib/python3/dist-packages/numpy

# Then pip overwrites it
$ pip install numpy --upgrade  # /usr/local/lib/python3.11/dist-packages/numpy

# Which numpy do you get? Depends on sys.path order.
$ python3 -c "import numpy; print(numpy.__file__)"

Fix: System packages for system tools. Pip packages inside virtualenvs for your applications. Never mix the two in the same Python path.


4. Forgetting to rebuild Docker image after requirements change

You update requirements.txt, test locally, and deploy. But the Docker image still has the old requirements because you forgot to rebuild. The CI pipeline runs the old image. You get runtime import errors or wrong behavior that works locally but fails in the container.

# Docker layer caching: if requirements.txt hasn't changed (from Docker's
# perspective), this layer is cached and pip install is skipped
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY . .

Fix: Always rebuild after changing dependencies. In CI, use content-addressable caching:

# Force rebuild of the pip layer
$ docker build --no-cache -t myapp:latest .

# Or, better: structure Dockerfile so requirements.txt is copied first
# Docker detects the file change and invalidates the cache from that layer on

5. setup.py vs pyproject.toml confusion

Your project has both setup.py and pyproject.toml with conflicting metadata. Some tools read one, some read the other. pip install . uses one set of dependencies while python -m build uses another. Your package installs different dependencies depending on how it's built.

# pyproject.toml says:
[project]
dependencies = ["requests>=2.28"]

# setup.py says:
install_requires=["requests>=2.25"]
# Which one wins? Depends on the build tool and its version.

Fix: Pick one. For new projects, use pyproject.toml exclusively (PEP 621). Remove setup.py and setup.cfg. The only reason to keep setup.py is for old tools that don't support pyproject.toml.

# pyproject.toml — single source of truth
[build-system]
requires = ["setuptools>=68.0", "wheel"]
build-backend = "setuptools.backends._legacy:_Backend"

[project]
name = "mypackage"
version = "1.0.0"
dependencies = [
    "requests>=2.28,<3.0",
    "pydantic>=2.0",
]

6. Missing __init__.py makes your package invisible

You create a directory structure that looks correct but forget __init__.py in a subdirectory. The package installs but import mypackage.submodule fails with ModuleNotFoundError. This is especially confusing because it works with python -m or when running from the project directory.

mypackage/
    __init__.py
    core.py
    utils/           # Missing __init__.py!
        helpers.py
# This fails:
from mypackage.utils.helpers import format_date
# ModuleNotFoundError: No module named 'mypackage.utils'

Fix: Add __init__.py (can be empty) to every directory that should be a Python package. Or use implicit namespace packages (PEP 420) if you specifically need them, but be aware this changes import semantics.

Note: setuptools with find_packages() will silently skip directories without __init__.py. Use find_namespace_packages() if you intentionally use namespace packages.


7. Relative vs absolute imports cause "works here, breaks there"

Code using relative imports (from .utils import helper) works when installed as a package but fails when run directly as a script (python mypackage/module.py). Conversely, absolute imports (from mypackage.utils import helper) fail inside the package if mypackage isn't on sys.path.

# mypackage/api/views.py

# Relative import — works when installed, fails when run directly
from ..utils import sanitize

# Absolute import — works when installed, fails from working directory
from mypackage.utils import sanitize

# Running directly always breaks relative imports:
$ python mypackage/api/views.py
# ImportError: attempted relative import with no known parent package

# Correct way to run:
$ python -m mypackage.api.views

Fix: Use absolute imports consistently. Always run code via python -m package.module or install the package with pip install -e .. Never run modules directly with python path/to/module.py if they use relative imports.


8. pip install from git without pinning a commit

You put a git URL in requirements.txt without pinning to a specific commit. Every install gets whatever is on the default branch at that moment. Builds become unreproducible. A force-push to the repo silently changes what you get.

# Bad: installs whatever is on main right now
git+https://github.com/org/internal-lib.git

# Bad: branch ref, still moves
git+https://github.com/org/internal-lib.git@main

# Good: pinned to exact commit
git+https://github.com/org/internal-lib.git@a1b2c3d4e5f6

# Also good: pinned to tag
git+https://github.com/org/internal-lib.git@v1.2.3

Fix: Always pin to a commit hash or immutable tag. Better yet, publish internal packages to a private PyPI index (devpi, Artifactory, CodeArtifact) instead of installing from git.