Skip to content

Python Packaging — Street-Level Ops

Real-world workflows for fixing broken environments, auditing dependencies, building Docker images with Python, and managing packages in production.

Debugging "ModuleNotFoundError"

The single most common Python packaging problem. The module IS installed — just not where Python is looking.

# Step 1: which Python is actually running?
which python3
python3 -c "import sys; print(sys.executable)"
# /usr/bin/python3  ← system Python
# /home/deploy/.venv/bin/python3  ← virtualenv Python

# Step 2: where does THIS Python look for packages?
python3 -c "import sys; print('\n'.join(sys.path))"
# Output:
# /home/deploy/app
# /usr/lib/python3.11
# /usr/lib/python3.11/lib-dynload
# /home/deploy/.venv/lib/python3.11/site-packages  ← packages go here

# Step 3: where is the package actually installed?
python3 -m pip show requests
# Location: /home/deploy/.venv/lib/python3.11/site-packages  ← match?

# Step 4: is the virtualenv even activated?
echo $VIRTUAL_ENV
# Empty = not activated. That's your problem.
source /home/deploy/.venv/bin/activate

# Step 5: check for namespace package conflicts
python3 -c "import mypackage; print(mypackage.__file__)"
# None = namespace package (directory without __init__.py)
# If __file__ points to the wrong location, you have multiple installs

# Step 6: nuclear option — see every import attempt
python3 -v -c "import mypackage" 2>&1 | grep -i "mypackage\|trying\|error"

The sys.path Fix for Scripts

# Script that needs to import from a parent directory
# DON'T do this:
import sys
sys.path.insert(0, '/absolute/path/to/project')  # fragile, non-portable

# DO this — use relative path from script location:
import sys
from pathlib import Path
sys.path.insert(0, str(Path(__file__).resolve().parent.parent))

# Or better yet: install your project as a package
# pip install -e .

Fixing Broken Virtualenvs

# Symptom: pip broken, imports failing, Python version mismatch after OS upgrade

# Option 1: recreate the virtualenv (safest)
# Save current requirements first
pip freeze > /tmp/requirements-backup.txt
deactivate

# Nuke and rebuild
rm -rf .venv
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt  # use your pinned requirements, not the backup

# Option 2: repair without recreating (Python minor version same)
python3 -m venv --upgrade .venv

# Option 3: force reinstall everything (keeps venv, fixes corrupt packages)
pip install --force-reinstall -r requirements.txt

# Option 4: fix pip itself when pip is broken
python3 -m ensurepip --upgrade
python3 -m pip install --upgrade pip setuptools wheel

# Symptom: "No module named pip" after OS upgrade
curl -sS https://bootstrap.pypa.io/get-pip.py | python3

# Symptom: SSL errors from pip
pip install --trusted-host pypi.org --trusted-host files.pythonhosted.org requests
# Permanent fix: update ca-certificates
sudo apt-get update && sudo apt-get install -y ca-certificates

Auditing Dependencies for Vulnerabilities

# pip-audit — official PSF tool, checks against the OSV database
pip install pip-audit

# Scan current environment
pip-audit
# Output:
# Name       Version  ID                  Fix Versions
# ---------- -------- ------------------- ------------
# requests   2.28.0   PYSEC-2023-74       2.31.0
# cryptography 38.0.0 GHSA-xxx-yyy-zzz    41.0.0

# Scan a requirements file without installing
pip-audit -r requirements.txt

# JSON output for CI pipelines
pip-audit --format json --output audit-results.json

# Fix: upgrade the vulnerable packages
pip-audit --fix  # attempts automatic upgrade

# safety — alternative scanner (Tidelift database)
pip install safety
safety check
safety check -r requirements.txt --json

# Scan in CI (GitHub Actions example)
# - name: Audit dependencies
#   run: |
#     pip install pip-audit
#     pip-audit -r requirements.txt --strict

# Check for packages with known malware (typosquatting, etc.)
pip-audit --desc
# Look for packages you don't recognize
pip list | sort

Docker Multi-Stage Patterns for Python

Pattern 1: Builder + Runtime (Most Common)

# Stage 1: build dependencies (includes compilers, headers)
FROM python:3.11 AS builder

WORKDIR /build
COPY requirements.txt .

# Install into a specific directory so we can copy just the packages
RUN pip install --no-cache-dir --prefix=/install -r requirements.txt

# Stage 2: minimal runtime image
FROM python:3.11-slim

# Copy only the installed packages, not the build tools
COPY --from=builder /install /usr/local

# Copy application code
WORKDIR /app
COPY . .

# Non-root user
RUN useradd -r -u 1000 appuser
USER appuser

CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]

Pattern 2: Compiled Dependencies (numpy, cryptography, etc.)

FROM python:3.11 AS builder

# System dependencies for compilation
RUN apt-get update && apt-get install -y --no-install-recommends \
    build-essential \
    libpq-dev \
    libffi-dev \
    && rm -rf /var/lib/apt/lists/*

WORKDIR /build
COPY requirements.txt .

# Build wheels so runtime doesn't need compilers
RUN pip wheel --no-cache-dir --wheel-dir=/wheels -r requirements.txt

FROM python:3.11-slim

# Runtime-only system deps (no compilers)
RUN apt-get update && apt-get install -y --no-install-recommends \
    libpq5 \
    && rm -rf /var/lib/apt/lists/*

# Install from pre-built wheels
COPY --from=builder /wheels /wheels
RUN pip install --no-cache-dir --no-index --find-links=/wheels /wheels/*.whl \
    && rm -rf /wheels

WORKDIR /app
COPY . .
RUN useradd -r -u 1000 appuser
USER appuser

CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]

Pattern 3: Poetry / PDM in Docker

FROM python:3.11 AS builder

RUN pip install poetry
WORKDIR /build
COPY pyproject.toml poetry.lock ./

# Export to requirements.txt — avoids needing poetry in runtime
RUN poetry export -f requirements.txt --without-hashes -o requirements.txt
RUN pip install --no-cache-dir --prefix=/install -r requirements.txt

FROM python:3.11-slim
COPY --from=builder /install /usr/local
WORKDIR /app
COPY . .
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]

Managing Multiple Python Versions

# Option 1: pyenv (recommended for dev machines)
curl https://pyenv.run | bash

# Add to shell profile:
# export PYENV_ROOT="$HOME/.pyenv"
# export PATH="$PYENV_ROOT/bin:$PATH"
# eval "$(pyenv init -)"

pyenv install 3.11.8
pyenv install 3.12.2

# Set global default
pyenv global 3.11.8

# Set per-project version (creates .python-version file)
cd /my/project
pyenv local 3.12.2

# List installed versions
pyenv versions
#   system
#   3.11.8
# * 3.12.2 (set by /my/project/.python-version)

# Option 2: update-alternatives (Debian/Ubuntu servers)
sudo update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.11 1
sudo update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.12 2
sudo update-alternatives --config python3

# Option 3: deadsnakes PPA (Ubuntu — get newer Python versions)
sudo add-apt-repository ppa:deadsnakes/ppa
sudo apt-get update
sudo apt-get install python3.12 python3.12-venv python3.12-dev

# Create a venv with a specific Python version
python3.12 -m venv .venv312
source .venv312/bin/activate
python --version  # Python 3.12.2

Building and Publishing Internal Packages

# Project structure for an internal library
# mylib/
# ├── pyproject.toml
# ├── src/
# │   └── mylib/
# │       ├── __init__.py
# │       └── core.py
# └── tests/
#     └── test_core.py
# pyproject.toml (modern, PEP 621)
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"

[project]
name = "mylib"
version = "1.2.0"
description = "Internal shared library"
requires-python = ">=3.11"
dependencies = [
    "requests>=2.31",
    "pydantic>=2.0",
]

[project.optional-dependencies]
dev = ["pytest", "ruff", "mypy"]
# Build the package
pip install build
python -m build
# Creates dist/mylib-1.2.0.tar.gz and dist/mylib-1.2.0-py3-none-any.whl

# Publish to a private PyPI (e.g., AWS CodeArtifact, Artifactory, devpi)
pip install twine
twine upload --repository-url https://pypi.internal.example.com/simple/ dist/*

# Install from private PyPI
pip install --index-url https://pypi.internal.example.com/simple/ mylib

# Or install directly from git (for small teams / early stage)
pip install git+https://github.com/myorg/mylib.git@v1.2.0

# Install from git in requirements.txt
# mylib @ git+https://github.com/myorg/mylib.git@v1.2.0

Vendoring Dependencies

When you need fully offline installs or don't trust your package index to be available.

# Download all dependencies as wheels into a vendor directory
pip download -r requirements.txt -d vendor/

# Install from the vendor directory (no network needed)
pip install --no-index --find-links=vendor/ -r requirements.txt

# Docker pattern: vendor in the build, install offline in runtime
# Dockerfile:
# COPY vendor/ /vendor/
# COPY requirements.txt .
# RUN pip install --no-cache-dir --no-index --find-links=/vendor -r requirements.txt

# Keep vendor up to date
pip download -r requirements.txt -d vendor/
# Then commit vendor/ to git (or store in artifact storage)

# Verify vendored packages match requirements
pip install --no-index --find-links=vendor/ --dry-run -r requirements.txt

Reproducible Builds with pip-compile

# Install pip-tools
pip install pip-tools

# Create requirements.in with top-level dependencies only
cat > requirements.in << 'EOF'
fastapi>=0.104
uvicorn[standard]>=0.24
sqlalchemy>=2.0
pydantic>=2.0
httpx>=0.25
EOF

# Compile to fully pinned requirements.txt with hashes
pip-compile --generate-hashes --output-file=requirements.txt requirements.in

# Output (requirements.txt) looks like:
# fastapi==0.109.2 \
#     --hash=sha256:2c1... \
#     --hash=sha256:f4e...
# pydantic==2.6.1 \
#     --hash=sha256:0b6...
# ... (every transitive dependency pinned with hashes)

# Install from the pinned file
pip-sync requirements.txt  # removes packages not in requirements.txt

# Upgrade a single package
pip-compile --upgrade-package fastapi requirements.in

# Upgrade all packages
pip-compile --upgrade requirements.in

# Separate dev dependencies
cat > requirements-dev.in << 'EOF'
-c requirements.txt
pytest>=7.0
ruff>=0.1
mypy>=1.0
EOF

pip-compile --generate-hashes requirements-dev.in

# CI workflow:
# 1. Developer edits requirements.in
# 2. pip-compile generates requirements.txt (committed to git)
# 3. CI runs: pip install --require-hashes -r requirements.txt
# 4. If hashes don't match (supply chain attack), install fails

Diagnosing Dependency Conflicts

# See the full dependency tree
pip install pipdeptree
pipdeptree

# Output:
# fastapi==0.109.2
#   - pydantic [required: !=1.8,!=1.8.1,>=1.7.4, installed: 2.6.1]
#   - starlette [required: >=0.36.3,<0.37.0, installed: 0.36.3]
# uvicorn==0.27.1
#   - click [required: >=7.0, installed: 8.1.7]

# Find conflicts
pipdeptree --warn fail
# ERROR: package 'foo' requires 'bar>=2.0' but 'bar==1.9' is installed

# Show reverse dependencies (who depends on this package?)
pipdeptree --reverse --packages requests
# Output:
# requests==2.31.0
#   - httpx==0.26.0 [requires: requests]
#   - myapp==1.0 [requires: requests>=2.28]

# Show as JSON for scripting
pipdeptree --json | python3 -m json.tool

# When you have a conflict:
# 1. Check which packages need conflicting versions
pipdeptree --reverse --packages problematic-package
# 2. Try upgrading the higher-level package
pip install --upgrade parent-package
# 3. If stuck, check if there's a compatible version range
pip install "parent-a>=X" "parent-b>=Y"

Quick Emergency Fixes

# Package install hangs on "Building wheel for X"
# It's compiling C code. Check if there's a binary wheel:
pip install --only-binary=:all: package-name
# If no binary wheel exists for your platform, you need build deps:
sudo apt-get install -y build-essential python3-dev

# "ERROR: Could not install packages due to an EnvironmentError: [Errno 13]"
# Permission denied — you're trying to install into system Python
python3 -m venv .venv && source .venv/bin/activate  # use a venv

# "error: subprocess-exited-with-error" during install
# Check if you need system libs:
sudo apt-get install -y libpq-dev  # for psycopg2
sudo apt-get install -y libffi-dev  # for cffi/cryptography
sudo apt-get install -y libxml2-dev libxslt-dev  # for lxml

# pip itself is broken
python3 -m ensurepip --upgrade
python3 -m pip install --upgrade pip

# Clear the pip cache (corrupt cache causes weird errors)
pip cache purge

# Install a specific version to unbreak a regression
pip install package-name==1.2.3

# Downgrade to last known working set
pip install -r requirements.txt --force-reinstall

Quick Reference: Common Tasks

Task Command
Pin all deps with hashes pip-compile --generate-hashes requirements.in
Sync env to lock file pip-sync requirements.txt
Audit for vulnerabilities pip-audit or safety check
Show dependency tree pipdeptree
Find who depends on X pipdeptree --reverse --packages X
Vendor deps for offline pip download -r requirements.txt -d vendor/
Recreate broken venv rm -rf .venv && python3 -m venv .venv
Multiple Python versions pyenv install 3.12.2 && pyenv local 3.12.2
Debug import failures python3 -v -c "import X"
Check package location python3 -m pip show X