Dagger - Street-Level Ops¶
Quick Diagnosis Commands¶
# Check Dagger CLI version
dagger version
# List available functions in a module
dagger functions
# Call a specific function
dagger call build
# Run with verbose output to see what's happening
dagger call --debug build
# Run a pipeline file (if using dagger run with a script)
dagger run node ci.mjs
dagger run python ci.py
# View Dagger engine logs
dagger engine logs
# Check active Dagger engine
docker ps | grep dagger-engine
# Pull latest Dagger engine
dagger version upgrade
# Module development
dagger init --sdk=go --name=mymodule
dagger init --sdk=python --name=mymodule
dagger init --sdk=typescript --name=mymodule
# Develop module (hot reload)
dagger develop
# Install a module from registry or git
dagger install github.com/shykes/dagger-demo
# List installed dependencies
cat dagger.json
# Sync module dependencies
dagger mod sync
# Calling module functions with arguments
dagger call build --source=. --dockerfile=./Dockerfile
# Chain function calls
dagger call build --source=. publish --address=ttl.sh/myimage:latest
# Pass a secret
export DOCKER_PASSWORD=mysecret
dagger call push --password=env:DOCKER_PASSWORD
# Pass a directory from local filesystem
dagger call lint --source=.
# Interactive shell inside a container step
dagger call build --source=. terminal
Common Scenarios¶
Scenario 1: Debugging a failing pipeline step¶
Pipeline fails but the error message is vague. Need to inspect the container state.
# Run with debug mode to see full execution trace
dagger call --debug build --source=.
# Drop into a terminal at the point of failure
# In your Dagger function, add:
# .terminal() (Go/TypeScript)
# .terminal() (Python)
# Example Go module snippet:
# return dag.Container().From("golang:1.22").
# WithMountedDirectory("/src", src).
# WithWorkdir("/src").
# Terminal() # <-- drops you into shell here
# Check engine logs for low-level errors
dagger engine logs --follow
# Inspect the Dagger engine container directly
docker exec -it $(docker ps -q -f name=dagger-engine) sh
Fix: Use .terminal() strategically in your pipeline definition during debugging. Remove before committing.
Scenario 2: Cache not being used — slow rebuilds¶
Under the hood: Dagger's cache keys are based on the content hash of function inputs. If you pass the entire source directory as an argument, any file change (even a README edit) invalidates the cache for dependency installation. Separate your dependency files from source code in the pipeline graph.
Pipeline runs full build every time despite no code changes.
# Verify cache mounts are defined correctly
# In Go SDK:
# dag.CacheVolume("go-mod-cache")
# .WithMountedCache("/root/go/pkg/mod", dag.CacheVolume("go-mod-cache"))
# Check if cache is being populated
dagger call --debug build 2>&1 | grep -i "cache\|CACHED\|RUN"
# Force cache invalidation (useful if cache is corrupt)
dagger call build --no-cache
# Clear local Dagger cache (destructive — slow next run)
docker volume ls | grep dagger
docker volume rm dagger_cache_XXX # use actual volume name
# Check cache keys — function arguments affect cache
# Different --source paths = different cache entries
dagger call build --source=$(pwd) # absolute path
dagger call build --source=. # relative — same result
Fix: Mount dependency files (go.sum, package-lock.json) separately before copying full source. Cache breaks only when those files change.
Scenario 3: Integrating with GitHub Actions¶
Dagger pipeline needs to run in GitHub Actions CI with proper caching.
# .github/workflows/ci.yml
name: CI
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Install Dagger CLI
run: cd /usr/local && curl -L https://dl.dagger.io/dagger/install.sh | sh
- name: Run pipeline
run: dagger call build --source=. test publish
env:
REGISTRY_PASSWORD: ${{ secrets.REGISTRY_PASSWORD }}
# Optional: Dagger Cloud for distributed caching
- name: Run pipeline with Dagger Cloud
run: dagger call build --source=.
env:
DAGGER_CLOUD_TOKEN: ${{ secrets.DAGGER_CLOUD_TOKEN }}
# Test locally before pushing
# Dagger runs identically locally and in CI
dagger call build --source=.
Scenario 4: Secrets management in pipelines¶
Gotcha: If you pass a secret as a plain string argument (e.g.,
WithEnvVariable("TOKEN", os.Getenv("TOKEN"))), it appears in the Dagger execution graph and may be visible in logs or cache keys. Always usedag.SetSecret()to wrap secrets -- Dagger redacts Secret objects from all output.
Secrets must be passed as Dagger Secret objects, not plain strings.
# Pass secret from environment variable
export GITHUB_TOKEN=ghp_...
dagger call deploy --token=env:GITHUB_TOKEN
# Pass secret from file
dagger call deploy --token=file:/home/user/.github-token
# Pass secret from command output
dagger call deploy --token=cmd:"vault read -field=value secret/github"
# In Go SDK — create a secret in the pipeline
# secret := dag.SetSecret("github-token", os.Getenv("GITHUB_TOKEN"))
# container.WithSecretVariable("GITHUB_TOKEN", secret)
# Verify secret is not leaked in logs
dagger call --debug deploy 2>&1 | grep -i "token\|secret\|password"
# Should show [secret] or similar masking
Key Patterns¶
Basic Go Module Structure¶
// dagger/main.go
package main
import (
"context"
"dagger.io/dagger"
)
type MyCIModule struct{}
func (m *MyCIModule) Build(ctx context.Context, source *dagger.Directory) *dagger.Container {
return dag.Container().
From("golang:1.22-alpine").
WithMountedDirectory("/src", source).
WithWorkdir("/src").
WithExec([]string{"go", "build", "./..."})
}
func (m *MyCIModule) Test(ctx context.Context, source *dagger.Directory) (string, error) {
return dag.Container().
From("golang:1.22-alpine").
WithMountedCache("/root/go/pkg/mod", dag.CacheVolume("go-mod")).
WithMountedDirectory("/src", source).
WithWorkdir("/src").
WithExec([]string{"go", "test", "./..."}).
Stdout(ctx)
}
Python Module Structure¶
# dagger/src/main.py
import dagger
from dagger import dag, function, object_type
@object_type
class MyCIModule:
@function
async def build(self, source: dagger.Directory) -> dagger.Container:
return (
dag.container()
.from_("python:3.12-slim")
.with_mounted_directory("/src", source)
.with_workdir("/src")
.with_exec(["pip", "install", "-r", "requirements.txt"])
.with_exec(["python", "-m", "pytest"])
)
Caching Strategy¶
// Mount cache volumes before source to maximize cache hits
return dag.Container().
From("node:20-alpine").
// Cache npm packages — only busts when package-lock.json changes
WithMountedCache("/root/.npm", dag.CacheVolume("npm-cache")).
// Copy only package files first
WithFile("/src/package.json", source.File("package.json")).
WithFile("/src/package-lock.json", source.File("package-lock.json")).
WithWorkdir("/src").
WithExec([]string{"npm", "ci"}).
// Now copy full source (cache hit for deps layer above)
WithMountedDirectory("/src", source).
WithExec([]string{"npm", "run", "build"})
Multi-Platform Builds¶
# Build for multiple platforms
dagger call build \
--platform=linux/amd64 \
--platform=linux/arm64 \
--source=. \
publish --address=myrepo/myimage:latest
GitLab CI Integration¶
# .gitlab-ci.yml
stages:
- build
dagger-pipeline:
stage: build
image: docker:latest
services:
- docker:dind
before_script:
- apk add curl
- cd /usr/local && curl -L https://dl.dagger.io/dagger/install.sh | sh
script:
- dagger call build --source=. test
variables:
DOCKER_HOST: tcp://docker:2376
DOCKER_TLS_CERTDIR: "/certs"
Checking Engine Health¶
Debug clue: If
dagger callhangs for 30+ seconds before doing anything, the engine is likely starting up. The first call after a reboot ordocker rmof the engine container triggers a cold start. Subsequent calls reuse the running engine and are fast.
# Verify Dagger engine is running
docker ps | grep dagger-engine
# Restart if stuck
docker rm -f $(docker ps -q -f name=dagger-engine)
# Next dagger call will start a fresh engine
# Check engine version matches CLI
dagger version
docker inspect $(docker ps -q -f name=dagger-engine) | jq '.[0].Config.Image'