Skip to main content
Back to Blog
DevOps

Docker in CI/CD: The Patterns That Cut My Pipeline Time by 82%

March 12, 20269 min read
DockerCI/CDGitHub ActionsDevOpsPerformanceKubernetes

Docker in CI/CD: The Patterns That Cut My Pipeline Time by 82%

My CI pipeline used to take 45 minutes. It now takes 8. The biggest wins came from Docker optimization — not faster hardware.

The Problem

Every CI run was:

  1. Pull base image (2 min)
  2. Install OS dependencies (5 min)
  3. Install Python packages (8 min)
  4. Install Node packages (6 min)
  5. Build application (4 min)
  6. Run tests (15 min)
  7. Build production image (5 min)

Total: ~45 minutes. Developers stopped running the full pipeline. Bugs slipped through.

Fix 1: Multi-Stage Builds (45 → 30 min)

# Stage 1: Dependencies (cached aggressively)
FROM python:3.11-slim AS deps
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

# Stage 2: Test (uses deps cache)
FROM deps AS test
COPY . .
RUN pytest tests/ -v --tb=short

# Stage 3: Production (clean, minimal image)
FROM python:3.11-slim AS production
WORKDIR /app
COPY --from=deps /usr/local/lib/python3.11/site-packages /usr/local/lib/python3.11/site-packages
COPY --from=deps /usr/local/bin /usr/local/bin
COPY . .
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0"]

Why this helps: Dependencies only reinstall when requirements.txt changes. Code changes skip the 8-minute pip install.

Fix 2: Layer Caching in CI (30 → 15 min)

GitHub Actions doesn't cache Docker layers by default. Add BuildKit caching:

- name: Build and test
  uses: docker/build-push-action@v5
  with:
    context: .
    target: test
    cache-from: type=gha
    cache-to: type=gha,mode=max

Fix 3: Parallel Test Execution (15 → 8 min)

Split the test suite across multiple containers:

strategy:
  matrix:
    test-group: [unit, integration, e2e, security]

steps:
  - name: Run ${{ matrix.test-group }} tests
    run: pytest tests/${{ matrix.test-group }}/ -v --tb=short

Four parallel jobs finishing in 4 minutes each beats one serial job taking 15 minutes.

The .dockerignore That Saves Minutes

.git
node_modules
__pycache__
*.pyc
.env
.pytest_cache
coverage/
dist/
*.md

Without this, Docker copies your entire .git directory (potentially GBs) into the build context. I've seen this add 3-5 minutes to builds.

Results

Metric Before After Improvement
Full pipeline 45 min 8 min 82% faster
Cache hit rate 0% 85% Deps rarely rebuilt
Prod image size 1.2 GB 180 MB 85% smaller
Developer adoption "I'll push and hope" "I run CI locally" Priceless

The 82% reduction wasn't one big fix — it was 5 patterns stacked together. Each one shaved off a chunk.

Want to see this in action?

Check out the projects and case studies behind these articles.