Back to Blog

Your Definition of Done Is Probably Incomplete: Here Is How to Fix It

A weak Definition of Done leads to incomplete features, technical debt, and quality issues. Learn how to craft a robust DoD that aligns teams, prevents rework, and ensures every story meets your quality standards.

Scanly App

Published

10 min read

Reading time

Related articles: Also see shifting quality left as the operational expression of a strong DoD, embedding a DoD into the quality culture of a growing team, and metrics that reveal whether your Definition of Done is working.

Your Definition of Done Is Probably Incomplete: Here Is How to Fix It

"Is this story done?"
"Well, the code is written..."
"But is it tested?"
"Umm, kind of..."
"Is it deployed?"
"Not yet..."
"So... is it done?"

This conversation happens in sprint reviews everywhere. The root cause? No clear Definition of Done.

A strong Definition of Done (DoD) is one of the most powerful quality tools in agile development. It creates a shared understanding of "done," prevents incomplete work from accumulating, and ensures every feature meets your team's quality standards before it's called complete.

This guide shows you how to craft a Definition of Done that actually improves quality, not just checks boxes.

What is a Definition of Done?

The Definition of Done is a checklist of criteria that a user story, feature, or increment must meet before it's considered complete. It's a quality gate�a contract between the team and stakeholders about what "done" means.

Why It Matters

Without DoD With DoD
"Done" means different things to different people Everyone agrees on what "done" means
Features declared done but still have bugs Quality is non-negotiable
Technical debt accumulates Technical quality is part of "done"
No documentation, tests, or monitoring All aspects of quality addressed
Surprises in production Predictable, reliable releases

DoD at Different Levels

graph TD
    A[Team-Level DoD] --> B[Feature-Level DoD];
    B --> C[Story-Level DoD];
    C --> D[Task-Level DoD];

    A --> E[Applies to: Sprint deliverables];
    B --> F[Applies to: Major features/epics];
    C --> G[Applies to: Individual user stories];
    D --> H[Applies to: Technical tasks];

Most teams need at least a Story-Level DoD and optionally a Sprint-Level DoD (what the entire increment must satisfy).

Crafting Your Definition of Done

Step 1: Start with the Basics

Every DoD should include foundational quality practices:

## Story-Level Definition of Done

- [ ] Code written and follows team coding standards
- [ ] Code reviewed and approved by at least one team member
- [ ] Unit tests written with >80% of coverage for new code
- [ ] All tests pass (unit, integration, E2E)
- [ ] No critical or high-severity bugs
- [ ] Documentation updated (README, API docs, user guides)
- [ ] Acceptance criteria met and demoed to Product Owner
- [ ] Deployed to staging environment
- [ ] PO acceptance obtained

Step 2: Add Domain-Specific Criteria

Tailor your DoD to your context:

For Backend APIs:

  • API documentation updated (OpenAPI/Swagger)
  • Performance benchmarks met (p95 latency < 200ms)
  • Security review completed for auth changes
  • Database migrations tested and reversible

For Frontend Features:

  • Responsive design tested on mobile, tablet, desktop
  • Cross-browser compatibility verified (Chrome, Firefox, Safari, Edge)
  • Accessibility audit passed (WCAG 2.1 AA)
  • Loading states and error handling implemented

For Infrastructure Changes:

  • Changes tested in non-production environment
  • Rollback plan documented and tested
  • Monitoring and alerts configured
  • Runbook updated with troubleshooting steps

Step 3: Include Non-Functional Requirements

Don't forget quality attributes:

## Non-Functional Requirements in DoD

- [ ] Performance: Response time < 2 seconds for 95% of requests
- [ ] Security: No new vulnerabilities introduced (SAST/DAST scans pass)
- [ ] Scalability: Tested with 2x expected load
- [ ] Observability: Logging, metrics, and tracing implemented
- [ ] Reliability: Error rate < 0.1%

Example Definitions of Done

Startup (Early Stage)

## Definition of Done

- [ ] Code written and pushed to main branch
- [ ] Manually tested in local environment
- [ ] Demoed to founder/product lead
- [ ] Deployed to production
- [ ] No obvious bugs

Why it's minimal: Early-stage startups prioritize speed to market. As the team grows, add rigor.

Enterprise (Mature Product)

## Definition of Done

**Code Quality**

- [ ] Code adheres to style guide (linter passes)
- [ ] Code reviewed by 2 engineers (1 senior)
- [ ] Unit test coverage >85%
- [ ] Integration tests cover main scenarios
- [ ] E2E tests updated for new user flows

**Security & Compliance**

- [ ] SAST/DAST scans pass (no high/critical findings)
- [ ] Dependencies updated to non-vulnerable versions
- [ ] PII handling reviewed for GDPR/CCPA compliance
- [ ] Security team sign-off for auth/payment changes

**Documentation**

- [ ] API documentation updated (OpenAPI)
- [ ] User-facing docs updated (Help Center)
- [ ] Changelog entry added
- [ ] Architecture decision record (ADR) created if applicable

**Testing & Quality**

- [ ] All acceptance criteria met
- [ ] Tested in staging environment
- [ ] Cross-browser tested (latest 2 versions: Chrome, Firefox, Safari, Edge)
- [ ] Mobile responsive (320px - 1920px)
- [ ] Accessibility audit (axe DevTools, no violations)
- [ ] Performance tested (Lighthouse score >90)

**Deployment & Monitoring**

- [ ] Feature flag configured (if applicable)
- [ ] Deployed to staging via CI/CD
- [ ] Smoke tests pass in staging
- [ ] Monitoring dashboards updated
- [ ] Alerts configured for error rates/latency
- [ ] Rollback plan documented

**Product Sign-Off**

- [ ] Product Owner reviewed and accepted
- [ ] UX Designer reviewed (for UI changes)
- [ ] Customer success team notified (for user-facing changes)

Why it's comprehensive: Mature products have more stakeholders, compliance requirements, and risk intolerance.

DoD vs. Acceptance Criteria

They're related but different:

Aspect Definition of Done Acceptance Criteria
Scope Applies to all stories Specific to one story
Purpose Quality gate for "done" Functional requirements for the story
Set by Team (collaborative) Product Owner
Changes Rarely (quarterly reviews) Per story
Example "Code reviewed, tests pass" "User can filter products by price range"

Example in Practice

User Story: "As a customer, I want to filter products by price so I can find items in my budget."

Acceptance Criteria (story-specific):

  • Price range slider on products page
  • Min/max price inputs with validation
  • Filters apply immediately without page reload
  • URL updates with price parameters
  • Works with other filters (category, brand)

Definition of Done (applies to all stories):

  • Code reviewed
  • Unit + E2E tests written
  • Cross-browser tested
  • Deployed to staging
  • Product Owner approved

Common DoD Pitfalls

1. Too Vague

? Bad: "Code is tested"
? Good: "Unit tests written with >80% coverage, E2E tests cover main flow, all tests pass in CI"

2. Too Prescriptive

? Bad: "Every function must have a JSDoc comment with @param and @returns"
? Good: "Public APIs are documented"

Why: The first approach wastes time on low-value documentation. The second focuses on what matters (external interfaces).

3. Not Measurable

? Bad: "Performance is good"
? Good: "Page load time < 2 seconds (p95), Lighthouse score > 90"

4. Ignoring Rework

If your DoD doesn't prevent production bugs, it's too weak. Track:

  • Escaped defects: Bugs found in production that should have been caught
  • Rework rate: Stories reopened after being marked "done"

If either metric is high, strengthen your DoD.

Evolving Your DoD Over Time

Your DoD should mature with your team and product.

Quarterly DoD Retrospective

Ask:

  1. What bugs escaped to production? Do we need new DoD criteria to catch these earlier?
  2. What slowed us down? Are any DoD criteria overkill? (Rare, but possible)
  3. What best practices emerged? Should we standardize them in the DoD?
  4. What new risks do we face? (New compliance requirements, scale issues, etc.)

Signs Your DoD Needs Updating

  • Production bugs are increasing: DoD too weak
  • Velocity is dropping without quality improving: DoD too burdensome
  • Team debates whether stories are "done": DoD not clear enough
  • New technology/process adopted: DoD doesn't cover it

Enforcing the Definition of Done

A DoD is only valuable if it's followed. Make it hard to ignore:

1. Tool Integration

# GitHub Actions: Enforce DoD checklist
name: DoD Check
on: pull_request

jobs:
  check-dod:
    runs-on: ubuntu-latest
    steps:
      - name: Check PR description for DoD checklist
        run: |
          if ! grep -q "\[x\] Code reviewed" <<< "$PR_BODY"; then
            echo "::error::DoD checklist not completed"
            exit 1
          fi

2. Pull Request Templates

## Definition of Done Checklist

- [ ] Code follows style guide (linter passes)
- [ ] Code reviewed by at least one team member
- [ ] Unit tests written (>80% coverage)
- [ ] E2E tests updated
- [ ] All tests pass in CI
- [ ] Documentation updated
- [ ] Deployed to staging and smoke tested
- [ ] Acceptance criteria met and demoed

## Acceptance Criteria

- [ ] [Criterion 1 from story]
- [ ] [Criterion 2 from story]
      ...

3. Sprint Review Protocol

  • Show the DoD: Display it on-screen during demo
  • Walk through it: Tester or developer confirms each item
  • Don't accept incomplete work: If DoD isn't met, story isn't "done"

Benefits of a Strong DoD

Benefit Impact
Shared understanding Eliminates ambiguity about "done"
Quality consistency Every story meets the same standards
Prevents technical debt Quality is enforced, not deferred
Predictable velocity "Done" means truly done�no surprises
Reduced rework Fewer bugs escape to production
Better estimates DoD is factored into story estimation
Team confidence Everyone knows the bar for quality

Conclusion

A Definition of Done is more than a checklist�it's a quality philosophy codified. It aligns your team on what "done" means, prevents incomplete work from piling up, and ensures every feature meets your standards before it ships.

Start simple: code review, tests, and product owner approval. Evolve from there based on your team's needs, pain points, and maturity. Review it quarterly, enforce it consistently, and watch your quality improve.

"Done" isn't when the code is written. It's when the checklist is complete.

Ready to build a culture of quality? Sign up for ScanlyApp and integrate systematic testing into your development process.

Related Posts