Test Environment Management: End the It Works on My Machine Problem for Good
"It works on my machine" is the bane of software testing. Environment differences—subtle variations in configuration, dependencies, or data—cause tests to pass locally but fail in CI, or work in staging but break in production. This comprehensive guide provides battle-tested strategies for managing test environments reliably.
The Environment Hierarchy
Modern applications typically have multiple environments, each serving different purposes.
graph TD
A[Local Development] --> B[CI/CD]
B --> C[Dev Environment]
C --> D[QA/Test Environment]
D --> E[Staging]
E --> F[Production]
style A fill:#6bcf7f
style B fill:#95e1d3
style C fill:#ffd93d
style D fill:#ff9a76
style E fill:#ff6b6b
style F fill:#a78bfa
Environment Characteristics Matrix
| Environment | Purpose | Data | Uptime | Cost | Deploy Frequency |
|---|---|---|---|---|---|
| Local | Development & unit tests | Mocked/minimal | Variable | Low | Continuous |
| CI | Automated tests | Synthetic/seeded | Per-run | Low | Every commit |
| Dev | Integration testing | Synthetic | 95% | Low | Multiple/day |
| QA | Manual & automated testing | Realistic | 99% | Medium | Daily |
| Staging | Pre-production validation | Production-like | 99.5% | High | Per release |
| Production | Live users | Real | 99.9%+ | High | Controlled |
Core Principles of Environment Management
1. Environment Parity
Environments should be as similar as possible, especially staging and production.
The 12-Factor App guidelines:
- Dev/prod parity: Minimize differences
- Configuration: Store config in environment variables
- Dependencies: Explicitly declare and isolate
- Backing services: Treat as attachevices
// ❌ BAD: Environment-specific code
function getDatabaseConnection() {
if (process.env.NODE_ENV === 'production') {
return new PostgresConnection({
host: 'prod-db.example.com',
port: 5432,
ssl: true,
});
} else if (process.env.NODE_ENV === 'staging') {
return new PostgresConnection({
host: 'staging-db.example.com',
port: 5432,
ssl: true,
});
} else {
return new SQLiteConnection({
filename: './dev.db',
});
}
}
// ✅ GOOD: Configuration-driven, same code path
function getDatabaseConnection() {
return new PostgresConnection({
host: process.env.DATABASE_HOST!,
port: parseInt(process.env.DATABASE_PORT!),
database: process.env.DATABASE_NAME!,
username: process.env.DATABASE_USER!,
password: process.env.DATABASE_PASSWORD!,
ssl: process.env.DATABASE_SSL === 'true',
});
}
// .env.local
DATABASE_HOST=localhost
DATABASE_PORT=5432
DATABASE_NAME=myapp_dev
DATABASE_USER=dev
DATABASE_PASSWORD=dev123
DATABASE_SSL=false
// .env.staging
DATABASE_HOST=staging-db.example.com
DATABASE_PORT=5432
DATABASE_NAME=myapp_staging
DATABASE_USER=staging_user
DATABASE_PASSWORD=${STAGING_DB_PASSWORD} # From secrets
DATABASE_SSL=true
// .env.production
DATABASE_HOST=prod-db.example.com
DATABASE_PORT=5432
DATABASE_NAME=myapp_prod
DATABASE_USER=prod_user
DATABASE_PASSWORD=${PRODUCTION_DB_PASSWORD} # From secrets
DATABASE_SSL=true
2. Infrastructure as Code (IaC)
Define environments in code for reproducibility.
# docker-compose.test.yml
version: '3.8'
services:
app:
build:
context: .
target: test
environment:
- NODE_ENV=test
- DATABASE_URL=postgresql://test:test@postgres:5432/test_db
- REDIS_URL=redis://redis:6379
- API_BASE_URL=http://app:3000
depends_on:
postgres:
condition: service_healthy
redis:
condition: service_healthy
command: npm run test:integration
postgres:
image: postgres:16-alpine
environment:
- POSTGRES_USER=test
- POSTGRES_PASSWORD=test
- POSTGRES_DB=test_db
healthcheck:
test: ['CMD-SHELL', 'pg_isready -U test']
interval: 5s
timeout: 5s
retries: 5
volumes:
- postgres_data:/var/lib/postgresql/data
- ./scripts/db/seed-test-data.sql:/docker-entrypoint-initdb.d/01-seed.sql
redis:
image: redis:7-alpine
healthcheck:
test: ['CMD', 'redis-cli', 'ping']
interval: 5s
timeout: 3s
retries: 5
volumes:
- redis_data:/data
mailhog:
image: mailhog/mailhog:latest
ports:
- '1025:1025' # SMTP
- '8025:8025' # Web UI
volumes:
postgres_data:
redis_data:
# Start test environment
docker-compose -f docker-compose.test.yml up -d
# Wait for services to be healthy
docker-compose -f docker-compose.test.yml ps
# Run tests
docker-compose -f docker-compose.test.yml run app npm test
# Tear down
docker-compose -f docker-compose.test.yml down -v
3. Isolated and Ephemeral Environments
Tests should not interfere with each other or leave residual state.
// tests/setup/test-environment.ts
export class TestEnvironment {
private static instanceCounter = 0;
private containerId?: string;
async setup(): Promise<void> {
const instanceId = ++TestEnvironment.instanceCounter;
// Create isolated database
this.containerId = await this.createIsolatedDatabase(instanceId);
// Set environment variables
process.env.DATABASE_URL = `postgresql://test:test@localhost:${5432 + instanceId}/test_${instanceId}`;
process.env.REDIS_URL = `redis://localhost:${6379 + instanceId}`;
// Run migrations
await this.runMigrations();
// Seed data
await this.seedData();
}
async teardown(): Promise<void> {
// Clean up database
if (this.containerId) {
await exec(`docker stop ${this.containerId}`);
await exec(`docker rm ${this.containerId}`);
}
}
private async createIsolatedDatabase(instanceId: number): Promise<string> {
const result = await exec(`
docker run -d \
--name test-db-${instanceId} \
-e POSTGRES_USER=test \
-e POSTGRES_PASSWORD=test \
-e POSTGRES_DB=test_${instanceId} \
-p ${5432 + instanceId}:5432 \
postgres:16-alpine
`);
// Wait for database to be ready
await this.waitForDatabase(instanceId);
return result.stdout.trim();
}
private async waitForDatabase(instanceId: number, maxAttempts = 30): Promise<void> {
for (let i = 0; i < maxAttempts; i++) {
try {
await exec(`docker exec test-db-${instanceId} pg_isready -U test`);
return;
} catch {
await new Promise((resolve) => setTimeout(resolve, 1000));
}
}
throw new Error('Database failed to start');
}
private async runMigrations(): Promise<void> {
const { exec } = await import('child_process');
await exec('npm run db:migrate');
}
private async seedData(): Promise<void> {
// Insert baseline test data
await db.users.insert({
id: 'test-user-1',
email: 'test@example.com',
name: 'Test User',
});
}
}
// vitest.config.ts
export default defineConfig({
test: {
globalSetup: './tests/setup/global-setup.ts',
setupFiles: ['./tests/setup/test-environment.ts'],
// Run tests serially to avoid port conflicts
// OR: Use random ports for full parallelization
testTimeout: 60000,
},
});
Environment Configuration Management
Strategy 1: .env Files with Validation
// src/config/env.ts
import { z } from 'zod';
import dotenv from 'dotenv';
// Load environment variables
dotenv.config({ path: `.env.${process.env.NODE_ENV}` });
// Define schema
const envSchema = z.object({
NODE_ENV: z.enum(['development', 'test', 'staging', 'production']),
// Server
PORT: z.string().transform(Number).pipe(z.number().min(1).max(65535)),
HOST: z.string().default('localhost'),
// Database
DATABASE_URL: z.string().url(),
DATABASE_POOL_MIN: z.string().transform(Number).default('2'),
DATABASE_POOL_MAX: z.string().transform(Number).default('10'),
// Redis
REDIS_URL: z.string().url(),
// External Services
API_KEY: z.string().min(1),
API_SECRET: z.string().min(1),
// Feature Flags
ENABLE_ANALYTICS: z
.enum(['true', 'false'])
.transform((v) => v === 'true')
.default('false'),
ENABLE_DEBUG_LOGGING: z
.enum(['true', 'false'])
.transform((v) => v === 'true')
.default('false'),
});
// Validate and export
export type Env = z.infer<typeof envSchema>;
function validateEnv(): Env {
try {
return envSchema.parse(process.env);
} catch (error) {
if (error instanceof z.ZodError) {
console.error('❌ Invalid environment variables:');
console.error(error.errors.map((e) => ` - ${e.path.join('.')}: ${e.message}`).join('\n'));
}
process.exit(1);
}
}
export const env = validateEnv();
// Usage in application
import { env } from './config/env';
const server = app.listen(env.PORT, env.HOST, () => {
console.log(`Server running on ${env.HOST}:${env.PORT}`);
});
Strategy 2: Secrets Management
// src/config/secrets.ts
interface SecretsProvider {
get(key: string): Promise<string>;
}
class LocalSecretsProvider implements SecretsProvider {
async get(key: string): Promise<string> {
// For local development, use .env
return process.env[key] || '';
}
}
class VaultSecretsProvider implements SecretsProvider {
private client: VaultClient;
constructor(vaultUrl: string, token: string) {
this.client = new VaultClient({ url: vaultUrl, token });
}
async get(key: string): Promise<string> {
const result = await this.client.read(`secret/data/${key}`);
return result.data.data.value;
}
}
class AWSSecretsProvider implements SecretsProvider {
private client: SecretsManagerClient;
constructor(region: string) {
this.client = new SecretsManagerClient({ region });
}
async get(key: string): Promise<string> {
const command = new GetSecretValueCommand({ SecretId: key });
const result = await this.client.send(command);
return result.SecretString || '';
}
}
// Factory
function createSecretsProvider(): SecretsProvider {
switch (process.env.NODE_ENV) {
case 'production':
return new AWSSecretsProvider(process.env.AWS_REGION!);
case 'staging':
return new VaultSecretsProvider(process.env.VAULT_URL!, process.env.VAULT_TOKEN!);
default:
return new LocalSecretsProvider();
}
}
export const secrets = createSecretsProvider();
// Usage
const apiKey = await secrets.get('STRIPE_API_KEY');
Test Data Management
Strategy 1: Database Seeding
-- scripts/db/seed-test-data.sql
-- Clean existing data
TRUNCATE TABLE users, projects, scans CASCADE;
-- Seed users
INSERT INTO users (id, email, name, role, created_at) VALUES
('user-1', 'admin@test.com', 'Admin User', 'admin', NOW()),
('user-2', 'user@test.com', 'Regular User', 'user', NOW()),
('user-3', 'viewer@test.com', 'Viewer User', 'viewer', NOW());
-- Seed projects
INSERT INTO projects (id, name, user_id, created_at) VALUES
('proj-1', 'Test Project 1', 'user-1', NOW()),
('proj-2', 'Test Project 2', 'user-2', NOW());
-- Seed scans
INSERT INTO scans (id, project_id, url, status, created_at) VALUES
('scan-1', 'proj-1', 'https://example.com', 'completed', NOW() - INTERVAL '1 day'),
('scan-2', 'proj-1', 'https://example.com', 'running', NOW() - INTERVAL '5 minutes'),
('scan-3', 'proj-2', 'https://test.com', 'completed', NOW() - INTERVAL '2 days');
// tests/helpers/database.ts
export class TestDatabase {
async reset(): Promise<void> {
// Drop all data
await db.query('TRUNCATE TABLE users, projects, scans CASCADE');
}
async seed(scenario: string): Promise<void> {
const seedFile = `./tests/fixtures/${scenario}.sql`;
const sql = await fs.readFile(seedFile, 'utf8');
await db.query(sql);
}
async seedFromFactory(factories: TestDataFactory[]): Promise<void> {
for (const factory of factories) {
await factory.create();
}
}
}
// Usage in tests
let testDb: TestDatabase;
beforeEach(async () => {
testDb = new TestDatabase();
await testDb.reset();
await testDb.seed('basic-users');
});
test('user can create project', async () => {
const user = await db.users.findOne({ email: 'user@test.com' });
// ... test logic
});
Strategy 2: Test Data Builders with Factories
// tests/factories/user.factory.ts
export class UserFactory {
private overrides: Partial<User> = {};
withEmail(email: string): this {
this.overrides.email = email;
return this;
}
withRole(role: UserRole): this {
this.overrides.role = role;
return this;
}
asAdmin(): this {
this.overrides.role = 'admin';
return this;
}
build(): User {
return {
id: `user-${Date.now()}-${Math.random()}`,
email: this.overrides.email || `test-${Date.now()}@example.com`,
name: this.overrides.name || 'Test User',
role: this.overrides.role || 'user',
createdAt: new Date(),
...this.overrides,
};
}
async create(): Promise<User> {
const user = this.build();
await db.users.insert(user);
return user;
}
// Create multiple users at once
async createMany(count: number): Promise<User[]> {
const users = Array.from({ length: count }, () => this.build());
await db.users.insertMany(users);
return users;
}
}
// tests/factories/project.factory.ts
export class ProjectFactory {
private overrides: Partial<Project> = {};
forUser(userId: string): this {
this.overrides.userId = userId;
return this;
}
withName(name: string): this {
this.overrides.name = name;
return this;
}
async create(): Promise<Project> {
// If no user specified, create one
if (!this.overrides.userId) {
const user = await new UserFactory().create();
this.overrides.userId = user.id;
}
const project = {
id: `proj-${Date.now()}-${Math.random()}`,
name: this.overrides.name || `Test Project ${Date.now()}`,
userId: this.overrides.userId,
createdAt: new Date(),
...this.overrides,
};
await db.projects.insert(project);
return project;
}
}
// Usage in tests
test('user can view their projects', async () => {
// Create user
const user = await new UserFactory().withEmail('user@example.com').create();
// Create projects for this user
const project1 = await new ProjectFactory().forUser(user.id).withName('Project 1').create();
const project2 = await new ProjectFactory().forUser(user.id).withName('Project 2').create();
// Create project for different user (shouldn't see)
const otherUser = await new UserFactory().create();
await new ProjectFactory().forUser(otherUser.id).create();
// Test
const response = await api.get('/api/projects', {
headers: { Authorization: `Bearer ${user.token}` },
});
expect(response.data.length).toBe(2);
expect(response.data.map((p) => p.id)).toContain(project1.id);
expect(response.data.map((p) => p.id)).toContain(project2.id);
});
CI/CD Environment Integration
# .github/workflows/integration-tests.yml
name: Integration Tests
on:
push:
branches: [main, develop]
pull_request:
jobs:
integration-tests:
runs-on: ubuntu-latest
# Service containers
services:
postgres:
image: postgres:16
env:
POSTGRES_PASSWORD: postgres
POSTGRES_DB: test_db
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5432:5432
redis:
image: redis:7
options: >-
--health-cmd "redis-cli ping"
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 6379:6379
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: '22'
cache: 'npm'
- run: npm ci
# Set up test environment
- name: Set up test database
env:
DATABASE_URL: postgresql://postgres:postgres@localhost:5432/test_db
run: |
npm run db:migrate
npm run db:seed:test
# Run integration tests
- name: Run tests
env:
NODE_ENV: test
DATABASE_URL: postgresql://postgres:postgres@localhost:5432/test_db
REDIS_URL: redis://localhost:6379
API_KEY: ${{ secrets.TEST_API_KEY }}
run: npm run test:integration
# Upload results
- name: Upload test results
if: always()
uses: actions/upload-artifact@v4
with:
name: test-results
path: test-results/
Environment Debugging Strategies
// scripts/diagnose-environment.ts
interface EnvironmentDiagnostics {
environment: string;
timestamp: Date;
services: ServiceStatus[];
configuration: ConfigStatus[];
issues: EnvironmentIssue[];
}
interface ServiceStatus {
name: string;
status: 'healthy' | 'unhealthy' | 'unknown';
responseTime?: number;
message?: string;
}
async function diagnoseEnvironment(): Promise<EnvironmentDiagnostics> {
const diagnostics: EnvironmentDiagnostics = {
environment: process.env.NODE_ENV || 'unknown',
timestamp: new Date(),
services: [],
configuration: [],
issues: [],
};
// Check database
try {
const start = Date.now();
await db.query('SELECT 1');
diagnostics.services.push({
name: 'database',
status: 'healthy',
responseTime: Date.now() - start,
});
} catch (error) {
diagnostics.services.push({
name: 'database',
status: 'unhealthy',
message: error.message,
});
diagnostics.issues.push({
severity: 'critical',
component: 'database',
message: `Database connection failed: ${error.message}`,
});
}
// Check Redis
try {
const start = Date.now();
await redis.ping();
diagnostics.services.push({
name: 'redis',
status: 'healthy',
responseTime: Date.now() - start,
});
} catch (error) {
diagnostics.services.push({
name: 'redis',
status: 'unhealthy',
message: error.message,
});
diagnostics.issues.push({
severity: 'high',
component: 'redis',
message: `Redis connection failed: ${error.message}`,
});
}
// Verify configuration
const requiredEnvVars = ['DATABASE_URL', 'REDIS_URL', 'API_KEY', 'API_SECRET'];
for (const envVar of requiredEnvVars) {
if (!process.env[envVar]) {
diagnostics.issues.push({
severity: 'critical',
component: 'configuration',
message: `Missing required environment variable: ${envVar}`,
});
} else {
diagnostics.configuration.push({
key: envVar,
status: 'present',
valuePreview: process.env[envVar]!.substring(0, 10) + '...',
});
}
}
return diagnostics;
}
// Run diagnostics
diagnoseEnvironment().then((diagnostics) => {
console.log('\n🔍 Environment Diagnostics\n');
console.log(`Environment: ${diagnostics.environment}`);
console.log(`Timestamp: ${diagnostics.timestamp.toISOString()}\n`);
console.log('Services:');
for (const service of diagnostics.services) {
const emoji = service.status === 'healthy' ? '✅' : '❌';
console.log(
` ${emoji} ${service.name}: ${service.status}${service.responseTime ? ` (${service.responseTime}ms)` : ''}`,
);
}
if (diagnostics.issues.length > 0) {
console.log('\n⚠️ Issues Found:');
for (const issue of diagnostics.issues) {
console.log(` [${issue.severity.toUpperCase()}] ${issue.component}: ${issue.message}`);
}
} else {
console.log('\n✅ No issues found!');
}
});
Best Practices Summary
| Practice | Implementation | Benefit |
|---|---|---|
| Environment Parity | Same services, config-driven | Reduces "works on my machine" issues |
| IaC | Docker Compose, Terraform | Reproducible environments |
| Isolated Environments | Per-test databases | No test interference |
| Config Validation | Zod schemas | Early error detection |
| Secrets Management | Vault, AWS Secrets Manager | Security |
| Test Data Factories | Builder pattern | Flexible, maintainable test data |
| Health Checks | Automated service verification | Fast failure detection |
| Diagnostics | Environment inspection scripts | Quick troubleshooting |
Conclusion: Stability Through Consistency
Reliable test environments require:
- Parity: Minimize differences between environments
- Automation: Code your infrastructure
- Isolation: Tests don't interfere
- Validation: Catch configuration errors early
- Observability: Know your environment's state
With these practices, "it works on my machine" becomes "it works everywhere."
Build Reliable Test Environments with ScanlyApp
ScanlyApp provides comprehensive environment management features including configuration validation, service health monitoring, and test data management.
Start Your Free Trial and eliminate environment-related test failures.
Related articles: Also see Docker as the foundation for reliable, repeatable test environments, ephemeral Kubernetes environments as the scalable evolution of test env management, and managing test data alongside the environments it lives in.
