Test management with CLI

While the Keystone CLI primarily handles local test execution and recording tunnels, test management is primarily done through Keystone Studio. However, the CLI provides useful capabilities for working with tests in your development workflow.

Local test execution

Running individual tests

# Execute a specific test locally
# (Tests are managed in Keystone Studio, executed via CLI runner)

# Start runner to receive test execution commands
keystone start --proxy --api-key your-api-key

# Tests are triggered from Studio and executed locally
# Full visibility into test execution
# Real-time debugging and inspection

Development testing workflow

# 1. Start your development environment
npm run dev

# 2. Start Keystone runner
keystone start --proxy --debug

# 3. From Keystone Studio:
#    - Select tests to run against localhost
#    - Execute tests with full local visibility
#    - Debug failures with local browser tools
#    - Iterate quickly on test fixes

Test file provisioning

Working with test files

The CLI can provision test files for local development:
# Set test files directory
export TEST_FILES_DIR="./tests"

# Runner provisions test files from cloud
# Files are synced automatically during execution
# Local files are used for debugging and inspection

File structure

tests/
├── auth/
   ├── login.json          # Test definitions
   ├── registration.json
   └── password-reset.json
├── e-commerce/
   ├── checkout.json
   ├── product-search.json
   └── cart-management.json
└── admin/
    ├── user-management.json
    └── analytics.json

Environment-specific testing

Testing against different environments

# Local development
export APP_ENV="development"
export BASE_URL="http://localhost:3000"
keystone start --proxy

# Staging environment
export APP_ENV="staging"
export BASE_URL="https://staging.example.com"
keystone start --proxy

# Feature branch testing
git checkout feature/new-dashboard
npm run dev
export BASE_URL="http://localhost:3000"
keystone start --proxy

Environment configuration

# .env.development
KEYSTONE_ENVIRONMENT=development
BASE_URL=http://localhost:3000
API_URL=http://localhost:8000
DATABASE_URL=postgresql://localhost:5432/myapp_dev

# .env.staging
KEYSTONE_ENVIRONMENT=staging
BASE_URL=https://staging.example.com
API_URL=https://api-staging.example.com
DATABASE_URL=postgresql://staging-db:5432/myapp

Test organization strategies

Project structure integration

Organize tests to match your application structure:
# Frontend tests
tests/frontend/
├── components/
├── pages/
└── workflows/

# API tests  
tests/api/
├── auth/
├── users/
└── orders/

# Integration tests
tests/integration/
├── auth-flow/
├── purchase-flow/
└── admin-flow/

Branch-based testing

# Feature branch workflow
git checkout -b feature/user-profiles

# Record tests specific to this feature
keystone start --proxy
# Record in Studio against localhost

# Tests are saved with branch context
# Can be re-run when feature is deployed
# Helps validate feature before merge

Debugging and inspection

Local test debugging

# Start runner with maximum debugging
keystone start --proxy --debug --headed

# Benefits of local execution:
# - Real browser with dev tools
# - Console access and debugging
# - Network tab inspection  
# - Step-by-step execution visibility
# - Local file system access

Test failure analysis

When tests fail locally:
# Runner provides detailed logs
tail -f ~/.keystone/logs/execution.log

# Screenshots are captured automatically
ls ~/.keystone/screenshots/

# Browser console output is logged
grep "console" ~/.keystone/logs/browser.log

Performance monitoring

Local performance testing

# Monitor test performance locally
keystone start --proxy --debug

# Captures:
# - Page load times
# - JavaScript execution time
# - Network request timing
# - Memory usage patterns
# - Resource loading metrics

Optimization insights

# Performance logs location
tail -f ~/.keystone/logs/performance.log

# Key metrics tracked:
# - Step execution time
# - Element selection time
# - Network request duration
# - Page rendering time
# - Total test execution time

Integration with development tools

Git hooks integration

# .git/hooks/pre-commit
#!/bin/bash
# Run critical tests before commit

keystone start --proxy --headless &
RUNNER_PID=$!

# Wait for runner to start
sleep 5

# Trigger critical tests via API
# (Implementation depends on your setup)

# Clean up
kill $RUNNER_PID

CI/CD preparation

# Validate tests locally before CI
export KEYSTONE_HEADLESS=true
export KEYSTONE_DEBUG=false

keystone start --proxy --headless

# Run same tests that will run in CI
# Ensure they pass locally first
# Debug any issues before pushing

Test data management

Local test data

# Set up consistent test data
npm run db:seed:test

# Record tests against known data
keystone start --proxy

# Tests become reproducible and reliable
# Data state is consistent across runs

Data isolation

# Reset data between test runs
# Ensures clean state for each test
# Prevents test interdependencies

# Example cleanup script
cleanup_test_data() {
  npm run db:reset:test
  npm run db:seed:test
}

Troubleshooting test issues

Common test problems

Element not found:
# Debug with local browser
keystone start --proxy --debug --headed

# Inspect element selectors
# Check for timing issues
# Verify page state
Timing issues:
# Analyze timing with debug logs
grep "timing" ~/.keystone/logs/execution.log

# Adjust wait strategies
# Optimize loading detection
Authentication failures:
# Debug auth state
# Check token validity
# Verify session handling

Debug strategies

# Step-by-step debugging
# Use headed mode for visual inspection
keystone start --proxy --debug --headed

# Network debugging
# Monitor API calls and responses
# Check authentication headers
# Verify request/response timing

Best practices

Development workflow

  1. Record locally: Use CLI tunnel to record against localhost
  2. Test locally: Debug and refine tests with full visibility
  3. Validate remotely: Run tests against staging/production
  4. Integrate CI: Automate test execution in pipeline

Test maintenance

  1. Regular updates: Keep tests current with application changes
  2. Performance monitoring: Track test execution time and reliability
  3. Error analysis: Investigate and fix failing tests promptly
  4. Documentation: Maintain clear test descriptions and purposes

Team collaboration

  1. Shared environments: Use consistent staging environments
  2. Test organization: Structure tests logically for team use
  3. Code integration: Include test development in code review process
  4. Knowledge sharing: Document test patterns and best practices

Next steps