Skip to main content
This tutorial demonstrates how Vers transforms database development and testing workflows by enabling parallel execution from branched database states. We’ll build a realistic e-commerce database testing suite that showcases how Vers eliminates repetitive database setup through VM branching.

What You’ll Learn

  • How to set up a PostgreSQL environment with Node.js in Vers
  • The power of branching database states at critical decision points
  • Parallel testing of different database schemas and features
  • Real-world time savings compared to traditional database testing

Prerequisites

  • Vers CLI installed and authenticated
  • Basic familiarity with PostgreSQL and Node.js
  • Understanding of database schema design concepts

Project Overview

We’ll test different database features and schema changes on an e-commerce system. Instead of resetting the database and rebuilding test data for each scenario, we’ll:
  1. Set up once: Create a VM with PostgreSQL, Node.js, and base schema
  2. Branch at decision points: Capture database state before major changes
  3. Test in parallel: Run different feature implementations simultaneously
  4. Compare results: Analyze different approaches without data loss
This approach saves significant time and enables safe experimentation with database changes.

Step 1: Project Setup

Initialize the Project

mkdir postgres-database-testing
cd postgres-database-testing
vers init

Configure the Environment

Edit the generated vers.toml to allocate sufficient resources for PostgreSQL:
[machine]
mem_size_mib = 2048
vcpu_count = 1
fs_size_cluster_mib = 8000  # Increased for database storage
fs_size_vm_mib = 4000       # More space for PostgreSQL data

[rootfs]
name = "postgres-node"

[builder]
name = "docker"
dockerfile = "Dockerfile"

[kernel]
name = "default.bin"
The larger allocations are essential for PostgreSQL data files, transaction logs, and multiple concurrent connections.

Create the Dockerfile

Create a Dockerfile in your project root. This file defines a custom Ubuntu-based environment with PostgreSQL 15, Node.js 18, and all the infrastructure needed for Vers to function properly. The Dockerfile includes SSH server setup, PostgreSQL configuration, and an automated startup script:
FROM ubuntu:22.04

# Prevent interactive prompts during package installation
ENV DEBIAN_FRONTEND=noninteractive

# Install system dependencies
RUN apt-get update && apt-get install -y \
    curl \
    wget \
    gnupg \
    lsb-release \
    iproute2 \
    openssh-server \
    sudo \
    vim \
    emacs \
    ca-certificates \
    && rm -rf /var/lib/apt/lists/*

# Configure nameserver for internet access
RUN echo "nameserver 8.8.8.8" >> /etc/resolv.conf

# Add PostgreSQL official APT repository
RUN wget --quiet -O - https://www.postgresql.org/media/keys/ACCC4CF8.asc | apt-key add -
RUN echo "deb http://apt.postgresql.org/pub/repos/apt/ $(lsb_release -cs)-pgdg main" > /etc/apt/sources.list.d/pgdg.list

# Install PostgreSQL 15
RUN apt-get update && apt-get install -y \
    postgresql-15 \
    postgresql-client-15 \
    postgresql-contrib-15 \
    && rm -rf /var/lib/apt/lists/*

# Install Node.js 18 for application layer
RUN curl -fsSL https://deb.nodesource.com/setup_18.x | bash - \
    && apt-get install -y nodejs

# SSH server setup
RUN mkdir -p /var/run/sshd /run/sshd
RUN echo 'root:password' | chpasswd
RUN sed -i 's/#PermitRootLogin prohibit-password/PermitRootLogin yes/' /etc/ssh/sshd_config
RUN sed 's@session\s*required\s*pam_loginuid.so@session optional pam_loginuid.so@g' -i /etc/pam.d/sshd
RUN ssh-keygen -A

# Configure PostgreSQL properly
RUN mkdir -p /var/lib/postgresql/data
RUN chown -R postgres:postgres /var/lib/postgresql/

# Initialize database as postgres user
USER postgres
RUN /usr/lib/postgresql/15/bin/initdb -D /var/lib/postgresql/data

# Configure PostgreSQL for container use
RUN echo "host all all 0.0.0.0/0 trust" >> /var/lib/postgresql/data/pg_hba.conf
RUN echo "local all all trust" >> /var/lib/postgresql/data/pg_hba.conf
RUN sed -i "s/#listen_addresses = 'localhost'/listen_addresses = '*'/" /var/lib/postgresql/data/postgresql.conf
RUN echo "port = 5432" >> /var/lib/postgresql/data/postgresql.conf

# Switch back to root
USER root

# Ensure proper ownership after all configuration
RUN chown -R postgres:postgres /var/lib/postgresql/
RUN chmod -R 700 /var/lib/postgresql/data/

# Set working directory
WORKDIR /app

# Copy application files
COPY . .

# Expose ports
EXPOSE 22 5432

# Create comprehensive startup script
RUN echo '#!/bin/bash' > /start.sh && \
    echo '# Fix DNS resolution at runtime' >> /start.sh && \
    echo 'echo "nameserver 8.8.8.8" >> /etc/resolv.conf' >> /start.sh && \
    echo '' >> /start.sh && \
    echo '# Start SSH server' >> /start.sh && \
    echo '/usr/sbin/sshd -D &' >> /start.sh && \
    echo '' >> /start.sh && \
    echo '# Ensure PostgreSQL directories have correct permissions' >> /start.sh && \
    echo 'chown -R postgres:postgres /var/lib/postgresql/' >> /start.sh && \
    echo 'chmod -R 700 /var/lib/postgresql/data/' >> /start.sh && \
    echo '' >> /start.sh && \
    echo '# Start PostgreSQL as postgres user' >> /start.sh && \
    echo 'su - postgres -c "/usr/lib/postgresql/15/bin/pg_ctl -D /var/lib/postgresql/data -l /tmp/postgresql.log start"' >> /start.sh && \
    echo '' >> /start.sh && \
    echo '# Wait for PostgreSQL to be ready' >> /start.sh && \
    echo 'echo "Waiting for PostgreSQL to start..."' >> /start.sh && \
    echo 'until su - postgres -c "psql -p 5432 -c \"SELECT 1\"" >/dev/null 2>&1; do' >> /start.sh && \
    echo '  sleep 1' >> /start.sh && \
    echo 'done' >> /start.sh && \
    echo 'echo "PostgreSQL is ready!"' >> /start.sh && \
    echo '' >> /start.sh && \
    echo '# Set up application database and user' >> /start.sh && \
    echo 'su - postgres -c "createdb -p 5432 ecommerce_app"' >> /start.sh && \
    echo 'su - postgres -c "psql -p 5432 -c \"CREATE USER app_user WITH PASSWORD '"'"'app_password'"'"';\""' >> /start.sh && \
    echo 'su - postgres -c "psql -p 5432 -c \"GRANT ALL PRIVILEGES ON DATABASE ecommerce_app TO app_user;\""' >> /start.sh && \
    echo '' >> /start.sh && \
    echo '# Grant comprehensive permissions for app_user' >> /start.sh && \
    echo 'su - postgres -c "psql -p 5432 -d ecommerce_app -c \"GRANT ALL PRIVILEGES ON SCHEMA public TO app_user;\""' >> /start.sh && \
    echo 'su - postgres -c "psql -p 5432 -d ecommerce_app -c \"GRANT ALL PRIVILEGES ON ALL TABLES IN SCHEMA public TO app_user;\""' >> /start.sh && \
    echo 'su - postgres -c "psql -p 5432 -d ecommerce_app -c \"GRANT ALL PRIVILEGES ON ALL SEQUENCES IN SCHEMA public TO app_user;\""' >> /start.sh && \
    echo 'su - postgres -c "psql -p 5432 -d ecommerce_app -c \"ALTER DEFAULT PRIVILEGES IN SCHEMA public GRANT ALL ON TABLES TO app_user;\""' >> /start.sh && \
    echo 'su - postgres -c "psql -p 5432 -d ecommerce_app -c \"ALTER DEFAULT PRIVILEGES IN SCHEMA public GRANT ALL ON SEQUENCES TO app_user;\""' >> /start.sh && \
    echo 'echo "Database setup complete with proper permissions!"' >> /start.sh && \
    echo '' >> /start.sh && \
    echo '# Keep container running' >> /start.sh && \
    echo 'tail -f /dev/null' >> /start.sh

RUN chmod +x /start.sh

CMD ["/start.sh"]
The startup script handles PostgreSQL initialization, user creation, and permission management automatically.

Step 2: Launch and Set Up the Environment

Start the Cluster

vers up
The vers up command will:
  • Build your custom rootfs with PostgreSQL and Node.js
  • Start a cluster with your new environment
  • Create the root VM with database services

Connect and Initialize Database

vers connect
Inside the VM, start the database services and create your project structure:
# Start PostgreSQL and application setup (runs in background)
/start.sh & sleep 10
You should see output confirming PostgreSQL is ready and the database is configured.

Set Up Node.js Dependencies

Create a package.json file in your project root to define your Node.js project dependencies and metadata. For your convenience, we’ve pre-installed emacs and vim for file editing. This file tells npm what packages to install for database connectivity:
{
  "name": "postgres-testing",
  "version": "1.0.0",
  "description": "PostgreSQL state testing with Vers",
  "main": "index.js",
  "dependencies": {
    "pg": "^8.11.0"
  },
  "scripts": {
    "test": "node test.js"
  }
}
Install the PostgreSQL Node.js driver:
npm install

Step 3: Create the Base Schema and Test Infrastructure

Build the Database Schema

Create a file called schema.sql containing a comprehensive e-commerce database schema with sample data. This SQL file will set up all the tables and relationships needed for our testing scenarios:
-- E-commerce database schema for A/B testing

-- Users table
CREATE TABLE users (
    id SERIAL PRIMARY KEY,
    username VARCHAR(50) UNIQUE NOT NULL,
    email VARCHAR(100) UNIQUE NOT NULL,
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
    user_type VARCHAR(20) DEFAULT 'regular'
);

-- Products table
CREATE TABLE products (
    id SERIAL PRIMARY KEY,
    name VARCHAR(100) NOT NULL,
    price DECIMAL(10,2) NOT NULL,
    category VARCHAR(50),
    stock_quantity INTEGER DEFAULT 0,
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

-- Orders table
CREATE TABLE orders (
    id SERIAL PRIMARY KEY,
    user_id INTEGER REFERENCES users(id),
    total_amount DECIMAL(10,2) NOT NULL,
    status VARCHAR(20) DEFAULT 'pending',
    payment_method VARCHAR(50),
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

-- Order items table
CREATE TABLE order_items (
    id SERIAL PRIMARY KEY,
    order_id INTEGER REFERENCES orders(id),
    product_id INTEGER REFERENCES products(id),
    quantity INTEGER NOT NULL,
    price DECIMAL(10,2) NOT NULL
);

-- Insert sample data
INSERT INTO users (username, email, user_type) VALUES
    ('john_doe', '[email protected]', 'regular'),
    ('jane_smith', '[email protected]', 'premium'),
    ('bob_wilson', '[email protected]', 'regular'),
    ('alice_brown', '[email protected]', 'premium');

INSERT INTO products (name, price, category, stock_quantity) VALUES
    ('Laptop Pro', 1299.99, 'electronics', 50),
    ('Wireless Mouse', 29.99, 'electronics', 200),
    ('Coffee Mug', 12.99, 'home', 100),
    ('Desk Chair', 199.99, 'furniture', 25),
    ('Notebook', 8.99, 'office', 150);

INSERT INTO orders (user_id, total_amount, status, payment_method) VALUES
    (1, 1329.98, 'completed', 'credit_card'),
    (2, 212.98, 'pending', 'paypal'),
    (3, 29.99, 'completed', 'credit_card');

INSERT INTO order_items (order_id, product_id, quantity, price) VALUES
    (1, 1, 1, 1299.99),
    (1, 2, 1, 29.99),
    (2, 4, 1, 199.99),
    (2, 3, 1, 12.99),
    (3, 2, 1, 29.99);
Load the schema into your database:
sudo -u postgres psql -p 5432 -d ecommerce_app -f schema.sql

Create the Base Test Class

Create a file called db-test.js that contains a reusable database testing class. This class handles PostgreSQL connections, displays database state, and provides common functionality that all your tests will inherit:
const { Client } = require("pg");

class DatabaseTest {
  constructor() {
    this.client = new Client({
      host: "127.0.0.1",
      port: 5432,
      database: "ecommerce_app",
      user: "app_user",
      password: "app_password",
    });
  }

  async connect() {
    await this.client.connect();
    console.log("✓ Connected to PostgreSQL");
  }

  async disconnect() {
    await this.client.end();
    console.log("✓ Disconnected from PostgreSQL");
  }

  async showDatabaseState() {
    console.log("\n=== Current Database State ===");

    // Show users
    const users = await this.client.query("SELECT * FROM users ORDER BY id");
    console.log(`Users (${users.rows.length}):`);
    users.rows.forEach((user) => {
      console.log(`  - ${user.username} (${user.email}) - ${user.user_type}`);
    });

    // Show products
    const products = await this.client.query(
      "SELECT * FROM products ORDER BY id"
    );
    console.log(`\nProducts (${products.rows.length}):`);
    products.rows.forEach((product) => {
      console.log(
        `  - ${product.name}: $${product.price} (Stock: ${product.stock_quantity})`
      );
    });

    // Show orders
    const orders = await this.client.query(`
      SELECT o.*, u.username
      FROM orders o
      JOIN users u ON o.user_id = u.id
      ORDER BY o.id
    `);
    console.log(`\nOrders (${orders.rows.length}):`);
    orders.rows.forEach((order) => {
      console.log(
        `  - Order #${order.id}: ${order.username} - $${order.total_amount} (${order.status})`
      );
    });
  }

  async getOrderStats() {
    const stats = await this.client.query(`
      SELECT
        COUNT(*) as total_orders,
        SUM(total_amount) as total_revenue,
        AVG(total_amount) as avg_order_value,
        COUNT(CASE WHEN status = 'completed' THEN 1 END) as completed_orders
      FROM orders
    `);

    console.log("\n=== Order Statistics ===");
    const row = stats.rows[0];
    console.log(`Total Orders: ${row.total_orders}`);
    console.log(
      `Total Revenue: $${parseFloat(row.total_revenue || 0).toFixed(2)}`
    );
    console.log(
      `Average Order Value: $${parseFloat(row.avg_order_value || 0).toFixed(2)}`
    );
    console.log(`Completed Orders: ${row.completed_orders}`);
  }
}

module.exports = DatabaseTest;

// Run if called directly
if (require.main === module) {
  async function main() {
    const db = new DatabaseTest();
    try {
      await db.connect();
      await db.showDatabaseState();
      await db.getOrderStats();
    } catch (error) {
      console.error("Database test failed:", error.message);
    } finally {
      await db.disconnect();
    }
  }
  main();
}
Test your base setup:
node db-test.js
You should see output showing your users, products, orders, and statistics.

Step 4: Save Your State

Similar to git, Vers allows you to create snapshots of your VM instances for reference. When you commit, Vers captures:
  • Complete filesystem state
  • Memory state
  • Running processes
  • Network configuration
# Exit the VM
exit

# Create a commit at this decision point
vers commit --tag "Base e-commerce schema with sample data loaded"

# Check our status
vers status

Step 5: Branch for Different Feature Implementations

This is where Vers shines! Instead of rebuilding database state for each feature test, we’ll branch from this established baseline. Our new branches will inherit the same state data we committed:
  • Complete filesystem state
  • Memory state
  • Running processes
  • Network configuration
Each time we branch, we’ll use the newly created VM instance copy to test a new feature.

Create Premium Features Branch

# Create and switch to premium features branch
vers checkout -c premium-features

# Create a child VM for premium features testing
vers branch --name premium-features-test

# Connect to the new VM
vers connect
Use the VM ID provided by the vers branch command.

Implement Premium Features

Create a file called premium-test.js that extends the base test class to add premium user functionality. This test demonstrates how to add new database tables and modify existing data to implement premium features:
const DatabaseTest = require("./db-test");

class PremiumFeaturesTest extends DatabaseTest {
  async addPremiumFeatures() {
    console.log("🔄 Testing Premium Features Implementation...");

    // Add premium features table
    await this.client.query(`
      CREATE TABLE IF NOT EXISTS premium_features (
        id SERIAL PRIMARY KEY,
        user_id INTEGER REFERENCES users(id),
        feature_name VARCHAR(50) NOT NULL,
        enabled BOOLEAN DEFAULT true,
        expires_at TIMESTAMP
      )
    `);

    // Add premium features for premium users
    await this.client.query(`
      INSERT INTO premium_features (user_id, feature_name, expires_at)
      SELECT id, 'priority_support', CURRENT_TIMESTAMP + INTERVAL '1 year'
      FROM users WHERE user_type = 'premium'
    `);

    await this.client.query(`
      INSERT INTO premium_features (user_id, feature_name, expires_at)
      SELECT id, 'advanced_analytics', CURRENT_TIMESTAMP + INTERVAL '1 year'
      FROM users WHERE user_type = 'premium'
    `);

    console.log("✓ Premium features table created and populated");

    // Add premium pricing tier
    await this.client.query(`
      UPDATE products 
      SET price = price * 0.9
      WHERE id IN (
        SELECT DISTINCT product_id 
        FROM order_items oi
        JOIN orders o ON oi.order_id = o.id
        JOIN users u ON o.user_id = u.id
        WHERE u.user_type = 'premium'
      )
    `);

    console.log(
      "✓ Premium user discount (10%) applied to frequently ordered products"
    );
  }

  async showPremiumStats() {
    console.log("\n=== Premium Features Analysis ===");

    const premiumStats = await this.client.query(`
      SELECT 
        u.user_type,
        COUNT(pf.id) as feature_count,
        COUNT(DISTINCT pf.feature_name) as unique_features
      FROM users u
      LEFT JOIN premium_features pf ON u.id = pf.user_id
      GROUP BY u.user_type
      ORDER BY u.user_type
    `);

    premiumStats.rows.forEach((row) => {
      console.log(
        `${row.user_type} users: ${row.feature_count} features (${row.unique_features} unique)`
      );
    });

    // Show revenue impact
    const revenueImpact = await this.client.query(`
      SELECT 
        u.user_type,
        COUNT(o.id) as order_count,
        SUM(o.total_amount) as total_revenue,
        AVG(o.total_amount) as avg_order_value
      FROM users u
      LEFT JOIN orders o ON u.id = o.user_id  
      GROUP BY u.user_type
      ORDER BY total_revenue DESC
    `);

    console.log("\n=== Revenue by User Type ===");
    revenueImpact.rows.forEach((row) => {
      console.log(
        `${row.user_type}: ${row.order_count} orders, $${parseFloat(
          row.total_revenue || 0
        ).toFixed(2)} revenue`
      );
    });
  }
}

async function main() {
  const test = new PremiumFeaturesTest();
  try {
    await test.connect();
    await test.addPremiumFeatures();
    await test.showDatabaseState();
    await test.showPremiumStats();
  } catch (error) {
    console.error("Premium features test failed:", error.message);
  } finally {
    await test.disconnect();
  }
}

main();
Run the premium features test:
node premium-test.js

Create Inventory Management Branch

# Exit current VM and create inventory branch
exit

# Create inventory branch from the same base state
vers branch --name inventory-management-test

# Switch to inventory branch
vers checkout inventory-management-test

# Connect to inventory VM
vers connect

Implement Inventory Management

Create a file called inventory-test.js that extends the base test class to add comprehensive inventory tracking. This test shows how to implement inventory movements, stock alerts, and automated stock level monitoring:
const DatabaseTest = require("./db-test");

class InventoryManagementTest extends DatabaseTest {
  async addInventoryTracking() {
    console.log("🔄 Testing Inventory Management System...");

    // Add inventory tracking table
    await this.client.query(`
      CREATE TABLE IF NOT EXISTS inventory_movements (
        id SERIAL PRIMARY KEY,
        product_id INTEGER REFERENCES products(id),
        movement_type VARCHAR(20) NOT NULL, -- 'in', 'out', 'adjustment'
        quantity INTEGER NOT NULL,
        reason VARCHAR(100),
        created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
      )
    `);

    // Add low stock alerts table
    await this.client.query(`
      CREATE TABLE IF NOT EXISTS stock_alerts (
        id SERIAL PRIMARY KEY,
        product_id INTEGER REFERENCES products(id),
        alert_type VARCHAR(20) NOT NULL, -- 'low_stock', 'out_of_stock'
        threshold_quantity INTEGER,
        created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
        resolved_at TIMESTAMP
      )
    `);

    console.log("✓ Inventory tracking tables created");

    // Simulate some inventory movements
    await this.client.query(`
      INSERT INTO inventory_movements (product_id, movement_type, quantity, reason) VALUES
      (1, 'out', 5, 'Sales orders'),
      (2, 'in', 50, 'New shipment'),
      (3, 'adjustment', -10, 'Damaged goods'),
      (4, 'out', 8, 'Sales orders'),
      (5, 'in', 25, 'Restock')
    `);

    // Update stock quantities based on movements
    await this.client.query(
      `UPDATE products SET stock_quantity = stock_quantity - 5 WHERE id = 1`
    );
    await this.client.query(
      `UPDATE products SET stock_quantity = stock_quantity + 50 WHERE id = 2`
    );
    await this.client.query(
      `UPDATE products SET stock_quantity = stock_quantity - 10 WHERE id = 3`
    );
    await this.client.query(
      `UPDATE products SET stock_quantity = stock_quantity - 8 WHERE id = 4`
    );
    await this.client.query(
      `UPDATE products SET stock_quantity = stock_quantity + 25 WHERE id = 5`
    );

    console.log("✓ Inventory movements recorded and stock updated");

    // Create low stock alerts for products below threshold
    await this.client.query(`
      INSERT INTO stock_alerts (product_id, alert_type, threshold_quantity)
      SELECT id, 'low_stock', 20
      FROM products 
      WHERE stock_quantity < 20 AND stock_quantity > 0
    `);

    await this.client.query(`
      INSERT INTO stock_alerts (product_id, alert_type, threshold_quantity)
      SELECT id, 'out_of_stock', 0
      FROM products
      WHERE stock_quantity <= 0
    `);

    console.log("✓ Stock alerts generated");
  }

  async showInventoryStats() {
    console.log("\n=== Inventory Management Analysis ===");

    // Show current stock levels
    const stockLevels = await this.client.query(`
      SELECT name, stock_quantity,
        CASE 
          WHEN stock_quantity <= 0 THEN 'OUT_OF_STOCK'
          WHEN stock_quantity < 20 THEN 'LOW_STOCK'
          WHEN stock_quantity < 50 THEN 'MODERATE'
          ELSE 'GOOD'
        END as stock_status
      FROM products
      ORDER BY stock_quantity ASC
    `);

    console.log("\nCurrent Stock Levels:");
    stockLevels.rows.forEach((row) => {
      console.log(
        `  - ${row.name}: ${row.stock_quantity} units (${row.stock_status})`
      );
    });

    // Show inventory movements
    const movements = await this.client.query(`
      SELECT p.name, im.movement_type, im.quantity, im.reason, im.created_at
      FROM inventory_movements im
      JOIN products p ON im.product_id = p.id
      ORDER BY im.created_at DESC
    `);

    console.log("\nRecent Inventory Movements:");
    movements.rows.forEach((row) => {
      const sign = row.movement_type === "in" ? "+" : "-";
      console.log(`  - ${row.name}: ${sign}${row.quantity} (${row.reason})`);
    });

    // Show active alerts
    const alerts = await this.client.query(`
      SELECT p.name, sa.alert_type, sa.threshold_quantity
      FROM stock_alerts sa
      JOIN products p ON sa.product_id = p.id
      WHERE sa.resolved_at IS NULL
      ORDER BY sa.created_at DESC
    `);

    console.log("\nActive Stock Alerts:");
    alerts.rows.forEach((row) => {
      console.log(`  - ${row.name}: ${row.alert_type.toUpperCase()}`);
    });
  }
}

async function main() {
  const test = new InventoryManagementTest();
  try {
    await test.connect();
    await test.addInventoryTracking();
    await test.showDatabaseState();
    await test.showInventoryStats();
  } catch (error) {
    console.error("Inventory management test failed:", error.message);
  } finally {
    await test.disconnect();
  }
}

main();
Run the inventory management test:
node inventory-test.js

Step 6: Visualize and Execute Parallel Tests

View Your VM Tree

# Exit VM and check the structure
exit

# See the branching structure
vers tree
You’ll see output showing:
  • Root VM: Paused parent with the base database state
  • Premium Features VM: Running child with premium user features
  • Inventory Management VM: Running child with inventory tracking

Execute Tests Non-Interactively

# Run inventory test without connecting
vers execute "node inventory-test.js"

# Switch to premium features branch and test
vers checkout premium-features-test
vers execute "node premium-test.js"

Parallel Testing Workflow

Open multiple terminals to run tests simultaneously: Terminal 1:
cd postgres-database-testing
vers checkout premium-features-test
vers execute "node premium-test.js"
Terminal 2:
cd postgres-database-testing
vers checkout inventory-management-test
vers execute "node inventory-test.js"
Both database scenarios run in parallel from the same starting state!

Commit Test Results

# Commit premium features results
vers checkout premium-features-test
vers commit --tag "Premium features implementation tested successfully"

# Commit inventory management results
vers checkout inventory-management-test
vers commit --tag "Inventory management system tested successfully"

# View commit history
vers log

Key Benefits Demonstrated

Database State Preservation

  • Complex schemas: Fully populated database states are captured and reusable
  • Data integrity: No risk of losing reference data during experimentation
  • Transaction history: Each branch maintains its own transaction log

Parallel Development

  • Schema experiments: Test different database designs simultaneously
  • Feature testing: Implement competing approaches without conflicts
  • Performance comparison: Benchmark different indexing or query strategies

Time Savings

  • Traditional approach: Reset database → rebuild schema → reload data for each test
  • Vers approach: Branch once → test multiple scenarios in parallel
  • Compound benefits: Time savings increase exponentially with schema complexity

Risk Reduction

  • Safe experimentation: Test destructive operations without data loss
  • Easy rollback: Return to any previous database state instantly
  • Parallel validation: Compare results across different implementations

Real-World Applications

This database branching pattern excels for:

Schema Evolution Testing

  • Test different migration strategies from the same starting point
  • Compare performance impacts of different indexing approaches
  • Validate data integrity across different schema versions

Feature Development

  • Implement competing feature designs in parallel
  • Test different user permission models
  • Compare storage requirements for alternative approaches

Performance Optimization

  • Test different query optimization strategies
  • Compare indexing approaches with identical data sets
  • Benchmark competing database configurations

Data Migration Validation

  • Test migration scripts against identical production-like data
  • Validate data transformation logic across multiple scenarios
  • Compare migration performance with different approaches

Common Issues and Solutions

PostgreSQL Connection Issues

If you encounter connection problems:
# Check PostgreSQL status
sudo -u postgres psql -p 5432 -c "SELECT version();"

# Restart PostgreSQL if needed
sudo -u postgres /usr/lib/postgresql/15/bin/pg_ctl -D /var/lib/postgresql/data restart

# Check log files for errors
tail -f /tmp/postgresql.log

Permission Problems

For database permission issues:
# Grant additional permissions if needed
sudo -u postgres psql -p 5432 -d ecommerce_app -c "GRANT ALL PRIVILEGES ON ALL TABLES IN SCHEMA public TO app_user;"
sudo -u postgres psql -p 5432 -d ecommerce_app -c "GRANT ALL PRIVILEGES ON ALL SEQUENCES IN SCHEMA public TO app_user;"

Network Configuration

If Node.js can’t connect to PostgreSQL:
# Verify PostgreSQL is listening
netstat -tlnp | grep 5432

# Test connection manually
sudo -u postgres psql -h 127.0.0.1 -p 5432 -d ecommerce_app -U app_user

Summary

Vers transforms database development by enabling:
  • Instant state capture: Preserve complex database states at any point
  • Parallel experimentation: Test multiple approaches simultaneously
  • Risk-free changes: Experiment without fear of data loss
  • Efficient workflows: Eliminate repetitive setup and data loading
Instead of thinking about database tests as isolated scripts that rebuild state each time, Vers enables database development trees that branch from shared states, dramatically improving productivity and reducing risk.

Next Steps

  • Explore more complex branching scenarios with multiple decision points
  • Test database migration strategies using the same base data
  • Experiment with different PostgreSQL configurations in parallel branches
  • Integrate database testing workflows into your CI/CD pipeline
  • Try the approach with other databases like MySQL or MongoDB