Scaling
Horizontal scaling
Section titled “Horizontal scaling”Scale the runtime container to handle more concurrent executions:
docker compose up -d --scale runtime=3The API round-robins requests across runtime instances.
Considerations
Section titled “Considerations”- Each runtime instance respects
MAX_CONCURRENT_EXECUTIONS - With 3 instances × 10 concurrent = 30 total concurrent executions
- All instances share the same storage backend
- No shared state between runtime instances (stateless)
Vertical scaling
Section titled “Vertical scaling”Adjust resource limits in docker-compose.yml:
runtime: deploy: resources: limits: cpus: '8' memory: 16G reservations: cpus: '2' memory: 4GAlso increase concurrent executions:
MAX_CONCURRENT_EXECUTIONS=20Database scaling
Section titled “Database scaling”For higher database throughput:
Option 1: Tune PostgreSQL
Section titled “Option 1: Tune PostgreSQL”Add these settings to your PostgreSQL service:
postgres: command: - "postgres" - "-c" - "max_connections=200" - "-c" - "shared_buffers=256MB"Option 2: External database
Section titled “Option 2: External database”Use a managed PostgreSQL service (AWS RDS, Azure Database, Cloud SQL):
# Remove the postgres service from docker-compose.yml# Update DATABASE_URL to point to external databaseDATABASE_URL=postgresql://user:pass@external-host:5432/flowlikeLoad balancing
Section titled “Load balancing”For multiple API instances, use an external load balancer:
docker compose up -d --scale api=2Then configure nginx or similar:
upstream flowlike-api { server localhost:8080;}
server { listen 80; location / { proxy_pass http://flowlike-api; }}Production recommendations
Section titled “Production recommendations”| Workload | API replicas | Runtime replicas | Runtime resources |
|---|---|---|---|
| Development | 1 | 1 | 2 CPU, 4GB |
| Small team | 1 | 2 | 4 CPU, 8GB |
| Medium | 2 | 4 | 4 CPU, 8GB |
| Large | 3+ | 6+ | 8 CPU, 16GB |
For large production workloads, consider Kubernetes deployment for better orchestration, auto-scaling, and isolation.