A complete Docker Compose setup for running n8n locally with:
- PostgreSQL with pgvector extension for vector database capabilities
 - Automatic database initialization with separate databases for n8n and documents
 - Cloudflare Tunnel for webhook support and external access
 
- 🚀 n8n workflow automation with PostgreSQL backend
 - 🔍 Vector database support using pgvector extension
 - 🌐 External webhook access via Cloudflare Tunnel
 - 📊 Dual database setup: Main n8n database + separate documents database
 - 🔧 Automated initialization with custom schema and tables
 
- Docker and Docker Compose installed
 - Cloudflare account with tunnel token (for webhook functionality)
 
- 
Configure environment variables
IMPORTANT: Copy the
example.envfile and name it.envand adjust the variables:# Database Configuration POSTGRES_USER=your_admin_user POSTGRES_PASSWORD=your_secure_password POSTGRES_DB=n8n POSTGRES_DOCUMENTS_DB=documents # n8n Configuration N8N_HOST=your-domain.com N8N_PROTOCOL=https N8N_WEBHOOK_URL=https://your-domain.com # Get the token from the cloudflare tunnel for the N8N_HOST TUNNEL_TOKEN=your_cloudflare_tunnel_token
 - 
Start the services
docker-compose up -d
 - 
Access n8n
- Local: http://localhost:5678
 - External: https://your-domain.com (via Cloudflare Tunnel)
 
 - 
Stop the services
docker-compose stop
 
The setup automatically creates two databases:
- Used by n8n for storing workflows, executions, and settings
 - Configured via environment variables in n8n service
 
- Dedicated vector database for AI/ML workflows
 - Includes pgvector extension for similarity search
 - Schema defined in 
postgres/sql/schema.sql - Perfect for RAG (Retrieval-Augmented Generation) workflows
 
- Port: 5432
 - Extensions: pgvector for vector operations
 - Initialization: Runs 
postgres/init-data.shon first startup - Health Check: Ensures database is ready before starting n8n
 
- Port: 5678
 - Database: PostgreSQL backend
 - Storage: Persistent volume for user data
 - Dependencies: Waits for PostgreSQL health check
 
- Purpose: Exposes n8n to the internet for webhook functionality
 - Configuration: Uses TUNNEL_TOKEN from environment
 - Benefits: No need to open firewall ports or configure reverse proxy
 
The documents database is configured with pgvector extension, making it perfect for:
- Embeddings storage for AI workflows
 - Similarity search operations
 - RAG implementations in n8n
 - Document indexing and retrieval
 
Example n8n workflow usage:
- Store document embeddings in the 
documentsdatabase - Perform similarity searches using pgvector functions
 - Integrate with AI nodes for intelligent document retrieval
 
Modify postgres/sql/schema.sql to customize the documents database structure.
Update postgres/init-data.sh to add custom database setup logic.
All configuration is centralized in .env file for easy customization.
# Check PostgreSQL logs
docker-compose logs postgres
# Verify database health
docker-compose exec postgres pg_isready -U admin -d n8n# Check tunnel logs
docker-compose logs cloudflared
# Verify tunnel token is valid# Check n8n logs
docker-compose logs n8n
# Verify n8n is healthy
curl http://localhost:5678/healthz- Database data: Stored in Docker volume 
db_storage - n8n user data: Stored in Docker volume 
n8n_storage - Volumes persist between container restarts