Files
trading_bot_v4/docs/setup/N8N_DATABASE_SETUP.md
mindesbunister 14d5de2c64 chore: Organize workspace structure - move docs, workflows, scripts to subdirectories
Organization:
- Created docs/ with setup/, guides/, history/ subdirectories
- Created workflows/ with trading/, analytics/, telegram/, archive/ subdirectories
- Created scripts/ with docker/, setup/, testing/ subdirectories
- Created tests/ for TypeScript test files
- Created archive/ for unused reference files

Moved files:
- 17 documentation files → docs/
- 16 workflow JSON files → workflows/
- 10 shell scripts → scripts/
- 4 test files → tests/
- 5 unused files → archive/

Updated:
- README.md with new file structure and documentation paths

Deleted:
- data/ (empty directory)
- screenshots/ (empty directory)

Critical files remain in root:
- telegram_command_bot.py (active bot - used by Dockerfile)
- watch-restart.sh (systemd service dependency)
- All Dockerfiles and docker-compose files
- All environment files

Validation:
 Containers running (trading-bot-v4, telegram-trade-bot, postgres)
 API responding (positions endpoint tested)
 Telegram bot functional (/status command tested)
 All critical files present in root

No code changes - purely organizational.
System continues running without interruption.

Recovery: git revert HEAD or git reset --hard cleanup-before
2025-10-27 12:59:25 +01:00

318 lines
9.0 KiB
Markdown

# n8n Database Integration Setup Guide
## Overview
This guide shows you how to connect your n8n instance to the Trading Bot v4 PostgreSQL database for automated analysis and insights.
## Database Connection Details
⚠️ **IMPORTANT:** n8n is on a **different Docker network** than the trading bot postgres. You MUST use the host machine IP or localhost.
### ✅ CORRECT Connection (n8n is on different network)
```
Type: PostgreSQL
Host: host.docker.internal (or your machine's IP like 172.18.0.1)
Port: 5432
Database: trading_bot_v4
User: postgres
Password: postgres
SSL Mode: disable
```
### Alternative: Use localhost with host networking
If `host.docker.internal` doesn't work, find your docker network gateway:
```bash
docker inspect n8n --format '{{range .NetworkSettings.Networks}}{{.Gateway}}{{end}}'
# Result: 172.18.0.1 (use this as Host)
```
### Network Details (for reference)
- **n8n network:** `compose_files_default` (172.18.0.0/16)
- **Trading bot network:** `traderv4_trading-net` (172.28.0.0/16)
- **PostgreSQL container:** `trading-bot-postgres` on traderv4_trading-net (172.28.0.2)
- **PostgreSQL exposed port:** 5432 → localhost:5432
Since they're on different networks, use the **host machine as bridge**.
## Setup Steps
### 1. Access n8n
Open your browser and navigate to:
```
http://localhost:8098
```
### 2. Create PostgreSQL Credential
1. Click on **Credentials** in the left sidebar
2. Click **Add Credential**
3. Search for **Postgres** and select it
4. Fill in the connection details (see above)
5. Name it: **Trading Bot Database**
6. Click **Test Connection** to verify
7. Click **Save**
### 3. Import Workflows
Four pre-built workflow templates are ready in your workspace:
#### A. Database Analytics (n8n-database-analytics.json)
**Purpose:** Query and analyze closed trades with statistical calculations
**Features:**
- Fetches last 100 closed trades
- Calculates win rate, P&L, profit factor
- Breaks down by symbol, direction, and exit reason
- Identifies best performing setups
**To import:**
1. Go to **Workflows****Add Workflow**
2. Click **...** menu → **Import from File**
3. Select `n8n-database-analytics.json`
4. Update PostgreSQL node to use "Trading Bot Database" credential
5. Click **Save**
6. Click **Execute Workflow** to test
#### B. Daily Trading Report (n8n-daily-report.json)
**Purpose:** Automated daily summary at midnight (stores in DailyStats table)
**Features:**
- Runs automatically at 00:05 every day
- Calculates yesterday's performance
- Breaks down by symbol
- Stores in DailyStats table for historical tracking
- Calculates win rate, profit factor, avg hold time
**To import:**
1. Import workflow from file
2. Update both PostgreSQL nodes with "Trading Bot Database" credential
3. **Activate** the workflow (toggle in top right)
4. Will run automatically at midnight
#### C. Pattern Analysis (n8n-pattern-analysis.json)
**Purpose:** Discover which times/conditions produce best results
**Features:**
- **Hourly Analysis:** Which hours have best win rate
- **Daily Analysis:** Which days perform best
- **Hold Time Analysis:** Optimal position duration
- Generates actionable recommendations
**Example insights:**
- "Focus trading around 14:00-16:00 (75% win rate)"
- "Trade more on Tuesday, avoid Friday"
- "Target exits in 15-30 min range"
**To import:**
1. Import workflow
2. Update all 3 PostgreSQL nodes with credential
3. Run manually to see insights
#### D. Stop Loss Analysis (n8n-stop-loss-analysis.json)
**Purpose:** Optimize stop loss distances and understand exit patterns
**Features:**
- Exit reason breakdown (stopped out vs targets hit)
- Stop distance effectiveness (tight vs wide stops)
- Symbol-specific stop performance
- Calculates profit factor (avg win / avg loss)
- Recommendations for stop optimization
**Example insights:**
- "⚠️ High stop hit rate - consider wider stops"
- "💡 Normal (1-1.5%) stops perform 45% better than tight stops"
- "✅ Risk/reward ratio is positive"
**To import:**
1. Import workflow
2. Update all 3 PostgreSQL nodes with credential
3. Run manually to analyze
## Database Schema Reference
### Trade Table (Main table)
Key fields for analysis:
```sql
id, symbol, direction, entryPrice, exitPrice, quantity,
notionalSize, realizedPnL, realizedPnLPercent,
entryTime, exitTime, holdTimeSeconds,
stopLossPrice, takeProfitPrice1, takeProfitPrice2,
exitReason, status, isTestTrade
```
### Common Queries
#### Get all closed trades (last 30 days)
```sql
SELECT * FROM "Trade"
WHERE status = 'closed'
AND "isTestTrade" = false
AND "entryTime" >= NOW() - INTERVAL '30 days'
ORDER BY "entryTime" DESC;
```
#### Calculate win rate
```sql
SELECT
COUNT(*) as total_trades,
COUNT(CASE WHEN "realizedPnL" > 0 THEN 1 END) as wins,
ROUND(COUNT(CASE WHEN "realizedPnL" > 0 THEN 1 END)::numeric / COUNT(*) * 100, 2) as win_rate
FROM "Trade"
WHERE status = 'closed' AND "isTestTrade" = false;
```
#### Best performing symbols
```sql
SELECT
symbol,
COUNT(*) as trades,
SUM("realizedPnL") as total_pnl,
AVG("realizedPnL") as avg_pnl
FROM "Trade"
WHERE status = 'closed' AND "isTestTrade" = false
GROUP BY symbol
ORDER BY total_pnl DESC;
```
## Workflow Automation Ideas
### 1. Performance Alerts
**Trigger:** Schedule (every 6 hours)
**Query:** Check if win rate drops below 50% in last 24h
**Action:** Send Telegram notification to pause trading
### 2. Best Setup Detector
**Trigger:** Manual or daily
**Query:** Find symbol + direction + time combinations with >70% win rate
**Action:** Save insights to config for bot to prioritize
### 3. Drawdown Monitor
**Trigger:** After each trade (webhook)
**Query:** Calculate rolling 10-trade P&L
**Action:** Auto-reduce position size if in drawdown
### 4. Exit Optimization
**Trigger:** Weekly
**Query:** Compare TP1 vs TP2 hit rates and P&L
**Action:** Recommend adjustment to TP levels
## Connecting Workflows to Trading Bot
### Webhook from Trading Bot to n8n
In your n8n workflow:
1. Add **Webhook** trigger node
2. Set HTTP Method: POST
3. Note the webhook URL: `http://localhost:8098/webhook/your-unique-id`
In trading bot code (e.g., after trade closes):
```typescript
// Send trade data to n8n for analysis
await fetch('http://localhost:8098/webhook/your-unique-id', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
tradeId: trade.id,
symbol: trade.symbol,
pnl: trade.realizedPnL,
exitReason: trade.exitReason
})
});
```
### Update Bot Config from n8n
In your n8n workflow (after analysis):
1. Add **HTTP Request** node
2. URL: `http://trading-bot-v4:3000/api/settings`
3. Method: POST
4. Body: Updated config based on analysis
Example - adjust stop loss based on analysis:
```json
{
"STOP_LOSS_PERCENT": 1.5,
"USE_DUAL_STOPS": true
}
```
## Advanced Use Cases
### Machine Learning Integration
Use n8n to:
1. Export trade data to CSV
2. Send to Python ML service via HTTP
3. Receive predictions
4. Update bot configuration
### Multi-Timeframe Analysis
Create workflow that:
1. Queries trades by hour/day/week
2. Identifies patterns at each timeframe
3. Generates trading schedule recommendations
### Risk Management Automation
Build workflow that:
1. Monitors account balance
2. Calculates daily/weekly profit target
3. Auto-pauses bot after hitting target
4. Resumes trading next day/week
## Troubleshooting
### Connection Refused
- Verify PostgreSQL container is running: `docker ps | grep postgres`
- Check port mapping: `docker port trading-bot-postgres`
- Ensure n8n and postgres are on same Docker network
### Query Timeouts
- Add indexes to frequently queried columns
- Limit result sets with `LIMIT` clause
- Use date range filters to reduce dataset
### Empty Results
- Check `isTestTrade` filter (you may want to include test trades for testing)
- Verify date range in queries
- Ensure trades have been closed (`status = 'closed'`)
## Testing Your Setup
### Quick Test Query
In n8n, create a workflow with:
1. Manual Trigger
2. PostgreSQL node with query:
```sql
SELECT COUNT(*) as total_trades,
MAX("entryTime") as last_trade
FROM "Trade";
```
3. Execute and verify results
### Verify Test Trade Flag
```sql
SELECT
"isTestTrade",
COUNT(*) as count
FROM "Trade"
GROUP BY "isTestTrade";
```
Expected output:
- `false`: Production trades
- `true`: Test trades from /api/trading/test-db
## Next Steps
1. **Import all 4 workflows** and test each one
2. **Activate the Daily Report** workflow for automated tracking
3. **Run Pattern Analysis** to discover your best trading times
4. **Run Stop Loss Analysis** to optimize risk management
5. **Create custom workflows** based on your specific needs
## Support
If you encounter issues:
1. Check n8n logs: `docker logs n8n`
2. Check postgres logs: `docker logs trading-bot-postgres`
3. Test database connection from host: `psql -h localhost -p 5432 -U postgres -d trading_bot_v4`
4. Verify bot is writing to database: Check `/analytics` page in web UI
---
**Pro Tip:** Start with the Pattern Analysis workflow to understand your trading patterns, then use those insights to create automated optimization workflows!