Deployment
This guide covers deploying the Email Assistant for production use, including local scheduling, Docker containers, and cloud platforms.
Deployment Options
┌─────────────────────────────────────────────────────────────────────────────┐ │ DEPLOYMENT OPTIONS │ ├─────────────────────────────────────────────────────────────────────────────┤ │ │ │ LOCAL CLOUD CONTAINER │ │ ┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐ │ │ │ macOS │ │ Google Cloud Run│ │ Docker │ │ │ │ LaunchAgent │ │ │ │ │ │ │ ├─────────────────┤ ├─────────────────┤ ├─────────────────┤ │ │ │ Linux Cron │ │ AWS Lambda │ │ Docker Compose │ │ │ ├─────────────────┤ ├─────────────────┤ │ │ │ │ │ Windows Task │ │ Railway/Render │ │ │ │ │ │ Scheduler │ │ │ │ │ │ │ └─────────────────┘ └─────────────────┘ └─────────────────┘ │ │ │ └─────────────────────────────────────────────────────────────────────────────┘
Local Deployment
macOS LaunchAgent
Create a LaunchAgent for automatic scheduling:
<!-- ~/Library/LaunchAgents/com.emailassistant.digest.plist -->
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>Label</key>
<string>com.emailassistant.digest</string>
<key>ProgramArguments</key>
<array>
<string>/usr/bin/python3</string>
<string>/Users/you/EmailAssistant/src/main.py</string>
</array>
<key>WorkingDirectory</key>
<string>/Users/you/EmailAssistant</string>
<key>StartCalendarInterval</key>
<dict>
<key>Hour</key>
<integer>8</integer>
<key>Minute</key>
<integer>0</integer>
</dict>
<key>StandardOutPath</key>
<string>/Users/you/EmailAssistant/logs/stdout.log</string>
<key>StandardErrorPath</key>
<string>/Users/you/EmailAssistant/logs/stderr.log</string>
<key>EnvironmentVariables</key>
<dict>
<key>GOOGLE_API_KEY</key>
<string>your-api-key</string>
</dict>
</dict>
</plist>Load the agent:
launchctl load ~/Library/LaunchAgents/com.emailassistant.digest.plist
Linux Cron
# Edit crontab crontab -e # Run daily at 8 AM 0 8 * * * cd /home/user/EmailAssistant && /usr/bin/python3 src/main.py >> logs/cron.log 2>&1
Web Server (systemd)
# /etc/systemd/system/emailassistant.service [Unit] Description=Email Assistant Web Server After=network.target [Service] Type=simple User=www-data WorkingDirectory=/opt/emailassistant ExecStart=/opt/emailassistant/venv/bin/gunicorn -w 2 -b 0.0.0.0:8001 server:app Restart=always Environment=GOOGLE_API_KEY=your-key [Install] WantedBy=multi-user.target
sudo systemctl enable emailassistant sudo systemctl start emailassistant
Docker Deployment
Dockerfile
FROM python:3.11-slim WORKDIR /app # Install dependencies COPY requirements.txt . RUN pip install --no-cache-dir -r requirements.txt # Copy application COPY . . # Create non-root user RUN useradd -m appuser && chown -R appuser:appuser /app USER appuser # Expose port EXPOSE 8001 # Run server CMD ["gunicorn", "-w", "2", "-b", "0.0.0.0:8001", "server:app"]
Docker Compose
# docker-compose.yml
version: '3.8'
services:
web:
build: .
ports:
- "8001:8001"
environment:
- GOOGLE_API_KEY=${GOOGLE_API_KEY}
volumes:
- ./data:/app/data
- ./credentials:/app/credentials:ro
restart: unless-stopped
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8001/api/status"]
interval: 30s
timeout: 10s
retries: 3
scheduler:
build: .
command: python src/scheduler.py
environment:
- GOOGLE_API_KEY=${GOOGLE_API_KEY}
volumes:
- ./data:/app/data
- ./credentials:/app/credentials:ro
restart: unless-stopped# Start services docker-compose up -d # View logs docker-compose logs -f # Stop services docker-compose down
Cloud Deployment
Google Cloud Run
1. Build and Push
# Authenticate gcloud auth configure-docker # Build docker build -t gcr.io/YOUR_PROJECT/email-assistant:latest . # Push docker push gcr.io/YOUR_PROJECT/email-assistant:latest
2. Deploy
gcloud run deploy email-assistant \ --image gcr.io/YOUR_PROJECT/email-assistant:latest \ --platform managed \ --region us-central1 \ --allow-unauthenticated \ --set-env-vars "GOOGLE_API_KEY=xxx"
3. Cloud Scheduler
gcloud scheduler jobs create http email-digest \ --schedule="0 8 * * *" \ --uri="https://your-service-url.run.app/api/refresh" \ --http-method=POST \ --time-zone="America/New_York"
Railway / Render
Both platforms support automatic deployment from GitHub:
- 1Connect GitHub repository
- 2Set environment variables in dashboard
- 3Deploy automatically on push
| Variable | Required | Description |
|---|---|---|
| GOOGLE_API_KEY | Yes | Gemini API key |
| PORT | No | Server port (auto-set) |
Environment Configuration
Production Settings
# config/production.py
import os
class ProductionConfig:
"""Production configuration."""
# Security
DEBUG = False
TESTING = False
# API Keys (from environment)
GOOGLE_API_KEY = os.environ["GOOGLE_API_KEY"]
# Logging
LOG_LEVEL = "WARNING"
LOG_FILE = "/var/log/emailassistant/app.log"
# Performance
CACHE_SIZE = 2000
REQUEST_TIMEOUT = 30
# Rate limiting
RATE_LIMIT = "100/hour"Security Checklist
- API keys stored in environment variables or secret manager
- Gmail credentials secured with appropriate permissions
- HTTPS enabled for web interface
- Rate limiting configured
- Log files do not contain sensitive data
- Regular credential rotation scheduled
Monitoring
Health Check Endpoint
@app.route("/health")
def health():
"""Health check for container orchestrators."""
checks = {
"api": check_gemini_connection(),
"gmail": check_gmail_connection(),
"disk": check_disk_space(),
}
healthy = all(checks.values())
status_code = 200 if healthy else 503
return jsonify({
"status": "healthy" if healthy else "unhealthy",
"checks": checks,
"timestamp": datetime.now().isoformat()
}), status_codeAlerting
Set up alerts for:
- Script execution failures
- API rate limit errors
- Low cache hit rates
- High error rates
Backup and Recovery
Data Backup
#!/bin/bash # backup.sh - Run daily BACKUP_DIR="/backups/emailassistant" DATE=$(date +%Y%m%d) # Backup data directory tar -czf "$BACKUP_DIR/data-$DATE.tar.gz" /app/data/ # Keep only last 30 days find "$BACKUP_DIR" -name "*.tar.gz" -mtime +30 -delete
Recovery Procedure
- 1Stop the service
- 2Restore data directory from backup
- 3Verify Gmail credentials
- 4Restart the service
- 5Run manual refresh to verify
Scaling Considerations
Horizontal Scaling
┌─────────────────────────────────────────────────────────────────┐ │ HORIZONTAL SCALING │ ├─────────────────────────────────────────────────────────────────┤ │ │ │ ┌─────────────────┐ │ │ │ Load Balancer │ │ │ └────────┬────────┘ │ │ │ │ │ ┌───────────────────┼───────────────────┐ │ │ │ │ │ │ │ ▼ ▼ ▼ │ │ ┌───────────┐ ┌───────────┐ ┌───────────┐ │ │ │ Instance 1│ │ Instance 2│ │ Instance N│ │ │ └─────┬─────┘ └─────┬─────┘ └─────┬─────┘ │ │ │ │ │ │ │ └───────────────────┼───────────────────┘ │ │ │ │ │ ▼ │ │ ┌─────────────────┐ │ │ │ Shared Cache │ │ │ │ (Redis) │ │ │ └─────────────────┘ │ │ │ └─────────────────────────────────────────────────────────────────┘
Redis for Shared Cache
import redis
redis_client = redis.from_url(os.environ.get("REDIS_URL"))
def get_cached(key: str) -> dict | None:
"""Get from Redis cache."""
data = redis_client.get(key)
return json.loads(data) if data else None
def set_cached(key: str, value: dict, ttl: int = 3600):
"""Set in Redis cache with TTL."""
redis_client.setex(key, ttl, json.dumps(value))