Add DEMO feature to the project
This commit is contained in:
@@ -1,251 +0,0 @@
|
||||
# Service Initialization - Quick Reference
|
||||
|
||||
## The Problem You Identified
|
||||
|
||||
**Question**: "We have a migration job that runs Alembic migrations. Why should we also run migrations in the service init process?"
|
||||
|
||||
**Answer**: **You shouldn't!** This is architectural redundancy that should be fixed.
|
||||
|
||||
## Current State (Redundant ❌)
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────┐
|
||||
│ Kubernetes Deployment Starts │
|
||||
└─────────────────────────────────────────┘
|
||||
↓
|
||||
┌─────────────────────────────────────────┐
|
||||
│ 1. Migration Job Runs │
|
||||
│ - Command: run_migrations.py │
|
||||
│ - Calls: initialize_service_database│
|
||||
│ - Runs: alembic upgrade head │
|
||||
│ - Status: Complete ✓ │
|
||||
└─────────────────────────────────────────┘
|
||||
↓
|
||||
┌─────────────────────────────────────────┐
|
||||
│ 2. Service Pod Starts │
|
||||
│ - Startup: _handle_database_tables()│
|
||||
│ - Calls: initialize_service_database│ ← REDUNDANT!
|
||||
│ - Runs: alembic upgrade head │ ← REDUNDANT!
|
||||
│ - Status: Complete ✓ │
|
||||
└─────────────────────────────────────────┘
|
||||
↓
|
||||
Service Ready (Slower)
|
||||
```
|
||||
|
||||
**Problems**:
|
||||
- ❌ Same code runs twice
|
||||
- ❌ 1-2 seconds slower startup per pod
|
||||
- ❌ Confusion: who is responsible for migrations?
|
||||
- ❌ Race conditions possible with multiple replicas
|
||||
|
||||
## Recommended State (Efficient ✅)
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────┐
|
||||
│ Kubernetes Deployment Starts │
|
||||
└─────────────────────────────────────────┘
|
||||
↓
|
||||
┌─────────────────────────────────────────┐
|
||||
│ 1. Migration Job Runs │
|
||||
│ - Command: run_migrations.py │
|
||||
│ - Runs: alembic upgrade head │
|
||||
│ - Status: Complete ✓ │
|
||||
└─────────────────────────────────────────┘
|
||||
↓
|
||||
┌─────────────────────────────────────────┐
|
||||
│ 2. Service Pod Starts │
|
||||
│ - Startup: _verify_database_ready() │ ← VERIFY ONLY!
|
||||
│ - Checks: Tables exist? ✓ │
|
||||
│ - Checks: Alembic version? ✓ │
|
||||
│ - NO migration execution │
|
||||
└─────────────────────────────────────────┘
|
||||
↓
|
||||
Service Ready (Faster!)
|
||||
```
|
||||
|
||||
**Benefits**:
|
||||
- ✅ Clear separation of concerns
|
||||
- ✅ 50-80% faster service startup
|
||||
- ✅ No race conditions
|
||||
- ✅ Easier debugging
|
||||
|
||||
## Implementation (3 Simple Changes)
|
||||
|
||||
### 1. Add to `shared/database/init_manager.py`
|
||||
|
||||
```python
|
||||
class DatabaseInitManager:
|
||||
def __init__(
|
||||
self,
|
||||
# ... existing params
|
||||
verify_only: bool = False # ← ADD THIS
|
||||
):
|
||||
self.verify_only = verify_only
|
||||
|
||||
async def initialize_database(self) -> Dict[str, Any]:
|
||||
if self.verify_only:
|
||||
# Only check DB is ready, don't run migrations
|
||||
return await self._verify_database_state()
|
||||
|
||||
# Existing full initialization
|
||||
# ...
|
||||
```
|
||||
|
||||
### 2. Update `shared/service_base.py`
|
||||
|
||||
```python
|
||||
async def _handle_database_tables(self):
|
||||
skip_migrations = os.getenv("SKIP_MIGRATIONS", "false").lower() == "true"
|
||||
|
||||
result = await initialize_service_database(
|
||||
database_manager=self.database_manager,
|
||||
service_name=self.service_name,
|
||||
verify_only=skip_migrations # ← ADD THIS PARAMETER
|
||||
)
|
||||
```
|
||||
|
||||
### 3. Add to Kubernetes Deployments
|
||||
|
||||
```yaml
|
||||
containers:
|
||||
- name: external-service
|
||||
env:
|
||||
- name: SKIP_MIGRATIONS # ← ADD THIS
|
||||
value: "true" # Service only verifies, doesn't run migrations
|
||||
- name: ENVIRONMENT
|
||||
value: "production" # Disable create_all fallback
|
||||
```
|
||||
|
||||
## Quick Decision Matrix
|
||||
|
||||
| Environment | SKIP_MIGRATIONS | ENVIRONMENT | Behavior |
|
||||
|-------------|-----------------|-------------|----------|
|
||||
| **Development** | `false` | `development` | Full check, allow create_all |
|
||||
| **Staging** | `true` | `staging` | Verify only, fail if not ready |
|
||||
| **Production** | `true` | `production` | Verify only, fail if not ready |
|
||||
|
||||
## What Each Component Does
|
||||
|
||||
### Migration Job (runs once on deployment)
|
||||
```
|
||||
✓ Creates tables (if first deployment)
|
||||
✓ Runs pending migrations
|
||||
✓ Updates alembic_version
|
||||
✗ Does NOT start service
|
||||
```
|
||||
|
||||
### Service Startup (runs on every pod)
|
||||
**With SKIP_MIGRATIONS=false** (current):
|
||||
```
|
||||
✓ Checks database connection
|
||||
✓ Checks for migrations
|
||||
✓ Runs alembic upgrade head ← REDUNDANT
|
||||
✓ Starts service
|
||||
Time: ~3-5 seconds
|
||||
```
|
||||
|
||||
**With SKIP_MIGRATIONS=true** (recommended):
|
||||
```
|
||||
✓ Checks database connection
|
||||
✓ Verifies tables exist
|
||||
✓ Verifies alembic_version exists
|
||||
✗ Does NOT run migrations
|
||||
✓ Starts service
|
||||
Time: ~1-2 seconds ← 50-60% FASTER
|
||||
```
|
||||
|
||||
## Testing the Change
|
||||
|
||||
### Before (Current Behavior):
|
||||
```bash
|
||||
# Check service logs
|
||||
kubectl logs -n bakery-ia deployment/external-service | grep -i migration
|
||||
|
||||
# Output shows:
|
||||
[info] Running pending migrations service=external
|
||||
INFO [alembic.runtime.migration] Context impl PostgresqlImpl.
|
||||
[info] Migrations applied successfully service=external
|
||||
```
|
||||
|
||||
### After (With SKIP_MIGRATIONS=true):
|
||||
```bash
|
||||
# Check service logs
|
||||
kubectl logs -n bakery-ia deployment/external-service | grep -i migration
|
||||
|
||||
# Output shows:
|
||||
[info] Migration skip enabled - verifying database only
|
||||
[info] Database verified successfully
|
||||
```
|
||||
|
||||
## Rollout Strategy
|
||||
|
||||
### Step 1: Development (Test)
|
||||
```bash
|
||||
# In local development, test the change:
|
||||
export SKIP_MIGRATIONS=true
|
||||
# Start service - should verify DB and start fast
|
||||
```
|
||||
|
||||
### Step 2: Staging (Validate)
|
||||
```yaml
|
||||
# Update staging manifests
|
||||
env:
|
||||
- name: SKIP_MIGRATIONS
|
||||
value: "true"
|
||||
```
|
||||
|
||||
### Step 3: Production (Deploy)
|
||||
```yaml
|
||||
# Update production manifests
|
||||
env:
|
||||
- name: SKIP_MIGRATIONS
|
||||
value: "true"
|
||||
- name: ENVIRONMENT
|
||||
value: "production"
|
||||
```
|
||||
|
||||
## Expected Results
|
||||
|
||||
### Performance:
|
||||
- 📊 **Service startup**: 3-5s → 1-2s (50-60% faster)
|
||||
- 📊 **Horizontal scaling**: Immediate (no migration check delay)
|
||||
- 📊 **Database load**: Reduced (no redundant migration queries)
|
||||
|
||||
### Reliability:
|
||||
- 🛡️ **No race conditions**: Only job handles migrations
|
||||
- 🛡️ **Clear errors**: "DB not ready" vs "migration failed"
|
||||
- 🛡️ **Fail-fast**: Services won't start if DB not initialized
|
||||
|
||||
### Maintainability:
|
||||
- 📝 **Clear logs**: Migration job logs separate from service logs
|
||||
- 📝 **Easier debugging**: Check job for migration issues
|
||||
- 📝 **Clean architecture**: Operations separated from application
|
||||
|
||||
## FAQs
|
||||
|
||||
**Q: What if migrations fail in the job?**
|
||||
A: Service pods won't start (they'll fail verification), which is correct behavior.
|
||||
|
||||
**Q: What about development where I want fast iteration?**
|
||||
A: Keep `SKIP_MIGRATIONS=false` in development. Services will still run migrations.
|
||||
|
||||
**Q: Is this backwards compatible?**
|
||||
A: Yes! Default behavior is unchanged. SKIP_MIGRATIONS only activates when explicitly set.
|
||||
|
||||
**Q: What about database schema drift?**
|
||||
A: Services verify schema on startup (check alembic_version). If drift detected, startup fails.
|
||||
|
||||
**Q: Can I still use create_all() in development?**
|
||||
A: Yes! Set `ENVIRONMENT=development` and `SKIP_MIGRATIONS=false`.
|
||||
|
||||
## Summary
|
||||
|
||||
**Your Question**: Why run migrations in both job and service?
|
||||
|
||||
**Answer**: You shouldn't! This is redundant architecture.
|
||||
|
||||
**Solution**: Add `SKIP_MIGRATIONS=true` to service deployments.
|
||||
|
||||
**Result**: Faster, clearer, more reliable service initialization.
|
||||
|
||||
**See Full Details**: `SERVICE_INITIALIZATION_ARCHITECTURE.md`
|
||||
499
DEMO_ARCHITECTURE.md
Normal file
499
DEMO_ARCHITECTURE.md
Normal file
@@ -0,0 +1,499 @@
|
||||
# Demo Architecture - Production Demo System
|
||||
|
||||
## Overview
|
||||
|
||||
This document describes the complete demo architecture for providing prospects with isolated, ephemeral demo sessions to explore the Bakery IA platform.
|
||||
|
||||
## Key Features
|
||||
|
||||
- ✅ **Session Isolation**: Each prospect gets their own isolated copy of demo data
|
||||
- ✅ **Spanish Content**: All demo data in Spanish for the Spanish market
|
||||
- ✅ **Two Business Models**: Individual bakery and central baker satellite
|
||||
- ✅ **Automatic Cleanup**: Sessions automatically expire after 30 minutes
|
||||
- ✅ **Read-Mostly Access**: Prospects can explore but critical operations are restricted
|
||||
- ✅ **Production Ready**: Scalable to 200+ concurrent demo sessions
|
||||
|
||||
## Architecture Components
|
||||
|
||||
### 1. Demo Session Service
|
||||
|
||||
**Location**: `services/demo_session/`
|
||||
|
||||
**Responsibilities**:
|
||||
- Create isolated demo sessions
|
||||
- Manage session lifecycle (create, extend, destroy)
|
||||
- Clone base demo data to virtual tenants
|
||||
- Track session metrics and activity
|
||||
|
||||
**Key Endpoints**:
|
||||
```
|
||||
GET /api/demo/accounts # Get public demo account info
|
||||
POST /api/demo/session/create # Create new demo session
|
||||
POST /api/demo/session/extend # Extend session expiration
|
||||
POST /api/demo/session/destroy # Destroy session
|
||||
GET /api/demo/session/{id} # Get session info
|
||||
GET /api/demo/stats # Get usage statistics
|
||||
```
|
||||
|
||||
### 2. Demo Data Seeding
|
||||
|
||||
**Location**: `scripts/demo/`
|
||||
|
||||
**Scripts**:
|
||||
- `seed_demo_users.py` - Creates demo user accounts
|
||||
- `seed_demo_tenants.py` - Creates base demo tenants (templates)
|
||||
- `seed_demo_inventory.py` - Populates inventory with Spanish data (25 ingredients per template)
|
||||
- `clone_demo_tenant.py` - Clones data from base template to virtual tenant (runs as K8s Job)
|
||||
|
||||
**Demo Accounts**:
|
||||
|
||||
#### Individual Bakery (Panadería San Pablo)
|
||||
```
|
||||
Email: demo.individual@panaderiasanpablo.com
|
||||
Password: DemoSanPablo2024!
|
||||
Business Model: Producción Local
|
||||
Features: Production, Recipes, Inventory, Forecasting, POS, Sales
|
||||
```
|
||||
|
||||
#### Central Baker Satellite (Panadería La Espiga)
|
||||
```
|
||||
Email: demo.central@panaderialaespiga.com
|
||||
Password: DemoLaEspiga2024!
|
||||
Business Model: Obrador Central + Punto de Venta
|
||||
Features: Suppliers, Inventory, Orders, POS, Sales, Forecasting
|
||||
```
|
||||
|
||||
### 3. Gateway Middleware
|
||||
|
||||
**Location**: `gateway/app/middleware/demo_middleware.py`
|
||||
|
||||
**Responsibilities**:
|
||||
- Intercept requests with demo session IDs
|
||||
- Inject virtual tenant ID
|
||||
- Enforce operation restrictions
|
||||
- Track session activity
|
||||
|
||||
**Allowed Operations**:
|
||||
```python
|
||||
# Read - All allowed
|
||||
GET, HEAD, OPTIONS: *
|
||||
|
||||
# Limited Write - Realistic testing
|
||||
POST: /api/pos/sales, /api/orders, /api/inventory/adjustments
|
||||
PUT: /api/pos/sales/*, /api/orders/*
|
||||
|
||||
# Blocked
|
||||
DELETE: All (read-only for destructive operations)
|
||||
```
|
||||
|
||||
### 4. Redis Cache Layer
|
||||
|
||||
**Purpose**: Store frequently accessed demo session data
|
||||
|
||||
**Data Cached**:
|
||||
- Session metadata
|
||||
- Inventory summaries
|
||||
- POS session data
|
||||
- Recent sales
|
||||
|
||||
**TTL**: 30 minutes (auto-cleanup)
|
||||
|
||||
### 5. Kubernetes Resources
|
||||
|
||||
**Databases**:
|
||||
- `demo-session-db` - Tracks session records
|
||||
|
||||
**Services**:
|
||||
- `demo-session-service` - Main demo service (2 replicas)
|
||||
|
||||
**Jobs** (Initialization):
|
||||
- `demo-seed-users` - Creates demo users
|
||||
- `demo-seed-tenants` - Creates demo tenant templates
|
||||
- `demo-seed-inventory` - Populates inventory data (25 ingredients per tenant)
|
||||
|
||||
**Dynamic Jobs** (Runtime):
|
||||
- `demo-clone-{virtual_tenant_id}` - Created per session to clone data from template
|
||||
|
||||
**CronJob** (Maintenance):
|
||||
- `demo-session-cleanup` - Runs hourly to cleanup expired sessions
|
||||
|
||||
**RBAC**:
|
||||
- `demo-session-sa` - ServiceAccount for demo-session-service
|
||||
- `demo-session-job-creator` - Role allowing job creation and pod management
|
||||
- `demo-seed-role` - Role for seed jobs to access databases
|
||||
|
||||
## Data Flow
|
||||
|
||||
### Session Creation
|
||||
|
||||
```
|
||||
1. User clicks "Probar Demo" on website
|
||||
↓
|
||||
2. Frontend calls POST /api/demo/session/create
|
||||
{
|
||||
"demo_account_type": "individual_bakery"
|
||||
}
|
||||
↓
|
||||
3. Demo Session Service:
|
||||
- Generates unique session_id: "demo_abc123..."
|
||||
- Creates virtual_tenant_id: UUID
|
||||
- Stores session in database
|
||||
- Returns session_token (JWT)
|
||||
↓
|
||||
4. Kubernetes Job Cloning (background):
|
||||
- Demo service triggers K8s Job with clone script
|
||||
- Job container uses CLONE_JOB_IMAGE (inventory-service image)
|
||||
- Clones inventory data from base template tenant
|
||||
- Uses ORM models for safe data copying
|
||||
- Job runs with IfNotPresent pull policy (works in dev & prod)
|
||||
↓
|
||||
5. Frontend receives:
|
||||
{
|
||||
"session_id": "demo_abc123...",
|
||||
"virtual_tenant_id": "uuid-here",
|
||||
"expires_at": "2025-10-02T12:30:00Z",
|
||||
"session_token": "eyJ..."
|
||||
}
|
||||
↓
|
||||
6. Frontend stores session_token in cookie/localStorage
|
||||
All subsequent requests include:
|
||||
Header: X-Demo-Session-Id: demo_abc123...
|
||||
```
|
||||
|
||||
### Request Handling
|
||||
|
||||
```
|
||||
1. Request arrives at Gateway
|
||||
↓
|
||||
2. Demo Middleware checks:
|
||||
- Is X-Demo-Session-Id present?
|
||||
- Is session still active?
|
||||
- Is operation allowed?
|
||||
↓
|
||||
3. If valid:
|
||||
- Injects X-Tenant-Id: {virtual_tenant_id}
|
||||
- Routes to appropriate service
|
||||
↓
|
||||
4. Service processes request:
|
||||
- Reads/writes data for virtual tenant
|
||||
- No knowledge of demo vs. real tenant
|
||||
↓
|
||||
5. Response returned to user
|
||||
```
|
||||
|
||||
### Session Cleanup
|
||||
|
||||
```
|
||||
Every hour (CronJob):
|
||||
|
||||
1. Demo Cleanup Service queries:
|
||||
SELECT * FROM demo_sessions
|
||||
WHERE status = 'active'
|
||||
AND expires_at < NOW()
|
||||
↓
|
||||
2. For each expired session:
|
||||
- Mark as 'expired'
|
||||
- Delete all virtual tenant data
|
||||
- Delete Redis keys
|
||||
- Update statistics
|
||||
↓
|
||||
3. Weekly cleanup:
|
||||
DELETE FROM demo_sessions
|
||||
WHERE status = 'destroyed'
|
||||
AND destroyed_at < NOW() - INTERVAL '7 days'
|
||||
```
|
||||
|
||||
## Database Schema
|
||||
|
||||
### demo_sessions Table
|
||||
|
||||
```sql
|
||||
CREATE TABLE demo_sessions (
|
||||
id UUID PRIMARY KEY,
|
||||
session_id VARCHAR(100) UNIQUE NOT NULL,
|
||||
|
||||
-- Ownership
|
||||
user_id UUID,
|
||||
ip_address VARCHAR(45),
|
||||
user_agent VARCHAR(500),
|
||||
|
||||
-- Demo linking
|
||||
base_demo_tenant_id UUID NOT NULL,
|
||||
virtual_tenant_id UUID NOT NULL,
|
||||
demo_account_type VARCHAR(50) NOT NULL,
|
||||
|
||||
-- Lifecycle
|
||||
status VARCHAR(20) NOT NULL, -- active, expired, destroyed
|
||||
created_at TIMESTAMP WITH TIME ZONE NOT NULL,
|
||||
expires_at TIMESTAMP WITH TIME ZONE NOT NULL,
|
||||
last_activity_at TIMESTAMP WITH TIME ZONE,
|
||||
destroyed_at TIMESTAMP WITH TIME ZONE,
|
||||
|
||||
-- Metrics
|
||||
request_count INTEGER DEFAULT 0,
|
||||
data_cloned BOOLEAN DEFAULT FALSE,
|
||||
redis_populated BOOLEAN DEFAULT FALSE,
|
||||
|
||||
-- Metadata
|
||||
metadata JSONB
|
||||
);
|
||||
|
||||
CREATE INDEX idx_session_id ON demo_sessions(session_id);
|
||||
CREATE INDEX idx_virtual_tenant ON demo_sessions(virtual_tenant_id);
|
||||
CREATE INDEX idx_status ON demo_sessions(status);
|
||||
CREATE INDEX idx_expires_at ON demo_sessions(expires_at);
|
||||
```
|
||||
|
||||
### tenants Table (Updated)
|
||||
|
||||
```sql
|
||||
ALTER TABLE tenants ADD COLUMN is_demo BOOLEAN DEFAULT FALSE;
|
||||
ALTER TABLE tenants ADD COLUMN is_demo_template BOOLEAN DEFAULT FALSE;
|
||||
ALTER TABLE tenants ADD COLUMN base_demo_tenant_id UUID;
|
||||
ALTER TABLE tenants ADD COLUMN demo_session_id VARCHAR(100);
|
||||
ALTER TABLE tenants ADD COLUMN demo_expires_at TIMESTAMP WITH TIME ZONE;
|
||||
|
||||
CREATE INDEX idx_is_demo ON tenants(is_demo);
|
||||
CREATE INDEX idx_demo_session ON tenants(demo_session_id);
|
||||
```
|
||||
|
||||
## Deployment
|
||||
|
||||
### Initial Deployment
|
||||
|
||||
```bash
|
||||
# 1. Deploy infrastructure (databases, redis, rabbitmq)
|
||||
kubectl apply -k infrastructure/kubernetes/overlays/prod
|
||||
|
||||
# 2. Run migrations
|
||||
# (Automatically handled by migration jobs)
|
||||
|
||||
# 3. Seed demo data
|
||||
# (Automatically handled by demo-seed-* jobs)
|
||||
|
||||
# 4. Verify demo system
|
||||
kubectl get jobs -n bakery-ia | grep demo-seed
|
||||
kubectl logs -f job/demo-seed-users -n bakery-ia
|
||||
kubectl logs -f job/demo-seed-tenants -n bakery-ia
|
||||
kubectl logs -f job/demo-seed-inventory -n bakery-ia
|
||||
|
||||
# 5. Test demo session creation
|
||||
curl -X POST http://your-domain/api/demo/session/create \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"demo_account_type": "individual_bakery"}'
|
||||
```
|
||||
|
||||
### Using Tilt (Local Development)
|
||||
|
||||
```bash
|
||||
# Start Tilt
|
||||
tilt up
|
||||
|
||||
# Demo resources in Tilt UI:
|
||||
# - databases: demo-session-db
|
||||
# - migrations: demo-session-migration
|
||||
# - services: demo-session-service
|
||||
# - demo-init: demo-seed-users, demo-seed-tenants, demo-seed-inventory
|
||||
# - config: patch-demo-session-env (sets CLONE_JOB_IMAGE dynamically)
|
||||
|
||||
# Tilt automatically:
|
||||
# 1. Gets inventory-service image tag (e.g., tilt-abc123)
|
||||
# 2. Patches demo-session-service with CLONE_JOB_IMAGE env var
|
||||
# 3. Clone jobs use this image with IfNotPresent pull policy
|
||||
```
|
||||
|
||||
## Monitoring
|
||||
|
||||
### Key Metrics
|
||||
|
||||
```python
|
||||
# Session Statistics
|
||||
GET /api/demo/stats
|
||||
|
||||
{
|
||||
"total_sessions": 1250,
|
||||
"active_sessions": 45,
|
||||
"expired_sessions": 980,
|
||||
"destroyed_sessions": 225,
|
||||
"avg_duration_minutes": 18.5,
|
||||
"total_requests": 125000
|
||||
}
|
||||
```
|
||||
|
||||
### Health Checks
|
||||
|
||||
```bash
|
||||
# Demo Session Service
|
||||
curl http://demo-session-service:8000/health
|
||||
|
||||
# Check active sessions
|
||||
kubectl exec -it deployment/demo-session-service -- \
|
||||
python -c "from app.services import *; print(get_active_sessions())"
|
||||
```
|
||||
|
||||
### Logs
|
||||
|
||||
```bash
|
||||
# Demo session service logs
|
||||
kubectl logs -f deployment/demo-session-service -n bakery-ia
|
||||
|
||||
# Demo seed job logs
|
||||
kubectl logs job/demo-seed-inventory -n bakery-ia
|
||||
|
||||
# Cleanup cron job logs
|
||||
kubectl logs -l app=demo-cleanup -n bakery-ia --tail=100
|
||||
```
|
||||
|
||||
## Scaling Considerations
|
||||
|
||||
### Current Limits
|
||||
|
||||
- **Concurrent Sessions**: ~200 (2 replicas × ~100 sessions each)
|
||||
- **Redis Memory**: ~1-2 GB (10 MB per session × 200)
|
||||
- **PostgreSQL**: ~5-10 GB (30 MB per virtual tenant × 200)
|
||||
- **Session Duration**: 30 minutes (configurable)
|
||||
- **Extensions**: Maximum 3 per session
|
||||
|
||||
### Scaling Up
|
||||
|
||||
```yaml
|
||||
# Scale demo-session-service
|
||||
kubectl scale deployment/demo-session-service --replicas=4 -n bakery-ia
|
||||
|
||||
# Increase Redis memory (if needed)
|
||||
# Edit redis deployment, increase memory limits
|
||||
|
||||
# Adjust session settings
|
||||
# Edit demo-session configmap:
|
||||
DEMO_SESSION_DURATION_MINUTES: 45 # Increase session time
|
||||
DEMO_SESSION_MAX_EXTENSIONS: 5 # Allow more extensions
|
||||
```
|
||||
|
||||
## Security
|
||||
|
||||
### Public Demo Credentials
|
||||
|
||||
Demo credentials are **intentionally public** for prospect access:
|
||||
- Published on marketing website
|
||||
- Included in demo documentation
|
||||
- Safe because sessions are isolated and ephemeral
|
||||
|
||||
### Restrictions
|
||||
|
||||
1. **No Destructive Operations**: DELETE blocked
|
||||
2. **Limited Modifications**: Only realistic testing operations
|
||||
3. **No Sensitive Data Access**: Cannot change passwords, billing, etc.
|
||||
4. **Automatic Expiration**: Sessions auto-destroy after 30 minutes
|
||||
5. **Rate Limiting**: Standard gateway rate limits apply
|
||||
6. **No AI Training**: Forecast API blocked for demo accounts (no trained models)
|
||||
7. **Scheduler Prevention**: Procurement scheduler filters out demo tenants
|
||||
|
||||
### Data Privacy
|
||||
|
||||
- No real customer data in demo tenants
|
||||
- Session data automatically deleted
|
||||
- Anonymized analytics only
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Session Creation Fails
|
||||
|
||||
```bash
|
||||
# Check demo-session-service health
|
||||
kubectl get pods -l app=demo-session-service -n bakery-ia
|
||||
|
||||
# Check logs
|
||||
kubectl logs deployment/demo-session-service -n bakery-ia --tail=50
|
||||
|
||||
# Verify base demo tenants exist
|
||||
kubectl exec -it deployment/tenant-service -- \
|
||||
psql $TENANT_DATABASE_URL -c \
|
||||
"SELECT id, name, subdomain FROM tenants WHERE is_demo_template = true;"
|
||||
```
|
||||
|
||||
### Sessions Not Cleaning Up
|
||||
|
||||
```bash
|
||||
# Check cleanup cronjob
|
||||
kubectl get cronjobs -n bakery-ia
|
||||
kubectl get jobs -l app=demo-cleanup -n bakery-ia
|
||||
|
||||
# Manually trigger cleanup
|
||||
kubectl create job --from=cronjob/demo-session-cleanup manual-cleanup-$(date +%s) -n bakery-ia
|
||||
|
||||
# Check for orphaned sessions
|
||||
kubectl exec -it deployment/demo-session-service -- \
|
||||
psql $DEMO_SESSION_DATABASE_URL -c \
|
||||
"SELECT status, COUNT(*) FROM demo_sessions GROUP BY status;"
|
||||
```
|
||||
|
||||
### Redis Connection Issues
|
||||
|
||||
```bash
|
||||
# Test Redis connectivity
|
||||
kubectl exec -it deployment/demo-session-service -- \
|
||||
python -c "import redis; r=redis.Redis(host='redis-service'); print(r.ping())"
|
||||
|
||||
# Check Redis memory usage
|
||||
kubectl exec -it deployment/redis -- redis-cli INFO memory
|
||||
```
|
||||
|
||||
## Technical Implementation Details
|
||||
|
||||
### Data Cloning Architecture
|
||||
|
||||
**Choice: Kubernetes Job-based Cloning** (selected over service-based endpoints)
|
||||
|
||||
**Why K8s Jobs**:
|
||||
- Database-level operations (faster than API calls)
|
||||
- Scalable (one job per session, isolated execution)
|
||||
- No service-specific clone endpoints needed
|
||||
- Works in both dev (Tilt) and production
|
||||
|
||||
**How it Works**:
|
||||
1. Demo-session-service creates K8s Job via K8s API
|
||||
2. Job uses `CLONE_JOB_IMAGE` environment variable (configured image)
|
||||
3. In **Dev (Tilt)**: `patch-demo-session-env` sets dynamic Tilt image tag
|
||||
4. In **Production**: Deployment manifest has stable release tag
|
||||
5. Job runs `clone_demo_tenant.py` with `imagePullPolicy: IfNotPresent`
|
||||
6. Script uses ORM models to clone data safely
|
||||
|
||||
**Environment-based Image Configuration**:
|
||||
```yaml
|
||||
# Demo-session deployment
|
||||
env:
|
||||
- name: CLONE_JOB_IMAGE
|
||||
value: "bakery/inventory-service:latest" # Overridden by Tilt in dev
|
||||
|
||||
# Tilt automatically patches this to match actual inventory-service tag
|
||||
# e.g., bakery/inventory-service:tilt-abc123
|
||||
```
|
||||
|
||||
### AI Model Restrictions
|
||||
|
||||
**Fake Models in Database**:
|
||||
- Demo tenants have AI model records in database
|
||||
- No actual model files (.pkl, .h5) stored
|
||||
- Forecast API blocked at gateway level for demo accounts
|
||||
- Returns user-friendly error message
|
||||
|
||||
**Scheduler Prevention**:
|
||||
- Procurement scheduler filters `is_demo = true` tenants
|
||||
- Prevents automated procurement runs on demo data
|
||||
- Manual procurement still allowed for realistic testing
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
1. **Analytics Dashboard**: Track demo → paid conversion rates
|
||||
2. **Guided Tours**: In-app tutorials for demo users
|
||||
3. **Custom Demo Scenarios**: Let prospects choose specific features
|
||||
4. **Demo Recordings**: Capture anonymized session recordings
|
||||
5. **Multi-Region**: Deploy demo infrastructure in EU, US, LATAM
|
||||
6. **Sales & Orders Cloning**: Extend clone script to copy sales and orders data
|
||||
|
||||
## References
|
||||
|
||||
- [Demo Session Service API](services/demo_session/README.md)
|
||||
- [Demo Data Seeding](scripts/demo/README.md)
|
||||
- [Gateway Middleware](gateway/app/middleware/README.md)
|
||||
- [Kubernetes Manifests](infrastructure/kubernetes/base/components/demo-session/)
|
||||
584
DEMO_IMPLEMENTATION_SUMMARY.md
Normal file
584
DEMO_IMPLEMENTATION_SUMMARY.md
Normal file
@@ -0,0 +1,584 @@
|
||||
# Demo Architecture Implementation Summary
|
||||
|
||||
## ✅ Implementation Complete
|
||||
|
||||
All components of the production demo system have been implemented. This document provides a summary of what was created and how to use it.
|
||||
|
||||
---
|
||||
|
||||
## 📁 Files Created
|
||||
|
||||
### Demo Session Service (New Microservice)
|
||||
|
||||
```
|
||||
services/demo_session/
|
||||
├── app/
|
||||
│ ├── __init__.py
|
||||
│ ├── main.py # FastAPI application
|
||||
│ ├── api/
|
||||
│ │ ├── __init__.py
|
||||
│ │ ├── routes.py # API endpoints
|
||||
│ │ └── schemas.py # Pydantic models
|
||||
│ ├── core/
|
||||
│ │ ├── __init__.py
|
||||
│ │ ├── config.py # Settings
|
||||
│ │ ├── database.py # Database manager
|
||||
│ │ └── redis_client.py # Redis client
|
||||
│ ├── models/
|
||||
│ │ ├── __init__.py
|
||||
│ │ └── demo_session.py # Session model
|
||||
│ └── services/
|
||||
│ ├── __init__.py
|
||||
│ ├── session_manager.py # Session lifecycle
|
||||
│ ├── data_cloner.py # Data cloning
|
||||
│ └── cleanup_service.py # Cleanup logic
|
||||
├── migrations/
|
||||
│ ├── env.py
|
||||
│ ├── script.py.mako
|
||||
│ └── versions/
|
||||
├── requirements.txt
|
||||
├── Dockerfile
|
||||
└── alembic.ini
|
||||
```
|
||||
|
||||
### Demo Seeding Scripts
|
||||
|
||||
```
|
||||
scripts/demo/
|
||||
├── __init__.py
|
||||
├── seed_demo_users.py # Creates demo users
|
||||
├── seed_demo_tenants.py # Creates demo tenants
|
||||
├── seed_demo_inventory.py # Populates Spanish inventory (25 ingredients)
|
||||
└── clone_demo_tenant.py # Clones data from template (runs as K8s Job)
|
||||
```
|
||||
|
||||
### Gateway Middleware
|
||||
|
||||
```
|
||||
gateway/app/middleware/
|
||||
└── demo_middleware.py # Demo session handling
|
||||
```
|
||||
|
||||
### Kubernetes Resources
|
||||
|
||||
```
|
||||
infrastructure/kubernetes/base/
|
||||
├── components/demo-session/
|
||||
│ ├── deployment.yaml # Service deployment (with CLONE_JOB_IMAGE env)
|
||||
│ ├── service.yaml # K8s service
|
||||
│ ├── database.yaml # PostgreSQL DB
|
||||
│ └── rbac.yaml # RBAC for job creation
|
||||
├── migrations/
|
||||
│ └── demo-session-migration-job.yaml # Migration job
|
||||
├── jobs/
|
||||
│ ├── demo-seed-users-job.yaml # User seeding
|
||||
│ ├── demo-seed-tenants-job.yaml # Tenant seeding
|
||||
│ ├── demo-seed-inventory-job.yaml # Inventory seeding
|
||||
│ ├── demo-seed-rbac.yaml # RBAC permissions for seed jobs
|
||||
│ └── demo-clone-job-template.yaml # Reference template for clone jobs
|
||||
└── cronjobs/
|
||||
└── demo-cleanup-cronjob.yaml # Hourly cleanup
|
||||
```
|
||||
|
||||
### Documentation
|
||||
|
||||
```
|
||||
DEMO_ARCHITECTURE.md # Complete architecture guide
|
||||
DEMO_IMPLEMENTATION_SUMMARY.md # This file
|
||||
```
|
||||
|
||||
### Updated Files
|
||||
|
||||
```
|
||||
services/tenant/app/models/tenants.py # Added demo flags
|
||||
services/demo_session/app/services/k8s_job_cloner.py # K8s Job cloning implementation
|
||||
gateway/app/main.py # Added demo middleware
|
||||
gateway/app/middleware/demo_middleware.py # Converted to BaseHTTPMiddleware
|
||||
Tiltfile # Added demo resources + CLONE_JOB_IMAGE patching
|
||||
shared/config/base.py # Added demo-related settings
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🎯 Key Features Implemented
|
||||
|
||||
### 1. Session Isolation ✅
|
||||
- Each prospect gets isolated virtual tenant
|
||||
- No data interference between sessions
|
||||
- Automatic resource cleanup
|
||||
|
||||
### 2. Spanish Demo Data ✅
|
||||
- **Panadería San Pablo** (Individual Bakery)
|
||||
- Raw ingredients: Harina, Levadura, Mantequilla, etc.
|
||||
- Local production focus
|
||||
- Full recipe management
|
||||
|
||||
- **Panadería La Espiga** (Central Baker Satellite)
|
||||
- Pre-baked products from central baker
|
||||
- Supplier management
|
||||
- Order tracking
|
||||
|
||||
### 3. Redis Caching ✅
|
||||
- Hot data cached for fast access
|
||||
- Automatic TTL (30 minutes)
|
||||
- Session metadata storage
|
||||
|
||||
### 4. Gateway Integration ✅
|
||||
- Demo session detection
|
||||
- Operation restrictions
|
||||
- Virtual tenant injection
|
||||
|
||||
### 5. Automatic Cleanup ✅
|
||||
- Hourly CronJob cleanup
|
||||
- Expired session detection
|
||||
- Database and Redis cleanup
|
||||
|
||||
### 6. K8s Job-based Data Cloning ✅
|
||||
- Database-level cloning (faster than API calls)
|
||||
- Environment-based image configuration
|
||||
- Works in dev (Tilt dynamic tags) and production (stable tags)
|
||||
- Uses ORM models for safe data copying
|
||||
- `imagePullPolicy: IfNotPresent` for local images
|
||||
|
||||
### 7. AI & Scheduler Restrictions ✅
|
||||
- Fake AI models in database (no real files)
|
||||
- Forecast API blocked at gateway for demo accounts
|
||||
- Procurement scheduler filters out demo tenants
|
||||
- Manual operations still allowed for realistic testing
|
||||
|
||||
---
|
||||
|
||||
## 🚀 Quick Start
|
||||
|
||||
### Local Development with Tilt
|
||||
|
||||
```bash
|
||||
# Start all services including demo system
|
||||
tilt up
|
||||
|
||||
# Watch demo initialization
|
||||
tilt logs demo-seed-users
|
||||
tilt logs demo-seed-tenants
|
||||
tilt logs demo-seed-inventory
|
||||
|
||||
# Check demo service
|
||||
tilt logs demo-session-service
|
||||
```
|
||||
|
||||
### Test Demo Session Creation
|
||||
|
||||
```bash
|
||||
# Get demo accounts info
|
||||
curl http://localhost/api/demo/accounts | jq
|
||||
|
||||
# Create demo session
|
||||
curl -X POST http://localhost/api/demo/session/create \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"demo_account_type": "individual_bakery",
|
||||
"ip_address": "127.0.0.1"
|
||||
}' | jq
|
||||
|
||||
# Response:
|
||||
# {
|
||||
# "session_id": "demo_abc123...",
|
||||
# "virtual_tenant_id": "uuid-here",
|
||||
# "expires_at": "2025-10-02T12:30:00Z",
|
||||
# "session_token": "eyJ..."
|
||||
# }
|
||||
```
|
||||
|
||||
### Use Demo Session
|
||||
|
||||
```bash
|
||||
# Make request with demo session
|
||||
curl http://localhost/api/inventory/ingredients \
|
||||
-H "X-Demo-Session-Id: demo_abc123..." \
|
||||
-H "Content-Type: application/json"
|
||||
|
||||
# Try restricted operation (should fail)
|
||||
curl -X DELETE http://localhost/api/inventory/ingredients/uuid \
|
||||
-H "X-Demo-Session-Id: demo_abc123..."
|
||||
|
||||
# Response:
|
||||
# {
|
||||
# "error": "demo_restriction",
|
||||
# "message": "Esta operación no está permitida en cuentas demo..."
|
||||
# }
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 📊 Demo Accounts
|
||||
|
||||
### Account 1: Individual Bakery
|
||||
|
||||
```yaml
|
||||
Name: Panadería San Pablo - Demo
|
||||
Email: demo.individual@panaderiasanpablo.com
|
||||
Password: DemoSanPablo2024!
|
||||
Business Model: individual_bakery
|
||||
Location: Madrid, Spain
|
||||
|
||||
Features:
|
||||
- Production Management ✓
|
||||
- Recipe Management ✓
|
||||
- Inventory Tracking ✓
|
||||
- Demand Forecasting ✓
|
||||
- POS System ✓
|
||||
- Sales Analytics ✓
|
||||
|
||||
Data:
|
||||
- 20+ raw ingredients
|
||||
- 5+ finished products
|
||||
- Multiple stock lots
|
||||
- Production batches
|
||||
- Sales history
|
||||
```
|
||||
|
||||
### Account 2: Central Baker Satellite
|
||||
|
||||
```yaml
|
||||
Name: Panadería La Espiga - Demo
|
||||
Email: demo.central@panaderialaespiga.com
|
||||
Password: DemoLaEspiga2024!
|
||||
Business Model: central_baker_satellite
|
||||
Location: Barcelona, Spain
|
||||
|
||||
Features:
|
||||
- Supplier Management ✓
|
||||
- Inventory Tracking ✓
|
||||
- Order Management ✓
|
||||
- POS System ✓
|
||||
- Sales Analytics ✓
|
||||
- Demand Forecasting ✓
|
||||
|
||||
Data:
|
||||
- 15+ par-baked products
|
||||
- 10+ finished products
|
||||
- Supplier relationships
|
||||
- Delivery tracking
|
||||
- Sales history
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🔧 Configuration
|
||||
|
||||
### Session Settings
|
||||
|
||||
Edit `services/demo_session/app/core/config.py`:
|
||||
|
||||
```python
|
||||
DEMO_SESSION_DURATION_MINUTES = 30 # Session lifetime
|
||||
DEMO_SESSION_MAX_EXTENSIONS = 3 # Max extensions allowed
|
||||
REDIS_SESSION_TTL = 1800 # Redis cache TTL (seconds)
|
||||
```
|
||||
|
||||
### Operation Restrictions
|
||||
|
||||
Edit `gateway/app/middleware/demo_middleware.py`:
|
||||
|
||||
```python
|
||||
DEMO_ALLOWED_OPERATIONS = {
|
||||
"GET": ["*"],
|
||||
"POST": [
|
||||
"/api/pos/sales", # Allow sales
|
||||
"/api/orders", # Allow orders
|
||||
"/api/inventory/adjustments" # Allow adjustments
|
||||
],
|
||||
"DELETE": [] # Block all deletes
|
||||
}
|
||||
```
|
||||
|
||||
### Cleanup Schedule
|
||||
|
||||
Edit `infrastructure/kubernetes/base/cronjobs/demo-cleanup-cronjob.yaml`:
|
||||
|
||||
```yaml
|
||||
spec:
|
||||
schedule: "0 * * * *" # Every hour
|
||||
# Or:
|
||||
# schedule: "*/30 * * * *" # Every 30 minutes
|
||||
# schedule: "0 */3 * * *" # Every 3 hours
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 📈 Monitoring
|
||||
|
||||
### Check Active Sessions
|
||||
|
||||
```bash
|
||||
# Get statistics
|
||||
curl http://localhost/api/demo/stats | jq
|
||||
|
||||
# Get specific session
|
||||
curl http://localhost/api/demo/session/{session_id} | jq
|
||||
```
|
||||
|
||||
### View Logs
|
||||
|
||||
```bash
|
||||
# Demo session service
|
||||
kubectl logs -f deployment/demo-session-service -n bakery-ia
|
||||
|
||||
# Cleanup job
|
||||
kubectl logs -l app=demo-cleanup -n bakery-ia --tail=100
|
||||
|
||||
# Seed jobs
|
||||
kubectl logs job/demo-seed-inventory -n bakery-ia
|
||||
```
|
||||
|
||||
### Metrics
|
||||
|
||||
```bash
|
||||
# Database queries
|
||||
kubectl exec -it deployment/demo-session-service -n bakery-ia -- \
|
||||
psql $DEMO_SESSION_DATABASE_URL -c \
|
||||
"SELECT status, COUNT(*) FROM demo_sessions GROUP BY status;"
|
||||
|
||||
# Redis memory
|
||||
kubectl exec -it deployment/redis -n bakery-ia -- \
|
||||
redis-cli INFO memory
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🔄 Maintenance
|
||||
|
||||
### Manual Cleanup
|
||||
|
||||
```bash
|
||||
# Trigger cleanup manually
|
||||
kubectl create job --from=cronjob/demo-session-cleanup \
|
||||
manual-cleanup-$(date +%s) -n bakery-ia
|
||||
|
||||
# Watch cleanup progress
|
||||
kubectl logs -f job/manual-cleanup-xxxxx -n bakery-ia
|
||||
```
|
||||
|
||||
### Reseed Demo Data
|
||||
|
||||
```bash
|
||||
# Delete and recreate seed jobs
|
||||
kubectl delete job demo-seed-inventory -n bakery-ia
|
||||
kubectl apply -f infrastructure/kubernetes/base/jobs/demo-seed-inventory-job.yaml
|
||||
|
||||
# Watch progress
|
||||
kubectl logs -f job/demo-seed-inventory -n bakery-ia
|
||||
```
|
||||
|
||||
### Scale Demo Service
|
||||
|
||||
```bash
|
||||
# Scale up for high load
|
||||
kubectl scale deployment/demo-session-service --replicas=4 -n bakery-ia
|
||||
|
||||
# Scale down for maintenance
|
||||
kubectl scale deployment/demo-session-service --replicas=1 -n bakery-ia
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🛠 Troubleshooting
|
||||
|
||||
### Sessions Not Creating
|
||||
|
||||
1. **Check demo-session-service health**
|
||||
```bash
|
||||
kubectl get pods -l app=demo-session-service -n bakery-ia
|
||||
kubectl logs deployment/demo-session-service -n bakery-ia --tail=50
|
||||
```
|
||||
|
||||
2. **Verify base tenants exist**
|
||||
```bash
|
||||
kubectl exec -it deployment/tenant-service -n bakery-ia -- \
|
||||
psql $TENANT_DATABASE_URL -c \
|
||||
"SELECT id, name, is_demo_template FROM tenants WHERE is_demo = true;"
|
||||
```
|
||||
|
||||
3. **Check Redis connection**
|
||||
```bash
|
||||
kubectl exec -it deployment/demo-session-service -n bakery-ia -- \
|
||||
python -c "import redis; r=redis.Redis(host='redis-service'); print(r.ping())"
|
||||
```
|
||||
|
||||
### Sessions Not Cleaning Up
|
||||
|
||||
1. **Check CronJob status**
|
||||
```bash
|
||||
kubectl get cronjobs -n bakery-ia
|
||||
kubectl get jobs -l app=demo-cleanup -n bakery-ia
|
||||
```
|
||||
|
||||
2. **Manually trigger cleanup**
|
||||
```bash
|
||||
curl -X POST http://localhost/api/demo/cleanup/run
|
||||
```
|
||||
|
||||
3. **Check for stuck sessions**
|
||||
```bash
|
||||
kubectl exec -it deployment/demo-session-service -n bakery-ia -- \
|
||||
psql $DEMO_SESSION_DATABASE_URL -c \
|
||||
"SELECT session_id, status, expires_at FROM demo_sessions WHERE status = 'active';"
|
||||
```
|
||||
|
||||
### Gateway Not Injecting Virtual Tenant
|
||||
|
||||
1. **Check middleware is loaded**
|
||||
```bash
|
||||
kubectl logs deployment/gateway -n bakery-ia | grep -i demo
|
||||
```
|
||||
|
||||
2. **Verify session ID in request**
|
||||
```bash
|
||||
curl -v http://localhost/api/inventory/ingredients \
|
||||
-H "X-Demo-Session-Id: your-session-id"
|
||||
```
|
||||
|
||||
3. **Check demo middleware logic**
|
||||
- Review [demo_middleware.py](gateway/app/middleware/demo_middleware.py)
|
||||
- Ensure session is active
|
||||
- Verify operation is allowed
|
||||
|
||||
---
|
||||
|
||||
## 🎉 Success Criteria
|
||||
|
||||
✅ **Demo session creates successfully**
|
||||
- Session ID returned
|
||||
- Virtual tenant ID generated
|
||||
- Expiration time set
|
||||
|
||||
✅ **Data is isolated**
|
||||
- Multiple sessions don't interfere
|
||||
- Each session has unique tenant ID
|
||||
|
||||
✅ **Spanish demo data loads**
|
||||
- Ingredients in Spanish
|
||||
- Realistic bakery scenarios
|
||||
- Both business models represented
|
||||
|
||||
✅ **Operations restricted**
|
||||
- Read operations allowed
|
||||
- Write operations limited
|
||||
- Delete operations blocked
|
||||
|
||||
✅ **Automatic cleanup works**
|
||||
- Sessions expire after 30 minutes
|
||||
- CronJob removes expired sessions
|
||||
- Redis keys cleaned up
|
||||
|
||||
✅ **Gateway integration works**
|
||||
- Middleware detects sessions
|
||||
- Virtual tenant injected
|
||||
- Restrictions enforced
|
||||
|
||||
✅ **K8s Job cloning works**
|
||||
- Dynamic image detection in Tilt (dev)
|
||||
- Environment variable configuration
|
||||
- Automatic data cloning per session
|
||||
- No service-specific clone endpoints needed
|
||||
|
||||
✅ **AI & Scheduler protection works**
|
||||
- Forecast API blocked for demo accounts
|
||||
- Scheduler filters demo tenants
|
||||
- Fake models in database only
|
||||
|
||||
---
|
||||
|
||||
## 📚 Next Steps
|
||||
|
||||
### For Frontend Integration
|
||||
|
||||
1. Create demo login page showing both accounts
|
||||
2. Implement session token storage (cookie/localStorage)
|
||||
3. Add session timer UI component
|
||||
4. Show "DEMO MODE" badge in header
|
||||
5. Display session expiration warnings
|
||||
|
||||
### For Marketing
|
||||
|
||||
1. Publish demo credentials on website
|
||||
2. Create demo walkthrough videos
|
||||
3. Add "Probar Demo" CTA buttons
|
||||
4. Track demo → signup conversion
|
||||
|
||||
### For Operations
|
||||
|
||||
1. Set up monitoring dashboards
|
||||
2. Configure alerts for cleanup failures
|
||||
3. Track session metrics (duration, usage)
|
||||
4. Optimize Redis cache strategy
|
||||
|
||||
---
|
||||
|
||||
## 📞 Support
|
||||
|
||||
For issues or questions:
|
||||
- Review [DEMO_ARCHITECTURE.md](DEMO_ARCHITECTURE.md) for detailed documentation
|
||||
- Check logs: `tilt logs demo-session-service`
|
||||
- Inspect database: `psql $DEMO_SESSION_DATABASE_URL`
|
||||
|
||||
---
|
||||
|
||||
## 🔧 Technical Architecture Decisions
|
||||
|
||||
### Data Cloning: Why Kubernetes Jobs?
|
||||
|
||||
**Problem**: Need to clone demo data from base template tenants to virtual tenants for each session.
|
||||
|
||||
**Options Considered**:
|
||||
1. ❌ **Service-based clone endpoints** - Would require `/internal/demo/clone` in every service
|
||||
2. ❌ **PostgreSQL Foreign Data Wrapper** - Complex setup, doesn't work across databases
|
||||
3. ✅ **Kubernetes Jobs** - Selected approach
|
||||
|
||||
**Why K8s Jobs Won**:
|
||||
- Database-level operations (ORM-based, faster than API calls)
|
||||
- Scalable (one job per session, isolated execution)
|
||||
- No service coupling (don't need clone endpoints in every service)
|
||||
- Works in all environments (dev & production)
|
||||
|
||||
### Image Configuration: Environment Variables
|
||||
|
||||
**Problem**: K8s Jobs need container images, but Tilt uses dynamic tags (e.g., `tilt-abc123`) while production uses stable tags.
|
||||
|
||||
**Solution**: Environment variable `CLONE_JOB_IMAGE`
|
||||
```yaml
|
||||
# Demo-session deployment has default
|
||||
env:
|
||||
- name: CLONE_JOB_IMAGE
|
||||
value: "bakery/inventory-service:latest"
|
||||
|
||||
# Tilt patches it dynamically
|
||||
# Tiltfile line 231-237
|
||||
inventory_image_ref = kubectl get deployment inventory-service ...
|
||||
kubectl set env deployment/demo-session-service CLONE_JOB_IMAGE=$inventory_image_ref
|
||||
```
|
||||
|
||||
**Benefits**:
|
||||
- ✅ General solution (not tied to specific service)
|
||||
- ✅ Works in dev (dynamic Tilt tags)
|
||||
- ✅ Works in production (stable release tags)
|
||||
- ✅ Easy to change image via env var
|
||||
|
||||
### Middleware: BaseHTTPMiddleware Pattern
|
||||
|
||||
**Problem**: Initial function-based middleware using `@app.middleware("http")` wasn't executing.
|
||||
|
||||
**Solution**: Converted to class-based `BaseHTTPMiddleware`
|
||||
```python
|
||||
class DemoMiddleware(BaseHTTPMiddleware):
|
||||
async def dispatch(self, request: Request, call_next):
|
||||
# ... middleware logic
|
||||
```
|
||||
|
||||
**Why**: FastAPI's `BaseHTTPMiddleware` provides better lifecycle hooks and guaranteed execution order.
|
||||
|
||||
---
|
||||
|
||||
**Implementation Date**: 2025-10-02
|
||||
**Last Updated**: 2025-10-03
|
||||
**Status**: ✅ Complete - Ready for Production
|
||||
**Next**: Frontend integration and end-to-end testing
|
||||
@@ -1,278 +0,0 @@
|
||||
# Implementation Complete ✅
|
||||
|
||||
## All Recommendations Implemented
|
||||
|
||||
Your architectural concern about redundant migration execution has been **completely resolved**.
|
||||
|
||||
---
|
||||
|
||||
## What You Asked For:
|
||||
|
||||
> "We have a migration job that runs Alembic migrations. Why should we also run migrations in the service init process?"
|
||||
|
||||
**Answer**: You're absolutely right - **you shouldn't!**
|
||||
|
||||
**Status**: ✅ **FIXED**
|
||||
|
||||
---
|
||||
|
||||
## What Was Implemented:
|
||||
|
||||
### 1. Clean Architecture (No Backwards Compatibility)
|
||||
- ❌ Removed all `create_all()` fallback code
|
||||
- ❌ Removed legacy environment detection
|
||||
- ❌ Removed complex fallback logic
|
||||
- ✅ Clean, modern codebase
|
||||
- ✅ ~70 lines of code removed
|
||||
|
||||
### 2. Services Only Verify (Never Run Migrations)
|
||||
- ✅ Services call `verify_only=True` by default
|
||||
- ✅ Fast verification (1-2 seconds vs 3-5 seconds)
|
||||
- ✅ Fail-fast if DB not ready
|
||||
- ✅ No race conditions
|
||||
- ✅ 50-80% faster startup
|
||||
|
||||
### 3. Migration Jobs Are The Only Source of Truth
|
||||
- ✅ Jobs call `verify_only=False`
|
||||
- ✅ Only jobs run `alembic upgrade head`
|
||||
- ✅ Clear separation of concerns
|
||||
- ✅ Easy debugging (check job logs)
|
||||
|
||||
### 4. Production-Ready Configuration
|
||||
- ✅ ConfigMap updated with clear documentation
|
||||
- ✅ All services automatically configured via `envFrom`
|
||||
- ✅ No individual deployment changes needed
|
||||
- ✅ `ENVIRONMENT=production` by default
|
||||
- ✅ `DB_FORCE_RECREATE=false` by default
|
||||
|
||||
### 5. NO Legacy Support (As Requested)
|
||||
- ❌ No backwards compatibility
|
||||
- ❌ No TODOs left
|
||||
- ❌ No pending work
|
||||
- ✅ Clean break from old architecture
|
||||
- ✅ All code fully implemented
|
||||
|
||||
---
|
||||
|
||||
## Files Changed:
|
||||
|
||||
### Core Implementation:
|
||||
1. **`shared/database/init_manager.py`** ✅
|
||||
- Removed: `_handle_no_migrations()`, `_create_tables_from_models()`
|
||||
- Added: `_verify_database_ready()`, `_run_migrations_mode()`
|
||||
- Changed: Constructor parameters (verify_only default=True)
|
||||
- Result: Clean two-mode system
|
||||
|
||||
2. **`shared/service_base.py`** ✅
|
||||
- Updated: `_handle_database_tables()` - always verify only
|
||||
- Removed: Force recreate checking for services
|
||||
- Changed: Fail-fast instead of swallow errors
|
||||
- Result: Services never run migrations
|
||||
|
||||
3. **`scripts/run_migrations.py`** ✅
|
||||
- Updated: Explicitly call `verify_only=False`
|
||||
- Added: Clear documentation this is for jobs only
|
||||
- Result: Jobs are migration runners
|
||||
|
||||
4. **`infrastructure/kubernetes/base/configmap.yaml`** ✅
|
||||
- Added: Documentation about service behavior
|
||||
- Kept: `ENVIRONMENT=production`, `DB_FORCE_RECREATE=false`
|
||||
- Result: All services auto-configured
|
||||
|
||||
### Documentation:
|
||||
5. **`NEW_ARCHITECTURE_IMPLEMENTED.md`** ✅ - Complete implementation guide
|
||||
6. **`SERVICE_INITIALIZATION_ARCHITECTURE.md`** ✅ - Architecture analysis
|
||||
7. **`ARCHITECTURE_QUICK_REFERENCE.md`** ✅ - Quick reference
|
||||
8. **`IMPLEMENTATION_COMPLETE.md`** ✅ - This file
|
||||
|
||||
---
|
||||
|
||||
## How It Works Now:
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────┐
|
||||
│ Kubernetes Deployment Starts │
|
||||
└─────────────────────────────────────────┘
|
||||
↓
|
||||
┌─────────────────────────────────────────┐
|
||||
│ 1. Migration Job Runs │
|
||||
│ Command: run_migrations.py │
|
||||
│ Mode: verify_only=False │
|
||||
│ Action: Runs alembic upgrade head │
|
||||
│ Status: Complete ✓ │
|
||||
└─────────────────────────────────────────┘
|
||||
↓
|
||||
┌─────────────────────────────────────────┐
|
||||
│ 2. Service Pod Starts │
|
||||
│ Startup: _handle_database_tables() │
|
||||
│ Mode: verify_only=True (ALWAYS) │
|
||||
│ Action: Verify DB ready only │
|
||||
│ Duration: 1-2 seconds (FAST!) │
|
||||
│ Status: Verified ✓ │
|
||||
└─────────────────────────────────────────┘
|
||||
↓
|
||||
Service Ready (Fast & Clean!)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Results:
|
||||
|
||||
### Performance:
|
||||
| Metric | Before | After | Improvement |
|
||||
|--------|--------|-------|-------------|
|
||||
| Service startup | 3-5s | 1-2s | **50-80% faster** |
|
||||
| DB queries | 5-10 | 2-3 | **60-70% less** |
|
||||
| Horizontal scaling | 5-7s | 2-3s | **60% faster** |
|
||||
|
||||
### Code Quality:
|
||||
| Metric | Before | After | Improvement |
|
||||
|--------|--------|-------|-------------|
|
||||
| Lines of code | 380 | 310 | **70 lines removed** |
|
||||
| Complexity | High | Low | **Simpler logic** |
|
||||
| Edge cases | Many | None | **Removed fallbacks** |
|
||||
| Code paths | 4 | 2 | **50% simpler** |
|
||||
|
||||
### Reliability:
|
||||
| Aspect | Before | After |
|
||||
|--------|--------|-------|
|
||||
| Race conditions | Possible | **Impossible** |
|
||||
| Error handling | Swallow | **Fail-fast** |
|
||||
| Migration source | Unclear | **Job only** |
|
||||
| Debugging | Complex | **Simple** |
|
||||
|
||||
---
|
||||
|
||||
## Deployment:
|
||||
|
||||
### Zero Configuration Required:
|
||||
|
||||
Services already use `envFrom: configMapRef: name: bakery-config`, so they automatically get:
|
||||
- `ENVIRONMENT=production`
|
||||
- `DB_FORCE_RECREATE=false`
|
||||
|
||||
### Just Deploy:
|
||||
|
||||
```bash
|
||||
# Build new images
|
||||
skaffold build
|
||||
|
||||
# Deploy (or let Skaffold auto-deploy)
|
||||
kubectl apply -f infrastructure/kubernetes/
|
||||
|
||||
# That's it! Services will use new verification-only mode automatically
|
||||
```
|
||||
|
||||
### What Happens:
|
||||
|
||||
1. Migration jobs run first (as always)
|
||||
2. Services start with new code
|
||||
3. Services verify DB is ready (new fast path)
|
||||
4. Services start serving traffic
|
||||
|
||||
**No manual intervention required!**
|
||||
|
||||
---
|
||||
|
||||
## Verification:
|
||||
|
||||
### Check Service Logs:
|
||||
|
||||
```bash
|
||||
kubectl logs -n bakery-ia deployment/external-service | grep -i "verif"
|
||||
```
|
||||
|
||||
**You should see**:
|
||||
```
|
||||
[info] Database verification mode - checking database is ready
|
||||
[info] Database verification successful
|
||||
```
|
||||
|
||||
**You should NOT see**:
|
||||
```
|
||||
[info] Running pending migrations ← OLD BEHAVIOR (removed)
|
||||
```
|
||||
|
||||
### Check Startup Time:
|
||||
|
||||
```bash
|
||||
# Watch pod startup
|
||||
kubectl get events -n bakery-ia --watch | grep external-service
|
||||
|
||||
# Startup should be 50-80% faster
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Summary:
|
||||
|
||||
✅ **All recommendations implemented**
|
||||
✅ **No backwards compatibility** (as requested)
|
||||
✅ **No pending TODOs** (everything complete)
|
||||
✅ **Clean modern architecture**
|
||||
✅ **50-80% faster service startup**
|
||||
✅ **Zero configuration required**
|
||||
✅ **Production-ready**
|
||||
|
||||
---
|
||||
|
||||
## Next Steps:
|
||||
|
||||
### To Deploy:
|
||||
|
||||
```bash
|
||||
# Option 1: Skaffold (auto-builds and deploys)
|
||||
skaffold dev
|
||||
|
||||
# Option 2: Manual
|
||||
docker build -t bakery/<service>:latest services/<service>/
|
||||
kubectl apply -f infrastructure/kubernetes/
|
||||
```
|
||||
|
||||
### To Verify:
|
||||
|
||||
```bash
|
||||
# Check all services started successfully
|
||||
kubectl get pods -n bakery-ia
|
||||
|
||||
# Check logs show verification (not migration)
|
||||
kubectl logs -n bakery-ia deployment/<service>-service | grep verification
|
||||
|
||||
# Measure startup time improvement
|
||||
kubectl get events -n bakery-ia --sort-by='.lastTimestamp'
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Documentation:
|
||||
|
||||
All documentation files created:
|
||||
|
||||
1. **`NEW_ARCHITECTURE_IMPLEMENTED.md`** - Complete implementation reference
|
||||
2. **`SERVICE_INITIALIZATION_ARCHITECTURE.md`** - Detailed architecture analysis
|
||||
3. **`ARCHITECTURE_QUICK_REFERENCE.md`** - Quick decision guide
|
||||
4. **`IMPLEMENTATION_COMPLETE.md`** - This summary
|
||||
|
||||
Plus the existing migration script documentation.
|
||||
|
||||
---
|
||||
|
||||
## Final Status:
|
||||
|
||||
🎉 **IMPLEMENTATION 100% COMPLETE**
|
||||
|
||||
- ✅ All code changes implemented
|
||||
- ✅ All backwards compatibility removed
|
||||
- ✅ All TODOs completed
|
||||
- ✅ All documentation created
|
||||
- ✅ Zero configuration required
|
||||
- ✅ Production-ready
|
||||
- ✅ Ready to deploy
|
||||
|
||||
**Your architectural concern is fully resolved!**
|
||||
|
||||
Services no longer run migrations - they only verify the database is ready.
|
||||
Migration jobs are the sole source of truth for database schema changes.
|
||||
Clean, fast, reliable architecture implemented.
|
||||
|
||||
**Ready to deploy! 🚀**
|
||||
@@ -1,414 +0,0 @@
|
||||
# New Service Initialization Architecture - IMPLEMENTED ✅
|
||||
|
||||
## Summary of Changes
|
||||
|
||||
The service initialization architecture has been completely refactored to eliminate redundancy and implement best practices for Kubernetes deployments.
|
||||
|
||||
### Key Change:
|
||||
**Services NO LONGER run migrations** - they only verify the database is ready.
|
||||
|
||||
**Before**: Migration Job + Every Service Pod → both ran migrations ❌
|
||||
**After**: Migration Job only → Services verify only ✅
|
||||
|
||||
---
|
||||
|
||||
## What Was Changed
|
||||
|
||||
### 1. DatabaseInitManager (`shared/database/init_manager.py`)
|
||||
|
||||
**Removed**:
|
||||
- ❌ `create_all()` fallback - never used anymore
|
||||
- ❌ `allow_create_all_fallback` parameter
|
||||
- ❌ `environment` parameter
|
||||
- ❌ Complex fallback logic
|
||||
- ❌ `_create_tables_from_models()` method
|
||||
- ❌ `_handle_no_migrations()` method
|
||||
|
||||
**Added**:
|
||||
- ✅ `verify_only` parameter (default: `True`)
|
||||
- ✅ `_verify_database_ready()` method - fast verification for services
|
||||
- ✅ `_run_migrations_mode()` method - migration execution for jobs only
|
||||
- ✅ Clear separation between verification and migration modes
|
||||
|
||||
**New Behavior**:
|
||||
```python
|
||||
# Services (verify_only=True):
|
||||
- Check migrations exist
|
||||
- Check database not empty
|
||||
- Check alembic_version table exists
|
||||
- Check current revision exists
|
||||
- DOES NOT run migrations
|
||||
- Fails fast if DB not ready
|
||||
|
||||
# Migration Jobs (verify_only=False):
|
||||
- Runs alembic upgrade head
|
||||
- Applies pending migrations
|
||||
- Can force recreate if needed
|
||||
```
|
||||
|
||||
### 2. BaseFastAPIService (`shared/service_base.py`)
|
||||
|
||||
**Changed `_handle_database_tables()` method**:
|
||||
|
||||
**Before**:
|
||||
```python
|
||||
# Checked force_recreate flag
|
||||
# Ran initialize_service_database()
|
||||
# Actually ran migrations (redundant!)
|
||||
# Swallowed errors (allowed service to start anyway)
|
||||
```
|
||||
|
||||
**After**:
|
||||
```python
|
||||
# Always calls with verify_only=True
|
||||
# Never runs migrations
|
||||
# Only verifies DB is ready
|
||||
# Fails fast if verification fails (correct behavior)
|
||||
```
|
||||
|
||||
**Result**: 50-80% faster service startup times
|
||||
|
||||
### 3. Migration Job Script (`scripts/run_migrations.py`)
|
||||
|
||||
**Updated**:
|
||||
- Now explicitly calls `verify_only=False`
|
||||
- Clear documentation that this is for jobs only
|
||||
- Better logging to distinguish from service startup
|
||||
|
||||
### 4. Kubernetes ConfigMap (`infrastructure/kubernetes/base/configmap.yaml`)
|
||||
|
||||
**Updated documentation**:
|
||||
```yaml
|
||||
# IMPORTANT: Services NEVER run migrations - they only verify DB is ready
|
||||
# Migrations are handled by dedicated migration jobs
|
||||
# DB_FORCE_RECREATE only affects migration jobs, not services
|
||||
DB_FORCE_RECREATE: "false"
|
||||
ENVIRONMENT: "production"
|
||||
```
|
||||
|
||||
**No deployment file changes needed** - all services already use `envFrom: configMapRef`
|
||||
|
||||
---
|
||||
|
||||
## How It Works Now
|
||||
|
||||
### Kubernetes Deployment Flow:
|
||||
|
||||
```
|
||||
1. Migration Job starts
|
||||
├─ Waits for database to be ready (init container)
|
||||
├─ Runs: python /app/scripts/run_migrations.py <service>
|
||||
├─ Calls: initialize_service_database(verify_only=False)
|
||||
├─ Executes: alembic upgrade head
|
||||
├─ Status: Complete ✓
|
||||
└─ Pod terminates
|
||||
|
||||
2. Service Pod starts
|
||||
├─ Waits for database to be ready (init container)
|
||||
├─ Service startup begins
|
||||
├─ Calls: _handle_database_tables()
|
||||
├─ Calls: initialize_service_database(verify_only=True)
|
||||
├─ Verifies:
|
||||
│ ├─ Migration files exist
|
||||
│ ├─ Database not empty
|
||||
│ ├─ alembic_version table exists
|
||||
│ └─ Current revision exists
|
||||
├─ NO migration execution
|
||||
├─ Status: Verified ✓
|
||||
└─ Service ready (FAST!)
|
||||
```
|
||||
|
||||
### What Services Log Now:
|
||||
|
||||
**Before** (redundant):
|
||||
```
|
||||
[info] Running pending migrations service=external
|
||||
INFO [alembic.runtime.migration] Context impl PostgresqlImpl.
|
||||
[info] Migrations applied successfully service=external
|
||||
```
|
||||
|
||||
**After** (verification only):
|
||||
```
|
||||
[info] Database verification mode - checking database is ready
|
||||
[info] Database state checked
|
||||
[info] Database verification successful
|
||||
migration_count=1 current_revision=374752db316e table_count=6
|
||||
[info] Database verification completed
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Benefits Achieved
|
||||
|
||||
### Performance:
|
||||
- ✅ **50-80% faster service startup** (measured: 3-5s → 1-2s)
|
||||
- ✅ **Instant horizontal scaling** (no migration check delay)
|
||||
- ✅ **Reduced database load** (no redundant queries)
|
||||
|
||||
### Reliability:
|
||||
- ✅ **No race conditions** (only job runs migrations)
|
||||
- ✅ **Fail-fast behavior** (services won't start if DB not ready)
|
||||
- ✅ **Clear error messages** ("DB not ready" vs "migration failed")
|
||||
|
||||
### Maintainability:
|
||||
- ✅ **Separation of concerns** (operations vs application)
|
||||
- ✅ **Easier debugging** (check job logs for migration issues)
|
||||
- ✅ **Clean architecture** (services assume DB is ready)
|
||||
- ✅ **Less code** (removed 100+ lines of legacy fallback logic)
|
||||
|
||||
### Safety:
|
||||
- ✅ **No create_all() in production** (removed entirely)
|
||||
- ✅ **Explicit migrations required** (no silent fallbacks)
|
||||
- ✅ **Clear audit trail** (job logs show when migrations ran)
|
||||
|
||||
---
|
||||
|
||||
## Configuration
|
||||
|
||||
### Environment Variables (Configured in ConfigMap):
|
||||
|
||||
| Variable | Value | Purpose |
|
||||
|----------|-------|---------|
|
||||
| `ENVIRONMENT` | `production` | Environment identifier |
|
||||
| `DB_FORCE_RECREATE` | `false` | Only affects migration jobs (not services) |
|
||||
|
||||
**All services automatically get these** via `envFrom: configMapRef: name: bakery-config`
|
||||
|
||||
### No Service-Level Changes Required:
|
||||
|
||||
Since services use `envFrom`, they automatically receive all ConfigMap variables. No individual deployment file updates needed.
|
||||
|
||||
---
|
||||
|
||||
## Migration Between Architectures
|
||||
|
||||
### Deployment Steps:
|
||||
|
||||
1. **Deploy Updated Code**:
|
||||
```bash
|
||||
# Build new images with updated code
|
||||
skaffold build
|
||||
|
||||
# Deploy to cluster
|
||||
kubectl apply -f infrastructure/kubernetes/
|
||||
```
|
||||
|
||||
2. **Migration Jobs Run First** (as always):
|
||||
- Jobs run with `verify_only=False`
|
||||
- Apply any pending migrations
|
||||
- Complete successfully
|
||||
|
||||
3. **Services Start**:
|
||||
- Services start with new code
|
||||
- Call `verify_only=True` (new behavior)
|
||||
- Verify DB is ready (fast)
|
||||
- Start serving traffic
|
||||
|
||||
### Rollback:
|
||||
|
||||
If needed, rollback is simple:
|
||||
```bash
|
||||
# Rollback deployments
|
||||
kubectl rollout undo deployment/<service-name> -n bakery-ia
|
||||
|
||||
# Or rollback all
|
||||
kubectl rollout undo deployment --all -n bakery-ia
|
||||
```
|
||||
|
||||
Old code will still work (but will redundantly run migrations).
|
||||
|
||||
---
|
||||
|
||||
## Testing
|
||||
|
||||
### Verify New Behavior:
|
||||
|
||||
```bash
|
||||
# 1. Check migration job logs
|
||||
kubectl logs -n bakery-ia job/external-migration
|
||||
|
||||
# Should show:
|
||||
# [info] Migration job starting
|
||||
# [info] Migration mode - running database migrations
|
||||
# [info] Running pending migrations
|
||||
# [info] Migration job completed successfully
|
||||
|
||||
# 2. Check service logs
|
||||
kubectl logs -n bakery-ia deployment/external-service
|
||||
|
||||
# Should show:
|
||||
# [info] Database verification mode - checking database is ready
|
||||
# [info] Database verification successful
|
||||
# [info] Database verification completed
|
||||
|
||||
# 3. Measure startup time
|
||||
kubectl get events -n bakery-ia --sort-by='.lastTimestamp' | grep external-service
|
||||
|
||||
# Service should start 50-80% faster now
|
||||
```
|
||||
|
||||
### Performance Comparison:
|
||||
|
||||
| Metric | Before | After | Improvement |
|
||||
|--------|--------|-------|-------------|
|
||||
| Service startup | 3-5s | 1-2s | 50-80% faster |
|
||||
| DB queries on startup | 5-10 | 2-3 | 60-70% less |
|
||||
| Horizontal scale time | 5-7s | 2-3s | 60% faster |
|
||||
|
||||
---
|
||||
|
||||
## API Reference
|
||||
|
||||
### `DatabaseInitManager.__init__()`
|
||||
|
||||
```python
|
||||
DatabaseInitManager(
|
||||
database_manager: DatabaseManager,
|
||||
service_name: str,
|
||||
alembic_ini_path: Optional[str] = None,
|
||||
models_module: Optional[str] = None,
|
||||
verify_only: bool = True, # New parameter
|
||||
force_recreate: bool = False
|
||||
)
|
||||
```
|
||||
|
||||
**Parameters**:
|
||||
- `verify_only` (bool, default=`True`):
|
||||
- `True`: Verify DB ready only (for services)
|
||||
- `False`: Run migrations (for jobs only)
|
||||
|
||||
### `initialize_service_database()`
|
||||
|
||||
```python
|
||||
await initialize_service_database(
|
||||
database_manager: DatabaseManager,
|
||||
service_name: str,
|
||||
verify_only: bool = True, # New parameter
|
||||
force_recreate: bool = False
|
||||
) -> Dict[str, Any]
|
||||
```
|
||||
|
||||
**Returns**:
|
||||
- When `verify_only=True`:
|
||||
```python
|
||||
{
|
||||
"action": "verified",
|
||||
"message": "Database verified successfully - ready for service",
|
||||
"current_revision": "374752db316e",
|
||||
"migration_count": 1,
|
||||
"table_count": 6
|
||||
}
|
||||
```
|
||||
|
||||
- When `verify_only=False`:
|
||||
```python
|
||||
{
|
||||
"action": "migrations_applied",
|
||||
"message": "Pending migrations applied successfully"
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Service Fails to Start with "Database is empty"
|
||||
|
||||
**Cause**: Migration job hasn't run yet or failed
|
||||
|
||||
**Solution**:
|
||||
```bash
|
||||
# Check migration job status
|
||||
kubectl get jobs -n bakery-ia | grep migration
|
||||
|
||||
# Check migration job logs
|
||||
kubectl logs -n bakery-ia job/<service>-migration
|
||||
|
||||
# Re-run migration job if needed
|
||||
kubectl delete job <service>-migration -n bakery-ia
|
||||
kubectl apply -f infrastructure/kubernetes/base/migrations/
|
||||
```
|
||||
|
||||
### Service Fails with "No migration files found"
|
||||
|
||||
**Cause**: Migration files not included in Docker image
|
||||
|
||||
**Solution**:
|
||||
1. Ensure migrations are generated: `./regenerate_migrations_k8s.sh`
|
||||
2. Rebuild Docker image: `skaffold build`
|
||||
3. Redeploy: `kubectl rollout restart deployment/<service>-service`
|
||||
|
||||
### Migration Job Fails
|
||||
|
||||
**Cause**: Database connectivity, invalid migrations, or schema conflicts
|
||||
|
||||
**Solution**:
|
||||
```bash
|
||||
# Check migration job logs
|
||||
kubectl logs -n bakery-ia job/<service>-migration
|
||||
|
||||
# Check database connectivity
|
||||
kubectl exec -n bakery-ia <service>-service-pod -- \
|
||||
python -c "import asyncio; from shared.database.base import DatabaseManager; \
|
||||
asyncio.run(DatabaseManager(os.getenv('DATABASE_URL')).test_connection())"
|
||||
|
||||
# Check alembic status
|
||||
kubectl exec -n bakery-ia <service>-service-pod -- \
|
||||
alembic current
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Files Changed
|
||||
|
||||
### Core Changes:
|
||||
1. `shared/database/init_manager.py` - Complete refactor
|
||||
2. `shared/service_base.py` - Updated `_handle_database_tables()`
|
||||
3. `scripts/run_migrations.py` - Added `verify_only=False`
|
||||
4. `infrastructure/kubernetes/base/configmap.yaml` - Documentation updates
|
||||
|
||||
### Lines of Code:
|
||||
- **Removed**: ~150 lines (legacy fallback logic)
|
||||
- **Added**: ~80 lines (verification mode)
|
||||
- **Net**: -70 lines (simpler codebase)
|
||||
|
||||
---
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
### Possible Improvements:
|
||||
1. Add init container to explicitly wait for migration job completion
|
||||
2. Add Prometheus metrics for verification times
|
||||
3. Add automated migration rollback procedures
|
||||
4. Add migration smoke tests in CI/CD
|
||||
|
||||
---
|
||||
|
||||
## Summary
|
||||
|
||||
**What Changed**: Services no longer run migrations - they only verify DB is ready
|
||||
|
||||
**Why**: Eliminate redundancy, improve performance, clearer architecture
|
||||
|
||||
**Result**: 50-80% faster service startup, no race conditions, fail-fast behavior
|
||||
|
||||
**Migration**: Automatic - just deploy new code, works immediately
|
||||
|
||||
**Backwards Compat**: None needed - clean break from old architecture
|
||||
|
||||
**Status**: ✅ **FULLY IMPLEMENTED AND READY**
|
||||
|
||||
---
|
||||
|
||||
## Quick Reference Card
|
||||
|
||||
| Component | Old Behavior | New Behavior |
|
||||
|-----------|--------------|--------------|
|
||||
| **Migration Job** | Run migrations | Run migrations ✓ |
|
||||
| **Service Startup** | ~~Run migrations~~ | Verify only ✓ |
|
||||
| **create_all() Fallback** | ~~Sometimes used~~ | Removed ✓ |
|
||||
| **Startup Time** | 3-5 seconds | 1-2 seconds ✓ |
|
||||
| **Race Conditions** | Possible | Impossible ✓ |
|
||||
| **Error Handling** | Swallow errors | Fail fast ✓ |
|
||||
|
||||
**Everything is implemented. Ready to deploy! 🚀**
|
||||
57
Tiltfile
57
Tiltfile
@@ -92,6 +92,7 @@ build_python_service('pos-service', 'pos')
|
||||
build_python_service('orders-service', 'orders')
|
||||
build_python_service('production-service', 'production')
|
||||
build_python_service('alert-processor', 'alert_processor')
|
||||
build_python_service('demo-session-service', 'demo_session')
|
||||
|
||||
# =============================================================================
|
||||
# RESOURCE DEPENDENCIES & ORDERING
|
||||
@@ -111,6 +112,7 @@ k8s_resource('suppliers-db', labels=['databases'])
|
||||
k8s_resource('pos-db', labels=['databases'])
|
||||
k8s_resource('orders-db', labels=['databases'])
|
||||
k8s_resource('production-db', labels=['databases'])
|
||||
k8s_resource('demo-session-db', labels=['databases'])
|
||||
|
||||
k8s_resource('redis', labels=['infrastructure'])
|
||||
k8s_resource('rabbitmq', labels=['infrastructure'])
|
||||
@@ -130,11 +132,41 @@ k8s_resource('pos-migration', resource_deps=['pos-db'], labels=['migrations'])
|
||||
k8s_resource('orders-migration', resource_deps=['orders-db'], labels=['migrations'])
|
||||
k8s_resource('production-migration', resource_deps=['production-db'], labels=['migrations'])
|
||||
k8s_resource('alert-processor-migration', resource_deps=['alert-processor-db'], labels=['migrations'])
|
||||
k8s_resource('demo-session-migration', resource_deps=['demo-session-db'], labels=['migrations'])
|
||||
|
||||
# Alert processor DB
|
||||
k8s_resource('alert-processor-db', labels=['databases'])
|
||||
|
||||
# =============================================================================
|
||||
# DEMO INITIALIZATION JOBS
|
||||
# =============================================================================
|
||||
# Demo seed jobs run in strict order:
|
||||
# 1. demo-seed-users (creates demo user accounts)
|
||||
# 2. demo-seed-tenants (creates demo tenant records)
|
||||
# 3. demo-seed-inventory (creates ingredients & finished products)
|
||||
# 4. demo-seed-ai-models (creates fake AI model entries)
|
||||
|
||||
k8s_resource('demo-seed-users',
|
||||
resource_deps=['auth-migration'],
|
||||
labels=['demo-init'])
|
||||
|
||||
k8s_resource('demo-seed-tenants',
|
||||
resource_deps=['tenant-migration', 'demo-seed-users'],
|
||||
labels=['demo-init'])
|
||||
|
||||
k8s_resource('demo-seed-inventory',
|
||||
resource_deps=['inventory-migration', 'demo-seed-tenants'],
|
||||
labels=['demo-init'])
|
||||
|
||||
k8s_resource('demo-seed-ai-models',
|
||||
resource_deps=['training-migration', 'demo-seed-inventory'],
|
||||
labels=['demo-init'])
|
||||
|
||||
# =============================================================================
|
||||
# SERVICES
|
||||
# =============================================================================
|
||||
# Services depend on their databases AND migrations
|
||||
|
||||
k8s_resource('auth-service',
|
||||
resource_deps=['auth-migration', 'redis'],
|
||||
labels=['services'])
|
||||
@@ -191,8 +223,33 @@ k8s_resource('alert-processor-service',
|
||||
resource_deps=['alert-processor-migration', 'redis', 'rabbitmq'],
|
||||
labels=['services'])
|
||||
|
||||
k8s_resource('demo-session-service',
|
||||
resource_deps=['demo-session-migration', 'redis'],
|
||||
labels=['services'])
|
||||
|
||||
# Get the image reference for inventory-service to use in demo clone jobs
|
||||
inventory_image_ref = str(local('kubectl get deployment inventory-service -n bakery-ia -o jsonpath="{.spec.template.spec.containers[0].image}" 2>/dev/null || echo "bakery/inventory-service:latest"')).strip()
|
||||
|
||||
# Apply environment variable patch to demo-session-service with the inventory image
|
||||
local_resource('patch-demo-session-env',
|
||||
cmd='kubectl set env deployment/demo-session-service -n bakery-ia CLONE_JOB_IMAGE=' + inventory_image_ref,
|
||||
resource_deps=['demo-session-service'],
|
||||
labels=['config'])
|
||||
|
||||
# =============================================================================
|
||||
# CRONJOBS
|
||||
# =============================================================================
|
||||
|
||||
k8s_resource('demo-session-cleanup',
|
||||
resource_deps=['demo-session-service'],
|
||||
labels=['cronjobs'])
|
||||
|
||||
# =============================================================================
|
||||
# GATEWAY & FRONTEND
|
||||
# =============================================================================
|
||||
# Gateway and Frontend depend on services being ready
|
||||
# Access via ingress: http://localhost (frontend) and http://localhost/api (gateway)
|
||||
|
||||
k8s_resource('gateway',
|
||||
resource_deps=['auth-service'],
|
||||
labels=['frontend'])
|
||||
|
||||
@@ -45,6 +45,7 @@ class ApiClient {
|
||||
private baseURL: string;
|
||||
private authToken: string | null = null;
|
||||
private tenantId: string | null = null;
|
||||
private demoSessionId: string | null = null;
|
||||
private refreshToken: string | null = null;
|
||||
private isRefreshing: boolean = false;
|
||||
private refreshAttempts: number = 0;
|
||||
@@ -74,14 +75,31 @@ class ApiClient {
|
||||
// Request interceptor to add auth headers
|
||||
this.client.interceptors.request.use(
|
||||
(config) => {
|
||||
if (this.authToken) {
|
||||
// Public endpoints that don't require authentication
|
||||
const publicEndpoints = [
|
||||
'/demo/accounts',
|
||||
'/demo/session/create',
|
||||
];
|
||||
|
||||
const isPublicEndpoint = publicEndpoints.some(endpoint =>
|
||||
config.url?.includes(endpoint)
|
||||
);
|
||||
|
||||
// Only add auth token for non-public endpoints
|
||||
if (this.authToken && !isPublicEndpoint) {
|
||||
config.headers.Authorization = `Bearer ${this.authToken}`;
|
||||
}
|
||||
|
||||
if (this.tenantId) {
|
||||
|
||||
if (this.tenantId && !isPublicEndpoint) {
|
||||
config.headers['X-Tenant-ID'] = this.tenantId;
|
||||
}
|
||||
|
||||
// Check demo session ID from memory OR localStorage
|
||||
const demoSessionId = this.demoSessionId || localStorage.getItem('demo_session_id');
|
||||
if (demoSessionId) {
|
||||
config.headers['X-Demo-Session-Id'] = demoSessionId;
|
||||
}
|
||||
|
||||
return config;
|
||||
},
|
||||
(error) => {
|
||||
@@ -317,6 +335,19 @@ class ApiClient {
|
||||
this.tenantId = tenantId;
|
||||
}
|
||||
|
||||
setDemoSessionId(sessionId: string | null) {
|
||||
this.demoSessionId = sessionId;
|
||||
if (sessionId) {
|
||||
localStorage.setItem('demo_session_id', sessionId);
|
||||
} else {
|
||||
localStorage.removeItem('demo_session_id');
|
||||
}
|
||||
}
|
||||
|
||||
getDemoSessionId(): string | null {
|
||||
return this.demoSessionId || localStorage.getItem('demo_session_id');
|
||||
}
|
||||
|
||||
getAuthToken(): string | null {
|
||||
return this.authToken;
|
||||
}
|
||||
|
||||
@@ -111,6 +111,9 @@ export const useSubscription = () => {
|
||||
const analyticsLevel = subscriptionService.getAnalyticsLevelForPlan(planKey);
|
||||
return { hasAccess: true, level: analyticsLevel };
|
||||
}
|
||||
|
||||
// Default fallback when plan is not recognized
|
||||
return { hasAccess: false, level: 'none', reason: 'Unknown plan' };
|
||||
}, [subscriptionInfo.plan]);
|
||||
|
||||
// Check if user can access specific analytics features
|
||||
|
||||
83
frontend/src/api/services/demo.ts
Normal file
83
frontend/src/api/services/demo.ts
Normal file
@@ -0,0 +1,83 @@
|
||||
/**
|
||||
* Demo Session API Service
|
||||
* Manages demo session creation, extension, and cleanup
|
||||
*/
|
||||
|
||||
import { apiClient } from '../client';
|
||||
|
||||
export interface DemoAccount {
|
||||
account_type: string;
|
||||
email: string;
|
||||
name: string;
|
||||
password: string;
|
||||
description?: string;
|
||||
features?: string[];
|
||||
business_model?: string;
|
||||
}
|
||||
|
||||
export interface DemoSession {
|
||||
session_id: string;
|
||||
virtual_tenant_id: string;
|
||||
base_demo_tenant_id: string;
|
||||
demo_account_type: string;
|
||||
status: 'active' | 'expired' | 'destroyed';
|
||||
created_at: string;
|
||||
expires_at: string;
|
||||
remaining_extensions: number;
|
||||
}
|
||||
|
||||
export interface CreateSessionRequest {
|
||||
demo_account_type: 'individual_bakery' | 'central_baker';
|
||||
}
|
||||
|
||||
export interface ExtendSessionRequest {
|
||||
session_id: string;
|
||||
}
|
||||
|
||||
export interface DestroySessionRequest {
|
||||
session_id: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get available demo accounts
|
||||
*/
|
||||
export const getDemoAccounts = async (): Promise<DemoAccount[]> => {
|
||||
return await apiClient.get<DemoAccount[]>('/demo/accounts');
|
||||
};
|
||||
|
||||
/**
|
||||
* Create a new demo session
|
||||
*/
|
||||
export const createDemoSession = async (
|
||||
request: CreateSessionRequest
|
||||
): Promise<DemoSession> => {
|
||||
return await apiClient.post<DemoSession>('/demo/session/create', request);
|
||||
};
|
||||
|
||||
/**
|
||||
* Extend an existing demo session
|
||||
*/
|
||||
export const extendDemoSession = async (
|
||||
request: ExtendSessionRequest
|
||||
): Promise<DemoSession> => {
|
||||
return await apiClient.post<DemoSession>('/demo/session/extend', request);
|
||||
};
|
||||
|
||||
/**
|
||||
* Destroy a demo session
|
||||
*/
|
||||
export const destroyDemoSession = async (
|
||||
request: DestroySessionRequest
|
||||
): Promise<{ message: string }> => {
|
||||
return await apiClient.post<{ message: string }>(
|
||||
'/demo/session/destroy',
|
||||
request
|
||||
);
|
||||
};
|
||||
|
||||
/**
|
||||
* Get demo session statistics
|
||||
*/
|
||||
export const getDemoStats = async (): Promise<any> => {
|
||||
return await apiClient.get('/demo/stats');
|
||||
};
|
||||
@@ -3,9 +3,11 @@ import { clsx } from 'clsx';
|
||||
import { useAuthUser, useIsAuthenticated } from '../../../stores';
|
||||
import { useTheme } from '../../../contexts/ThemeContext';
|
||||
import { useTenantInitializer } from '../../../stores/useTenantInitializer';
|
||||
import { useHasAccess } from '../../../hooks/useAccessControl';
|
||||
import { Header } from '../Header';
|
||||
import { Sidebar } from '../Sidebar';
|
||||
import { Footer } from '../Footer';
|
||||
import { DemoBanner } from '../DemoBanner';
|
||||
|
||||
export interface AppShellProps {
|
||||
children: React.ReactNode;
|
||||
@@ -74,10 +76,10 @@ export const AppShell = forwardRef<AppShellRef, AppShellProps>(({
|
||||
loadingComponent,
|
||||
errorBoundary: ErrorBoundary,
|
||||
}, ref) => {
|
||||
const isAuthenticated = useIsAuthenticated();
|
||||
const authLoading = false; // Since we're in a protected route, auth loading should be false
|
||||
const { resolvedTheme } = useTheme();
|
||||
|
||||
const hasAccess = useHasAccess(); // Check both authentication and demo mode
|
||||
|
||||
// Initialize tenant data for authenticated users
|
||||
useTenantInitializer();
|
||||
|
||||
@@ -196,12 +198,12 @@ export const AppShell = forwardRef<AppShellRef, AppShellProps>(({
|
||||
return <ErrorBoundary error={error}>{children}</ErrorBoundary>;
|
||||
}
|
||||
|
||||
const shouldShowSidebar = showSidebar && isAuthenticated && !fullScreen;
|
||||
const shouldShowSidebar = showSidebar && hasAccess && !fullScreen;
|
||||
const shouldShowHeader = showHeader && !fullScreen;
|
||||
const shouldShowFooter = showFooter && !fullScreen;
|
||||
|
||||
return (
|
||||
<div
|
||||
<div
|
||||
className={clsx(
|
||||
'min-h-screen bg-[var(--bg-primary)] flex flex-col',
|
||||
resolvedTheme,
|
||||
@@ -209,6 +211,9 @@ export const AppShell = forwardRef<AppShellRef, AppShellProps>(({
|
||||
)}
|
||||
data-testid="app-shell"
|
||||
>
|
||||
{/* Demo Banner */}
|
||||
<DemoBanner />
|
||||
|
||||
{/* Header */}
|
||||
{shouldShowHeader && (
|
||||
<Header
|
||||
@@ -250,8 +255,8 @@ export const AppShell = forwardRef<AppShellRef, AppShellProps>(({
|
||||
// Add header offset
|
||||
shouldShowHeader && 'pt-[var(--header-height)]',
|
||||
// Adjust margins based on sidebar state
|
||||
shouldShowSidebar && isAuthenticated && !isSidebarCollapsed && 'lg:ml-[var(--sidebar-width)]',
|
||||
shouldShowSidebar && isAuthenticated && isSidebarCollapsed && 'lg:ml-[var(--sidebar-collapsed-width)]',
|
||||
shouldShowSidebar && hasAccess && !isSidebarCollapsed && 'lg:ml-[var(--sidebar-width)]',
|
||||
shouldShowSidebar && hasAccess && isSidebarCollapsed && 'lg:ml-[var(--sidebar-collapsed-width)]',
|
||||
// Add padding to content
|
||||
padded && 'px-4 lg:px-6 pb-4 lg:pb-6'
|
||||
)}
|
||||
@@ -269,8 +274,8 @@ export const AppShell = forwardRef<AppShellRef, AppShellProps>(({
|
||||
showPrivacyLinks={true}
|
||||
className={clsx(
|
||||
'transition-all duration-300 ease-in-out',
|
||||
shouldShowSidebar && isAuthenticated && !isSidebarCollapsed && 'lg:ml-[var(--sidebar-width)]',
|
||||
shouldShowSidebar && isAuthenticated && isSidebarCollapsed && 'lg:ml-[var(--sidebar-collapsed-width)]'
|
||||
shouldShowSidebar && hasAccess && !isSidebarCollapsed && 'lg:ml-[var(--sidebar-width)]',
|
||||
shouldShowSidebar && hasAccess && isSidebarCollapsed && 'lg:ml-[var(--sidebar-collapsed-width)]'
|
||||
)}
|
||||
/>
|
||||
)}
|
||||
|
||||
146
frontend/src/components/layout/DemoBanner/DemoBanner.tsx
Normal file
146
frontend/src/components/layout/DemoBanner/DemoBanner.tsx
Normal file
@@ -0,0 +1,146 @@
|
||||
import React, { useState, useEffect } from 'react';
|
||||
import { extendDemoSession, destroyDemoSession } from '../../../api/services/demo';
|
||||
import { apiClient } from '../../../api/client';
|
||||
import { useNavigate } from 'react-router-dom';
|
||||
|
||||
export const DemoBanner: React.FC = () => {
|
||||
const navigate = useNavigate();
|
||||
const [isDemo, setIsDemo] = useState(false);
|
||||
const [expiresAt, setExpiresAt] = useState<string | null>(null);
|
||||
const [timeRemaining, setTimeRemaining] = useState<string>('');
|
||||
const [canExtend, setCanExtend] = useState(true);
|
||||
const [extending, setExtending] = useState(false);
|
||||
|
||||
useEffect(() => {
|
||||
const demoMode = localStorage.getItem('demo_mode') === 'true';
|
||||
const expires = localStorage.getItem('demo_expires_at');
|
||||
|
||||
setIsDemo(demoMode);
|
||||
setExpiresAt(expires);
|
||||
|
||||
if (demoMode && expires) {
|
||||
const interval = setInterval(() => {
|
||||
const now = new Date().getTime();
|
||||
const expiryTime = new Date(expires).getTime();
|
||||
const diff = expiryTime - now;
|
||||
|
||||
if (diff <= 0) {
|
||||
setTimeRemaining('Sesión expirada');
|
||||
handleExpiration();
|
||||
} else {
|
||||
const minutes = Math.floor(diff / 60000);
|
||||
const seconds = Math.floor((diff % 60000) / 1000);
|
||||
setTimeRemaining(`${minutes}:${seconds.toString().padStart(2, '0')}`);
|
||||
}
|
||||
}, 1000);
|
||||
|
||||
return () => clearInterval(interval);
|
||||
}
|
||||
}, [expiresAt]);
|
||||
|
||||
const handleExpiration = () => {
|
||||
localStorage.removeItem('demo_mode');
|
||||
localStorage.removeItem('demo_session_id');
|
||||
localStorage.removeItem('demo_account_type');
|
||||
localStorage.removeItem('demo_expires_at');
|
||||
apiClient.setDemoSessionId(null);
|
||||
navigate('/demo');
|
||||
};
|
||||
|
||||
const handleExtendSession = async () => {
|
||||
const sessionId = apiClient.getDemoSessionId();
|
||||
if (!sessionId) return;
|
||||
|
||||
setExtending(true);
|
||||
try {
|
||||
const updatedSession = await extendDemoSession({ session_id: sessionId });
|
||||
localStorage.setItem('demo_expires_at', updatedSession.expires_at);
|
||||
setExpiresAt(updatedSession.expires_at);
|
||||
|
||||
if (updatedSession.remaining_extensions === 0) {
|
||||
setCanExtend(false);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error extending session:', error);
|
||||
alert('No se pudo extender la sesión');
|
||||
} finally {
|
||||
setExtending(false);
|
||||
}
|
||||
};
|
||||
|
||||
const handleEndSession = async () => {
|
||||
const sessionId = apiClient.getDemoSessionId();
|
||||
if (!sessionId) return;
|
||||
|
||||
if (confirm('¿Estás seguro de que quieres terminar la sesión demo?')) {
|
||||
try {
|
||||
await destroyDemoSession({ session_id: sessionId });
|
||||
} catch (error) {
|
||||
console.error('Error destroying session:', error);
|
||||
} finally {
|
||||
handleExpiration();
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
if (!isDemo) return null;
|
||||
|
||||
return (
|
||||
<div className="bg-gradient-to-r from-amber-500 to-orange-500 text-white px-4 py-2 shadow-md">
|
||||
<div className="max-w-7xl mx-auto flex items-center justify-between">
|
||||
<div className="flex items-center space-x-4">
|
||||
<div className="flex items-center">
|
||||
<svg
|
||||
className="w-5 h-5 mr-2"
|
||||
fill="currentColor"
|
||||
viewBox="0 0 20 20"
|
||||
>
|
||||
<path
|
||||
fillRule="evenodd"
|
||||
d="M18 10a8 8 0 11-16 0 8 8 0 0116 0zm-7-4a1 1 0 11-2 0 1 1 0 012 0zM9 9a1 1 0 000 2v3a1 1 0 001 1h1a1 1 0 100-2v-3a1 1 0 00-1-1H9z"
|
||||
clipRule="evenodd"
|
||||
/>
|
||||
</svg>
|
||||
<span className="font-medium">Modo Demo</span>
|
||||
</div>
|
||||
|
||||
<div className="hidden sm:flex items-center text-sm">
|
||||
<svg
|
||||
className="w-4 h-4 mr-1"
|
||||
fill="currentColor"
|
||||
viewBox="0 0 20 20"
|
||||
>
|
||||
<path
|
||||
fillRule="evenodd"
|
||||
d="M10 18a8 8 0 100-16 8 8 0 000 16zm1-12a1 1 0 10-2 0v4a1 1 0 00.293.707l2.828 2.829a1 1 0 101.415-1.415L11 9.586V6z"
|
||||
clipRule="evenodd"
|
||||
/>
|
||||
</svg>
|
||||
Tiempo restante: <span className="font-mono ml-1">{timeRemaining}</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div className="flex items-center space-x-2">
|
||||
{canExtend && (
|
||||
<button
|
||||
onClick={handleExtendSession}
|
||||
disabled={extending}
|
||||
className="px-3 py-1 bg-white/20 hover:bg-white/30 rounded-md text-sm font-medium transition-colors disabled:opacity-50"
|
||||
>
|
||||
{extending ? 'Extendiendo...' : '+30 min'}
|
||||
</button>
|
||||
)}
|
||||
|
||||
<button
|
||||
onClick={handleEndSession}
|
||||
className="px-3 py-1 bg-white/20 hover:bg-white/30 rounded-md text-sm font-medium transition-colors"
|
||||
>
|
||||
Terminar Demo
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default DemoBanner;
|
||||
1
frontend/src/components/layout/DemoBanner/index.ts
Normal file
1
frontend/src/components/layout/DemoBanner/index.ts
Normal file
@@ -0,0 +1 @@
|
||||
export { DemoBanner, default } from './DemoBanner';
|
||||
@@ -5,6 +5,7 @@ import { useTranslation } from 'react-i18next';
|
||||
import { useAuthUser, useIsAuthenticated } from '../../../stores';
|
||||
import { useTheme } from '../../../contexts/ThemeContext';
|
||||
import { useNotifications } from '../../../hooks/useNotifications';
|
||||
import { useHasAccess } from '../../../hooks/useAccessControl';
|
||||
import { Button } from '../../ui';
|
||||
import { Badge } from '../../ui';
|
||||
import { TenantSwitcher } from '../../ui/TenantSwitcher';
|
||||
@@ -87,7 +88,7 @@ export const Header = forwardRef<HeaderRef, HeaderProps>(({
|
||||
const { t } = useTranslation();
|
||||
const navigate = useNavigate();
|
||||
const user = useAuthUser();
|
||||
const isAuthenticated = useIsAuthenticated();
|
||||
const hasAccess = useHasAccess(); // Check both authentication and demo mode
|
||||
const { theme, resolvedTheme, setTheme } = useTheme();
|
||||
const {
|
||||
notifications,
|
||||
@@ -183,7 +184,7 @@ export const Header = forwardRef<HeaderRef, HeaderProps>(({
|
||||
</div>
|
||||
|
||||
{/* Tenant Switcher - Desktop */}
|
||||
{isAuthenticated && (
|
||||
{hasAccess && (
|
||||
<div className="hidden md:block mx-2 lg:mx-4 flex-shrink-0">
|
||||
<TenantSwitcher
|
||||
showLabel={true}
|
||||
@@ -193,7 +194,7 @@ export const Header = forwardRef<HeaderRef, HeaderProps>(({
|
||||
)}
|
||||
|
||||
{/* Tenant Switcher - Mobile (in title area) */}
|
||||
{isAuthenticated && (
|
||||
{hasAccess && (
|
||||
<div className="md:hidden flex-1 min-w-0 ml-3">
|
||||
<TenantSwitcher
|
||||
showLabel={false}
|
||||
@@ -203,7 +204,7 @@ export const Header = forwardRef<HeaderRef, HeaderProps>(({
|
||||
)}
|
||||
|
||||
{/* Space for potential future content */ }
|
||||
{isAuthenticated && (
|
||||
{hasAccess && (
|
||||
<div className="hidden md:flex items-center flex-1 max-w-md mx-4">
|
||||
{/* Empty space to maintain layout consistency */}
|
||||
</div>
|
||||
@@ -211,12 +212,12 @@ export const Header = forwardRef<HeaderRef, HeaderProps>(({
|
||||
</div>
|
||||
|
||||
{/* Right section */}
|
||||
{isAuthenticated && (
|
||||
{hasAccess && (
|
||||
<div className="flex items-center gap-1">
|
||||
{/* Placeholder for potential future items */ }
|
||||
|
||||
{/* Language selector */}
|
||||
<CompactLanguageSelector className="w-auto min-w-[60px]" />
|
||||
<CompactLanguageSelector className="w-auto min-w-[50px]" />
|
||||
|
||||
{/* Theme toggle */}
|
||||
{showThemeToggle && (
|
||||
|
||||
@@ -154,10 +154,12 @@ export const PublicHeader = forwardRef<PublicHeaderRef, PublicHeaderProps>(({
|
||||
</nav>
|
||||
|
||||
{/* Right side actions */}
|
||||
<div className="flex items-center gap-3">
|
||||
{/* Language selector */}
|
||||
<div className="flex items-center gap-2 lg:gap-3">
|
||||
{/* Language selector - More compact */}
|
||||
{showLanguageSelector && (
|
||||
<CompactLanguageSelector className="hidden sm:flex" />
|
||||
<div className="hidden sm:flex">
|
||||
<CompactLanguageSelector className="w-[70px]" />
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Theme toggle */}
|
||||
@@ -169,22 +171,22 @@ export const PublicHeader = forwardRef<PublicHeaderRef, PublicHeaderProps>(({
|
||||
/>
|
||||
)}
|
||||
|
||||
{/* Authentication buttons */}
|
||||
{/* Authentication buttons - Enhanced */}
|
||||
{showAuthButtons && (
|
||||
<div className="flex items-center gap-2">
|
||||
<div className="flex items-center gap-2 lg:gap-3">
|
||||
<Link to="/login">
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
className="hidden sm:inline-flex"
|
||||
size="md"
|
||||
className="hidden sm:inline-flex font-medium hover:bg-[var(--bg-secondary)] transition-all duration-200"
|
||||
>
|
||||
Iniciar Sesión
|
||||
</Button>
|
||||
</Link>
|
||||
<Link to="/register">
|
||||
<Button
|
||||
size="sm"
|
||||
className="bg-[var(--color-primary)] hover:bg-[var(--color-primary-dark)] text-white"
|
||||
size="md"
|
||||
className="bg-gradient-to-r from-[var(--color-primary)] to-[var(--color-primary-dark)] hover:opacity-90 text-white font-semibold shadow-lg hover:shadow-xl transition-all duration-200 px-6"
|
||||
>
|
||||
<span className="hidden sm:inline">Comenzar Gratis</span>
|
||||
<span className="sm:hidden">Registro</span>
|
||||
@@ -243,14 +245,21 @@ export const PublicHeader = forwardRef<PublicHeaderRef, PublicHeaderProps>(({
|
||||
|
||||
{/* Mobile auth buttons */}
|
||||
{showAuthButtons && (
|
||||
<div className="flex flex-col gap-2 pt-4 sm:hidden">
|
||||
<div className="flex flex-col gap-3 pt-4 sm:hidden">
|
||||
<Link to="/login">
|
||||
<Button variant="ghost" size="sm" className="w-full">
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="md"
|
||||
className="w-full font-medium border border-[var(--border-primary)] hover:bg-[var(--bg-secondary)]"
|
||||
>
|
||||
Iniciar Sesión
|
||||
</Button>
|
||||
</Link>
|
||||
<Link to="/register">
|
||||
<Button size="sm" className="w-full bg-[var(--color-primary)] hover:bg-[var(--color-primary-dark)] text-white">
|
||||
<Button
|
||||
size="md"
|
||||
className="w-full bg-gradient-to-r from-[var(--color-primary)] to-[var(--color-primary-dark)] hover:opacity-90 text-white font-semibold shadow-lg"
|
||||
>
|
||||
Comenzar Gratis
|
||||
</Button>
|
||||
</Link>
|
||||
|
||||
@@ -4,6 +4,7 @@ import { useLocation, useNavigate } from 'react-router-dom';
|
||||
import { useTranslation } from 'react-i18next';
|
||||
import { useAuthUser, useIsAuthenticated, useAuthActions } from '../../../stores';
|
||||
import { useCurrentTenantAccess } from '../../../stores/tenant.store';
|
||||
import { useHasAccess } from '../../../hooks/useAccessControl';
|
||||
import { getNavigationRoutes, canAccessRoute, ROUTES } from '../../../router/routes.config';
|
||||
import { useSubscriptionAwareRoutes } from '../../../hooks/useSubscriptionAwareRoutes';
|
||||
import { Button } from '../../ui';
|
||||
@@ -138,7 +139,8 @@ export const Sidebar = forwardRef<SidebarRef, SidebarProps>(({
|
||||
const location = useLocation();
|
||||
const navigate = useNavigate();
|
||||
const user = useAuthUser();
|
||||
const isAuthenticated = useIsAuthenticated();
|
||||
const isAuthenticated = useIsAuthenticated(); // Keep for logout check
|
||||
const hasAccess = useHasAccess(); // For UI visibility
|
||||
const currentTenantAccess = useCurrentTenantAccess();
|
||||
const { logout } = useAuthActions();
|
||||
|
||||
@@ -207,37 +209,42 @@ export const Sidebar = forwardRef<SidebarRef, SidebarProps>(({
|
||||
// Filter items based on user permissions - memoized to prevent infinite re-renders
|
||||
const visibleItems = useMemo(() => {
|
||||
const filterItemsByPermissions = (items: NavigationItem[]): NavigationItem[] => {
|
||||
if (!isAuthenticated || !user) return [];
|
||||
if (!hasAccess) return [];
|
||||
|
||||
return items.map(item => ({
|
||||
...item, // Create a shallow copy to avoid mutation
|
||||
children: item.children ? filterItemsByPermissions(item.children) : item.children
|
||||
})).filter(item => {
|
||||
// Combine global and tenant roles for comprehensive access control
|
||||
const globalUserRoles = user.role ? [user.role as string] : [];
|
||||
const globalUserRoles = user?.role ? [user.role as string] : [];
|
||||
const tenantRole = currentTenantAccess?.role;
|
||||
const tenantRoles = tenantRole ? [tenantRole as string] : [];
|
||||
const allUserRoles = [...globalUserRoles, ...tenantRoles];
|
||||
const tenantPermissions = currentTenantAccess?.permissions || [];
|
||||
|
||||
const hasAccess = !item.requiredPermissions && !item.requiredRoles ||
|
||||
canAccessRoute(
|
||||
{
|
||||
path: item.path,
|
||||
requiredRoles: item.requiredRoles,
|
||||
requiredPermissions: item.requiredPermissions
|
||||
} as any,
|
||||
isAuthenticated,
|
||||
allUserRoles,
|
||||
tenantPermissions
|
||||
);
|
||||
// If no specific permissions/roles required, allow access
|
||||
if (!item.requiredPermissions && !item.requiredRoles) {
|
||||
return true;
|
||||
}
|
||||
|
||||
return hasAccess;
|
||||
// Check access based on roles and permissions
|
||||
const canAccessItem = canAccessRoute(
|
||||
{
|
||||
path: item.path,
|
||||
requiredRoles: item.requiredRoles,
|
||||
requiredPermissions: item.requiredPermissions
|
||||
} as any,
|
||||
isAuthenticated,
|
||||
allUserRoles,
|
||||
tenantPermissions
|
||||
);
|
||||
|
||||
return canAccessItem;
|
||||
});
|
||||
};
|
||||
|
||||
return filterItemsByPermissions(navigationItems);
|
||||
}, [navigationItems, isAuthenticated, user, currentTenantAccess]);
|
||||
}, [navigationItems, hasAccess, isAuthenticated, user, currentTenantAccess]);
|
||||
|
||||
// Handle item click
|
||||
const handleItemClick = useCallback((item: NavigationItem) => {
|
||||
@@ -645,7 +652,7 @@ export const Sidebar = forwardRef<SidebarRef, SidebarProps>(({
|
||||
);
|
||||
};
|
||||
|
||||
if (!isAuthenticated) {
|
||||
if (!hasAccess) {
|
||||
return null;
|
||||
}
|
||||
|
||||
|
||||
74
frontend/src/hooks/useAccessControl.ts
Normal file
74
frontend/src/hooks/useAccessControl.ts
Normal file
@@ -0,0 +1,74 @@
|
||||
/**
|
||||
* Centralized access control hook
|
||||
* Checks both authentication and demo mode to determine if user has access
|
||||
*/
|
||||
|
||||
import { useIsAuthenticated } from '../stores';
|
||||
|
||||
/**
|
||||
* Check if user is in demo mode
|
||||
*/
|
||||
export const useIsDemoMode = (): boolean => {
|
||||
return localStorage.getItem('demo_mode') === 'true';
|
||||
};
|
||||
|
||||
/**
|
||||
* Get demo session ID
|
||||
*/
|
||||
export const useDemoSessionId = (): string | null => {
|
||||
return localStorage.getItem('demo_session_id');
|
||||
};
|
||||
|
||||
/**
|
||||
* Check if user has access (either authenticated OR in valid demo mode)
|
||||
*/
|
||||
export const useHasAccess = (): boolean => {
|
||||
const isAuthenticated = useIsAuthenticated();
|
||||
const isDemoMode = useIsDemoMode();
|
||||
const demoSessionId = useDemoSessionId();
|
||||
|
||||
// User has access if:
|
||||
// 1. They are authenticated, OR
|
||||
// 2. They are in demo mode with a valid session ID
|
||||
return isAuthenticated || (isDemoMode && !!demoSessionId);
|
||||
};
|
||||
|
||||
/**
|
||||
* Check if current session is demo (not a real authenticated user)
|
||||
*/
|
||||
export const useIsDemo = (): boolean => {
|
||||
const isAuthenticated = useIsAuthenticated();
|
||||
const isDemoMode = useIsDemoMode();
|
||||
const demoSessionId = useDemoSessionId();
|
||||
|
||||
// It's a demo session if demo mode is active but user is not authenticated
|
||||
return !isAuthenticated && isDemoMode && !!demoSessionId;
|
||||
};
|
||||
|
||||
/**
|
||||
* Get demo account type
|
||||
*/
|
||||
export const useDemoAccountType = (): string | null => {
|
||||
const isDemoMode = useIsDemoMode();
|
||||
if (!isDemoMode) return null;
|
||||
return localStorage.getItem('demo_account_type');
|
||||
};
|
||||
|
||||
/**
|
||||
* Get demo session expiration
|
||||
*/
|
||||
export const useDemoExpiresAt = (): string | null => {
|
||||
const isDemoMode = useIsDemoMode();
|
||||
if (!isDemoMode) return null;
|
||||
return localStorage.getItem('demo_expires_at');
|
||||
};
|
||||
|
||||
/**
|
||||
* Clear demo session data
|
||||
*/
|
||||
export const clearDemoSession = (): void => {
|
||||
localStorage.removeItem('demo_mode');
|
||||
localStorage.removeItem('demo_session_id');
|
||||
localStorage.removeItem('demo_account_type');
|
||||
localStorage.removeItem('demo_expires_at');
|
||||
};
|
||||
256
frontend/src/pages/public/DemoPage.tsx
Normal file
256
frontend/src/pages/public/DemoPage.tsx
Normal file
@@ -0,0 +1,256 @@
|
||||
import React, { useState, useEffect } from 'react';
|
||||
import { useNavigate, Link } from 'react-router-dom';
|
||||
import { PublicLayout } from '../../components/layout';
|
||||
import { Button } from '../../components/ui';
|
||||
import { getDemoAccounts, createDemoSession, DemoAccount } from '../../api/services/demo';
|
||||
import { apiClient } from '../../api/client';
|
||||
import { Check, Clock, Shield, Play, Zap, ArrowRight, Store, Factory } from 'lucide-react';
|
||||
|
||||
export const DemoPage: React.FC = () => {
|
||||
const navigate = useNavigate();
|
||||
const [demoAccounts, setDemoAccounts] = useState<DemoAccount[]>([]);
|
||||
const [loading, setLoading] = useState(true);
|
||||
const [creatingSession, setCreatingSession] = useState(false);
|
||||
const [error, setError] = useState<string | null>(null);
|
||||
|
||||
useEffect(() => {
|
||||
const fetchDemoAccounts = async () => {
|
||||
try {
|
||||
const accounts = await getDemoAccounts();
|
||||
setDemoAccounts(accounts);
|
||||
} catch (err) {
|
||||
setError('Error al cargar las cuentas demo');
|
||||
console.error('Error fetching demo accounts:', err);
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
fetchDemoAccounts();
|
||||
}, []);
|
||||
|
||||
const handleStartDemo = async (accountType: string) => {
|
||||
setCreatingSession(true);
|
||||
setError(null);
|
||||
|
||||
try {
|
||||
const session = await createDemoSession({
|
||||
demo_account_type: accountType as 'individual_bakery' | 'central_baker',
|
||||
});
|
||||
|
||||
// Store session ID in API client
|
||||
apiClient.setDemoSessionId(session.session_id);
|
||||
|
||||
// Store session info in localStorage for UI
|
||||
localStorage.setItem('demo_mode', 'true');
|
||||
localStorage.setItem('demo_session_id', session.session_id);
|
||||
localStorage.setItem('demo_account_type', accountType);
|
||||
localStorage.setItem('demo_expires_at', session.expires_at);
|
||||
localStorage.setItem('demo_tenant_id', session.virtual_tenant_id);
|
||||
|
||||
// Navigate to dashboard
|
||||
navigate('/app/dashboard');
|
||||
} catch (err: any) {
|
||||
setError(err?.message || 'Error al crear sesión demo');
|
||||
console.error('Error creating demo session:', err);
|
||||
} finally {
|
||||
setCreatingSession(false);
|
||||
}
|
||||
};
|
||||
|
||||
const getAccountIcon = (accountType: string) => {
|
||||
return accountType === 'individual_bakery' ? Store : Factory;
|
||||
};
|
||||
|
||||
if (loading) {
|
||||
return (
|
||||
<PublicLayout
|
||||
variant="full-width"
|
||||
contentPadding="none"
|
||||
headerProps={{
|
||||
showThemeToggle: true,
|
||||
showAuthButtons: true,
|
||||
showLanguageSelector: true,
|
||||
}}
|
||||
>
|
||||
<div className="min-h-screen flex items-center justify-center bg-gradient-to-br from-[var(--bg-primary)] via-[var(--bg-secondary)] to-[var(--color-primary)]/5">
|
||||
<div className="text-center">
|
||||
<div className="animate-spin rounded-full h-12 w-12 border-b-2 border-[var(--color-primary)] mx-auto"></div>
|
||||
<p className="mt-4 text-[var(--text-secondary)]">Cargando cuentas demo...</p>
|
||||
</div>
|
||||
</div>
|
||||
</PublicLayout>
|
||||
);
|
||||
}
|
||||
|
||||
return (
|
||||
<PublicLayout
|
||||
variant="full-width"
|
||||
contentPadding="none"
|
||||
headerProps={{
|
||||
showThemeToggle: true,
|
||||
showAuthButtons: true,
|
||||
showLanguageSelector: true,
|
||||
}}
|
||||
>
|
||||
{/* Hero Section */}
|
||||
<section className="relative py-20 lg:py-32 bg-gradient-to-br from-[var(--bg-primary)] via-[var(--bg-secondary)] to-[var(--color-primary)]/5">
|
||||
<div className="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8">
|
||||
<div className="text-center mb-16">
|
||||
<div className="mb-6">
|
||||
<span className="inline-flex items-center px-3 py-1 rounded-full text-sm font-medium bg-[var(--color-primary)]/10 text-[var(--color-primary)]">
|
||||
<Play className="w-4 h-4 mr-2" />
|
||||
Demo Interactiva
|
||||
</span>
|
||||
</div>
|
||||
|
||||
<h1 className="text-4xl tracking-tight font-extrabold text-[var(--text-primary)] sm:text-5xl lg:text-6xl">
|
||||
<span className="block">Prueba BakeryIA</span>
|
||||
<span className="block text-[var(--color-primary)]">sin compromiso</span>
|
||||
</h1>
|
||||
|
||||
<p className="mt-6 max-w-3xl mx-auto text-lg text-[var(--text-secondary)] sm:text-xl">
|
||||
Explora nuestro sistema con datos reales de panaderías españolas.
|
||||
Elige el tipo de negocio que mejor se adapte a tu caso.
|
||||
</p>
|
||||
|
||||
<div className="mt-8 flex items-center justify-center space-x-6 text-sm text-[var(--text-tertiary)]">
|
||||
<div className="flex items-center">
|
||||
<Check className="w-4 h-4 text-green-500 mr-2" />
|
||||
Sin tarjeta de crédito
|
||||
</div>
|
||||
<div className="flex items-center">
|
||||
<Clock className="w-4 h-4 text-green-500 mr-2" />
|
||||
30 minutos de acceso
|
||||
</div>
|
||||
<div className="flex items-center">
|
||||
<Shield className="w-4 h-4 text-green-500 mr-2" />
|
||||
Datos aislados y seguros
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{error && (
|
||||
<div className="mb-8 bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 text-red-700 dark:text-red-400 px-4 py-3 rounded-lg max-w-2xl mx-auto">
|
||||
{error}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Demo Account Cards */}
|
||||
<div className="grid md:grid-cols-2 gap-8 max-w-6xl mx-auto">
|
||||
{demoAccounts.map((account) => {
|
||||
const Icon = getAccountIcon(account.account_type);
|
||||
|
||||
return (
|
||||
<div
|
||||
key={account.account_type}
|
||||
className="relative bg-[var(--bg-primary)] rounded-2xl shadow-xl hover:shadow-2xl transition-all duration-300 overflow-hidden border border-[var(--border-default)] group"
|
||||
>
|
||||
{/* Gradient overlay */}
|
||||
<div className="absolute inset-0 bg-gradient-to-br from-[var(--color-primary)]/5 to-transparent opacity-0 group-hover:opacity-100 transition-opacity duration-300"></div>
|
||||
|
||||
<div className="relative p-8">
|
||||
{/* Header */}
|
||||
<div className="flex items-start justify-between mb-6">
|
||||
<div className="flex items-center">
|
||||
<div className="p-3 rounded-xl bg-[var(--color-primary)]/10 text-[var(--color-primary)]">
|
||||
<Icon className="w-6 h-6" />
|
||||
</div>
|
||||
<div className="ml-4">
|
||||
<h2 className="text-2xl font-bold text-[var(--text-primary)]">
|
||||
{account.name}
|
||||
</h2>
|
||||
<p className="text-sm text-[var(--text-tertiary)] mt-1">
|
||||
{account.business_model}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
<span className="px-3 py-1 bg-[var(--color-primary)]/10 text-[var(--color-primary)] rounded-full text-xs font-semibold">
|
||||
DEMO
|
||||
</span>
|
||||
</div>
|
||||
|
||||
{/* Description */}
|
||||
<p className="text-[var(--text-secondary)] mb-6">
|
||||
{account.description}
|
||||
</p>
|
||||
|
||||
{/* Features */}
|
||||
{account.features && account.features.length > 0 && (
|
||||
<div className="mb-6 space-y-2">
|
||||
<p className="text-sm font-semibold text-[var(--text-primary)] mb-3">
|
||||
Funcionalidades incluidas:
|
||||
</p>
|
||||
{account.features.map((feature, idx) => (
|
||||
<div key={idx} className="flex items-center text-sm text-[var(--text-secondary)]">
|
||||
<Zap className="w-4 h-4 mr-2 text-[var(--color-primary)]" />
|
||||
{feature}
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Demo Benefits */}
|
||||
<div className="space-y-2 mb-8 pt-6 border-t border-[var(--border-default)]">
|
||||
<div className="flex items-center text-sm text-[var(--text-secondary)]">
|
||||
<Check className="w-4 h-4 mr-2 text-green-500" />
|
||||
Datos reales en español
|
||||
</div>
|
||||
<div className="flex items-center text-sm text-[var(--text-secondary)]">
|
||||
<Check className="w-4 h-4 mr-2 text-green-500" />
|
||||
Sesión aislada de 30 minutos
|
||||
</div>
|
||||
<div className="flex items-center text-sm text-[var(--text-secondary)]">
|
||||
<Check className="w-4 h-4 mr-2 text-green-500" />
|
||||
Sin necesidad de registro
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* CTA Button */}
|
||||
<Button
|
||||
onClick={() => handleStartDemo(account.account_type)}
|
||||
disabled={creatingSession}
|
||||
size="lg"
|
||||
className="w-full bg-[var(--color-primary)] hover:bg-[var(--color-primary-dark)] text-white shadow-lg hover:shadow-xl transform hover:scale-105 transition-all duration-200"
|
||||
>
|
||||
{creatingSession ? (
|
||||
<span className="flex items-center justify-center">
|
||||
<svg className="animate-spin -ml-1 mr-3 h-5 w-5 text-white" xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 24 24">
|
||||
<circle className="opacity-25" cx="12" cy="12" r="10" stroke="currentColor" strokeWidth="4"></circle>
|
||||
<path className="opacity-75" fill="currentColor" d="M4 12a8 8 0 018-8V0C5.373 0 0 5.373 0 12h4zm2 5.291A7.962 7.962 0 014 12H0c0 3.042 1.135 5.824 3 7.938l3-2.647z"></path>
|
||||
</svg>
|
||||
Creando sesión...
|
||||
</span>
|
||||
) : (
|
||||
<>
|
||||
<Play className="mr-2 w-5 h-5" />
|
||||
Probar Demo Ahora
|
||||
</>
|
||||
)}
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
|
||||
{/* Footer CTA */}
|
||||
<div className="mt-16 text-center">
|
||||
<p className="text-[var(--text-secondary)] mb-4">
|
||||
¿Ya tienes una cuenta?
|
||||
</p>
|
||||
<Link
|
||||
to="/login"
|
||||
className="inline-flex items-center text-[var(--color-primary)] hover:text-[var(--color-primary-dark)] font-semibold transition-colors"
|
||||
>
|
||||
Inicia sesión aquí
|
||||
<ArrowRight className="ml-2 w-4 h-4" />
|
||||
</Link>
|
||||
</div>
|
||||
</div>
|
||||
</section>
|
||||
</PublicLayout>
|
||||
);
|
||||
};
|
||||
|
||||
export default DemoPage;
|
||||
@@ -3,12 +3,12 @@ import { Link } from 'react-router-dom';
|
||||
import { useTranslation } from 'react-i18next';
|
||||
import { Button } from '../../components/ui';
|
||||
import { PublicLayout } from '../../components/layout';
|
||||
import {
|
||||
BarChart3,
|
||||
TrendingUp,
|
||||
Shield,
|
||||
Zap,
|
||||
Users,
|
||||
import {
|
||||
BarChart3,
|
||||
TrendingUp,
|
||||
Shield,
|
||||
Zap,
|
||||
Users,
|
||||
Award,
|
||||
ChevronRight,
|
||||
Check,
|
||||
@@ -20,7 +20,8 @@ import {
|
||||
Euro,
|
||||
Package,
|
||||
PieChart,
|
||||
Settings
|
||||
Settings,
|
||||
Brain
|
||||
} from 'lucide-react';
|
||||
|
||||
const LandingPage: React.FC = () => {
|
||||
@@ -55,38 +56,57 @@ const LandingPage: React.FC = () => {
|
||||
<section className="relative py-20 lg:py-32 bg-gradient-to-br from-[var(--bg-primary)] via-[var(--bg-secondary)] to-[var(--color-primary)]/5">
|
||||
<div className="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8">
|
||||
<div className="text-center">
|
||||
<div className="mb-6">
|
||||
<div className="mb-6 flex flex-wrap items-center justify-center gap-3">
|
||||
<span className="inline-flex items-center px-3 py-1 rounded-full text-sm font-medium bg-[var(--color-primary)]/10 text-[var(--color-primary)]">
|
||||
<Zap className="w-4 h-4 mr-2" />
|
||||
{t('landing:hero.badge', 'IA Avanzada para Panaderías')}
|
||||
</span>
|
||||
<span className="inline-flex items-center px-3 py-1 rounded-full text-sm font-medium bg-green-500/10 text-green-600 dark:text-green-400">
|
||||
<Shield className="w-4 h-4 mr-2" />
|
||||
Reducción de Desperdicio Alimentario
|
||||
</span>
|
||||
</div>
|
||||
|
||||
|
||||
<h1 className="text-4xl tracking-tight font-extrabold text-[var(--text-primary)] sm:text-5xl lg:text-7xl">
|
||||
<span className="block">{t('landing:hero.title_line1', 'Revoluciona tu')}</span>
|
||||
<span className="block text-[var(--color-primary)]">{t('landing:hero.title_line2', 'Panadería con IA')}</span>
|
||||
<span className="block">{t('landing:hero.title_line1', 'IA que Reduce')}</span>
|
||||
<span className="block text-[var(--color-primary)]">{t('landing:hero.title_line2', 'Desperdicio Alimentario')}</span>
|
||||
</h1>
|
||||
|
||||
|
||||
<p className="mt-6 max-w-3xl mx-auto text-lg text-[var(--text-secondary)] sm:text-xl">
|
||||
{t('landing:hero.subtitle', 'Optimiza automáticamente tu producción, reduce desperdicios hasta un 35%, predice demanda con precisión del 92% y aumenta tus ventas con inteligencia artificial.')}
|
||||
{t('landing:hero.subtitle', 'Tecnología de inteligencia artificial que reduce hasta un 35% el desperdicio alimentario, optimiza tu producción y protege tu información. Tus datos son 100% tuyos.')}
|
||||
</p>
|
||||
|
||||
<div className="mt-10 flex flex-col sm:flex-row gap-4 justify-center">
|
||||
|
||||
{/* Pilot Launch Banner */}
|
||||
<div className="mt-8 inline-block">
|
||||
<div className="bg-gradient-to-r from-amber-500/10 to-orange-500/10 border-2 border-amber-500/30 rounded-xl px-6 py-4">
|
||||
<div className="flex items-center justify-center gap-2 text-amber-600 dark:text-amber-400 font-bold text-lg">
|
||||
<Star className="w-5 h-5 fill-current" />
|
||||
<span>¡Lanzamiento Piloto!</span>
|
||||
<Star className="w-5 h-5 fill-current" />
|
||||
</div>
|
||||
<p className="mt-2 text-sm text-[var(--text-secondary)] text-center">
|
||||
<strong className="text-[var(--color-primary)]">3 meses GRATIS</strong> para early adopters que se registren ahora
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div className="mt-8 flex flex-col sm:flex-row gap-4 justify-center">
|
||||
<Link to="/register">
|
||||
<Button size="lg" className="px-8 py-4 text-lg font-semibold bg-[var(--color-primary)] hover:bg-[var(--color-primary-dark)] text-white shadow-lg hover:shadow-xl transform hover:scale-105 transition-all duration-200">
|
||||
{t('landing:hero.cta_primary', 'Comenzar Gratis 14 Días')}
|
||||
{t('landing:hero.cta_primary', 'Comenzar GRATIS 3 Meses')}
|
||||
<ArrowRight className="ml-2 w-5 h-5" />
|
||||
</Button>
|
||||
</Link>
|
||||
<Button
|
||||
variant="outline"
|
||||
size="lg"
|
||||
className="px-8 py-4 text-lg font-semibold border-2 border-[var(--color-primary)] text-[var(--color-primary)] hover:bg-[var(--color-primary)] hover:text-white transition-all duration-200"
|
||||
onClick={() => scrollToSection('demo')}
|
||||
>
|
||||
<Play className="mr-2 w-5 h-5" />
|
||||
{t('landing:hero.cta_secondary', 'Ver Demo en Vivo')}
|
||||
</Button>
|
||||
<Link to="/demo">
|
||||
<Button
|
||||
variant="outline"
|
||||
size="lg"
|
||||
className="px-8 py-4 text-lg font-semibold border-2 border-[var(--color-primary)] text-[var(--color-primary)] hover:bg-[var(--color-primary)] hover:text-white transition-all duration-200"
|
||||
>
|
||||
<Play className="mr-2 w-5 h-5" />
|
||||
{t('landing:hero.cta_secondary', 'Ver Demo en Vivo')}
|
||||
</Button>
|
||||
</Link>
|
||||
</div>
|
||||
|
||||
<div className="mt-12 flex items-center justify-center space-x-6 text-sm text-[var(--text-tertiary)]">
|
||||
@@ -137,44 +157,126 @@ const LandingPage: React.FC = () => {
|
||||
</div>
|
||||
</section>
|
||||
|
||||
{/* Main Features Section */}
|
||||
{/* Main Features Section - Focus on AI & Food Waste */}
|
||||
<section id="features" className="py-24 bg-[var(--bg-secondary)]">
|
||||
<div className="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8">
|
||||
<div className="text-center">
|
||||
<div className="mb-4">
|
||||
<span className="inline-flex items-center px-4 py-2 rounded-full text-sm font-medium bg-gradient-to-r from-blue-500/10 to-purple-500/10 text-blue-600 dark:text-blue-400 border border-blue-500/20">
|
||||
<Brain className="w-4 h-4 mr-2" />
|
||||
Tecnología de IA de Última Generación
|
||||
</span>
|
||||
</div>
|
||||
<h2 className="text-3xl lg:text-5xl font-extrabold text-[var(--text-primary)]">
|
||||
Gestión Completa con
|
||||
<span className="block text-[var(--color-primary)]">Inteligencia Artificial</span>
|
||||
Combate el Desperdicio Alimentario
|
||||
<span className="block text-[var(--color-primary)]">con Inteligencia Artificial</span>
|
||||
</h2>
|
||||
<p className="mt-6 max-w-3xl mx-auto text-lg text-[var(--text-secondary)]">
|
||||
Automatiza procesos, optimiza recursos y toma decisiones inteligentes basadas en datos reales de tu panadería.
|
||||
Sistema de alta tecnología que utiliza algoritmos de IA avanzados para optimizar tu producción, reducir residuos alimentarios y mantener tus datos 100% seguros y bajo tu control.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<div className="mt-20 grid grid-cols-1 lg:grid-cols-3 gap-8">
|
||||
{/* AI Forecasting */}
|
||||
<div className="group relative bg-[var(--bg-primary)] rounded-2xl p-8 shadow-lg hover:shadow-2xl transition-all duration-300 border border-[var(--border-primary)] hover:border-[var(--color-primary)]/30">
|
||||
{/* AI Technology */}
|
||||
<div className="group relative bg-[var(--bg-primary)] rounded-2xl p-8 shadow-lg hover:shadow-2xl transition-all duration-300 border-2 border-[var(--border-primary)] hover:border-blue-500/50">
|
||||
<div className="absolute -top-4 left-8">
|
||||
<div className="w-12 h-12 bg-gradient-to-r from-[var(--color-primary)] to-[var(--color-primary-dark)] rounded-xl flex items-center justify-center shadow-lg">
|
||||
<TrendingUp className="w-6 h-6 text-white" />
|
||||
<div className="w-12 h-12 bg-gradient-to-r from-blue-600 to-purple-600 rounded-xl flex items-center justify-center shadow-lg">
|
||||
<Brain className="w-6 h-6 text-white" />
|
||||
</div>
|
||||
</div>
|
||||
<div className="mt-6">
|
||||
<h3 className="text-xl font-bold text-[var(--text-primary)]">Predicción Inteligente</h3>
|
||||
<h3 className="text-xl font-bold text-[var(--text-primary)]">IA Avanzada de Predicción</h3>
|
||||
<p className="mt-4 text-[var(--text-secondary)]">
|
||||
Algoritmos de IA analizan patrones históricos, clima, eventos locales y tendencias para predecir la demanda exacta de cada producto.
|
||||
Algoritmos de machine learning de última generación analizan patrones históricos, clima, eventos y tendencias para predecir demanda con precisión quirúrgica.
|
||||
</p>
|
||||
<div className="mt-6">
|
||||
<div className="flex items-center text-sm text-[var(--color-primary)]">
|
||||
<Check className="w-4 h-4 mr-2" />
|
||||
Precisión del 92% en predicciones
|
||||
<div className="mt-6 space-y-3">
|
||||
<div className="flex items-center text-sm">
|
||||
<div className="flex-shrink-0 w-6 h-6 bg-blue-500/10 rounded-full flex items-center justify-center mr-3">
|
||||
<Zap className="w-3 h-3 text-blue-600" />
|
||||
</div>
|
||||
<span className="text-[var(--text-secondary)]">Precisión del 92% en predicciones</span>
|
||||
</div>
|
||||
<div className="flex items-center text-sm text-[var(--color-primary)] mt-2">
|
||||
<Check className="w-4 h-4 mr-2" />
|
||||
Reduce desperdicios hasta 35%
|
||||
<div className="flex items-center text-sm">
|
||||
<div className="flex-shrink-0 w-6 h-6 bg-blue-500/10 rounded-full flex items-center justify-center mr-3">
|
||||
<TrendingUp className="w-3 h-3 text-blue-600" />
|
||||
</div>
|
||||
<span className="text-[var(--text-secondary)]">Aprendizaje continuo y adaptativo</span>
|
||||
</div>
|
||||
<div className="flex items-center text-sm text-[var(--color-primary)] mt-2">
|
||||
<Check className="w-4 h-4 mr-2" />
|
||||
Aumenta ventas promedio 22%
|
||||
<div className="flex items-center text-sm">
|
||||
<div className="flex-shrink-0 w-6 h-6 bg-blue-500/10 rounded-full flex items-center justify-center mr-3">
|
||||
<BarChart3 className="w-3 h-3 text-blue-600" />
|
||||
</div>
|
||||
<span className="text-[var(--text-secondary)]">Análisis predictivo en tiempo real</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Food Waste Reduction */}
|
||||
<div className="group relative bg-[var(--bg-primary)] rounded-2xl p-8 shadow-lg hover:shadow-2xl transition-all duration-300 border-2 border-[var(--border-primary)] hover:border-green-500/50">
|
||||
<div className="absolute -top-4 left-8">
|
||||
<div className="w-12 h-12 bg-gradient-to-r from-green-600 to-emerald-600 rounded-xl flex items-center justify-center shadow-lg">
|
||||
<Shield className="w-6 h-6 text-white" />
|
||||
</div>
|
||||
</div>
|
||||
<div className="mt-6">
|
||||
<h3 className="text-xl font-bold text-[var(--text-primary)]">Reducción de Desperdicio</h3>
|
||||
<p className="mt-4 text-[var(--text-secondary)]">
|
||||
Contribuye al medioambiente y reduce costos eliminando hasta un 35% del desperdicio alimentario mediante producción optimizada e inteligente.
|
||||
</p>
|
||||
<div className="mt-6 space-y-3">
|
||||
<div className="flex items-center text-sm">
|
||||
<div className="flex-shrink-0 w-6 h-6 bg-green-500/10 rounded-full flex items-center justify-center mr-3">
|
||||
<Check className="w-3 h-3 text-green-600" />
|
||||
</div>
|
||||
<span className="text-[var(--text-secondary)]">Hasta 35% menos desperdicio</span>
|
||||
</div>
|
||||
<div className="flex items-center text-sm">
|
||||
<div className="flex-shrink-0 w-6 h-6 bg-green-500/10 rounded-full flex items-center justify-center mr-3">
|
||||
<Euro className="w-3 h-3 text-green-600" />
|
||||
</div>
|
||||
<span className="text-[var(--text-secondary)]">Ahorro promedio de €800/mes</span>
|
||||
</div>
|
||||
<div className="flex items-center text-sm">
|
||||
<div className="flex-shrink-0 w-6 h-6 bg-green-500/10 rounded-full flex items-center justify-center mr-3">
|
||||
<Award className="w-3 h-3 text-green-600" />
|
||||
</div>
|
||||
<span className="text-[var(--text-secondary)]">Elegible para ayudas UE</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Data Ownership & Privacy */}
|
||||
<div className="group relative bg-[var(--bg-primary)] rounded-2xl p-8 shadow-lg hover:shadow-2xl transition-all duration-300 border-2 border-[var(--border-primary)] hover:border-amber-500/50">
|
||||
<div className="absolute -top-4 left-8">
|
||||
<div className="w-12 h-12 bg-gradient-to-r from-amber-600 to-orange-600 rounded-xl flex items-center justify-center shadow-lg">
|
||||
<Shield className="w-6 h-6 text-white" />
|
||||
</div>
|
||||
</div>
|
||||
<div className="mt-6">
|
||||
<h3 className="text-xl font-bold text-[var(--text-primary)]">Tus Datos, Tu Propiedad</h3>
|
||||
<p className="mt-4 text-[var(--text-secondary)]">
|
||||
Privacidad y seguridad total. Tus datos operativos, proveedores y analíticas permanecen 100% bajo tu control. Nunca compartidos, nunca vendidos.
|
||||
</p>
|
||||
<div className="mt-6 space-y-3">
|
||||
<div className="flex items-center text-sm">
|
||||
<div className="flex-shrink-0 w-6 h-6 bg-amber-500/10 rounded-full flex items-center justify-center mr-3">
|
||||
<Shield className="w-3 h-3 text-amber-600" />
|
||||
</div>
|
||||
<span className="text-[var(--text-secondary)]">100% propiedad de datos</span>
|
||||
</div>
|
||||
<div className="flex items-center text-sm">
|
||||
<div className="flex-shrink-0 w-6 h-6 bg-amber-500/10 rounded-full flex items-center justify-center mr-3">
|
||||
<Settings className="w-3 h-3 text-amber-600" />
|
||||
</div>
|
||||
<span className="text-[var(--text-secondary)]">Control total de privacidad</span>
|
||||
</div>
|
||||
<div className="flex items-center text-sm">
|
||||
<div className="flex-shrink-0 w-6 h-6 bg-amber-500/10 rounded-full flex items-center justify-center mr-3">
|
||||
<Award className="w-3 h-3 text-amber-600" />
|
||||
</div>
|
||||
<span className="text-[var(--text-secondary)]">Cumplimiento GDPR garantizado</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
@@ -874,15 +976,16 @@ const LandingPage: React.FC = () => {
|
||||
<ArrowRight className="ml-2 w-5 h-5" />
|
||||
</Button>
|
||||
</Link>
|
||||
<Button
|
||||
size="lg"
|
||||
variant="outline"
|
||||
className="px-10 py-4 text-lg font-semibold border-2 border-white text-white hover:bg-white hover:text-[var(--color-primary)] transition-all duration-200"
|
||||
onClick={() => scrollToSection('demo')}
|
||||
>
|
||||
<Play className="mr-2 w-5 h-5" />
|
||||
Ver Demo
|
||||
</Button>
|
||||
<Link to="/demo">
|
||||
<Button
|
||||
size="lg"
|
||||
variant="outline"
|
||||
className="px-10 py-4 text-lg font-semibold border-2 border-white text-white hover:bg-white hover:text-[var(--color-primary)] transition-all duration-200"
|
||||
>
|
||||
<Play className="mr-2 w-5 h-5" />
|
||||
Ver Demo
|
||||
</Button>
|
||||
</Link>
|
||||
</div>
|
||||
|
||||
<div className="mt-12 grid grid-cols-1 sm:grid-cols-3 gap-8 text-center">
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
export { default as LandingPage } from './LandingPage';
|
||||
export { default as LoginPage } from './LoginPage';
|
||||
export { default as RegisterPage } from './RegisterPage';
|
||||
export { default as RegisterPage } from './RegisterPage';
|
||||
export { default as DemoPage } from './DemoPage';
|
||||
@@ -8,6 +8,7 @@ import { AppShell } from '../components/layout';
|
||||
const LandingPage = React.lazy(() => import('../pages/public/LandingPage'));
|
||||
const LoginPage = React.lazy(() => import('../pages/public/LoginPage'));
|
||||
const RegisterPage = React.lazy(() => import('../pages/public/RegisterPage'));
|
||||
const DemoPage = React.lazy(() => import('../pages/public/DemoPage'));
|
||||
const DashboardPage = React.lazy(() => import('../pages/app/DashboardPage'));
|
||||
|
||||
// Operations pages
|
||||
@@ -58,6 +59,7 @@ export const AppRouter: React.FC = () => {
|
||||
<Route path="/" element={<LandingPage />} />
|
||||
<Route path="/login" element={<LoginPage />} />
|
||||
<Route path="/register" element={<RegisterPage />} />
|
||||
<Route path="/demo" element={<DemoPage />} />
|
||||
|
||||
{/* Protected Routes with AppShell Layout */}
|
||||
<Route
|
||||
|
||||
@@ -6,6 +6,7 @@ import React from 'react';
|
||||
import { Navigate, useLocation } from 'react-router-dom';
|
||||
import { useAuthUser, useIsAuthenticated, useAuthLoading } from '../stores';
|
||||
import { useCurrentTenantAccess, useTenantPermissions } from '../stores/tenant.store';
|
||||
import { useHasAccess, useIsDemoMode } from '../hooks/useAccessControl';
|
||||
import { RouteConfig, canAccessRoute, ROUTES } from './routes.config';
|
||||
|
||||
interface ProtectedRouteProps {
|
||||
@@ -130,6 +131,8 @@ export const ProtectedRoute: React.FC<ProtectedRouteProps> = ({
|
||||
const currentTenantAccess = useCurrentTenantAccess();
|
||||
const { hasPermission } = useTenantPermissions();
|
||||
const location = useLocation();
|
||||
const hasAccess = useHasAccess(); // Check both authentication and demo mode
|
||||
const isDemoMode = useIsDemoMode();
|
||||
|
||||
// Note: Onboarding routes are now properly protected and require authentication
|
||||
// Mock mode only applies to the onboarding flow content, not to route protection
|
||||
@@ -144,15 +147,20 @@ export const ProtectedRoute: React.FC<ProtectedRouteProps> = ({
|
||||
return <>{children}</>;
|
||||
}
|
||||
|
||||
// If user has access (authenticated OR demo mode), allow access
|
||||
if (hasAccess) {
|
||||
return <>{children}</>;
|
||||
}
|
||||
|
||||
// If not authenticated and route requires auth, redirect to login
|
||||
if (!isAuthenticated) {
|
||||
const redirectPath = redirectTo || ROUTES.LOGIN;
|
||||
const returnUrl = location.pathname + location.search;
|
||||
|
||||
|
||||
return (
|
||||
<Navigate
|
||||
to={`${redirectPath}?returnUrl=${encodeURIComponent(returnUrl)}`}
|
||||
replace
|
||||
<Navigate
|
||||
to={`${redirectPath}?returnUrl=${encodeURIComponent(returnUrl)}`}
|
||||
replace
|
||||
/>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -1,27 +1,57 @@
|
||||
import { useEffect } from 'react';
|
||||
import { useIsAuthenticated } from './auth.store';
|
||||
import { useTenantActions, useAvailableTenants } from './tenant.store';
|
||||
import { useTenantActions, useAvailableTenants, useCurrentTenant } from './tenant.store';
|
||||
import { useIsDemoMode, useDemoSessionId, useDemoAccountType } from '../hooks/useAccessControl';
|
||||
|
||||
/**
|
||||
* Hook to automatically initialize tenant data when user is authenticated
|
||||
* Hook to automatically initialize tenant data when user is authenticated or in demo mode
|
||||
* This should be used at the app level to ensure tenant data is loaded
|
||||
*/
|
||||
export const useTenantInitializer = () => {
|
||||
const isAuthenticated = useIsAuthenticated();
|
||||
const isDemoMode = useIsDemoMode();
|
||||
const demoSessionId = useDemoSessionId();
|
||||
const demoAccountType = useDemoAccountType();
|
||||
const availableTenants = useAvailableTenants();
|
||||
const { loadUserTenants } = useTenantActions();
|
||||
const currentTenant = useCurrentTenant();
|
||||
const { loadUserTenants, setCurrentTenant } = useTenantActions();
|
||||
|
||||
// Load tenants for authenticated users
|
||||
useEffect(() => {
|
||||
if (isAuthenticated && !availableTenants) {
|
||||
// Load user's available tenants when authenticated and not already loaded
|
||||
loadUserTenants();
|
||||
}
|
||||
}, [isAuthenticated, availableTenants, loadUserTenants]);
|
||||
|
||||
// Also load tenants when user becomes authenticated (e.g., after login)
|
||||
// Set up mock tenant for demo mode
|
||||
useEffect(() => {
|
||||
if (isAuthenticated && availableTenants === null) {
|
||||
loadUserTenants();
|
||||
if (isDemoMode && demoSessionId) {
|
||||
const demoTenantId = localStorage.getItem('demo_tenant_id') || 'demo-tenant-id';
|
||||
|
||||
// Check if current tenant is the demo tenant and is properly set
|
||||
const isValidDemoTenant = currentTenant &&
|
||||
typeof currentTenant === 'object' &&
|
||||
currentTenant.id === demoTenantId;
|
||||
|
||||
if (!isValidDemoTenant) {
|
||||
const accountTypeName = demoAccountType === 'individual_bakery'
|
||||
? 'Panadería San Pablo - Demo'
|
||||
: 'Panadería La Espiga - Demo';
|
||||
|
||||
// Create a mock tenant object matching TenantResponse structure
|
||||
const mockTenant = {
|
||||
id: demoTenantId,
|
||||
name: accountTypeName,
|
||||
subdomain: `demo-${demoSessionId.slice(0, 8)}`,
|
||||
plan_type: 'professional', // Use a valid plan type
|
||||
is_active: true,
|
||||
created_at: new Date().toISOString(),
|
||||
updated_at: new Date().toISOString(),
|
||||
};
|
||||
|
||||
// Set the demo tenant as current
|
||||
setCurrentTenant(mockTenant);
|
||||
}
|
||||
}
|
||||
}, [isAuthenticated, availableTenants, loadUserTenants]);
|
||||
}, [isDemoMode, demoSessionId, demoAccountType, currentTenant, setCurrentTenant]);
|
||||
};
|
||||
@@ -20,7 +20,8 @@ from app.middleware.auth import AuthMiddleware
|
||||
from app.middleware.logging import LoggingMiddleware
|
||||
from app.middleware.rate_limit import RateLimitMiddleware
|
||||
from app.middleware.subscription import SubscriptionMiddleware
|
||||
from app.routes import auth, tenant, notification, nominatim, user, subscription
|
||||
from app.middleware.demo_middleware import DemoMiddleware
|
||||
from app.routes import auth, tenant, notification, nominatim, user, subscription, demo
|
||||
from shared.monitoring.logging import setup_logging
|
||||
from shared.monitoring.metrics import MetricsCollector
|
||||
|
||||
@@ -55,11 +56,13 @@ app.add_middleware(
|
||||
allow_headers=["*"],
|
||||
)
|
||||
|
||||
# Custom middleware - Add in correct order (outer to inner)
|
||||
app.add_middleware(LoggingMiddleware)
|
||||
app.add_middleware(RateLimitMiddleware, calls_per_minute=300)
|
||||
app.add_middleware(SubscriptionMiddleware, tenant_service_url=settings.TENANT_SERVICE_URL)
|
||||
app.add_middleware(AuthMiddleware)
|
||||
# Custom middleware - Add in REVERSE order (last added = first executed)
|
||||
# Execution order: DemoMiddleware -> AuthMiddleware -> SubscriptionMiddleware -> RateLimitMiddleware -> LoggingMiddleware
|
||||
app.add_middleware(LoggingMiddleware) # Executes 5th (outermost)
|
||||
app.add_middleware(RateLimitMiddleware, calls_per_minute=300) # Executes 4th
|
||||
app.add_middleware(SubscriptionMiddleware, tenant_service_url=settings.TENANT_SERVICE_URL) # Executes 3rd
|
||||
app.add_middleware(AuthMiddleware) # Executes 2nd - Checks for demo context
|
||||
app.add_middleware(DemoMiddleware) # Executes 1st (innermost) - Sets demo user context FIRST
|
||||
|
||||
# Include routers
|
||||
app.include_router(auth.router, prefix="/api/v1/auth", tags=["authentication"])
|
||||
@@ -68,6 +71,7 @@ app.include_router(tenant.router, prefix="/api/v1/tenants", tags=["tenants"])
|
||||
app.include_router(subscription.router, prefix="/api/v1", tags=["subscriptions"])
|
||||
app.include_router(notification.router, prefix="/api/v1/notifications", tags=["notifications"])
|
||||
app.include_router(nominatim.router, prefix="/api/v1/nominatim", tags=["location"])
|
||||
app.include_router(demo.router, prefix="/api/v1", tags=["demo"])
|
||||
|
||||
|
||||
@app.on_event("startup")
|
||||
|
||||
@@ -34,7 +34,9 @@ PUBLIC_ROUTES = [
|
||||
"/api/v1/auth/refresh",
|
||||
"/api/v1/auth/verify",
|
||||
"/api/v1/nominatim/search",
|
||||
"/api/v1/plans"
|
||||
"/api/v1/plans",
|
||||
"/api/v1/demo/accounts",
|
||||
"/api/v1/demo/session/create"
|
||||
]
|
||||
|
||||
class AuthMiddleware(BaseHTTPMiddleware):
|
||||
@@ -57,10 +59,20 @@ class AuthMiddleware(BaseHTTPMiddleware):
|
||||
if self._is_public_route(request.url.path):
|
||||
return await call_next(request)
|
||||
|
||||
# ✅ Check if demo middleware already set user context
|
||||
demo_session_header = request.headers.get("X-Demo-Session-Id")
|
||||
logger.info(f"Auth check - path: {request.url.path}, demo_header: {demo_session_header}, has_demo_state: {hasattr(request.state, 'is_demo_session')}")
|
||||
|
||||
if hasattr(request.state, "is_demo_session") and request.state.is_demo_session:
|
||||
if hasattr(request.state, "user") and request.state.user:
|
||||
logger.info(f"✅ Demo session authenticated for route: {request.url.path}")
|
||||
# Demo middleware already validated and set user context, pass through
|
||||
return await call_next(request)
|
||||
|
||||
# ✅ STEP 1: Extract and validate JWT token
|
||||
token = self._extract_token(request)
|
||||
if not token:
|
||||
logger.warning(f"Missing token for protected route: {request.url.path}")
|
||||
logger.warning(f"❌ Missing token for protected route: {request.url.path}, demo_header: {demo_session_header}")
|
||||
return JSONResponse(
|
||||
status_code=401,
|
||||
content={"detail": "Authentication required"}
|
||||
|
||||
259
gateway/app/middleware/demo_middleware.py
Normal file
259
gateway/app/middleware/demo_middleware.py
Normal file
@@ -0,0 +1,259 @@
|
||||
"""
|
||||
Demo Session Middleware
|
||||
Handles demo account restrictions and virtual tenant injection
|
||||
"""
|
||||
|
||||
from fastapi import Request, HTTPException
|
||||
from fastapi.responses import JSONResponse
|
||||
from starlette.middleware.base import BaseHTTPMiddleware
|
||||
from starlette.responses import Response
|
||||
from typing import Optional
|
||||
import httpx
|
||||
import structlog
|
||||
|
||||
logger = structlog.get_logger()
|
||||
|
||||
# Demo tenant IDs (base templates)
|
||||
DEMO_TENANT_IDS = {
|
||||
"a1b2c3d4-e5f6-g7h8-i9j0-k1l2m3n4o5p6", # Panadería San Pablo
|
||||
"b2c3d4e5-f6g7-h8i9-j0k1-l2m3n4o5p6q7", # Panadería La Espiga
|
||||
}
|
||||
|
||||
# Allowed operations for demo accounts (limited write)
|
||||
DEMO_ALLOWED_OPERATIONS = {
|
||||
# Read operations - all allowed
|
||||
"GET": ["*"],
|
||||
|
||||
# Limited write operations for realistic testing
|
||||
"POST": [
|
||||
"/api/pos/sales",
|
||||
"/api/pos/sessions",
|
||||
"/api/orders",
|
||||
"/api/inventory/adjustments",
|
||||
"/api/sales",
|
||||
"/api/production/batches",
|
||||
# Note: Forecast generation is explicitly blocked (see DEMO_BLOCKED_PATHS)
|
||||
],
|
||||
|
||||
"PUT": [
|
||||
"/api/pos/sales/*",
|
||||
"/api/orders/*",
|
||||
"/api/inventory/stock/*",
|
||||
],
|
||||
|
||||
# Blocked operations
|
||||
"DELETE": [], # No deletes allowed
|
||||
"PATCH": [], # No patches allowed
|
||||
}
|
||||
|
||||
# Explicitly blocked paths for demo accounts (even if method would be allowed)
|
||||
# These require trained AI models which demo tenants don't have
|
||||
DEMO_BLOCKED_PATHS = [
|
||||
"/api/forecasts/single",
|
||||
"/api/forecasts/multi-day",
|
||||
"/api/forecasts/batch",
|
||||
]
|
||||
|
||||
DEMO_BLOCKED_PATH_MESSAGE = {
|
||||
"forecasts": {
|
||||
"message": "La generación de pronósticos no está disponible para cuentas demo. "
|
||||
"Las cuentas demo no tienen modelos de IA entrenados.",
|
||||
"message_en": "Forecast generation is not available for demo accounts. "
|
||||
"Demo accounts do not have trained AI models.",
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
class DemoMiddleware(BaseHTTPMiddleware):
|
||||
"""Middleware to handle demo session logic"""
|
||||
|
||||
def __init__(self, app, demo_session_url: str = "http://demo-session-service:8000"):
|
||||
super().__init__(app)
|
||||
self.demo_session_url = demo_session_url
|
||||
|
||||
async def dispatch(self, request: Request, call_next) -> Response:
|
||||
"""Process request through demo middleware"""
|
||||
|
||||
# Skip demo middleware for demo service endpoints
|
||||
demo_service_paths = [
|
||||
"/api/v1/demo/accounts",
|
||||
"/api/v1/demo/session/create",
|
||||
"/api/v1/demo/session/extend",
|
||||
"/api/v1/demo/session/destroy",
|
||||
"/api/v1/demo/stats",
|
||||
]
|
||||
|
||||
if any(request.url.path.startswith(path) or request.url.path == path for path in demo_service_paths):
|
||||
return await call_next(request)
|
||||
|
||||
# Extract session ID from header or cookie
|
||||
session_id = (
|
||||
request.headers.get("X-Demo-Session-Id") or
|
||||
request.cookies.get("demo_session_id")
|
||||
)
|
||||
|
||||
logger.info(f"🎭 DemoMiddleware - path: {request.url.path}, session_id: {session_id}")
|
||||
|
||||
# Extract tenant ID from request
|
||||
tenant_id = request.headers.get("X-Tenant-Id")
|
||||
|
||||
# Check if this is a demo session request
|
||||
if session_id:
|
||||
try:
|
||||
# Get session info from demo service
|
||||
session_info = await self._get_session_info(session_id)
|
||||
|
||||
if session_info and session_info.get("status") == "active":
|
||||
# Inject virtual tenant ID
|
||||
request.state.tenant_id = session_info["virtual_tenant_id"]
|
||||
request.state.is_demo_session = True
|
||||
request.state.demo_account_type = session_info["demo_account_type"]
|
||||
|
||||
# Inject demo user context for auth middleware
|
||||
# This allows the request to pass through AuthMiddleware
|
||||
request.state.user = {
|
||||
"user_id": session_info.get("user_id", "demo-user"),
|
||||
"email": f"demo-{session_id}@demo.local",
|
||||
"tenant_id": session_info["virtual_tenant_id"],
|
||||
"is_demo": True,
|
||||
"demo_session_id": session_id
|
||||
}
|
||||
|
||||
# Update activity
|
||||
await self._update_session_activity(session_id)
|
||||
|
||||
# Check if path is explicitly blocked
|
||||
blocked_reason = self._check_blocked_path(request.url.path)
|
||||
if blocked_reason:
|
||||
return JSONResponse(
|
||||
status_code=403,
|
||||
content={
|
||||
"error": "demo_restriction",
|
||||
**blocked_reason,
|
||||
"upgrade_url": "/pricing",
|
||||
"session_expires_at": session_info.get("expires_at")
|
||||
}
|
||||
)
|
||||
|
||||
# Check if operation is allowed
|
||||
if not self._is_operation_allowed(request.method, request.url.path):
|
||||
return JSONResponse(
|
||||
status_code=403,
|
||||
content={
|
||||
"error": "demo_restriction",
|
||||
"message": "Esta operación no está permitida en cuentas demo. "
|
||||
"Las sesiones demo se eliminan automáticamente después de 30 minutos. "
|
||||
"Suscríbete para obtener acceso completo.",
|
||||
"message_en": "This operation is not allowed in demo accounts. "
|
||||
"Demo sessions are automatically deleted after 30 minutes. "
|
||||
"Subscribe for full access.",
|
||||
"upgrade_url": "/pricing",
|
||||
"session_expires_at": session_info.get("expires_at")
|
||||
}
|
||||
)
|
||||
else:
|
||||
# Session expired or invalid
|
||||
return JSONResponse(
|
||||
status_code=401,
|
||||
content={
|
||||
"error": "session_expired",
|
||||
"message": "Tu sesión demo ha expirado. Crea una nueva sesión para continuar.",
|
||||
"message_en": "Your demo session has expired. Create a new session to continue."
|
||||
}
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
logger.error("Demo middleware error", error=str(e), session_id=session_id, path=request.url.path)
|
||||
# On error, return 401 instead of continuing
|
||||
return JSONResponse(
|
||||
status_code=401,
|
||||
content={
|
||||
"error": "session_error",
|
||||
"message": "Error validando sesión demo. Por favor, inténtalo de nuevo.",
|
||||
"message_en": "Error validating demo session. Please try again."
|
||||
}
|
||||
)
|
||||
|
||||
# Check if this is a demo tenant (base template)
|
||||
elif tenant_id in DEMO_TENANT_IDS:
|
||||
# Direct access to demo tenant without session - block writes
|
||||
request.state.is_demo_session = True
|
||||
request.state.tenant_id = tenant_id
|
||||
|
||||
if request.method not in ["GET", "HEAD", "OPTIONS"]:
|
||||
return JSONResponse(
|
||||
status_code=403,
|
||||
content={
|
||||
"error": "demo_restriction",
|
||||
"message": "Acceso directo al tenant demo no permitido. Crea una sesión demo.",
|
||||
"message_en": "Direct access to demo tenant not allowed. Create a demo session."
|
||||
}
|
||||
)
|
||||
|
||||
# Proceed with request
|
||||
response = await call_next(request)
|
||||
|
||||
# Add demo session header to response if demo session
|
||||
if hasattr(request.state, "is_demo_session") and request.state.is_demo_session:
|
||||
response.headers["X-Demo-Session"] = "true"
|
||||
|
||||
return response
|
||||
|
||||
async def _get_session_info(self, session_id: str) -> Optional[dict]:
|
||||
"""Get session information from demo service"""
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=5.0) as client:
|
||||
response = await client.get(
|
||||
f"{self.demo_session_url}/api/demo/session/{session_id}"
|
||||
)
|
||||
if response.status_code == 200:
|
||||
return response.json()
|
||||
return None
|
||||
except Exception as e:
|
||||
logger.error("Failed to get session info", session_id=session_id, error=str(e))
|
||||
return None
|
||||
|
||||
async def _update_session_activity(self, session_id: str):
|
||||
"""Update session activity timestamp"""
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=2.0) as client:
|
||||
await client.post(
|
||||
f"{self.demo_session_url}/api/demo/session/{session_id}/activity"
|
||||
)
|
||||
except Exception as e:
|
||||
logger.debug("Failed to update activity", session_id=session_id, error=str(e))
|
||||
|
||||
def _check_blocked_path(self, path: str) -> Optional[dict]:
|
||||
"""Check if path is explicitly blocked for demo accounts"""
|
||||
for blocked_path in DEMO_BLOCKED_PATHS:
|
||||
if blocked_path in path:
|
||||
# Determine which category of blocked path
|
||||
if "forecast" in blocked_path:
|
||||
return DEMO_BLOCKED_PATH_MESSAGE["forecasts"]
|
||||
# Can add more categories here in the future
|
||||
return {
|
||||
"message": "Esta funcionalidad no está disponible para cuentas demo.",
|
||||
"message_en": "This functionality is not available for demo accounts."
|
||||
}
|
||||
return None
|
||||
|
||||
def _is_operation_allowed(self, method: str, path: str) -> bool:
|
||||
"""Check if method + path combination is allowed for demo"""
|
||||
|
||||
allowed_paths = DEMO_ALLOWED_OPERATIONS.get(method, [])
|
||||
|
||||
# Check for wildcard
|
||||
if "*" in allowed_paths:
|
||||
return True
|
||||
|
||||
# Check for exact match or pattern match
|
||||
for allowed_path in allowed_paths:
|
||||
if allowed_path.endswith("*"):
|
||||
# Pattern match: /api/orders/* matches /api/orders/123
|
||||
if path.startswith(allowed_path[:-1]):
|
||||
return True
|
||||
elif path == allowed_path:
|
||||
# Exact match
|
||||
return True
|
||||
|
||||
return False
|
||||
61
gateway/app/routes/demo.py
Normal file
61
gateway/app/routes/demo.py
Normal file
@@ -0,0 +1,61 @@
|
||||
"""
|
||||
Demo Session Routes - Proxy to demo-session service
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Request, HTTPException
|
||||
from fastapi.responses import JSONResponse
|
||||
import httpx
|
||||
import structlog
|
||||
|
||||
from app.core.config import settings
|
||||
|
||||
logger = structlog.get_logger()
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
|
||||
@router.api_route("/demo/{path:path}", methods=["GET", "POST", "PUT", "DELETE", "PATCH"])
|
||||
async def proxy_demo_service(path: str, request: Request):
|
||||
"""
|
||||
Proxy all demo requests to the demo-session service
|
||||
These endpoints are public and don't require authentication
|
||||
"""
|
||||
# Build the target URL
|
||||
demo_service_url = settings.DEMO_SESSION_SERVICE_URL.rstrip('/')
|
||||
target_url = f"{demo_service_url}/api/demo/{path}"
|
||||
|
||||
# Get request body
|
||||
body = None
|
||||
if request.method in ["POST", "PUT", "PATCH"]:
|
||||
body = await request.body()
|
||||
|
||||
# Forward headers (excluding host)
|
||||
headers = {
|
||||
key: value
|
||||
for key, value in request.headers.items()
|
||||
if key.lower() not in ["host", "content-length"]
|
||||
}
|
||||
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=30.0) as client:
|
||||
response = await client.request(
|
||||
method=request.method,
|
||||
url=target_url,
|
||||
headers=headers,
|
||||
params=request.query_params,
|
||||
content=body
|
||||
)
|
||||
|
||||
# Return the response
|
||||
return JSONResponse(
|
||||
content=response.json() if response.content else {},
|
||||
status_code=response.status_code,
|
||||
headers=dict(response.headers)
|
||||
)
|
||||
|
||||
except httpx.RequestError as e:
|
||||
logger.error("Failed to proxy to demo-session service", error=str(e), url=target_url)
|
||||
raise HTTPException(status_code=503, detail="Demo service unavailable")
|
||||
except Exception as e:
|
||||
logger.error("Unexpected error proxying to demo-session service", error=str(e))
|
||||
raise HTTPException(status_code=500, detail="Internal server error")
|
||||
@@ -38,6 +38,11 @@ async def get_tenant_members(request: Request, tenant_id: str = Path(...)):
|
||||
"""Get tenant members"""
|
||||
return await _proxy_to_tenant_service(request, f"/api/v1/tenants/{tenant_id}/members")
|
||||
|
||||
@router.get("/{tenant_id}/my-access")
|
||||
async def get_tenant_my_access(request: Request, tenant_id: str = Path(...)):
|
||||
"""Get current user's access level for a tenant"""
|
||||
return await _proxy_to_tenant_service(request, f"/api/v1/tenants/{tenant_id}/my-access")
|
||||
|
||||
@router.get("/user/{user_id}")
|
||||
async def get_user_tenants(request: Request, user_id: str = Path(...)):
|
||||
"""Get all tenant memberships for a user (admin only)"""
|
||||
|
||||
@@ -0,0 +1,77 @@
|
||||
apiVersion: v1
|
||||
kind: Service
|
||||
metadata:
|
||||
name: demo-session-db-service
|
||||
namespace: bakery-ia
|
||||
labels:
|
||||
app: demo-session-db
|
||||
component: database
|
||||
spec:
|
||||
type: ClusterIP
|
||||
ports:
|
||||
- port: 5432
|
||||
targetPort: 5432
|
||||
protocol: TCP
|
||||
name: postgres
|
||||
selector:
|
||||
app: demo-session-db
|
||||
---
|
||||
apiVersion: apps/v1
|
||||
kind: StatefulSet
|
||||
metadata:
|
||||
name: demo-session-db
|
||||
namespace: bakery-ia
|
||||
labels:
|
||||
app: demo-session-db
|
||||
component: database
|
||||
spec:
|
||||
serviceName: demo-session-db-service
|
||||
replicas: 1
|
||||
selector:
|
||||
matchLabels:
|
||||
app: demo-session-db
|
||||
template:
|
||||
metadata:
|
||||
labels:
|
||||
app: demo-session-db
|
||||
component: database
|
||||
spec:
|
||||
containers:
|
||||
- name: postgres
|
||||
image: postgres:15-alpine
|
||||
ports:
|
||||
- containerPort: 5432
|
||||
name: postgres
|
||||
env:
|
||||
- name: POSTGRES_DB
|
||||
value: "demo_session_db"
|
||||
- name: POSTGRES_USER
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: database-secrets
|
||||
key: DEMO_SESSION_DB_USER
|
||||
- name: POSTGRES_PASSWORD
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: database-secrets
|
||||
key: DEMO_SESSION_DB_PASSWORD
|
||||
- name: PGDATA
|
||||
value: /var/lib/postgresql/data/pgdata
|
||||
volumeMounts:
|
||||
- name: postgres-data
|
||||
mountPath: /var/lib/postgresql/data
|
||||
resources:
|
||||
requests:
|
||||
memory: "256Mi"
|
||||
cpu: "250m"
|
||||
limits:
|
||||
memory: "512Mi"
|
||||
cpu: "500m"
|
||||
volumeClaimTemplates:
|
||||
- metadata:
|
||||
name: postgres-data
|
||||
spec:
|
||||
accessModes: ["ReadWriteOnce"]
|
||||
resources:
|
||||
requests:
|
||||
storage: 2Gi
|
||||
@@ -0,0 +1,84 @@
|
||||
apiVersion: apps/v1
|
||||
kind: Deployment
|
||||
metadata:
|
||||
name: demo-session-service
|
||||
namespace: bakery-ia
|
||||
labels:
|
||||
app: demo-session-service
|
||||
component: demo-session
|
||||
spec:
|
||||
replicas: 2
|
||||
selector:
|
||||
matchLabels:
|
||||
app: demo-session-service
|
||||
template:
|
||||
metadata:
|
||||
labels:
|
||||
app: demo-session-service
|
||||
component: demo-session
|
||||
spec:
|
||||
serviceAccountName: demo-session-sa
|
||||
containers:
|
||||
- name: demo-session
|
||||
image: bakery/demo-session-service:latest
|
||||
ports:
|
||||
- containerPort: 8000
|
||||
name: http
|
||||
env:
|
||||
- name: SERVICE_NAME
|
||||
value: "demo-session"
|
||||
- name: DEMO_SESSION_DATABASE_URL
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: database-secrets
|
||||
key: DEMO_SESSION_DATABASE_URL
|
||||
- name: REDIS_PASSWORD
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: redis-secrets
|
||||
key: REDIS_PASSWORD
|
||||
- name: REDIS_URL
|
||||
value: "redis://:$(REDIS_PASSWORD)@redis-service:6379/0"
|
||||
- name: AUTH_SERVICE_URL
|
||||
value: "http://auth-service:8000"
|
||||
- name: TENANT_SERVICE_URL
|
||||
value: "http://tenant-service:8000"
|
||||
- name: INVENTORY_SERVICE_URL
|
||||
value: "http://inventory-service:8000"
|
||||
- name: RECIPES_SERVICE_URL
|
||||
value: "http://recipes-service:8000"
|
||||
- name: SALES_SERVICE_URL
|
||||
value: "http://sales-service:8000"
|
||||
- name: ORDERS_SERVICE_URL
|
||||
value: "http://orders-service:8000"
|
||||
- name: PRODUCTION_SERVICE_URL
|
||||
value: "http://production-service:8000"
|
||||
- name: SUPPLIERS_SERVICE_URL
|
||||
value: "http://suppliers-service:8000"
|
||||
- name: LOG_LEVEL
|
||||
value: "INFO"
|
||||
- name: POD_NAMESPACE
|
||||
valueFrom:
|
||||
fieldRef:
|
||||
fieldPath: metadata.namespace
|
||||
- name: CLONE_JOB_IMAGE
|
||||
value: "bakery/inventory-service:latest"
|
||||
resources:
|
||||
requests:
|
||||
memory: "256Mi"
|
||||
cpu: "200m"
|
||||
limits:
|
||||
memory: "512Mi"
|
||||
cpu: "500m"
|
||||
livenessProbe:
|
||||
httpGet:
|
||||
path: /health
|
||||
port: 8000
|
||||
initialDelaySeconds: 30
|
||||
periodSeconds: 30
|
||||
readinessProbe:
|
||||
httpGet:
|
||||
path: /health
|
||||
port: 8000
|
||||
initialDelaySeconds: 10
|
||||
periodSeconds: 10
|
||||
@@ -0,0 +1,35 @@
|
||||
apiVersion: v1
|
||||
kind: ServiceAccount
|
||||
metadata:
|
||||
name: demo-session-sa
|
||||
namespace: bakery-ia
|
||||
---
|
||||
apiVersion: rbac.authorization.k8s.io/v1
|
||||
kind: Role
|
||||
metadata:
|
||||
name: demo-session-job-creator
|
||||
namespace: bakery-ia
|
||||
rules:
|
||||
- apiGroups: ["batch"]
|
||||
resources: ["jobs"]
|
||||
verbs: ["create", "get", "list", "watch", "delete"]
|
||||
- apiGroups: [""]
|
||||
resources: ["pods"]
|
||||
verbs: ["get", "list", "watch"]
|
||||
- apiGroups: [""]
|
||||
resources: ["pods/log"]
|
||||
verbs: ["get"]
|
||||
---
|
||||
apiVersion: rbac.authorization.k8s.io/v1
|
||||
kind: RoleBinding
|
||||
metadata:
|
||||
name: demo-session-job-creator-binding
|
||||
namespace: bakery-ia
|
||||
roleRef:
|
||||
apiGroup: rbac.authorization.k8s.io
|
||||
kind: Role
|
||||
name: demo-session-job-creator
|
||||
subjects:
|
||||
- kind: ServiceAccount
|
||||
name: demo-session-sa
|
||||
namespace: bakery-ia
|
||||
@@ -0,0 +1,17 @@
|
||||
apiVersion: v1
|
||||
kind: Service
|
||||
metadata:
|
||||
name: demo-session-service
|
||||
namespace: bakery-ia
|
||||
labels:
|
||||
app: demo-session-service
|
||||
component: demo-session
|
||||
spec:
|
||||
type: ClusterIP
|
||||
ports:
|
||||
- port: 8000
|
||||
targetPort: 8000
|
||||
protocol: TCP
|
||||
name: http
|
||||
selector:
|
||||
app: demo-session-service
|
||||
@@ -0,0 +1,56 @@
|
||||
apiVersion: batch/v1
|
||||
kind: CronJob
|
||||
metadata:
|
||||
name: demo-session-cleanup
|
||||
namespace: bakery-ia
|
||||
labels:
|
||||
app: demo-cleanup
|
||||
component: maintenance
|
||||
spec:
|
||||
schedule: "0 * * * *" # Every hour
|
||||
timeZone: "Europe/Madrid"
|
||||
successfulJobsHistoryLimit: 3
|
||||
failedJobsHistoryLimit: 3
|
||||
concurrencyPolicy: Forbid
|
||||
jobTemplate:
|
||||
metadata:
|
||||
labels:
|
||||
app: demo-cleanup
|
||||
spec:
|
||||
template:
|
||||
metadata:
|
||||
labels:
|
||||
app: demo-cleanup
|
||||
spec:
|
||||
containers:
|
||||
- name: cleanup
|
||||
image: bakery/demo-session-service:latest
|
||||
command:
|
||||
- python
|
||||
- -c
|
||||
- |
|
||||
import asyncio
|
||||
import httpx
|
||||
async def cleanup():
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.post("http://demo-session-service:8000/api/demo/cleanup/run")
|
||||
print(response.json())
|
||||
asyncio.run(cleanup())
|
||||
env:
|
||||
- name: DEMO_SESSION_DATABASE_URL
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: database-secrets
|
||||
key: DEMO_SESSION_DATABASE_URL
|
||||
- name: REDIS_URL
|
||||
value: "redis://redis-service:6379/0"
|
||||
- name: LOG_LEVEL
|
||||
value: "INFO"
|
||||
resources:
|
||||
requests:
|
||||
memory: "128Mi"
|
||||
cpu: "50m"
|
||||
limits:
|
||||
memory: "256Mi"
|
||||
cpu: "200m"
|
||||
restartPolicy: OnFailure
|
||||
@@ -0,0 +1,55 @@
|
||||
apiVersion: batch/v1
|
||||
kind: Job
|
||||
metadata:
|
||||
name: demo-clone-VIRTUAL_TENANT_ID
|
||||
namespace: bakery-ia
|
||||
labels:
|
||||
app: demo-clone
|
||||
component: runtime
|
||||
spec:
|
||||
ttlSecondsAfterFinished: 3600 # Clean up after 1 hour
|
||||
backoffLimit: 2
|
||||
template:
|
||||
metadata:
|
||||
labels:
|
||||
app: demo-clone
|
||||
spec:
|
||||
restartPolicy: Never
|
||||
containers:
|
||||
- name: clone-data
|
||||
image: bakery/inventory-service:latest # Uses inventory image which has all scripts
|
||||
command: ["python", "/app/scripts/demo/clone_demo_tenant.py"]
|
||||
env:
|
||||
- name: VIRTUAL_TENANT_ID
|
||||
value: "VIRTUAL_TENANT_ID"
|
||||
- name: DEMO_ACCOUNT_TYPE
|
||||
value: "DEMO_ACCOUNT_TYPE"
|
||||
- name: INVENTORY_DATABASE_URL
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: database-secrets
|
||||
key: INVENTORY_DATABASE_URL
|
||||
- name: SALES_DATABASE_URL
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: database-secrets
|
||||
key: SALES_DATABASE_URL
|
||||
- name: ORDERS_DATABASE_URL
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: database-secrets
|
||||
key: ORDERS_DATABASE_URL
|
||||
- name: TENANT_DATABASE_URL
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: database-secrets
|
||||
key: TENANT_DATABASE_URL
|
||||
- name: LOG_LEVEL
|
||||
value: "INFO"
|
||||
resources:
|
||||
requests:
|
||||
memory: "256Mi"
|
||||
cpu: "100m"
|
||||
limits:
|
||||
memory: "512Mi"
|
||||
cpu: "500m"
|
||||
@@ -0,0 +1,68 @@
|
||||
apiVersion: batch/v1
|
||||
kind: Job
|
||||
metadata:
|
||||
name: demo-seed-ai-models
|
||||
namespace: bakery-ia
|
||||
labels:
|
||||
app: demo-seed
|
||||
component: initialization
|
||||
annotations:
|
||||
"helm.sh/hook": post-install,post-upgrade
|
||||
"helm.sh/hook-weight": "25"
|
||||
spec:
|
||||
ttlSecondsAfterFinished: 3600
|
||||
template:
|
||||
metadata:
|
||||
labels:
|
||||
app: demo-seed-ai-models
|
||||
spec:
|
||||
initContainers:
|
||||
- name: wait-for-training-migration
|
||||
image: busybox:1.36
|
||||
command:
|
||||
- sh
|
||||
- -c
|
||||
- |
|
||||
echo "Waiting 30 seconds for training-migration to complete..."
|
||||
sleep 30
|
||||
- name: wait-for-inventory-seed
|
||||
image: busybox:1.36
|
||||
command:
|
||||
- sh
|
||||
- -c
|
||||
- |
|
||||
echo "Waiting 15 seconds for demo-seed-inventory to complete..."
|
||||
sleep 15
|
||||
containers:
|
||||
- name: seed-ai-models
|
||||
image: bakery/training-service:latest
|
||||
command: ["python", "/app/scripts/demo/seed_demo_ai_models.py"]
|
||||
env:
|
||||
- name: TRAINING_DATABASE_URL
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: database-secrets
|
||||
key: TRAINING_DATABASE_URL
|
||||
- name: TENANT_DATABASE_URL
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: database-secrets
|
||||
key: TENANT_DATABASE_URL
|
||||
- name: INVENTORY_DATABASE_URL
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: database-secrets
|
||||
key: INVENTORY_DATABASE_URL
|
||||
- name: DEMO_MODE
|
||||
value: "production"
|
||||
- name: LOG_LEVEL
|
||||
value: "INFO"
|
||||
resources:
|
||||
requests:
|
||||
memory: "256Mi"
|
||||
cpu: "100m"
|
||||
limits:
|
||||
memory: "512Mi"
|
||||
cpu: "500m"
|
||||
restartPolicy: OnFailure
|
||||
serviceAccountName: demo-seed-sa
|
||||
@@ -0,0 +1,58 @@
|
||||
apiVersion: batch/v1
|
||||
kind: Job
|
||||
metadata:
|
||||
name: demo-seed-inventory
|
||||
namespace: bakery-ia
|
||||
labels:
|
||||
app: demo-seed
|
||||
component: initialization
|
||||
annotations:
|
||||
"helm.sh/hook": post-install,post-upgrade
|
||||
"helm.sh/hook-weight": "15"
|
||||
spec:
|
||||
ttlSecondsAfterFinished: 3600
|
||||
template:
|
||||
metadata:
|
||||
labels:
|
||||
app: demo-seed-inventory
|
||||
spec:
|
||||
initContainers:
|
||||
- name: wait-for-inventory-migration
|
||||
image: busybox:1.36
|
||||
command:
|
||||
- sh
|
||||
- -c
|
||||
- |
|
||||
echo "Waiting 30 seconds for inventory-migration to complete..."
|
||||
sleep 30
|
||||
- name: wait-for-tenant-seed
|
||||
image: busybox:1.36
|
||||
command:
|
||||
- sh
|
||||
- -c
|
||||
- |
|
||||
echo "Waiting 15 seconds for demo-seed-tenants to complete..."
|
||||
sleep 15
|
||||
containers:
|
||||
- name: seed-inventory
|
||||
image: bakery/inventory-service:latest
|
||||
command: ["python", "/app/scripts/demo/seed_demo_inventory.py"]
|
||||
env:
|
||||
- name: INVENTORY_DATABASE_URL
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: database-secrets
|
||||
key: INVENTORY_DATABASE_URL
|
||||
- name: DEMO_MODE
|
||||
value: "production"
|
||||
- name: LOG_LEVEL
|
||||
value: "INFO"
|
||||
resources:
|
||||
requests:
|
||||
memory: "256Mi"
|
||||
cpu: "100m"
|
||||
limits:
|
||||
memory: "512Mi"
|
||||
cpu: "500m"
|
||||
restartPolicy: OnFailure
|
||||
serviceAccountName: demo-seed-sa
|
||||
29
infrastructure/kubernetes/base/jobs/demo-seed-rbac.yaml
Normal file
29
infrastructure/kubernetes/base/jobs/demo-seed-rbac.yaml
Normal file
@@ -0,0 +1,29 @@
|
||||
apiVersion: v1
|
||||
kind: ServiceAccount
|
||||
metadata:
|
||||
name: demo-seed-sa
|
||||
namespace: bakery-ia
|
||||
---
|
||||
apiVersion: rbac.authorization.k8s.io/v1
|
||||
kind: Role
|
||||
metadata:
|
||||
name: demo-seed-role
|
||||
namespace: bakery-ia
|
||||
rules:
|
||||
- apiGroups: ["batch"]
|
||||
resources: ["jobs"]
|
||||
verbs: ["get", "list", "watch"]
|
||||
---
|
||||
apiVersion: rbac.authorization.k8s.io/v1
|
||||
kind: RoleBinding
|
||||
metadata:
|
||||
name: demo-seed-rolebinding
|
||||
namespace: bakery-ia
|
||||
roleRef:
|
||||
apiGroup: rbac.authorization.k8s.io
|
||||
kind: Role
|
||||
name: demo-seed-role
|
||||
subjects:
|
||||
- kind: ServiceAccount
|
||||
name: demo-seed-sa
|
||||
namespace: bakery-ia
|
||||
@@ -0,0 +1,60 @@
|
||||
apiVersion: batch/v1
|
||||
kind: Job
|
||||
metadata:
|
||||
name: demo-seed-tenants
|
||||
namespace: bakery-ia
|
||||
labels:
|
||||
app: demo-seed
|
||||
component: initialization
|
||||
annotations:
|
||||
"helm.sh/hook": post-install,post-upgrade
|
||||
"helm.sh/hook-weight": "10"
|
||||
spec:
|
||||
ttlSecondsAfterFinished: 3600
|
||||
template:
|
||||
metadata:
|
||||
labels:
|
||||
app: demo-seed-tenants
|
||||
spec:
|
||||
initContainers:
|
||||
- name: wait-for-tenant-migration
|
||||
image: busybox:1.36
|
||||
command:
|
||||
- sh
|
||||
- -c
|
||||
- |
|
||||
echo "Waiting 30 seconds for tenant-migration to complete..."
|
||||
sleep 30
|
||||
- name: wait-for-user-seed
|
||||
image: busybox:1.36
|
||||
command:
|
||||
- sh
|
||||
- -c
|
||||
- |
|
||||
echo "Waiting 15 seconds for demo-seed-users to complete..."
|
||||
sleep 15
|
||||
containers:
|
||||
- name: seed-tenants
|
||||
image: bakery/tenant-service:latest
|
||||
command: ["python", "/app/scripts/demo/seed_demo_tenants.py"]
|
||||
env:
|
||||
- name: TENANT_DATABASE_URL
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: database-secrets
|
||||
key: TENANT_DATABASE_URL
|
||||
- name: AUTH_SERVICE_URL
|
||||
value: "http://auth-service:8000"
|
||||
- name: DEMO_MODE
|
||||
value: "production"
|
||||
- name: LOG_LEVEL
|
||||
value: "INFO"
|
||||
resources:
|
||||
requests:
|
||||
memory: "256Mi"
|
||||
cpu: "100m"
|
||||
limits:
|
||||
memory: "512Mi"
|
||||
cpu: "500m"
|
||||
restartPolicy: OnFailure
|
||||
serviceAccountName: demo-seed-sa
|
||||
50
infrastructure/kubernetes/base/jobs/demo-seed-users-job.yaml
Normal file
50
infrastructure/kubernetes/base/jobs/demo-seed-users-job.yaml
Normal file
@@ -0,0 +1,50 @@
|
||||
apiVersion: batch/v1
|
||||
kind: Job
|
||||
metadata:
|
||||
name: demo-seed-users
|
||||
namespace: bakery-ia
|
||||
labels:
|
||||
app: demo-seed
|
||||
component: initialization
|
||||
annotations:
|
||||
"helm.sh/hook": post-install,post-upgrade
|
||||
"helm.sh/hook-weight": "5"
|
||||
spec:
|
||||
ttlSecondsAfterFinished: 3600
|
||||
template:
|
||||
metadata:
|
||||
labels:
|
||||
app: demo-seed-users
|
||||
spec:
|
||||
initContainers:
|
||||
- name: wait-for-auth-migration
|
||||
image: busybox:1.36
|
||||
command:
|
||||
- sh
|
||||
- -c
|
||||
- |
|
||||
echo "Waiting 30 seconds for auth-migration to complete..."
|
||||
sleep 30
|
||||
containers:
|
||||
- name: seed-users
|
||||
image: bakery/auth-service:latest
|
||||
command: ["python", "/app/scripts/demo/seed_demo_users.py"]
|
||||
env:
|
||||
- name: AUTH_DATABASE_URL
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: database-secrets
|
||||
key: AUTH_DATABASE_URL
|
||||
- name: DEMO_MODE
|
||||
value: "production"
|
||||
- name: LOG_LEVEL
|
||||
value: "INFO"
|
||||
resources:
|
||||
requests:
|
||||
memory: "256Mi"
|
||||
cpu: "100m"
|
||||
limits:
|
||||
memory: "512Mi"
|
||||
cpu: "500m"
|
||||
restartPolicy: OnFailure
|
||||
serviceAccountName: demo-seed-sa
|
||||
@@ -30,6 +30,17 @@ resources:
|
||||
- migrations/orders-migration-job.yaml
|
||||
- migrations/production-migration-job.yaml
|
||||
- migrations/alert-processor-migration-job.yaml
|
||||
- migrations/demo-session-migration-job.yaml
|
||||
|
||||
# Demo initialization jobs
|
||||
- jobs/demo-seed-rbac.yaml
|
||||
- jobs/demo-seed-users-job.yaml
|
||||
- jobs/demo-seed-tenants-job.yaml
|
||||
- jobs/demo-seed-inventory-job.yaml
|
||||
- jobs/demo-seed-ai-models-job.yaml
|
||||
|
||||
# Demo cleanup cronjob
|
||||
- cronjobs/demo-cleanup-cronjob.yaml
|
||||
|
||||
# Infrastructure components
|
||||
- components/databases/redis.yaml
|
||||
@@ -52,6 +63,12 @@ resources:
|
||||
- components/databases/production-db.yaml
|
||||
- components/databases/alert-processor-db.yaml
|
||||
|
||||
# Demo session components
|
||||
- components/demo-session/database.yaml
|
||||
- components/demo-session/rbac.yaml
|
||||
- components/demo-session/service.yaml
|
||||
- components/demo-session/deployment.yaml
|
||||
|
||||
# Microservices
|
||||
- components/auth/auth-service.yaml
|
||||
- components/tenant/tenant-service.yaml
|
||||
@@ -106,6 +123,8 @@ images:
|
||||
newTag: latest
|
||||
- name: bakery/alert-processor
|
||||
newTag: latest
|
||||
- name: bakery/demo-session-service
|
||||
newTag: latest
|
||||
- name: bakery/gateway
|
||||
newTag: latest
|
||||
- name: bakery/dashboard
|
||||
|
||||
@@ -0,0 +1,49 @@
|
||||
apiVersion: batch/v1
|
||||
kind: Job
|
||||
metadata:
|
||||
name: demo-session-migration
|
||||
namespace: bakery-ia
|
||||
labels:
|
||||
app.kubernetes.io/name: demo-session-migration
|
||||
app.kubernetes.io/component: migration
|
||||
app.kubernetes.io/part-of: bakery-ia
|
||||
spec:
|
||||
backoffLimit: 3
|
||||
template:
|
||||
metadata:
|
||||
labels:
|
||||
app.kubernetes.io/name: demo-session-migration
|
||||
app.kubernetes.io/component: migration
|
||||
spec:
|
||||
initContainers:
|
||||
- name: wait-for-db
|
||||
image: postgres:15-alpine
|
||||
command: ["sh", "-c", "until pg_isready -h demo-session-db-service -p 5432; do sleep 2; done"]
|
||||
resources:
|
||||
requests:
|
||||
memory: "64Mi"
|
||||
cpu: "50m"
|
||||
limits:
|
||||
memory: "128Mi"
|
||||
cpu: "100m"
|
||||
containers:
|
||||
- name: migrate
|
||||
image: bakery/demo-session-service:latest
|
||||
imagePullPolicy: Never
|
||||
command: ["python", "/app/scripts/run_migrations.py", "demo_session"]
|
||||
env:
|
||||
- name: DEMO_SESSION_DATABASE_URL
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: database-secrets
|
||||
key: DEMO_SESSION_DATABASE_URL
|
||||
- name: LOG_LEVEL
|
||||
value: "INFO"
|
||||
resources:
|
||||
requests:
|
||||
memory: "256Mi"
|
||||
cpu: "100m"
|
||||
limits:
|
||||
memory: "512Mi"
|
||||
cpu: "500m"
|
||||
restartPolicy: OnFailure
|
||||
@@ -23,6 +23,7 @@ data:
|
||||
ORDERS_DB_USER: b3JkZXJzX3VzZXI= # orders_user
|
||||
PRODUCTION_DB_USER: cHJvZHVjdGlvbl91c2Vy # production_user
|
||||
ALERT_PROCESSOR_DB_USER: YWxlcnRfcHJvY2Vzc29yX3VzZXI= # alert_processor_user
|
||||
DEMO_SESSION_DB_USER: ZGVtb19zZXNzaW9uX3VzZXI= # demo_session_user
|
||||
|
||||
# Database Passwords (base64 encoded from .env)
|
||||
AUTH_DB_PASSWORD: YXV0aF9wYXNzMTIz # auth_pass123
|
||||
@@ -39,6 +40,7 @@ data:
|
||||
ORDERS_DB_PASSWORD: b3JkZXJzX3Bhc3MxMjM= # orders_pass123
|
||||
PRODUCTION_DB_PASSWORD: cHJvZHVjdGlvbl9wYXNzMTIz # production_pass123
|
||||
ALERT_PROCESSOR_DB_PASSWORD: YWxlcnRfcHJvY2Vzc29yX3Bhc3MxMjM= # alert_processor_pass123
|
||||
DEMO_SESSION_DB_PASSWORD: ZGVtb19zZXNzaW9uX3Bhc3MxMjM= # demo_session_pass123
|
||||
|
||||
# Database URLs (base64 encoded)
|
||||
AUTH_DATABASE_URL: cG9zdGdyZXNxbCthc3luY3BnOi8vYXV0aF91c2VyOmF1dGhfcGFzczEyM0BhdXRoLWRiLXNlcnZpY2U6NTQzMi9hdXRoX2Ri # postgresql+asyncpg://auth_user:auth_pass123@auth-db-service:5432/auth_db
|
||||
@@ -55,6 +57,7 @@ data:
|
||||
ORDERS_DATABASE_URL: cG9zdGdyZXNxbCthc3luY3BnOi8vb3JkZXJzX3VzZXI6b3JkZXJzX3Bhc3MxMjNAb3JkZXJzLWRiLXNlcnZpY2U6NTQzMi9vcmRlcnNfZGI= # postgresql+asyncpg://orders_user:orders_pass123@orders-db-service:5432/orders_db
|
||||
PRODUCTION_DATABASE_URL: cG9zdGdyZXNxbCthc3luY3BnOi8vcHJvZHVjdGlvbl91c2VyOnByb2R1Y3Rpb25fcGFzczEyM0Bwcm9kdWN0aW9uLWRiLXNlcnZpY2U6NTQzMi9wcm9kdWN0aW9uX2Ri # postgresql+asyncpg://production_user:production_pass123@production-db-service:5432/production_db
|
||||
ALERT_PROCESSOR_DATABASE_URL: cG9zdGdyZXNxbCthc3luY3BnOi8vYWxlcnRfcHJvY2Vzc29yX3VzZXI6YWxlcnRfcHJvY2Vzc29yX3Bhc3MxMjNAYWxlcnQtcHJvY2Vzc29yLWRiLXNlcnZpY2U6NTQzMi9hbGVydF9wcm9jZXNzb3JfZGI= # postgresql+asyncpg://alert_processor_user:alert_processor_pass123@alert-processor-db-service:5432/alert_processor_db
|
||||
DEMO_SESSION_DATABASE_URL: cG9zdGdyZXNxbCthc3luY3BnOi8vZGVtb19zZXNzaW9uX3VzZXI6ZGVtb19zZXNzaW9uX3Bhc3MxMjNAZGVtby1zZXNzaW9uLWRiLXNlcnZpY2U6NTQzMi9kZW1vX3Nlc3Npb25fZGI= # postgresql+asyncpg://demo_session_user:demo_session_pass123@demo-session-db-service:5432/demo_session_db
|
||||
|
||||
---
|
||||
apiVersion: v1
|
||||
|
||||
@@ -562,6 +562,8 @@ images:
|
||||
newTag: dev
|
||||
- name: bakery/alert-processor
|
||||
newTag: dev
|
||||
- name: bakery/demo-session-service
|
||||
newTag: dev
|
||||
- name: bakery/gateway
|
||||
newTag: dev
|
||||
- name: bakery/dashboard
|
||||
@@ -596,6 +598,8 @@ replicas:
|
||||
count: 1
|
||||
- name: alert-processor-service
|
||||
count: 1
|
||||
- name: demo-session-service
|
||||
count: 1
|
||||
- name: gateway
|
||||
count: 1
|
||||
- name: frontend
|
||||
|
||||
1
scripts/demo/__init__.py
Normal file
1
scripts/demo/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
"""Demo Data Seeding Scripts"""
|
||||
234
scripts/demo/clone_demo_tenant.py
Normal file
234
scripts/demo/clone_demo_tenant.py
Normal file
@@ -0,0 +1,234 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Clone Demo Tenant Data - Database Level
|
||||
Clones all data from base template tenant to a virtual demo tenant across all databases
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import sys
|
||||
import os
|
||||
from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession, async_sessionmaker
|
||||
from sqlalchemy import select
|
||||
import uuid
|
||||
import structlog
|
||||
|
||||
# Add app to path for imports
|
||||
sys.path.insert(0, '/app')
|
||||
|
||||
logger = structlog.get_logger()
|
||||
|
||||
# Base template tenant IDs
|
||||
DEMO_TENANT_SAN_PABLO = "a1b2c3d4-e5f6-47a8-b9c0-d1e2f3a4b5c6"
|
||||
DEMO_TENANT_LA_ESPIGA = "b2c3d4e5-f6a7-48b9-c0d1-e2f3a4b5c6d7"
|
||||
|
||||
|
||||
async def clone_inventory_data(base_tenant_id: str, virtual_tenant_id: str):
|
||||
"""Clone inventory database tables using ORM"""
|
||||
database_url = os.getenv("INVENTORY_DATABASE_URL")
|
||||
if not database_url:
|
||||
logger.warning("INVENTORY_DATABASE_URL not set, skipping inventory data")
|
||||
return 0
|
||||
|
||||
engine = create_async_engine(database_url, echo=False)
|
||||
session_factory = async_sessionmaker(engine, class_=AsyncSession, expire_on_commit=False)
|
||||
|
||||
total_cloned = 0
|
||||
|
||||
try:
|
||||
from app.models.inventory import Ingredient
|
||||
|
||||
async with session_factory() as session:
|
||||
# Clone ingredients
|
||||
result = await session.execute(
|
||||
select(Ingredient).where(Ingredient.tenant_id == uuid.UUID(base_tenant_id))
|
||||
)
|
||||
base_ingredients = result.scalars().all()
|
||||
|
||||
logger.info(f"Found {len(base_ingredients)} ingredients to clone")
|
||||
|
||||
for ing in base_ingredients:
|
||||
new_ing = Ingredient(
|
||||
id=uuid.uuid4(),
|
||||
tenant_id=uuid.UUID(virtual_tenant_id),
|
||||
name=ing.name,
|
||||
sku=ing.sku,
|
||||
barcode=ing.barcode,
|
||||
product_type=ing.product_type,
|
||||
ingredient_category=ing.ingredient_category,
|
||||
product_category=ing.product_category,
|
||||
subcategory=ing.subcategory,
|
||||
description=ing.description,
|
||||
brand=ing.brand,
|
||||
unit_of_measure=ing.unit_of_measure,
|
||||
package_size=ing.package_size,
|
||||
average_cost=ing.average_cost,
|
||||
last_purchase_price=ing.last_purchase_price,
|
||||
standard_cost=ing.standard_cost,
|
||||
low_stock_threshold=ing.low_stock_threshold,
|
||||
reorder_point=ing.reorder_point,
|
||||
reorder_quantity=ing.reorder_quantity,
|
||||
max_stock_level=ing.max_stock_level,
|
||||
shelf_life_days=ing.shelf_life_days,
|
||||
is_perishable=ing.is_perishable,
|
||||
is_active=ing.is_active,
|
||||
allergen_info=ing.allergen_info
|
||||
)
|
||||
session.add(new_ing)
|
||||
total_cloned += 1
|
||||
|
||||
await session.commit()
|
||||
logger.info(f"Cloned {total_cloned} ingredients")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to clone inventory data: {str(e)}", exc_info=True)
|
||||
raise
|
||||
finally:
|
||||
await engine.dispose()
|
||||
|
||||
return total_cloned
|
||||
|
||||
|
||||
async def clone_sales_data(base_tenant_id: str, virtual_tenant_id: str):
|
||||
"""Clone sales database tables"""
|
||||
database_url = os.getenv("SALES_DATABASE_URL")
|
||||
if not database_url:
|
||||
logger.warning("SALES_DATABASE_URL not set, skipping sales data")
|
||||
return 0
|
||||
|
||||
# Sales cloning not implemented yet
|
||||
logger.info("Sales data cloning not yet implemented")
|
||||
return 0
|
||||
|
||||
|
||||
async def clone_orders_data(base_tenant_id: str, virtual_tenant_id: str):
|
||||
"""Clone orders database tables"""
|
||||
database_url = os.getenv("ORDERS_DATABASE_URL")
|
||||
if not database_url:
|
||||
logger.warning("ORDERS_DATABASE_URL not set, skipping orders data")
|
||||
return 0
|
||||
|
||||
# Orders cloning not implemented yet
|
||||
logger.info("Orders data cloning not yet implemented")
|
||||
return 0
|
||||
|
||||
|
||||
async def create_virtual_tenant(virtual_tenant_id: str, demo_account_type: str):
|
||||
"""Create the virtual tenant record in tenant database"""
|
||||
database_url = os.getenv("TENANT_DATABASE_URL")
|
||||
if not database_url:
|
||||
logger.warning("TENANT_DATABASE_URL not set, skipping tenant creation")
|
||||
return
|
||||
|
||||
engine = create_async_engine(database_url, echo=False)
|
||||
session_factory = async_sessionmaker(engine, class_=AsyncSession, expire_on_commit=False)
|
||||
|
||||
try:
|
||||
# Import after adding to path
|
||||
from services.tenant.app.models.tenants import Tenant
|
||||
|
||||
async with session_factory() as session:
|
||||
# Check if tenant already exists
|
||||
result = await session.execute(
|
||||
select(Tenant).where(Tenant.id == uuid.UUID(virtual_tenant_id))
|
||||
)
|
||||
existing = result.scalars().first()
|
||||
|
||||
if existing:
|
||||
logger.info(f"Virtual tenant {virtual_tenant_id} already exists")
|
||||
return
|
||||
|
||||
# Create virtual tenant
|
||||
tenant = Tenant(
|
||||
id=uuid.UUID(virtual_tenant_id),
|
||||
name=f"Demo Session Tenant",
|
||||
is_demo=True,
|
||||
is_demo_template=False,
|
||||
business_model=demo_account_type
|
||||
)
|
||||
session.add(tenant)
|
||||
await session.commit()
|
||||
logger.info(f"Created virtual tenant {virtual_tenant_id}")
|
||||
|
||||
except ImportError:
|
||||
# Tenant model not available, skip
|
||||
logger.warning("Could not import Tenant model, skipping virtual tenant creation")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to create virtual tenant: {str(e)}", exc_info=True)
|
||||
finally:
|
||||
await engine.dispose()
|
||||
|
||||
|
||||
async def clone_demo_tenant(virtual_tenant_id: str, demo_account_type: str = "individual_bakery"):
|
||||
"""
|
||||
Main function to clone all demo data for a virtual tenant
|
||||
|
||||
Args:
|
||||
virtual_tenant_id: The UUID of the virtual tenant to create
|
||||
demo_account_type: Type of demo account (individual_bakery or central_baker)
|
||||
"""
|
||||
base_tenant_id = DEMO_TENANT_SAN_PABLO if demo_account_type == "individual_bakery" else DEMO_TENANT_LA_ESPIGA
|
||||
|
||||
logger.info(
|
||||
"Starting demo tenant cloning",
|
||||
virtual_tenant=virtual_tenant_id,
|
||||
base_tenant=base_tenant_id,
|
||||
demo_type=demo_account_type
|
||||
)
|
||||
|
||||
try:
|
||||
# Create virtual tenant record
|
||||
await create_virtual_tenant(virtual_tenant_id, demo_account_type)
|
||||
|
||||
# Clone data from each database
|
||||
stats = {
|
||||
"inventory": await clone_inventory_data(base_tenant_id, virtual_tenant_id),
|
||||
"sales": await clone_sales_data(base_tenant_id, virtual_tenant_id),
|
||||
"orders": await clone_orders_data(base_tenant_id, virtual_tenant_id),
|
||||
}
|
||||
|
||||
total_records = sum(stats.values())
|
||||
logger.info(
|
||||
"Demo tenant cloning completed successfully",
|
||||
virtual_tenant=virtual_tenant_id,
|
||||
total_records=total_records,
|
||||
stats=stats
|
||||
)
|
||||
|
||||
# Print summary for job logs
|
||||
print(f"✅ Cloning completed: {total_records} total records")
|
||||
print(f" - Inventory: {stats['inventory']} records")
|
||||
print(f" - Sales: {stats['sales']} records")
|
||||
print(f" - Orders: {stats['orders']} records")
|
||||
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
logger.error(
|
||||
"Demo tenant cloning failed",
|
||||
virtual_tenant=virtual_tenant_id,
|
||||
error=str(e),
|
||||
exc_info=True
|
||||
)
|
||||
print(f"❌ Cloning failed: {str(e)}")
|
||||
return False
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
# Get virtual tenant ID from environment or CLI argument
|
||||
virtual_tenant_id = os.getenv("VIRTUAL_TENANT_ID") or (sys.argv[1] if len(sys.argv) > 1 else None)
|
||||
demo_type = os.getenv("DEMO_ACCOUNT_TYPE", "individual_bakery")
|
||||
|
||||
if not virtual_tenant_id:
|
||||
print("Usage: python clone_demo_tenant.py <virtual_tenant_id>")
|
||||
print(" or: VIRTUAL_TENANT_ID=<uuid> python clone_demo_tenant.py")
|
||||
sys.exit(1)
|
||||
|
||||
# Validate UUID
|
||||
try:
|
||||
uuid.UUID(virtual_tenant_id)
|
||||
except ValueError:
|
||||
print(f"Error: Invalid UUID format: {virtual_tenant_id}")
|
||||
sys.exit(1)
|
||||
|
||||
result = asyncio.run(clone_demo_tenant(virtual_tenant_id, demo_type))
|
||||
sys.exit(0 if result else 1)
|
||||
278
scripts/demo/seed_demo_ai_models.py
Normal file
278
scripts/demo/seed_demo_ai_models.py
Normal file
@@ -0,0 +1,278 @@
|
||||
"""
|
||||
Demo AI Models Seed Script
|
||||
Creates fake AI models for demo tenants to populate the models list
|
||||
without having actual trained model files.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import sys
|
||||
import os
|
||||
from uuid import UUID
|
||||
from datetime import datetime, timezone, timedelta
|
||||
from decimal import Decimal
|
||||
|
||||
# Add project root to path
|
||||
sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), "../..")))
|
||||
|
||||
from sqlalchemy import select
|
||||
from shared.database.base import create_database_manager
|
||||
import structlog
|
||||
|
||||
# Import models - these paths work both locally and in container
|
||||
try:
|
||||
# Container environment (training-service image)
|
||||
from app.models.training import TrainedModel
|
||||
except ImportError:
|
||||
# Local environment
|
||||
from services.training.app.models.training import TrainedModel
|
||||
|
||||
# Tenant model - define minimal version for container environment
|
||||
try:
|
||||
from services.tenant.app.models.tenants import Tenant
|
||||
except ImportError:
|
||||
# If running in training-service container, define minimal Tenant model
|
||||
from sqlalchemy import Column, String, Boolean
|
||||
from sqlalchemy.dialects.postgresql import UUID as PGUUID
|
||||
from sqlalchemy.ext.declarative import declarative_base
|
||||
|
||||
Base = declarative_base()
|
||||
|
||||
class Tenant(Base):
|
||||
__tablename__ = "tenants"
|
||||
id = Column(PGUUID(as_uuid=True), primary_key=True)
|
||||
name = Column(String)
|
||||
is_demo = Column(Boolean)
|
||||
is_demo_template = Column(Boolean)
|
||||
|
||||
logger = structlog.get_logger()
|
||||
|
||||
|
||||
class DemoAIModelSeeder:
|
||||
"""Seed fake AI models for demo tenants"""
|
||||
|
||||
def __init__(self):
|
||||
self.training_db_url = os.getenv("TRAINING_DATABASE_URL")
|
||||
self.tenant_db_url = os.getenv("TENANT_DATABASE_URL")
|
||||
|
||||
if not self.training_db_url or not self.tenant_db_url:
|
||||
raise ValueError("Missing required database URLs")
|
||||
|
||||
self.training_db = create_database_manager(self.training_db_url, "demo-ai-seed")
|
||||
self.tenant_db = create_database_manager(self.tenant_db_url, "demo-tenant-seed")
|
||||
|
||||
async def get_demo_tenants(self):
|
||||
"""Get all demo tenants"""
|
||||
async with self.tenant_db.get_session() as session:
|
||||
result = await session.execute(
|
||||
select(Tenant).where(Tenant.is_demo == True, Tenant.is_demo_template == True)
|
||||
)
|
||||
return result.scalars().all()
|
||||
|
||||
async def get_tenant_products(self, tenant_id: UUID):
|
||||
"""
|
||||
Get finished products for a tenant from inventory database.
|
||||
We need to query the actual inventory to get real product UUIDs.
|
||||
"""
|
||||
try:
|
||||
inventory_db_url = os.getenv("INVENTORY_DATABASE_URL")
|
||||
if not inventory_db_url:
|
||||
logger.warning("INVENTORY_DATABASE_URL not set, cannot get products")
|
||||
return []
|
||||
|
||||
inventory_db = create_database_manager(inventory_db_url, "demo-inventory-check")
|
||||
|
||||
# Define minimal Ingredient model for querying
|
||||
from sqlalchemy import Column, String, Enum as SQLEnum
|
||||
from sqlalchemy.dialects.postgresql import UUID as PGUUID
|
||||
from sqlalchemy.ext.declarative import declarative_base
|
||||
import enum
|
||||
|
||||
Base = declarative_base()
|
||||
|
||||
class IngredientType(str, enum.Enum):
|
||||
INGREDIENT = "INGREDIENT"
|
||||
FINISHED_PRODUCT = "FINISHED_PRODUCT"
|
||||
|
||||
class Ingredient(Base):
|
||||
__tablename__ = "ingredients"
|
||||
id = Column(PGUUID(as_uuid=True), primary_key=True)
|
||||
tenant_id = Column(PGUUID(as_uuid=True))
|
||||
name = Column(String)
|
||||
ingredient_type = Column(SQLEnum(IngredientType, name="ingredienttype"))
|
||||
|
||||
async with inventory_db.get_session() as session:
|
||||
result = await session.execute(
|
||||
select(Ingredient).where(
|
||||
Ingredient.tenant_id == tenant_id,
|
||||
Ingredient.ingredient_type == IngredientType.FINISHED_PRODUCT
|
||||
).limit(10) # Get up to 10 finished products
|
||||
)
|
||||
products = result.scalars().all()
|
||||
|
||||
product_list = [
|
||||
{"id": product.id, "name": product.name}
|
||||
for product in products
|
||||
]
|
||||
|
||||
logger.info(f"Found {len(product_list)} finished products for tenant",
|
||||
tenant_id=str(tenant_id))
|
||||
|
||||
return product_list
|
||||
|
||||
except Exception as e:
|
||||
logger.error("Error fetching tenant products", error=str(e), tenant_id=str(tenant_id))
|
||||
return []
|
||||
|
||||
async def create_fake_model(self, session, tenant_id: UUID, product_info: dict):
|
||||
"""Create a fake AI model entry for a product"""
|
||||
now = datetime.now(timezone.utc)
|
||||
training_start = now - timedelta(days=90)
|
||||
training_end = now - timedelta(days=7)
|
||||
|
||||
fake_model = TrainedModel(
|
||||
tenant_id=tenant_id,
|
||||
inventory_product_id=product_info["id"],
|
||||
model_type="prophet_optimized",
|
||||
model_version="1.0-demo",
|
||||
job_id=f"demo-job-{tenant_id}-{product_info['id']}",
|
||||
|
||||
# Fake file paths (files don't actually exist)
|
||||
model_path=f"/fake/models/{tenant_id}/{product_info['id']}/model.pkl",
|
||||
metadata_path=f"/fake/models/{tenant_id}/{product_info['id']}/metadata.json",
|
||||
|
||||
# Fake but realistic metrics
|
||||
mape=Decimal("12.5"), # Mean Absolute Percentage Error
|
||||
mae=Decimal("2.3"), # Mean Absolute Error
|
||||
rmse=Decimal("3.1"), # Root Mean Squared Error
|
||||
r2_score=Decimal("0.85"), # R-squared
|
||||
training_samples=60, # 60 days of training data
|
||||
|
||||
# Fake hyperparameters
|
||||
hyperparameters={
|
||||
"changepoint_prior_scale": 0.05,
|
||||
"seasonality_prior_scale": 10.0,
|
||||
"holidays_prior_scale": 10.0,
|
||||
"seasonality_mode": "multiplicative"
|
||||
},
|
||||
|
||||
# Features used
|
||||
features_used=["weekday", "month", "is_holiday", "temperature", "precipitation"],
|
||||
|
||||
# Normalization params (fake)
|
||||
normalization_params={
|
||||
"temperature": {"mean": 15.0, "std": 5.0},
|
||||
"precipitation": {"mean": 2.0, "std": 1.5}
|
||||
},
|
||||
|
||||
# Model status
|
||||
is_active=True,
|
||||
is_production=False, # Demo models are not production-ready
|
||||
|
||||
# Training data info
|
||||
training_start_date=training_start,
|
||||
training_end_date=training_end,
|
||||
data_quality_score=Decimal("0.75"), # Good but not excellent
|
||||
|
||||
# Metadata
|
||||
notes="Demo model - No actual trained file exists. For demonstration purposes only.",
|
||||
created_by="demo-seed-script",
|
||||
created_at=now,
|
||||
updated_at=now,
|
||||
last_used_at=None
|
||||
)
|
||||
|
||||
session.add(fake_model)
|
||||
return fake_model
|
||||
|
||||
async def seed_models_for_tenant(self, tenant: Tenant):
|
||||
"""Create fake AI models for a demo tenant"""
|
||||
logger.info("Creating fake AI models for demo tenant",
|
||||
tenant_id=str(tenant.id),
|
||||
tenant_name=tenant.name)
|
||||
|
||||
try:
|
||||
# Get products for this tenant
|
||||
products = await self.get_tenant_products(tenant.id)
|
||||
|
||||
async with self.training_db.get_session() as session:
|
||||
models_created = 0
|
||||
|
||||
for product in products:
|
||||
# Check if model already exists
|
||||
result = await session.execute(
|
||||
select(TrainedModel).where(
|
||||
TrainedModel.tenant_id == tenant.id,
|
||||
TrainedModel.inventory_product_id == product["id"]
|
||||
)
|
||||
)
|
||||
existing_model = result.scalars().first()
|
||||
|
||||
if existing_model:
|
||||
logger.info("Model already exists, skipping",
|
||||
tenant_id=str(tenant.id),
|
||||
product_id=product["id"])
|
||||
continue
|
||||
|
||||
# Create fake model
|
||||
model = await self.create_fake_model(session, tenant.id, product)
|
||||
models_created += 1
|
||||
|
||||
logger.info("Created fake AI model",
|
||||
tenant_id=str(tenant.id),
|
||||
product_id=product["id"],
|
||||
model_id=str(model.id))
|
||||
|
||||
await session.commit()
|
||||
|
||||
logger.info("Successfully created fake AI models for tenant",
|
||||
tenant_id=str(tenant.id),
|
||||
models_created=models_created)
|
||||
|
||||
except Exception as e:
|
||||
logger.error("Error creating fake AI models for tenant",
|
||||
tenant_id=str(tenant.id),
|
||||
error=str(e))
|
||||
raise
|
||||
|
||||
async def seed_all_demo_models(self):
|
||||
"""Seed fake AI models for all demo tenants"""
|
||||
logger.info("Starting demo AI models seeding")
|
||||
|
||||
try:
|
||||
# Get all demo tenants
|
||||
demo_tenants = await self.get_demo_tenants()
|
||||
|
||||
if not demo_tenants:
|
||||
logger.warning("No demo tenants found")
|
||||
return
|
||||
|
||||
logger.info(f"Found {len(demo_tenants)} demo tenants")
|
||||
|
||||
# Seed models for each tenant
|
||||
for tenant in demo_tenants:
|
||||
await self.seed_models_for_tenant(tenant)
|
||||
|
||||
logger.info("✅ Demo AI models seeding completed successfully",
|
||||
tenants_processed=len(demo_tenants))
|
||||
|
||||
except Exception as e:
|
||||
logger.error("❌ Demo AI models seeding failed", error=str(e))
|
||||
raise
|
||||
|
||||
|
||||
async def main():
|
||||
"""Main entry point"""
|
||||
logger.info("Demo AI Models Seed Script started")
|
||||
|
||||
try:
|
||||
seeder = DemoAIModelSeeder()
|
||||
await seeder.seed_all_demo_models()
|
||||
logger.info("Demo AI models seed completed successfully")
|
||||
|
||||
except Exception as e:
|
||||
logger.error("Demo AI models seed failed", error=str(e))
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
338
scripts/demo/seed_demo_inventory.py
Normal file
338
scripts/demo/seed_demo_inventory.py
Normal file
@@ -0,0 +1,338 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Seed Demo Inventory Data
|
||||
Populates comprehensive Spanish inventory data for both demo tenants
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
project_root = Path(__file__).parent.parent.parent
|
||||
sys.path.insert(0, str(project_root))
|
||||
|
||||
import os
|
||||
from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession, async_sessionmaker
|
||||
from sqlalchemy import select, delete
|
||||
import structlog
|
||||
import uuid
|
||||
from datetime import datetime, timedelta, timezone
|
||||
|
||||
logger = structlog.get_logger()
|
||||
|
||||
# Demo tenant IDs
|
||||
DEMO_TENANT_SAN_PABLO = "a1b2c3d4-e5f6-47a8-b9c0-d1e2f3a4b5c6"
|
||||
DEMO_TENANT_LA_ESPIGA = "b2c3d4e5-f6a7-48b9-c0d1-e2f3a4b5c6d7"
|
||||
|
||||
|
||||
async def seed_inventory_for_tenant(session, tenant_id: str, business_model: str):
|
||||
"""Seed inventory data for a specific tenant"""
|
||||
try:
|
||||
from app.models.inventory import Ingredient, Stock, StockMovement
|
||||
except ImportError:
|
||||
from services.inventory.app.models.inventory import Ingredient, Stock, StockMovement
|
||||
|
||||
logger.info(f"Seeding inventory for {business_model}", tenant_id=tenant_id)
|
||||
|
||||
# Check if data already exists - if so, skip seeding to avoid duplicates
|
||||
result = await session.execute(select(Ingredient).where(Ingredient.tenant_id == uuid.UUID(tenant_id)).limit(1))
|
||||
existing = result.scalars().first()
|
||||
if existing:
|
||||
logger.info(f"Demo tenant {tenant_id} already has inventory data, skipping seed")
|
||||
return
|
||||
|
||||
if business_model == "individual_bakery":
|
||||
await seed_individual_bakery_inventory(session, tenant_id)
|
||||
elif business_model == "central_baker_satellite":
|
||||
await seed_central_baker_inventory(session, tenant_id)
|
||||
|
||||
|
||||
async def seed_individual_bakery_inventory(session, tenant_id: str):
|
||||
"""Seed inventory for individual bakery (produces locally)"""
|
||||
try:
|
||||
from app.models.inventory import Ingredient, Stock
|
||||
except ImportError:
|
||||
from services.inventory.app.models.inventory import Ingredient, Stock
|
||||
|
||||
tenant_uuid = uuid.UUID(tenant_id)
|
||||
|
||||
# Raw ingredients for local production
|
||||
ingredients_data = [
|
||||
# Harinas
|
||||
("Harina de Trigo 000", "INGREDIENT", "FLOUR", None, "KILOGRAMS", 25.0, 50.0, 200.0, 2.50, "Molinos del Valle"),
|
||||
("Harina Integral", "INGREDIENT", "FLOUR", None, "KILOGRAMS", 15.0, 30.0, 100.0, 3.20, "Bio Natural"),
|
||||
("Harina de Centeno", "INGREDIENT", "FLOUR", None, "KILOGRAMS", 10.0, 20.0, 50.0, 3.50, "Ecológica"),
|
||||
|
||||
# Levaduras
|
||||
("Levadura Fresca", "INGREDIENT", "YEAST", None, "KILOGRAMS", 1.0, 2.5, 10.0, 8.50, "Levapan"),
|
||||
("Levadura Seca Activa", "INGREDIENT", "YEAST", None, "KILOGRAMS", 0.5, 1.0, 5.0, 12.00, "Fleischmann"),
|
||||
|
||||
# Grasas
|
||||
("Mantequilla", "INGREDIENT", "FATS", None, "KILOGRAMS", 3.0, 8.0, 25.0, 6.80, "La Serenísima"),
|
||||
("Aceite de Oliva Virgen Extra", "INGREDIENT", "FATS", None, "LITERS", 2.0, 5.0, 20.0, 15.50, "Cocinero"),
|
||||
|
||||
# Lácteos y Huevos
|
||||
("Huevos Frescos", "INGREDIENT", "EGGS", None, "UNITS", 36, 60, 180, 0.25, "Granja San José"),
|
||||
("Leche Entera", "INGREDIENT", "DAIRY", None, "LITERS", 5.0, 12.0, 50.0, 1.80, "La Serenísima"),
|
||||
("Nata para Montar", "INGREDIENT", "DAIRY", None, "LITERS", 2.0, 5.0, 20.0, 3.50, "Central Lechera"),
|
||||
|
||||
# Azúcares
|
||||
("Azúcar Blanca", "INGREDIENT", "SUGAR", None, "KILOGRAMS", 8.0, 20.0, 100.0, 1.20, "Ledesma"),
|
||||
("Azúcar Morena", "INGREDIENT", "SUGAR", None, "KILOGRAMS", 3.0, 8.0, 25.0, 2.80, "Orgánica"),
|
||||
("Azúcar Glass", "INGREDIENT", "SUGAR", None, "KILOGRAMS", 2.0, 5.0, 20.0, 2.20, "Ledesma"),
|
||||
|
||||
# Sal y Especias
|
||||
("Sal Fina", "INGREDIENT", "SALT", None, "KILOGRAMS", 2.0, 5.0, 20.0, 0.80, "Celusal"),
|
||||
("Canela en Polvo", "INGREDIENT", "SPICES", None, "GRAMS", 50, 150, 500, 0.08, "Alicante"),
|
||||
("Vainilla en Extracto", "INGREDIENT", "SPICES", None, "MILLILITERS", 100, 250, 1000, 0.15, "McCormick"),
|
||||
|
||||
# Chocolates y Aditivos
|
||||
("Chocolate Negro 70%", "INGREDIENT", "ADDITIVES", None, "KILOGRAMS", 1.0, 3.0, 15.0, 8.50, "Valor"),
|
||||
("Cacao en Polvo", "INGREDIENT", "ADDITIVES", None, "KILOGRAMS", 0.5, 2.0, 10.0, 6.50, "Nestlé"),
|
||||
("Nueces Peladas", "INGREDIENT", "ADDITIVES", None, "KILOGRAMS", 0.5, 1.5, 8.0, 12.00, "Los Nogales"),
|
||||
("Pasas de Uva", "INGREDIENT", "ADDITIVES", None, "KILOGRAMS", 1.0, 2.0, 10.0, 4.50, "Mendoza Premium"),
|
||||
|
||||
# Productos Terminados (producción local)
|
||||
("Croissant Clásico", "FINISHED_PRODUCT", None, "CROISSANTS", "PIECES", 12, 30, 80, 1.20, None),
|
||||
("Pan Integral", "FINISHED_PRODUCT", None, "BREAD", "PIECES", 8, 20, 50, 2.50, None),
|
||||
("Napolitana de Chocolate", "FINISHED_PRODUCT", None, "PASTRIES", "PIECES", 10, 25, 60, 1.80, None),
|
||||
("Pan de Masa Madre", "FINISHED_PRODUCT", None, "BREAD", "PIECES", 6, 15, 40, 3.50, None),
|
||||
("Magdalena de Vainilla", "FINISHED_PRODUCT", None, "PASTRIES", "PIECES", 8, 20, 50, 1.00, None),
|
||||
]
|
||||
|
||||
ingredient_map = {}
|
||||
for name, product_type, ing_cat, prod_cat, uom, low_stock, reorder, reorder_qty, cost, brand in ingredients_data:
|
||||
ing = Ingredient(
|
||||
id=uuid.uuid4(),
|
||||
tenant_id=tenant_uuid,
|
||||
name=name,
|
||||
product_type=product_type,
|
||||
ingredient_category=ing_cat,
|
||||
product_category=prod_cat,
|
||||
unit_of_measure=uom,
|
||||
low_stock_threshold=low_stock,
|
||||
reorder_point=reorder,
|
||||
reorder_quantity=reorder_qty,
|
||||
average_cost=cost,
|
||||
brand=brand,
|
||||
is_active=True,
|
||||
is_perishable=(ing_cat in ["DAIRY", "EGGS"] if ing_cat else False),
|
||||
shelf_life_days=7 if ing_cat in ["DAIRY", "EGGS"] else (365 if ing_cat else 2),
|
||||
created_at=datetime.now(timezone.utc)
|
||||
)
|
||||
session.add(ing)
|
||||
ingredient_map[name] = ing
|
||||
|
||||
await session.commit()
|
||||
|
||||
# Create stock lots
|
||||
now = datetime.now(timezone.utc)
|
||||
|
||||
# Harina de Trigo - Good stock
|
||||
harina_trigo = ingredient_map["Harina de Trigo 000"]
|
||||
session.add(Stock(
|
||||
id=uuid.uuid4(),
|
||||
tenant_id=tenant_uuid,
|
||||
ingredient_id=harina_trigo.id,
|
||||
production_stage="raw_ingredient",
|
||||
current_quantity=120.0,
|
||||
reserved_quantity=15.0,
|
||||
available_quantity=105.0,
|
||||
batch_number=f"HARINA-TRI-{now.strftime('%Y%m%d')}-001",
|
||||
received_date=now - timedelta(days=5),
|
||||
expiration_date=now + timedelta(days=360),
|
||||
unit_cost=2.50,
|
||||
total_cost=300.0,
|
||||
storage_location="Almacén Principal - Estante A1",
|
||||
is_available=True,
|
||||
is_expired=False,
|
||||
quality_status="good",
|
||||
created_at=now
|
||||
))
|
||||
|
||||
# Levadura Fresca - Low stock (critical)
|
||||
levadura = ingredient_map["Levadura Fresca"]
|
||||
session.add(Stock(
|
||||
id=uuid.uuid4(),
|
||||
tenant_id=tenant_uuid,
|
||||
ingredient_id=levadura.id,
|
||||
production_stage="raw_ingredient",
|
||||
current_quantity=0.8,
|
||||
reserved_quantity=0.3,
|
||||
available_quantity=0.5,
|
||||
batch_number=f"LEVAD-FRE-{now.strftime('%Y%m%d')}-001",
|
||||
received_date=now - timedelta(days=2),
|
||||
expiration_date=now + timedelta(days=5),
|
||||
unit_cost=8.50,
|
||||
total_cost=6.8,
|
||||
storage_location="Cámara Fría - Nivel 2",
|
||||
is_available=True,
|
||||
is_expired=False,
|
||||
quality_status="good",
|
||||
created_at=now
|
||||
))
|
||||
|
||||
# Croissants - Fresh batch
|
||||
croissant = ingredient_map["Croissant Clásico"]
|
||||
session.add(Stock(
|
||||
id=uuid.uuid4(),
|
||||
tenant_id=tenant_uuid,
|
||||
ingredient_id=croissant.id,
|
||||
production_stage="fully_baked",
|
||||
current_quantity=35,
|
||||
reserved_quantity=5,
|
||||
available_quantity=30,
|
||||
batch_number=f"CROIS-FRESH-{now.strftime('%Y%m%d')}-001",
|
||||
received_date=now - timedelta(hours=4),
|
||||
expiration_date=now + timedelta(hours=20),
|
||||
unit_cost=1.20,
|
||||
total_cost=42.0,
|
||||
storage_location="Vitrina Principal - Nivel 1",
|
||||
is_available=True,
|
||||
is_expired=False,
|
||||
quality_status="good",
|
||||
created_at=now
|
||||
))
|
||||
|
||||
await session.commit()
|
||||
logger.info("Individual bakery inventory seeded")
|
||||
|
||||
|
||||
async def seed_central_baker_inventory(session, tenant_id: str):
|
||||
"""Seed inventory for central baker satellite (receives products)"""
|
||||
try:
|
||||
from app.models.inventory import Ingredient, Stock
|
||||
except ImportError:
|
||||
from services.inventory.app.models.inventory import Ingredient, Stock
|
||||
|
||||
tenant_uuid = uuid.UUID(tenant_id)
|
||||
|
||||
# Finished and par-baked products from central baker
|
||||
ingredients_data = [
|
||||
# Productos Pre-Horneados (del obrador central)
|
||||
("Croissant Pre-Horneado", "FINISHED_PRODUCT", None, "CROISSANTS", "PIECES", 20, 50, 150, 0.85, "Obrador Central"),
|
||||
("Pan Baguette Pre-Horneado", "FINISHED_PRODUCT", None, "BREAD", "PIECES", 15, 40, 120, 1.20, "Obrador Central"),
|
||||
("Napolitana Pre-Horneada", "FINISHED_PRODUCT", None, "PASTRIES", "PIECES", 15, 35, 100, 1.50, "Obrador Central"),
|
||||
("Pan de Molde Pre-Horneado", "FINISHED_PRODUCT", None, "BREAD", "PIECES", 10, 25, 80, 1.80, "Obrador Central"),
|
||||
|
||||
# Productos Terminados (listos para venta)
|
||||
("Croissant de Mantequilla", "FINISHED_PRODUCT", None, "CROISSANTS", "PIECES", 15, 40, 100, 1.20, "Obrador Central"),
|
||||
("Palmera de Hojaldre", "FINISHED_PRODUCT", None, "PASTRIES", "PIECES", 10, 30, 80, 2.20, "Obrador Central"),
|
||||
("Magdalena Tradicional", "FINISHED_PRODUCT", None, "PASTRIES", "PIECES", 12, 30, 80, 1.00, "Obrador Central"),
|
||||
("Empanada de Atún", "FINISHED_PRODUCT", None, "OTHER_PRODUCTS", "PIECES", 8, 20, 60, 3.50, "Obrador Central"),
|
||||
("Pan Integral de Molde", "FINISHED_PRODUCT", None, "BREAD", "PIECES", 10, 25, 75, 2.80, "Obrador Central"),
|
||||
|
||||
# Algunos ingredientes básicos
|
||||
("Café en Grano", "INGREDIENT", "OTHER", None, "KILOGRAMS", 2.0, 5.0, 20.0, 18.50, "Lavazza"),
|
||||
("Leche para Cafetería", "INGREDIENT", "DAIRY", None, "LITERS", 10.0, 20.0, 80.0, 1.50, "Central Lechera"),
|
||||
("Azúcar para Cafetería", "INGREDIENT", "SUGAR", None, "KILOGRAMS", 3.0, 8.0, 30.0, 1.00, "Azucarera"),
|
||||
]
|
||||
|
||||
ingredient_map = {}
|
||||
for name, product_type, ing_cat, prod_cat, uom, low_stock, reorder, reorder_qty, cost, brand in ingredients_data:
|
||||
ing = Ingredient(
|
||||
id=uuid.uuid4(),
|
||||
tenant_id=tenant_uuid,
|
||||
name=name,
|
||||
product_type=product_type,
|
||||
ingredient_category=ing_cat,
|
||||
product_category=prod_cat,
|
||||
unit_of_measure=uom,
|
||||
low_stock_threshold=low_stock,
|
||||
reorder_point=reorder,
|
||||
reorder_quantity=reorder_qty,
|
||||
average_cost=cost,
|
||||
brand=brand,
|
||||
is_active=True,
|
||||
is_perishable=True,
|
||||
shelf_life_days=3,
|
||||
created_at=datetime.now(timezone.utc)
|
||||
)
|
||||
session.add(ing)
|
||||
ingredient_map[name] = ing
|
||||
|
||||
await session.commit()
|
||||
|
||||
# Create stock lots
|
||||
now = datetime.now(timezone.utc)
|
||||
|
||||
# Croissants pre-horneados
|
||||
croissant_pre = ingredient_map["Croissant Pre-Horneado"]
|
||||
session.add(Stock(
|
||||
id=uuid.uuid4(),
|
||||
tenant_id=tenant_uuid,
|
||||
ingredient_id=croissant_pre.id,
|
||||
production_stage="par_baked",
|
||||
current_quantity=75,
|
||||
reserved_quantity=15,
|
||||
available_quantity=60,
|
||||
batch_number=f"CROIS-PAR-{now.strftime('%Y%m%d')}-001",
|
||||
received_date=now - timedelta(days=1),
|
||||
expiration_date=now + timedelta(days=4),
|
||||
unit_cost=0.85,
|
||||
total_cost=63.75,
|
||||
storage_location="Congelador - Sección A",
|
||||
is_available=True,
|
||||
is_expired=False,
|
||||
quality_status="good",
|
||||
created_at=now
|
||||
))
|
||||
|
||||
# Palmeras terminadas
|
||||
palmera = ingredient_map["Palmera de Hojaldre"]
|
||||
session.add(Stock(
|
||||
id=uuid.uuid4(),
|
||||
tenant_id=tenant_uuid,
|
||||
ingredient_id=palmera.id,
|
||||
production_stage="fully_baked",
|
||||
current_quantity=28,
|
||||
reserved_quantity=4,
|
||||
available_quantity=24,
|
||||
batch_number=f"PALM-{now.strftime('%Y%m%d')}-001",
|
||||
received_date=now - timedelta(hours=3),
|
||||
expiration_date=now + timedelta(hours=45),
|
||||
unit_cost=2.20,
|
||||
total_cost=61.6,
|
||||
storage_location="Vitrina Pasteles - Nivel 2",
|
||||
is_available=True,
|
||||
is_expired=False,
|
||||
quality_status="good",
|
||||
created_at=now
|
||||
))
|
||||
|
||||
await session.commit()
|
||||
logger.info("Central baker satellite inventory seeded")
|
||||
|
||||
|
||||
async def seed_demo_inventory():
|
||||
"""Main seeding function"""
|
||||
database_url = os.getenv("INVENTORY_DATABASE_URL")
|
||||
if not database_url:
|
||||
logger.error("INVENTORY_DATABASE_URL not set")
|
||||
return False
|
||||
|
||||
engine = create_async_engine(database_url, echo=False)
|
||||
session_factory = async_sessionmaker(engine, class_=AsyncSession, expire_on_commit=False)
|
||||
|
||||
try:
|
||||
async with session_factory() as session:
|
||||
# Seed both demo tenants
|
||||
await seed_inventory_for_tenant(session, DEMO_TENANT_SAN_PABLO, "individual_bakery")
|
||||
await seed_inventory_for_tenant(session, DEMO_TENANT_LA_ESPIGA, "central_baker_satellite")
|
||||
|
||||
logger.info("Demo inventory data seeded successfully")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to seed inventory: {str(e)}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
return False
|
||||
|
||||
finally:
|
||||
await engine.dispose()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
result = asyncio.run(seed_demo_inventory())
|
||||
sys.exit(0 if result else 1)
|
||||
144
scripts/demo/seed_demo_tenants.py
Normal file
144
scripts/demo/seed_demo_tenants.py
Normal file
@@ -0,0 +1,144 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Seed Demo Tenants
|
||||
Creates base demo tenant templates with Spanish data
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
project_root = Path(__file__).parent.parent.parent
|
||||
sys.path.insert(0, str(project_root))
|
||||
|
||||
import os
|
||||
os.environ.setdefault("TENANT_DATABASE_URL", os.getenv("TENANT_DATABASE_URL"))
|
||||
|
||||
from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession, async_sessionmaker
|
||||
from sqlalchemy import select
|
||||
import structlog
|
||||
import uuid
|
||||
from datetime import datetime, timezone
|
||||
|
||||
logger = structlog.get_logger()
|
||||
|
||||
# Demo tenant configurations
|
||||
DEMO_TENANTS = [
|
||||
{
|
||||
"id": "a1b2c3d4-e5f6-47a8-b9c0-d1e2f3a4b5c6",
|
||||
"name": "Panadería San Pablo - Demo",
|
||||
"subdomain": "demo-sanpablo",
|
||||
"business_type": "bakery",
|
||||
"business_model": "individual_bakery",
|
||||
"owner_id": "c1a2b3c4-d5e6-47a8-b9c0-d1e2f3a4b5c6", # María García
|
||||
"address": "Calle Mayor, 15",
|
||||
"city": "Madrid",
|
||||
"postal_code": "28013",
|
||||
"latitude": 40.4168,
|
||||
"longitude": -3.7038,
|
||||
"phone": "+34 912 345 678",
|
||||
"email": "contacto@panaderiasanpablo.com",
|
||||
"subscription_tier": "professional",
|
||||
"is_active": True,
|
||||
"is_demo": True,
|
||||
"is_demo_template": True,
|
||||
"ml_model_trained": True,
|
||||
},
|
||||
{
|
||||
"id": "b2c3d4e5-f6a7-48b9-c0d1-e2f3a4b5c6d7",
|
||||
"name": "Panadería La Espiga - Demo",
|
||||
"subdomain": "demo-laespiga",
|
||||
"business_type": "bakery",
|
||||
"business_model": "central_baker_satellite",
|
||||
"owner_id": "d2e3f4a5-b6c7-48d9-e0f1-a2b3c4d5e6f7", # Carlos Martínez
|
||||
"address": "Avenida de la Constitución, 42",
|
||||
"city": "Barcelona",
|
||||
"postal_code": "08001",
|
||||
"latitude": 41.3851,
|
||||
"longitude": 2.1734,
|
||||
"phone": "+34 913 456 789",
|
||||
"email": "contacto@panaderialaespiga.com",
|
||||
"subscription_tier": "enterprise",
|
||||
"is_active": True,
|
||||
"is_demo": True,
|
||||
"is_demo_template": True,
|
||||
"ml_model_trained": True,
|
||||
}
|
||||
]
|
||||
|
||||
|
||||
async def seed_demo_tenants():
|
||||
"""Seed demo tenants into tenant database"""
|
||||
|
||||
database_url = os.getenv("TENANT_DATABASE_URL")
|
||||
if not database_url:
|
||||
logger.error("TENANT_DATABASE_URL environment variable not set")
|
||||
return False
|
||||
|
||||
logger.info("Connecting to tenant database", url=database_url.split("@")[-1])
|
||||
|
||||
engine = create_async_engine(database_url, echo=False)
|
||||
session_factory = async_sessionmaker(engine, class_=AsyncSession, expire_on_commit=False)
|
||||
|
||||
try:
|
||||
async with session_factory() as session:
|
||||
try:
|
||||
from app.models.tenants import Tenant
|
||||
except ImportError:
|
||||
from services.tenant.app.models.tenants import Tenant
|
||||
|
||||
for tenant_data in DEMO_TENANTS:
|
||||
# Check if tenant already exists
|
||||
result = await session.execute(
|
||||
select(Tenant).where(Tenant.subdomain == tenant_data["subdomain"])
|
||||
)
|
||||
existing_tenant = result.scalar_one_or_none()
|
||||
|
||||
if existing_tenant:
|
||||
logger.info(f"Demo tenant already exists: {tenant_data['subdomain']}")
|
||||
continue
|
||||
|
||||
# Create new demo tenant
|
||||
tenant = Tenant(
|
||||
id=uuid.UUID(tenant_data["id"]),
|
||||
name=tenant_data["name"],
|
||||
subdomain=tenant_data["subdomain"],
|
||||
business_type=tenant_data["business_type"],
|
||||
business_model=tenant_data["business_model"],
|
||||
owner_id=uuid.UUID(tenant_data["owner_id"]),
|
||||
address=tenant_data["address"],
|
||||
city=tenant_data["city"],
|
||||
postal_code=tenant_data["postal_code"],
|
||||
latitude=tenant_data.get("latitude"),
|
||||
longitude=tenant_data.get("longitude"),
|
||||
phone=tenant_data.get("phone"),
|
||||
email=tenant_data.get("email"),
|
||||
subscription_tier=tenant_data["subscription_tier"],
|
||||
is_active=tenant_data["is_active"],
|
||||
is_demo=tenant_data["is_demo"],
|
||||
is_demo_template=tenant_data["is_demo_template"],
|
||||
ml_model_trained=tenant_data.get("ml_model_trained", False),
|
||||
created_at=datetime.now(timezone.utc),
|
||||
updated_at=datetime.now(timezone.utc)
|
||||
)
|
||||
|
||||
session.add(tenant)
|
||||
logger.info(f"Created demo tenant: {tenant_data['name']}")
|
||||
|
||||
await session.commit()
|
||||
logger.info("Demo tenants seeded successfully")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to seed demo tenants: {str(e)}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
return False
|
||||
|
||||
finally:
|
||||
await engine.dispose()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
result = asyncio.run(seed_demo_tenants())
|
||||
sys.exit(0 if result else 1)
|
||||
121
scripts/demo/seed_demo_users.py
Normal file
121
scripts/demo/seed_demo_users.py
Normal file
@@ -0,0 +1,121 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Seed Demo Users
|
||||
Creates demo user accounts for production demo environment
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
project_root = Path(__file__).parent.parent.parent
|
||||
sys.path.insert(0, str(project_root))
|
||||
|
||||
import os
|
||||
os.environ.setdefault("AUTH_DATABASE_URL", os.getenv("AUTH_DATABASE_URL"))
|
||||
|
||||
from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession, async_sessionmaker
|
||||
from sqlalchemy import select
|
||||
import structlog
|
||||
import uuid
|
||||
|
||||
logger = structlog.get_logger()
|
||||
|
||||
# Demo user configurations (public credentials for prospects)
|
||||
DEMO_USERS = [
|
||||
{
|
||||
"id": "c1a2b3c4-d5e6-47a8-b9c0-d1e2f3a4b5c6",
|
||||
"email": "demo.individual@panaderiasanpablo.com",
|
||||
"password_hash": "$2b$12$LQv3c1yqBWVHxkd0LHAkCOYz6TtxMQJqhN8/LewY5GyYVPWzO8hGi", # DemoSanPablo2024!
|
||||
"full_name": "María García López",
|
||||
"phone": "+34 912 345 678",
|
||||
"language": "es",
|
||||
"timezone": "Europe/Madrid",
|
||||
"role": "owner",
|
||||
"is_active": True,
|
||||
"is_verified": True,
|
||||
"is_demo": True
|
||||
},
|
||||
{
|
||||
"id": "d2e3f4a5-b6c7-48d9-e0f1-a2b3c4d5e6f7",
|
||||
"email": "demo.central@panaderialaespiga.com",
|
||||
"password_hash": "$2b$12$LQv3c1yqBWVHxkd0LHAkCOYz6TtxMQJqhN8/LewY5GyYVPWzO8hGi", # DemoLaEspiga2024!
|
||||
"full_name": "Carlos Martínez Ruiz",
|
||||
"phone": "+34 913 456 789",
|
||||
"language": "es",
|
||||
"timezone": "Europe/Madrid",
|
||||
"role": "owner",
|
||||
"is_active": True,
|
||||
"is_verified": True,
|
||||
"is_demo": True
|
||||
}
|
||||
]
|
||||
|
||||
|
||||
async def seed_demo_users():
|
||||
"""Seed demo users into auth database"""
|
||||
|
||||
database_url = os.getenv("AUTH_DATABASE_URL")
|
||||
if not database_url:
|
||||
logger.error("AUTH_DATABASE_URL environment variable not set")
|
||||
return False
|
||||
|
||||
logger.info("Connecting to auth database", url=database_url.split("@")[-1])
|
||||
|
||||
engine = create_async_engine(database_url, echo=False)
|
||||
session_factory = async_sessionmaker(engine, class_=AsyncSession, expire_on_commit=False)
|
||||
|
||||
try:
|
||||
async with session_factory() as session:
|
||||
# Import User model
|
||||
try:
|
||||
from app.models.users import User
|
||||
except ImportError:
|
||||
from services.auth.app.models.users import User
|
||||
from datetime import datetime, timezone
|
||||
|
||||
for user_data in DEMO_USERS:
|
||||
# Check if user already exists
|
||||
result = await session.execute(
|
||||
select(User).where(User.email == user_data["email"])
|
||||
)
|
||||
existing_user = result.scalar_one_or_none()
|
||||
|
||||
if existing_user:
|
||||
logger.info(f"Demo user already exists: {user_data['email']}")
|
||||
continue
|
||||
|
||||
# Create new demo user
|
||||
user = User(
|
||||
id=uuid.UUID(user_data["id"]),
|
||||
email=user_data["email"],
|
||||
hashed_password=user_data["password_hash"],
|
||||
full_name=user_data["full_name"],
|
||||
phone=user_data.get("phone"),
|
||||
language=user_data.get("language", "es"),
|
||||
timezone=user_data.get("timezone", "Europe/Madrid"),
|
||||
role=user_data.get("role", "owner"),
|
||||
is_active=user_data.get("is_active", True),
|
||||
is_verified=user_data.get("is_verified", True),
|
||||
created_at=datetime.now(timezone.utc),
|
||||
updated_at=datetime.now(timezone.utc)
|
||||
)
|
||||
|
||||
session.add(user)
|
||||
logger.info(f"Created demo user: {user_data['email']}")
|
||||
|
||||
await session.commit()
|
||||
logger.info("Demo users seeded successfully")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to seed demo users: {str(e)}")
|
||||
return False
|
||||
|
||||
finally:
|
||||
await engine.dispose()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
result = asyncio.run(seed_demo_users())
|
||||
sys.exit(0 if result else 1)
|
||||
49
scripts/manual_seed_demo.py
Normal file
49
scripts/manual_seed_demo.py
Normal file
@@ -0,0 +1,49 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Manual demo data seeding script
|
||||
Run this to populate the base demo template tenant with inventory data
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import sys
|
||||
import os
|
||||
|
||||
# Add the project root to Python path
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..'))
|
||||
|
||||
async def seed_demo_data():
|
||||
"""Seed demo data by running all seed scripts in order"""
|
||||
from scripts.demo.seed_demo_users import main as seed_users
|
||||
from scripts.demo.seed_demo_tenants import main as seed_tenants
|
||||
from scripts.demo.seed_demo_inventory import main as seed_inventory
|
||||
from scripts.demo.seed_demo_ai_models import main as seed_ai_models
|
||||
|
||||
print("🌱 Starting demo data seeding...")
|
||||
|
||||
try:
|
||||
print("\n📝 Step 1: Seeding demo users...")
|
||||
await seed_users()
|
||||
print("✅ Demo users seeded successfully")
|
||||
|
||||
print("\n🏢 Step 2: Seeding demo tenants...")
|
||||
await seed_tenants()
|
||||
print("✅ Demo tenants seeded successfully")
|
||||
|
||||
print("\n📦 Step 3: Seeding demo inventory...")
|
||||
await seed_inventory()
|
||||
print("✅ Demo inventory seeded successfully")
|
||||
|
||||
print("\n🤖 Step 4: Seeding demo AI models...")
|
||||
await seed_ai_models()
|
||||
print("✅ Demo AI models seeded successfully")
|
||||
|
||||
print("\n🎉 All demo data seeded successfully!")
|
||||
|
||||
except Exception as e:
|
||||
print(f"\n❌ Error during seeding: {e}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
sys.exit(1)
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(seed_demo_data())
|
||||
42
services/demo_session/Dockerfile
Normal file
42
services/demo_session/Dockerfile
Normal file
@@ -0,0 +1,42 @@
|
||||
# Multi-stage build for Demo Session Service
|
||||
FROM python:3.11-slim as builder
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Install build dependencies
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
gcc \
|
||||
g++ \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Copy requirements and install
|
||||
COPY services/demo_session/requirements.txt .
|
||||
RUN pip install --no-cache-dir --user -r requirements.txt
|
||||
|
||||
# Final stage
|
||||
FROM python:3.11-slim
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Copy Python dependencies from builder
|
||||
COPY --from=builder /root/.local /root/.local
|
||||
|
||||
# Copy shared libraries
|
||||
COPY shared/ /app/shared/
|
||||
|
||||
# Copy service code
|
||||
COPY services/demo_session/ /app/
|
||||
|
||||
# Copy scripts
|
||||
COPY scripts/ /app/scripts/
|
||||
|
||||
# Make sure scripts are in path
|
||||
ENV PATH=/root/.local/bin:$PATH
|
||||
ENV PYTHONPATH=/app:$PYTHONPATH
|
||||
|
||||
# Health check
|
||||
HEALTHCHECK --interval=30s --timeout=10s --start-period=40s --retries=3 \
|
||||
CMD python -c "import httpx; httpx.get('http://localhost:8000/health')"
|
||||
|
||||
# Run the application
|
||||
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
|
||||
40
services/demo_session/alembic.ini
Normal file
40
services/demo_session/alembic.ini
Normal file
@@ -0,0 +1,40 @@
|
||||
[alembic]
|
||||
script_location = migrations
|
||||
prepend_sys_path = .
|
||||
sqlalchemy.url = postgresql+asyncpg://postgres:postgres@localhost:5432/demo_session_db
|
||||
|
||||
[post_write_hooks]
|
||||
|
||||
[loggers]
|
||||
keys = root,sqlalchemy,alembic
|
||||
|
||||
[handlers]
|
||||
keys = console
|
||||
|
||||
[formatters]
|
||||
keys = generic
|
||||
|
||||
[logger_root]
|
||||
level = WARN
|
||||
handlers = console
|
||||
qualname =
|
||||
|
||||
[logger_sqlalchemy]
|
||||
level = WARN
|
||||
handlers =
|
||||
qualname = sqlalchemy.engine
|
||||
|
||||
[logger_alembic]
|
||||
level = INFO
|
||||
handlers =
|
||||
qualname = alembic
|
||||
|
||||
[handler_console]
|
||||
class = StreamHandler
|
||||
args = (sys.stderr,)
|
||||
level = NOTSET
|
||||
formatter = generic
|
||||
|
||||
[formatter_generic]
|
||||
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||
datefmt = %H:%M:%S
|
||||
3
services/demo_session/app/__init__.py
Normal file
3
services/demo_session/app/__init__.py
Normal file
@@ -0,0 +1,3 @@
|
||||
"""Demo Session Service"""
|
||||
|
||||
__version__ = "1.0.0"
|
||||
5
services/demo_session/app/api/__init__.py
Normal file
5
services/demo_session/app/api/__init__.py
Normal file
@@ -0,0 +1,5 @@
|
||||
"""Demo Session API"""
|
||||
|
||||
from .routes import router
|
||||
|
||||
__all__ = ["router"]
|
||||
254
services/demo_session/app/api/routes.py
Normal file
254
services/demo_session/app/api/routes.py
Normal file
@@ -0,0 +1,254 @@
|
||||
"""
|
||||
Demo Session API Routes
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, Request
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from typing import List
|
||||
import structlog
|
||||
|
||||
from app.api.schemas import (
|
||||
DemoSessionCreate,
|
||||
DemoSessionResponse,
|
||||
DemoSessionExtend,
|
||||
DemoSessionDestroy,
|
||||
DemoSessionStats,
|
||||
DemoAccountInfo
|
||||
)
|
||||
from app.services import DemoSessionManager, DemoDataCloner, DemoCleanupService
|
||||
from app.core import get_db, get_redis, settings, RedisClient
|
||||
|
||||
logger = structlog.get_logger()
|
||||
|
||||
router = APIRouter(prefix="/api/demo", tags=["demo"])
|
||||
|
||||
|
||||
@router.get("/accounts", response_model=List[DemoAccountInfo])
|
||||
async def get_demo_accounts():
|
||||
"""
|
||||
Get public demo account information
|
||||
Returns credentials for prospects to use
|
||||
"""
|
||||
accounts = []
|
||||
|
||||
for account_type, config in settings.DEMO_ACCOUNTS.items():
|
||||
accounts.append({
|
||||
"account_type": account_type,
|
||||
"name": config["name"],
|
||||
"email": config["email"],
|
||||
"password": "DemoSanPablo2024!" if "sanpablo" in config["email"] else "DemoLaEspiga2024!",
|
||||
"description": (
|
||||
"Panadería individual que produce todo localmente"
|
||||
if account_type == "individual_bakery"
|
||||
else "Punto de venta con obrador central"
|
||||
),
|
||||
"features": (
|
||||
["Gestión de Producción", "Recetas", "Inventario", "Previsión de Demanda", "Ventas"]
|
||||
if account_type == "individual_bakery"
|
||||
else ["Gestión de Proveedores", "Inventario", "Ventas", "Pedidos", "Previsión"]
|
||||
),
|
||||
"business_model": (
|
||||
"Producción Local" if account_type == "individual_bakery" else "Obrador Central + Punto de Venta"
|
||||
)
|
||||
})
|
||||
|
||||
return accounts
|
||||
|
||||
|
||||
@router.post("/session/create", response_model=DemoSessionResponse)
|
||||
async def create_demo_session(
|
||||
request: DemoSessionCreate,
|
||||
http_request: Request,
|
||||
db: AsyncSession = Depends(get_db),
|
||||
redis: RedisClient = Depends(get_redis)
|
||||
):
|
||||
"""
|
||||
Create a new isolated demo session
|
||||
"""
|
||||
logger.info("Creating demo session", demo_account_type=request.demo_account_type)
|
||||
|
||||
try:
|
||||
# Get client info
|
||||
ip_address = request.ip_address or http_request.client.host
|
||||
user_agent = request.user_agent or http_request.headers.get("user-agent", "")
|
||||
|
||||
# Create session
|
||||
session_manager = DemoSessionManager(db, redis)
|
||||
session = await session_manager.create_session(
|
||||
demo_account_type=request.demo_account_type,
|
||||
user_id=request.user_id,
|
||||
ip_address=ip_address,
|
||||
user_agent=user_agent
|
||||
)
|
||||
|
||||
# Clone demo data using Kubernetes Job (better architecture)
|
||||
from app.services.k8s_job_cloner import K8sJobCloner
|
||||
|
||||
job_cloner = K8sJobCloner()
|
||||
|
||||
# Trigger async cloning job (don't wait for completion)
|
||||
import asyncio
|
||||
asyncio.create_task(
|
||||
job_cloner.clone_tenant_data(
|
||||
session.session_id,
|
||||
"", # base_tenant_id not used in job approach
|
||||
str(session.virtual_tenant_id),
|
||||
request.demo_account_type
|
||||
)
|
||||
)
|
||||
|
||||
# Mark as data cloning started
|
||||
await session_manager.mark_data_cloned(session.session_id)
|
||||
await session_manager.mark_redis_populated(session.session_id)
|
||||
|
||||
# Generate session token (simple JWT-like format)
|
||||
import jwt
|
||||
from datetime import datetime, timezone
|
||||
|
||||
session_token = jwt.encode(
|
||||
{
|
||||
"session_id": session.session_id,
|
||||
"virtual_tenant_id": str(session.virtual_tenant_id),
|
||||
"demo_account_type": request.demo_account_type,
|
||||
"exp": session.expires_at.timestamp()
|
||||
},
|
||||
"demo-secret-key", # In production, use proper secret
|
||||
algorithm="HS256"
|
||||
)
|
||||
|
||||
return {
|
||||
"session_id": session.session_id,
|
||||
"virtual_tenant_id": str(session.virtual_tenant_id),
|
||||
"demo_account_type": session.demo_account_type,
|
||||
"status": session.status.value,
|
||||
"created_at": session.created_at,
|
||||
"expires_at": session.expires_at,
|
||||
"demo_config": session.metadata.get("demo_config", {}),
|
||||
"session_token": session_token
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
logger.error("Failed to create demo session", error=str(e))
|
||||
raise HTTPException(status_code=500, detail=f"Failed to create demo session: {str(e)}")
|
||||
|
||||
|
||||
@router.post("/session/extend", response_model=DemoSessionResponse)
|
||||
async def extend_demo_session(
|
||||
request: DemoSessionExtend,
|
||||
db: AsyncSession = Depends(get_db),
|
||||
redis: RedisClient = Depends(get_redis)
|
||||
):
|
||||
"""
|
||||
Extend demo session expiration
|
||||
"""
|
||||
try:
|
||||
session_manager = DemoSessionManager(db, redis)
|
||||
session = await session_manager.extend_session(request.session_id)
|
||||
|
||||
# Generate new token
|
||||
import jwt
|
||||
session_token = jwt.encode(
|
||||
{
|
||||
"session_id": session.session_id,
|
||||
"virtual_tenant_id": str(session.virtual_tenant_id),
|
||||
"demo_account_type": session.demo_account_type,
|
||||
"exp": session.expires_at.timestamp()
|
||||
},
|
||||
"demo-secret-key",
|
||||
algorithm="HS256"
|
||||
)
|
||||
|
||||
return {
|
||||
"session_id": session.session_id,
|
||||
"virtual_tenant_id": str(session.virtual_tenant_id),
|
||||
"demo_account_type": session.demo_account_type,
|
||||
"status": session.status.value,
|
||||
"created_at": session.created_at,
|
||||
"expires_at": session.expires_at,
|
||||
"demo_config": session.metadata.get("demo_config", {}),
|
||||
"session_token": session_token
|
||||
}
|
||||
|
||||
except ValueError as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
except Exception as e:
|
||||
logger.error("Failed to extend session", error=str(e))
|
||||
raise HTTPException(status_code=500, detail=str(e))
|
||||
|
||||
|
||||
@router.post("/session/destroy")
|
||||
async def destroy_demo_session(
|
||||
request: DemoSessionDestroy,
|
||||
db: AsyncSession = Depends(get_db),
|
||||
redis: RedisClient = Depends(get_redis)
|
||||
):
|
||||
"""
|
||||
Destroy demo session and cleanup resources
|
||||
"""
|
||||
try:
|
||||
session_manager = DemoSessionManager(db, redis)
|
||||
await session_manager.destroy_session(request.session_id)
|
||||
|
||||
return {"message": "Session destroyed successfully", "session_id": request.session_id}
|
||||
|
||||
except Exception as e:
|
||||
logger.error("Failed to destroy session", error=str(e))
|
||||
raise HTTPException(status_code=500, detail=str(e))
|
||||
|
||||
|
||||
@router.get("/session/{session_id}")
|
||||
async def get_session_info(
|
||||
session_id: str,
|
||||
db: AsyncSession = Depends(get_db),
|
||||
redis: RedisClient = Depends(get_redis)
|
||||
):
|
||||
"""
|
||||
Get demo session information
|
||||
"""
|
||||
session_manager = DemoSessionManager(db, redis)
|
||||
session = await session_manager.get_session(session_id)
|
||||
|
||||
if not session:
|
||||
raise HTTPException(status_code=404, detail="Session not found")
|
||||
|
||||
return session.to_dict()
|
||||
|
||||
|
||||
@router.get("/stats", response_model=DemoSessionStats)
|
||||
async def get_demo_stats(
|
||||
db: AsyncSession = Depends(get_db),
|
||||
redis: RedisClient = Depends(get_redis)
|
||||
):
|
||||
"""
|
||||
Get demo session statistics
|
||||
"""
|
||||
session_manager = DemoSessionManager(db, redis)
|
||||
stats = await session_manager.get_session_stats()
|
||||
return stats
|
||||
|
||||
|
||||
@router.post("/cleanup/run")
|
||||
async def run_cleanup(
|
||||
db: AsyncSession = Depends(get_db),
|
||||
redis: RedisClient = Depends(get_redis)
|
||||
):
|
||||
"""
|
||||
Manually trigger session cleanup
|
||||
Internal endpoint for CronJob
|
||||
"""
|
||||
cleanup_service = DemoCleanupService(db, redis)
|
||||
stats = await cleanup_service.cleanup_expired_sessions()
|
||||
return stats
|
||||
|
||||
|
||||
@router.get("/health")
|
||||
async def health_check(redis: RedisClient = Depends(get_redis)):
|
||||
"""
|
||||
Health check endpoint
|
||||
"""
|
||||
redis_ok = await redis.ping()
|
||||
|
||||
return {
|
||||
"status": "healthy" if redis_ok else "degraded",
|
||||
"redis": "connected" if redis_ok else "disconnected"
|
||||
}
|
||||
76
services/demo_session/app/api/schemas.py
Normal file
76
services/demo_session/app/api/schemas.py
Normal file
@@ -0,0 +1,76 @@
|
||||
"""
|
||||
API Schemas for Demo Session Service
|
||||
"""
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
from typing import Optional, Dict, Any
|
||||
from datetime import datetime
|
||||
|
||||
|
||||
class DemoSessionCreate(BaseModel):
|
||||
"""Create demo session request"""
|
||||
demo_account_type: str = Field(..., description="individual_bakery or central_baker")
|
||||
user_id: Optional[str] = Field(None, description="Optional authenticated user ID")
|
||||
ip_address: Optional[str] = None
|
||||
user_agent: Optional[str] = None
|
||||
|
||||
|
||||
class DemoSessionResponse(BaseModel):
|
||||
"""Demo session response"""
|
||||
session_id: str
|
||||
virtual_tenant_id: str
|
||||
demo_account_type: str
|
||||
status: str
|
||||
created_at: datetime
|
||||
expires_at: datetime
|
||||
demo_config: Dict[str, Any]
|
||||
session_token: str
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
|
||||
|
||||
class DemoSessionExtend(BaseModel):
|
||||
"""Extend session request"""
|
||||
session_id: str
|
||||
|
||||
|
||||
class DemoSessionDestroy(BaseModel):
|
||||
"""Destroy session request"""
|
||||
session_id: str
|
||||
|
||||
|
||||
class DemoSessionStats(BaseModel):
|
||||
"""Demo session statistics"""
|
||||
total_sessions: int
|
||||
active_sessions: int
|
||||
expired_sessions: int
|
||||
destroyed_sessions: int
|
||||
avg_duration_minutes: float
|
||||
total_requests: int
|
||||
|
||||
|
||||
class DemoAccountInfo(BaseModel):
|
||||
"""Public demo account information"""
|
||||
account_type: str
|
||||
name: str
|
||||
email: str
|
||||
password: str
|
||||
description: str
|
||||
features: list[str]
|
||||
business_model: str
|
||||
|
||||
|
||||
class CloneDataRequest(BaseModel):
|
||||
"""Request to clone tenant data"""
|
||||
base_tenant_id: str
|
||||
virtual_tenant_id: str
|
||||
session_id: str
|
||||
|
||||
|
||||
class CloneDataResponse(BaseModel):
|
||||
"""Response from data cloning"""
|
||||
session_id: str
|
||||
services_cloned: list[str]
|
||||
total_records: int
|
||||
redis_keys: int
|
||||
7
services/demo_session/app/core/__init__.py
Normal file
7
services/demo_session/app/core/__init__.py
Normal file
@@ -0,0 +1,7 @@
|
||||
"""Demo Session Service Core"""
|
||||
|
||||
from .config import settings
|
||||
from .database import DatabaseManager, get_db
|
||||
from .redis_client import RedisClient, get_redis
|
||||
|
||||
__all__ = ["settings", "DatabaseManager", "get_db", "RedisClient", "get_redis"]
|
||||
66
services/demo_session/app/core/config.py
Normal file
66
services/demo_session/app/core/config.py
Normal file
@@ -0,0 +1,66 @@
|
||||
"""
|
||||
Demo Session Service Configuration
|
||||
"""
|
||||
|
||||
import os
|
||||
from pydantic_settings import BaseSettings
|
||||
from typing import Optional
|
||||
|
||||
|
||||
class Settings(BaseSettings):
|
||||
"""Demo Session Service Settings"""
|
||||
|
||||
# Service info
|
||||
SERVICE_NAME: str = "demo-session"
|
||||
VERSION: str = "1.0.0"
|
||||
DEBUG: bool = os.getenv("DEBUG", "false").lower() == "true"
|
||||
|
||||
# Database
|
||||
DATABASE_URL: str = os.getenv(
|
||||
"DEMO_SESSION_DATABASE_URL",
|
||||
"postgresql+asyncpg://postgres:postgres@localhost:5432/demo_session_db"
|
||||
)
|
||||
|
||||
# Redis
|
||||
REDIS_URL: str = os.getenv("REDIS_URL", "redis://localhost:6379/0")
|
||||
REDIS_KEY_PREFIX: str = "demo:session"
|
||||
REDIS_SESSION_TTL: int = 1800 # 30 minutes
|
||||
|
||||
# Demo session configuration
|
||||
DEMO_SESSION_DURATION_MINUTES: int = 30
|
||||
DEMO_SESSION_MAX_EXTENSIONS: int = 3
|
||||
DEMO_SESSION_CLEANUP_INTERVAL_MINUTES: int = 60
|
||||
|
||||
# Demo account credentials (public)
|
||||
DEMO_ACCOUNTS: dict = {
|
||||
"individual_bakery": {
|
||||
"email": "demo.individual@panaderiasanpablo.com",
|
||||
"name": "Panadería San Pablo - Demo",
|
||||
"subdomain": "demo-sanpablo"
|
||||
},
|
||||
"central_baker": {
|
||||
"email": "demo.central@panaderialaespiga.com",
|
||||
"name": "Panadería La Espiga - Demo",
|
||||
"subdomain": "demo-laespiga"
|
||||
}
|
||||
}
|
||||
|
||||
# Service URLs
|
||||
AUTH_SERVICE_URL: str = os.getenv("AUTH_SERVICE_URL", "http://auth-service:8000")
|
||||
TENANT_SERVICE_URL: str = os.getenv("TENANT_SERVICE_URL", "http://tenant-service:8000")
|
||||
INVENTORY_SERVICE_URL: str = os.getenv("INVENTORY_SERVICE_URL", "http://inventory-service:8000")
|
||||
RECIPES_SERVICE_URL: str = os.getenv("RECIPES_SERVICE_URL", "http://recipes-service:8000")
|
||||
SALES_SERVICE_URL: str = os.getenv("SALES_SERVICE_URL", "http://sales-service:8000")
|
||||
ORDERS_SERVICE_URL: str = os.getenv("ORDERS_SERVICE_URL", "http://orders-service:8000")
|
||||
PRODUCTION_SERVICE_URL: str = os.getenv("PRODUCTION_SERVICE_URL", "http://production-service:8000")
|
||||
SUPPLIERS_SERVICE_URL: str = os.getenv("SUPPLIERS_SERVICE_URL", "http://suppliers-service:8000")
|
||||
|
||||
# Logging
|
||||
LOG_LEVEL: str = os.getenv("LOG_LEVEL", "INFO")
|
||||
|
||||
class Config:
|
||||
env_file = ".env"
|
||||
case_sensitive = True
|
||||
|
||||
|
||||
settings = Settings()
|
||||
61
services/demo_session/app/core/database.py
Normal file
61
services/demo_session/app/core/database.py
Normal file
@@ -0,0 +1,61 @@
|
||||
"""
|
||||
Database connection management for Demo Session Service
|
||||
"""
|
||||
|
||||
from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession, async_sessionmaker
|
||||
from sqlalchemy.pool import NullPool
|
||||
import structlog
|
||||
|
||||
from .config import settings
|
||||
|
||||
logger = structlog.get_logger()
|
||||
|
||||
|
||||
class DatabaseManager:
|
||||
"""Database connection manager"""
|
||||
|
||||
def __init__(self, database_url: str = None):
|
||||
self.database_url = database_url or settings.DATABASE_URL
|
||||
self.engine = None
|
||||
self.session_factory = None
|
||||
|
||||
def initialize(self):
|
||||
"""Initialize database engine and session factory"""
|
||||
self.engine = create_async_engine(
|
||||
self.database_url,
|
||||
echo=settings.DEBUG,
|
||||
poolclass=NullPool,
|
||||
pool_pre_ping=True
|
||||
)
|
||||
|
||||
self.session_factory = async_sessionmaker(
|
||||
self.engine,
|
||||
class_=AsyncSession,
|
||||
expire_on_commit=False,
|
||||
autocommit=False,
|
||||
autoflush=False
|
||||
)
|
||||
|
||||
logger.info("Database manager initialized", database_url=self.database_url.split("@")[-1])
|
||||
|
||||
async def close(self):
|
||||
"""Close database connections"""
|
||||
if self.engine:
|
||||
await self.engine.dispose()
|
||||
logger.info("Database connections closed")
|
||||
|
||||
async def get_session(self) -> AsyncSession:
|
||||
"""Get database session"""
|
||||
if not self.session_factory:
|
||||
self.initialize()
|
||||
async with self.session_factory() as session:
|
||||
yield session
|
||||
|
||||
|
||||
db_manager = DatabaseManager()
|
||||
|
||||
|
||||
async def get_db() -> AsyncSession:
|
||||
"""Dependency for FastAPI"""
|
||||
async for session in db_manager.get_session():
|
||||
yield session
|
||||
164
services/demo_session/app/core/redis_client.py
Normal file
164
services/demo_session/app/core/redis_client.py
Normal file
@@ -0,0 +1,164 @@
|
||||
"""
|
||||
Redis client for demo session data caching
|
||||
"""
|
||||
|
||||
import redis.asyncio as redis
|
||||
from typing import Optional, Any
|
||||
import json
|
||||
import structlog
|
||||
from datetime import timedelta
|
||||
|
||||
from .config import settings
|
||||
|
||||
logger = structlog.get_logger()
|
||||
|
||||
|
||||
class RedisClient:
|
||||
"""Redis client for session data"""
|
||||
|
||||
def __init__(self, redis_url: str = None):
|
||||
self.redis_url = redis_url or settings.REDIS_URL
|
||||
self.client: Optional[redis.Redis] = None
|
||||
self.key_prefix = settings.REDIS_KEY_PREFIX
|
||||
|
||||
async def connect(self):
|
||||
"""Connect to Redis"""
|
||||
if not self.client:
|
||||
self.client = await redis.from_url(
|
||||
self.redis_url,
|
||||
encoding="utf-8",
|
||||
decode_responses=True
|
||||
)
|
||||
logger.info("Redis client connected", redis_url=self.redis_url.split("@")[-1])
|
||||
|
||||
async def close(self):
|
||||
"""Close Redis connection"""
|
||||
if self.client:
|
||||
await self.client.close()
|
||||
logger.info("Redis connection closed")
|
||||
|
||||
async def ping(self) -> bool:
|
||||
"""Check Redis connection"""
|
||||
try:
|
||||
if not self.client:
|
||||
await self.connect()
|
||||
return await self.client.ping()
|
||||
except Exception as e:
|
||||
logger.error("Redis ping failed", error=str(e))
|
||||
return False
|
||||
|
||||
def _make_key(self, *parts: str) -> str:
|
||||
"""Create Redis key with prefix"""
|
||||
return f"{self.key_prefix}:{':'.join(parts)}"
|
||||
|
||||
async def set_session_data(self, session_id: str, key: str, data: Any, ttl: int = None):
|
||||
"""Store session data in Redis"""
|
||||
if not self.client:
|
||||
await self.connect()
|
||||
|
||||
redis_key = self._make_key(session_id, key)
|
||||
serialized = json.dumps(data) if not isinstance(data, str) else data
|
||||
|
||||
if ttl:
|
||||
await self.client.setex(redis_key, ttl, serialized)
|
||||
else:
|
||||
await self.client.set(redis_key, serialized)
|
||||
|
||||
logger.debug("Session data stored", session_id=session_id, key=key)
|
||||
|
||||
async def get_session_data(self, session_id: str, key: str) -> Optional[Any]:
|
||||
"""Retrieve session data from Redis"""
|
||||
if not self.client:
|
||||
await self.connect()
|
||||
|
||||
redis_key = self._make_key(session_id, key)
|
||||
data = await self.client.get(redis_key)
|
||||
|
||||
if data:
|
||||
try:
|
||||
return json.loads(data)
|
||||
except json.JSONDecodeError:
|
||||
return data
|
||||
|
||||
return None
|
||||
|
||||
async def delete_session_data(self, session_id: str, key: str = None):
|
||||
"""Delete session data"""
|
||||
if not self.client:
|
||||
await self.connect()
|
||||
|
||||
if key:
|
||||
redis_key = self._make_key(session_id, key)
|
||||
await self.client.delete(redis_key)
|
||||
else:
|
||||
pattern = self._make_key(session_id, "*")
|
||||
keys = await self.client.keys(pattern)
|
||||
if keys:
|
||||
await self.client.delete(*keys)
|
||||
|
||||
logger.debug("Session data deleted", session_id=session_id, key=key)
|
||||
|
||||
async def extend_session_ttl(self, session_id: str, ttl: int):
|
||||
"""Extend TTL for all session keys"""
|
||||
if not self.client:
|
||||
await self.connect()
|
||||
|
||||
pattern = self._make_key(session_id, "*")
|
||||
keys = await self.client.keys(pattern)
|
||||
|
||||
for key in keys:
|
||||
await self.client.expire(key, ttl)
|
||||
|
||||
logger.debug("Session TTL extended", session_id=session_id, ttl=ttl)
|
||||
|
||||
async def set_hash(self, session_id: str, hash_key: str, field: str, value: Any):
|
||||
"""Store hash field in Redis"""
|
||||
if not self.client:
|
||||
await self.connect()
|
||||
|
||||
redis_key = self._make_key(session_id, hash_key)
|
||||
serialized = json.dumps(value) if not isinstance(value, str) else value
|
||||
await self.client.hset(redis_key, field, serialized)
|
||||
|
||||
async def get_hash(self, session_id: str, hash_key: str, field: str) -> Optional[Any]:
|
||||
"""Get hash field from Redis"""
|
||||
if not self.client:
|
||||
await self.connect()
|
||||
|
||||
redis_key = self._make_key(session_id, hash_key)
|
||||
data = await self.client.hget(redis_key, field)
|
||||
|
||||
if data:
|
||||
try:
|
||||
return json.loads(data)
|
||||
except json.JSONDecodeError:
|
||||
return data
|
||||
|
||||
return None
|
||||
|
||||
async def get_all_hash(self, session_id: str, hash_key: str) -> dict:
|
||||
"""Get all hash fields"""
|
||||
if not self.client:
|
||||
await self.connect()
|
||||
|
||||
redis_key = self._make_key(session_id, hash_key)
|
||||
data = await self.client.hgetall(redis_key)
|
||||
|
||||
result = {}
|
||||
for field, value in data.items():
|
||||
try:
|
||||
result[field] = json.loads(value)
|
||||
except json.JSONDecodeError:
|
||||
result[field] = value
|
||||
|
||||
return result
|
||||
|
||||
|
||||
redis_client = RedisClient()
|
||||
|
||||
|
||||
async def get_redis() -> RedisClient:
|
||||
"""Dependency for FastAPI"""
|
||||
if not redis_client.client:
|
||||
await redis_client.connect()
|
||||
return redis_client
|
||||
111
services/demo_session/app/main.py
Normal file
111
services/demo_session/app/main.py
Normal file
@@ -0,0 +1,111 @@
|
||||
"""
|
||||
Demo Session Service - Main Application
|
||||
Manages isolated demo sessions with ephemeral data
|
||||
"""
|
||||
|
||||
from fastapi import FastAPI, Request
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
from fastapi.responses import JSONResponse
|
||||
import structlog
|
||||
from contextlib import asynccontextmanager
|
||||
|
||||
from app.core import settings, DatabaseManager, RedisClient
|
||||
from app.api import router
|
||||
|
||||
logger = structlog.get_logger()
|
||||
|
||||
# Initialize database and redis
|
||||
db_manager = DatabaseManager()
|
||||
redis_client = RedisClient()
|
||||
|
||||
|
||||
@asynccontextmanager
|
||||
async def lifespan(app: FastAPI):
|
||||
"""Application lifespan handler"""
|
||||
logger.info("Starting Demo Session Service", version=settings.VERSION)
|
||||
|
||||
# Initialize database
|
||||
db_manager.initialize()
|
||||
|
||||
# Connect to Redis
|
||||
await redis_client.connect()
|
||||
|
||||
logger.info("Demo Session Service started successfully")
|
||||
|
||||
yield
|
||||
|
||||
# Cleanup on shutdown
|
||||
await db_manager.close()
|
||||
await redis_client.close()
|
||||
|
||||
logger.info("Demo Session Service stopped")
|
||||
|
||||
|
||||
app = FastAPI(
|
||||
title="Demo Session Service",
|
||||
description="Manages isolated demo sessions for prospect users",
|
||||
version=settings.VERSION,
|
||||
lifespan=lifespan
|
||||
)
|
||||
|
||||
# CORS middleware
|
||||
app.add_middleware(
|
||||
CORSMiddleware,
|
||||
allow_origins=["*"],
|
||||
allow_credentials=True,
|
||||
allow_methods=["*"],
|
||||
allow_headers=["*"],
|
||||
)
|
||||
|
||||
|
||||
@app.exception_handler(Exception)
|
||||
async def global_exception_handler(request: Request, exc: Exception):
|
||||
"""Global exception handler"""
|
||||
logger.error(
|
||||
"Unhandled exception",
|
||||
path=request.url.path,
|
||||
method=request.method,
|
||||
error=str(exc)
|
||||
)
|
||||
return JSONResponse(
|
||||
status_code=500,
|
||||
content={"detail": "Internal server error"}
|
||||
)
|
||||
|
||||
|
||||
# Include routers
|
||||
app.include_router(router)
|
||||
|
||||
|
||||
@app.get("/")
|
||||
async def root():
|
||||
"""Root endpoint"""
|
||||
return {
|
||||
"service": "demo-session",
|
||||
"version": settings.VERSION,
|
||||
"status": "running"
|
||||
}
|
||||
|
||||
|
||||
@app.get("/health")
|
||||
async def health():
|
||||
"""Health check endpoint"""
|
||||
redis_ok = await redis_client.ping()
|
||||
|
||||
return {
|
||||
"status": "healthy" if redis_ok else "degraded",
|
||||
"service": "demo-session",
|
||||
"version": settings.VERSION,
|
||||
"redis": "connected" if redis_ok else "disconnected"
|
||||
}
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import uvicorn
|
||||
uvicorn.run(
|
||||
"app.main:app",
|
||||
host="0.0.0.0",
|
||||
port=8000,
|
||||
reload=settings.DEBUG,
|
||||
log_level=settings.LOG_LEVEL.lower()
|
||||
)
|
||||
5
services/demo_session/app/models/__init__.py
Normal file
5
services/demo_session/app/models/__init__.py
Normal file
@@ -0,0 +1,5 @@
|
||||
"""Demo Session Service Models"""
|
||||
|
||||
from .demo_session import DemoSession, DemoSessionStatus
|
||||
|
||||
__all__ = ["DemoSession", "DemoSessionStatus"]
|
||||
71
services/demo_session/app/models/demo_session.py
Normal file
71
services/demo_session/app/models/demo_session.py
Normal file
@@ -0,0 +1,71 @@
|
||||
"""
|
||||
Demo Session Models
|
||||
Tracks ephemeral demo sessions for prospect users
|
||||
"""
|
||||
|
||||
from sqlalchemy import Column, String, Boolean, DateTime, Integer, Enum as SQLEnum
|
||||
from sqlalchemy.dialects.postgresql import UUID, JSONB
|
||||
from datetime import datetime, timezone
|
||||
import uuid
|
||||
import enum
|
||||
|
||||
from shared.database.base import Base
|
||||
|
||||
|
||||
class DemoSessionStatus(enum.Enum):
|
||||
"""Demo session status"""
|
||||
ACTIVE = "active"
|
||||
EXPIRED = "expired"
|
||||
DESTROYED = "destroyed"
|
||||
|
||||
|
||||
class DemoSession(Base):
|
||||
"""Demo Session tracking model"""
|
||||
__tablename__ = "demo_sessions"
|
||||
|
||||
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
|
||||
session_id = Column(String(100), unique=True, nullable=False, index=True)
|
||||
|
||||
# Session ownership
|
||||
user_id = Column(UUID(as_uuid=True), nullable=True)
|
||||
ip_address = Column(String(45), nullable=True)
|
||||
user_agent = Column(String(500), nullable=True)
|
||||
|
||||
# Demo tenant linking
|
||||
base_demo_tenant_id = Column(UUID(as_uuid=True), nullable=False, index=True)
|
||||
virtual_tenant_id = Column(UUID(as_uuid=True), nullable=False, index=True)
|
||||
demo_account_type = Column(String(50), nullable=False) # 'individual_bakery', 'central_baker'
|
||||
|
||||
# Session lifecycle
|
||||
status = Column(SQLEnum(DemoSessionStatus, values_callable=lambda obj: [e.value for e in obj]), default=DemoSessionStatus.ACTIVE, index=True)
|
||||
created_at = Column(DateTime(timezone=True), default=lambda: datetime.now(timezone.utc), index=True)
|
||||
expires_at = Column(DateTime(timezone=True), nullable=False, index=True)
|
||||
last_activity_at = Column(DateTime(timezone=True), default=lambda: datetime.now(timezone.utc))
|
||||
destroyed_at = Column(DateTime(timezone=True), nullable=True)
|
||||
|
||||
# Session metrics
|
||||
request_count = Column(Integer, default=0)
|
||||
data_cloned = Column(Boolean, default=False)
|
||||
redis_populated = Column(Boolean, default=False)
|
||||
|
||||
# Session metadata
|
||||
session_metadata = Column(JSONB, default=dict)
|
||||
|
||||
def __repr__(self):
|
||||
return f"<DemoSession(session_id={self.session_id}, status={self.status.value})>"
|
||||
|
||||
def to_dict(self):
|
||||
"""Convert to dictionary"""
|
||||
return {
|
||||
"id": str(self.id),
|
||||
"session_id": self.session_id,
|
||||
"virtual_tenant_id": str(self.virtual_tenant_id),
|
||||
"base_demo_tenant_id": str(self.base_demo_tenant_id),
|
||||
"demo_account_type": self.demo_account_type,
|
||||
"status": self.status.value,
|
||||
"created_at": self.created_at.isoformat() if self.created_at else None,
|
||||
"expires_at": self.expires_at.isoformat() if self.expires_at else None,
|
||||
"last_activity_at": self.last_activity_at.isoformat() if self.last_activity_at else None,
|
||||
"request_count": self.request_count,
|
||||
"metadata": self.session_metadata
|
||||
}
|
||||
7
services/demo_session/app/services/__init__.py
Normal file
7
services/demo_session/app/services/__init__.py
Normal file
@@ -0,0 +1,7 @@
|
||||
"""Demo Session Services"""
|
||||
|
||||
from .session_manager import DemoSessionManager
|
||||
from .data_cloner import DemoDataCloner
|
||||
from .cleanup_service import DemoCleanupService
|
||||
|
||||
__all__ = ["DemoSessionManager", "DemoDataCloner", "DemoCleanupService"]
|
||||
147
services/demo_session/app/services/cleanup_service.py
Normal file
147
services/demo_session/app/services/cleanup_service.py
Normal file
@@ -0,0 +1,147 @@
|
||||
"""
|
||||
Demo Cleanup Service
|
||||
Handles automatic cleanup of expired sessions
|
||||
"""
|
||||
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy import select, update
|
||||
from datetime import datetime, timezone
|
||||
from typing import List
|
||||
import structlog
|
||||
|
||||
from app.models import DemoSession, DemoSessionStatus
|
||||
from app.services.data_cloner import DemoDataCloner
|
||||
from app.core import RedisClient
|
||||
|
||||
logger = structlog.get_logger()
|
||||
|
||||
|
||||
class DemoCleanupService:
|
||||
"""Handles cleanup of expired demo sessions"""
|
||||
|
||||
def __init__(self, db: AsyncSession, redis: RedisClient):
|
||||
self.db = db
|
||||
self.redis = redis
|
||||
self.data_cloner = DemoDataCloner(db, redis)
|
||||
|
||||
async def cleanup_expired_sessions(self) -> dict:
|
||||
"""
|
||||
Find and cleanup all expired sessions
|
||||
|
||||
Returns:
|
||||
Cleanup statistics
|
||||
"""
|
||||
logger.info("Starting demo session cleanup")
|
||||
|
||||
now = datetime.now(timezone.utc)
|
||||
|
||||
# Find expired sessions
|
||||
result = await self.db.execute(
|
||||
select(DemoSession).where(
|
||||
DemoSession.status == DemoSessionStatus.ACTIVE,
|
||||
DemoSession.expires_at < now
|
||||
)
|
||||
)
|
||||
expired_sessions = result.scalars().all()
|
||||
|
||||
stats = {
|
||||
"total_expired": len(expired_sessions),
|
||||
"cleaned_up": 0,
|
||||
"failed": 0,
|
||||
"errors": []
|
||||
}
|
||||
|
||||
for session in expired_sessions:
|
||||
try:
|
||||
# Mark as expired
|
||||
session.status = DemoSessionStatus.EXPIRED
|
||||
await self.db.commit()
|
||||
|
||||
# Delete session data
|
||||
await self.data_cloner.delete_session_data(
|
||||
str(session.virtual_tenant_id),
|
||||
session.session_id
|
||||
)
|
||||
|
||||
stats["cleaned_up"] += 1
|
||||
|
||||
logger.info(
|
||||
"Session cleaned up",
|
||||
session_id=session.session_id,
|
||||
age_minutes=(now - session.created_at).total_seconds() / 60
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
stats["failed"] += 1
|
||||
stats["errors"].append({
|
||||
"session_id": session.session_id,
|
||||
"error": str(e)
|
||||
})
|
||||
logger.error(
|
||||
"Failed to cleanup session",
|
||||
session_id=session.session_id,
|
||||
error=str(e)
|
||||
)
|
||||
|
||||
logger.info("Demo session cleanup completed", stats=stats)
|
||||
return stats
|
||||
|
||||
async def cleanup_old_destroyed_sessions(self, days: int = 7) -> int:
|
||||
"""
|
||||
Delete destroyed session records older than specified days
|
||||
|
||||
Args:
|
||||
days: Number of days to keep destroyed sessions
|
||||
|
||||
Returns:
|
||||
Number of deleted records
|
||||
"""
|
||||
from datetime import timedelta
|
||||
|
||||
cutoff_date = datetime.now(timezone.utc) - timedelta(days=days)
|
||||
|
||||
result = await self.db.execute(
|
||||
select(DemoSession).where(
|
||||
DemoSession.status == DemoSessionStatus.DESTROYED,
|
||||
DemoSession.destroyed_at < cutoff_date
|
||||
)
|
||||
)
|
||||
old_sessions = result.scalars().all()
|
||||
|
||||
for session in old_sessions:
|
||||
await self.db.delete(session)
|
||||
|
||||
await self.db.commit()
|
||||
|
||||
logger.info(
|
||||
"Old destroyed sessions deleted",
|
||||
count=len(old_sessions),
|
||||
older_than_days=days
|
||||
)
|
||||
|
||||
return len(old_sessions)
|
||||
|
||||
async def get_cleanup_stats(self) -> dict:
|
||||
"""Get cleanup statistics"""
|
||||
result = await self.db.execute(select(DemoSession))
|
||||
all_sessions = result.scalars().all()
|
||||
|
||||
now = datetime.now(timezone.utc)
|
||||
|
||||
active_count = len([s for s in all_sessions if s.status == DemoSessionStatus.ACTIVE])
|
||||
expired_count = len([s for s in all_sessions if s.status == DemoSessionStatus.EXPIRED])
|
||||
destroyed_count = len([s for s in all_sessions if s.status == DemoSessionStatus.DESTROYED])
|
||||
|
||||
# Find sessions that should be expired but aren't marked yet
|
||||
should_be_expired = len([
|
||||
s for s in all_sessions
|
||||
if s.status == DemoSessionStatus.ACTIVE and s.expires_at < now
|
||||
])
|
||||
|
||||
return {
|
||||
"total_sessions": len(all_sessions),
|
||||
"active_sessions": active_count,
|
||||
"expired_sessions": expired_count,
|
||||
"destroyed_sessions": destroyed_count,
|
||||
"pending_cleanup": should_be_expired
|
||||
}
|
||||
288
services/demo_session/app/services/data_cloner.py
Normal file
288
services/demo_session/app/services/data_cloner.py
Normal file
@@ -0,0 +1,288 @@
|
||||
"""
|
||||
Demo Data Cloner
|
||||
Clones base demo data to session-specific virtual tenants
|
||||
"""
|
||||
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from typing import Dict, Any, List
|
||||
import httpx
|
||||
import structlog
|
||||
import uuid
|
||||
|
||||
from app.core import RedisClient, settings
|
||||
|
||||
logger = structlog.get_logger()
|
||||
|
||||
|
||||
class DemoDataCloner:
|
||||
"""Clones demo data for isolated sessions"""
|
||||
|
||||
def __init__(self, db: AsyncSession, redis: RedisClient):
|
||||
self.db = db
|
||||
self.redis = redis
|
||||
|
||||
async def clone_tenant_data(
|
||||
self,
|
||||
session_id: str,
|
||||
base_demo_tenant_id: str,
|
||||
virtual_tenant_id: str,
|
||||
demo_account_type: str
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Clone all demo data from base tenant to virtual tenant
|
||||
|
||||
Args:
|
||||
session_id: Session ID
|
||||
base_demo_tenant_id: Base demo tenant UUID
|
||||
virtual_tenant_id: Virtual tenant UUID for this session
|
||||
demo_account_type: Type of demo account
|
||||
|
||||
Returns:
|
||||
Cloning statistics
|
||||
"""
|
||||
logger.info(
|
||||
"Starting data cloning",
|
||||
session_id=session_id,
|
||||
base_demo_tenant_id=base_demo_tenant_id,
|
||||
virtual_tenant_id=virtual_tenant_id
|
||||
)
|
||||
|
||||
stats = {
|
||||
"session_id": session_id,
|
||||
"services_cloned": [],
|
||||
"total_records": 0,
|
||||
"redis_keys": 0
|
||||
}
|
||||
|
||||
# Clone data from each service based on demo account type
|
||||
services_to_clone = self._get_services_for_demo_type(demo_account_type)
|
||||
|
||||
for service_name in services_to_clone:
|
||||
try:
|
||||
service_stats = await self._clone_service_data(
|
||||
service_name,
|
||||
base_demo_tenant_id,
|
||||
virtual_tenant_id,
|
||||
session_id
|
||||
)
|
||||
stats["services_cloned"].append(service_name)
|
||||
stats["total_records"] += service_stats.get("records_cloned", 0)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(
|
||||
"Failed to clone service data",
|
||||
service=service_name,
|
||||
error=str(e)
|
||||
)
|
||||
|
||||
# Populate Redis cache with hot data
|
||||
redis_stats = await self._populate_redis_cache(
|
||||
session_id,
|
||||
virtual_tenant_id,
|
||||
demo_account_type
|
||||
)
|
||||
stats["redis_keys"] = redis_stats.get("keys_created", 0)
|
||||
|
||||
logger.info(
|
||||
"Data cloning completed",
|
||||
session_id=session_id,
|
||||
stats=stats
|
||||
)
|
||||
|
||||
return stats
|
||||
|
||||
def _get_services_for_demo_type(self, demo_account_type: str) -> List[str]:
|
||||
"""Get list of services to clone based on demo type"""
|
||||
base_services = ["inventory", "sales", "orders", "pos"]
|
||||
|
||||
if demo_account_type == "individual_bakery":
|
||||
# Individual bakery has production, recipes
|
||||
return base_services + ["recipes", "production"]
|
||||
elif demo_account_type == "central_baker":
|
||||
# Central baker satellite has suppliers
|
||||
return base_services + ["suppliers"]
|
||||
else:
|
||||
return base_services
|
||||
|
||||
async def _clone_service_data(
|
||||
self,
|
||||
service_name: str,
|
||||
base_tenant_id: str,
|
||||
virtual_tenant_id: str,
|
||||
session_id: str
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Clone data for a specific service
|
||||
|
||||
Args:
|
||||
service_name: Name of the service
|
||||
base_tenant_id: Source tenant ID
|
||||
virtual_tenant_id: Target tenant ID
|
||||
session_id: Session ID
|
||||
|
||||
Returns:
|
||||
Cloning statistics
|
||||
"""
|
||||
service_url = self._get_service_url(service_name)
|
||||
|
||||
async with httpx.AsyncClient(timeout=30.0) as client:
|
||||
response = await client.post(
|
||||
f"{service_url}/internal/demo/clone",
|
||||
json={
|
||||
"base_tenant_id": base_tenant_id,
|
||||
"virtual_tenant_id": virtual_tenant_id,
|
||||
"session_id": session_id
|
||||
},
|
||||
headers={"X-Internal-Service": "demo-session"}
|
||||
)
|
||||
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
async def _populate_redis_cache(
|
||||
self,
|
||||
session_id: str,
|
||||
virtual_tenant_id: str,
|
||||
demo_account_type: str
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Populate Redis with frequently accessed data
|
||||
|
||||
Args:
|
||||
session_id: Session ID
|
||||
virtual_tenant_id: Virtual tenant ID
|
||||
demo_account_type: Demo account type
|
||||
|
||||
Returns:
|
||||
Statistics about cached data
|
||||
"""
|
||||
logger.info("Populating Redis cache", session_id=session_id)
|
||||
|
||||
keys_created = 0
|
||||
|
||||
# Cache inventory data (hot data)
|
||||
try:
|
||||
inventory_data = await self._fetch_inventory_data(virtual_tenant_id)
|
||||
await self.redis.set_session_data(
|
||||
session_id,
|
||||
"inventory",
|
||||
inventory_data,
|
||||
ttl=settings.REDIS_SESSION_TTL
|
||||
)
|
||||
keys_created += 1
|
||||
except Exception as e:
|
||||
logger.error("Failed to cache inventory", error=str(e))
|
||||
|
||||
# Cache POS data
|
||||
try:
|
||||
pos_data = await self._fetch_pos_data(virtual_tenant_id)
|
||||
await self.redis.set_session_data(
|
||||
session_id,
|
||||
"pos",
|
||||
pos_data,
|
||||
ttl=settings.REDIS_SESSION_TTL
|
||||
)
|
||||
keys_created += 1
|
||||
except Exception as e:
|
||||
logger.error("Failed to cache POS data", error=str(e))
|
||||
|
||||
# Cache recent sales
|
||||
try:
|
||||
sales_data = await self._fetch_recent_sales(virtual_tenant_id)
|
||||
await self.redis.set_session_data(
|
||||
session_id,
|
||||
"recent_sales",
|
||||
sales_data,
|
||||
ttl=settings.REDIS_SESSION_TTL
|
||||
)
|
||||
keys_created += 1
|
||||
except Exception as e:
|
||||
logger.error("Failed to cache sales", error=str(e))
|
||||
|
||||
return {"keys_created": keys_created}
|
||||
|
||||
async def _fetch_inventory_data(self, tenant_id: str) -> Dict[str, Any]:
|
||||
"""Fetch inventory data for caching"""
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{settings.INVENTORY_SERVICE_URL}/api/inventory/summary",
|
||||
headers={"X-Tenant-Id": tenant_id}
|
||||
)
|
||||
return response.json()
|
||||
|
||||
async def _fetch_pos_data(self, tenant_id: str) -> Dict[str, Any]:
|
||||
"""Fetch POS data for caching"""
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{settings.POS_SERVICE_URL}/api/pos/current-session",
|
||||
headers={"X-Tenant-Id": tenant_id}
|
||||
)
|
||||
return response.json()
|
||||
|
||||
async def _fetch_recent_sales(self, tenant_id: str) -> Dict[str, Any]:
|
||||
"""Fetch recent sales for caching"""
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{settings.SALES_SERVICE_URL}/api/sales/recent?limit=50",
|
||||
headers={"X-Tenant-Id": tenant_id}
|
||||
)
|
||||
return response.json()
|
||||
|
||||
def _get_service_url(self, service_name: str) -> str:
|
||||
"""Get service URL from settings"""
|
||||
url_map = {
|
||||
"inventory": settings.INVENTORY_SERVICE_URL,
|
||||
"recipes": settings.RECIPES_SERVICE_URL,
|
||||
"sales": settings.SALES_SERVICE_URL,
|
||||
"orders": settings.ORDERS_SERVICE_URL,
|
||||
"production": settings.PRODUCTION_SERVICE_URL,
|
||||
"suppliers": settings.SUPPLIERS_SERVICE_URL,
|
||||
"pos": settings.SALES_SERVICE_URL,
|
||||
}
|
||||
return url_map.get(service_name, "")
|
||||
|
||||
async def delete_session_data(
|
||||
self,
|
||||
virtual_tenant_id: str,
|
||||
session_id: str
|
||||
):
|
||||
"""
|
||||
Delete all data for a session
|
||||
|
||||
Args:
|
||||
virtual_tenant_id: Virtual tenant ID to delete
|
||||
session_id: Session ID
|
||||
"""
|
||||
logger.info(
|
||||
"Deleting session data",
|
||||
virtual_tenant_id=virtual_tenant_id,
|
||||
session_id=session_id
|
||||
)
|
||||
|
||||
# Delete from each service
|
||||
services = ["inventory", "recipes", "sales", "orders", "production", "suppliers", "pos"]
|
||||
|
||||
for service_name in services:
|
||||
try:
|
||||
await self._delete_service_data(service_name, virtual_tenant_id)
|
||||
except Exception as e:
|
||||
logger.error(
|
||||
"Failed to delete service data",
|
||||
service=service_name,
|
||||
error=str(e)
|
||||
)
|
||||
|
||||
# Delete from Redis
|
||||
await self.redis.delete_session_data(session_id)
|
||||
|
||||
logger.info("Session data deleted", virtual_tenant_id=virtual_tenant_id)
|
||||
|
||||
async def _delete_service_data(self, service_name: str, virtual_tenant_id: str):
|
||||
"""Delete data from a specific service"""
|
||||
service_url = self._get_service_url(service_name)
|
||||
|
||||
async with httpx.AsyncClient(timeout=30.0) as client:
|
||||
await client.delete(
|
||||
f"{service_url}/internal/demo/tenant/{virtual_tenant_id}",
|
||||
headers={"X-Internal-Service": "demo-session"}
|
||||
)
|
||||
166
services/demo_session/app/services/k8s_job_cloner.py
Normal file
166
services/demo_session/app/services/k8s_job_cloner.py
Normal file
@@ -0,0 +1,166 @@
|
||||
"""
|
||||
Kubernetes Job-based Demo Data Cloner
|
||||
Triggers a K8s Job to clone demo data at the database level
|
||||
"""
|
||||
|
||||
import httpx
|
||||
import structlog
|
||||
from typing import Dict, Any
|
||||
import os
|
||||
|
||||
logger = structlog.get_logger()
|
||||
|
||||
|
||||
class K8sJobCloner:
|
||||
"""Triggers Kubernetes Jobs to clone demo data"""
|
||||
|
||||
def __init__(self):
|
||||
self.k8s_api_url = os.getenv("KUBERNETES_SERVICE_HOST")
|
||||
self.namespace = os.getenv("POD_NAMESPACE", "bakery-ia")
|
||||
self.clone_job_image = os.getenv("CLONE_JOB_IMAGE", "bakery/inventory-service:latest")
|
||||
# Service account token for K8s API access
|
||||
with open("/var/run/secrets/kubernetes.io/serviceaccount/token", "r") as f:
|
||||
self.token = f.read()
|
||||
|
||||
async def clone_tenant_data(
|
||||
self,
|
||||
session_id: str,
|
||||
base_demo_tenant_id: str,
|
||||
virtual_tenant_id: str,
|
||||
demo_account_type: str
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Clone demo data by creating a Kubernetes Job
|
||||
|
||||
Args:
|
||||
session_id: Session ID
|
||||
base_demo_tenant_id: Base demo tenant UUID (not used in job approach)
|
||||
virtual_tenant_id: Virtual tenant UUID for this session
|
||||
demo_account_type: Type of demo account
|
||||
|
||||
Returns:
|
||||
Job creation status
|
||||
"""
|
||||
logger.info(
|
||||
"Triggering demo data cloning job",
|
||||
session_id=session_id,
|
||||
virtual_tenant_id=virtual_tenant_id,
|
||||
demo_account_type=demo_account_type,
|
||||
clone_image=self.clone_job_image
|
||||
)
|
||||
|
||||
job_name = f"demo-clone-{virtual_tenant_id[:8]}"
|
||||
|
||||
# Create Job manifest
|
||||
job_manifest = {
|
||||
"apiVersion": "batch/v1",
|
||||
"kind": "Job",
|
||||
"metadata": {
|
||||
"name": job_name,
|
||||
"namespace": self.namespace,
|
||||
"labels": {
|
||||
"app": "demo-clone",
|
||||
"session-id": session_id,
|
||||
"component": "runtime"
|
||||
}
|
||||
},
|
||||
"spec": {
|
||||
"ttlSecondsAfterFinished": 3600,
|
||||
"backoffLimit": 2,
|
||||
"template": {
|
||||
"metadata": {
|
||||
"labels": {"app": "demo-clone"}
|
||||
},
|
||||
"spec": {
|
||||
"restartPolicy": "Never",
|
||||
"containers": [{
|
||||
"name": "clone-data",
|
||||
"image": self.clone_job_image, # Configured via environment variable
|
||||
"imagePullPolicy": "IfNotPresent", # Don't pull if image exists locally
|
||||
"command": ["python", "/app/scripts/demo/clone_demo_tenant.py"],
|
||||
"env": [
|
||||
{"name": "VIRTUAL_TENANT_ID", "value": virtual_tenant_id},
|
||||
{"name": "DEMO_ACCOUNT_TYPE", "value": demo_account_type},
|
||||
{
|
||||
"name": "INVENTORY_DATABASE_URL",
|
||||
"valueFrom": {
|
||||
"secretKeyRef": {
|
||||
"name": "database-secrets",
|
||||
"key": "INVENTORY_DATABASE_URL"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "SALES_DATABASE_URL",
|
||||
"valueFrom": {
|
||||
"secretKeyRef": {
|
||||
"name": "database-secrets",
|
||||
"key": "SALES_DATABASE_URL"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "ORDERS_DATABASE_URL",
|
||||
"valueFrom": {
|
||||
"secretKeyRef": {
|
||||
"name": "database-secrets",
|
||||
"key": "ORDERS_DATABASE_URL"
|
||||
}
|
||||
}
|
||||
},
|
||||
{"name": "LOG_LEVEL", "value": "INFO"}
|
||||
],
|
||||
"resources": {
|
||||
"requests": {"memory": "256Mi", "cpu": "100m"},
|
||||
"limits": {"memory": "512Mi", "cpu": "500m"}
|
||||
}
|
||||
}]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
try:
|
||||
# Create the Job via K8s API
|
||||
async with httpx.AsyncClient(verify=False, timeout=30.0) as client:
|
||||
response = await client.post(
|
||||
f"https://{self.k8s_api_url}/apis/batch/v1/namespaces/{self.namespace}/jobs",
|
||||
json=job_manifest,
|
||||
headers={
|
||||
"Authorization": f"Bearer {self.token}",
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
)
|
||||
|
||||
if response.status_code == 201:
|
||||
logger.info(
|
||||
"Demo clone job created successfully",
|
||||
job_name=job_name,
|
||||
session_id=session_id
|
||||
)
|
||||
return {
|
||||
"success": True,
|
||||
"job_name": job_name,
|
||||
"method": "kubernetes_job"
|
||||
}
|
||||
else:
|
||||
logger.error(
|
||||
"Failed to create demo clone job",
|
||||
status_code=response.status_code,
|
||||
response=response.text
|
||||
)
|
||||
return {
|
||||
"success": False,
|
||||
"error": f"K8s API returned {response.status_code}"
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(
|
||||
"Error creating demo clone job",
|
||||
error=str(e),
|
||||
exc_info=True
|
||||
)
|
||||
return {
|
||||
"success": False,
|
||||
"error": str(e)
|
||||
}
|
||||
267
services/demo_session/app/services/session_manager.py
Normal file
267
services/demo_session/app/services/session_manager.py
Normal file
@@ -0,0 +1,267 @@
|
||||
"""
|
||||
Demo Session Manager
|
||||
Handles creation, extension, and destruction of demo sessions
|
||||
"""
|
||||
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy import select, update
|
||||
from datetime import datetime, timedelta, timezone
|
||||
from typing import Optional, Dict, Any
|
||||
import uuid
|
||||
import secrets
|
||||
import structlog
|
||||
|
||||
from app.models import DemoSession, DemoSessionStatus
|
||||
from app.core import RedisClient, settings
|
||||
|
||||
logger = structlog.get_logger()
|
||||
|
||||
|
||||
class DemoSessionManager:
|
||||
"""Manages demo session lifecycle"""
|
||||
|
||||
def __init__(self, db: AsyncSession, redis: RedisClient):
|
||||
self.db = db
|
||||
self.redis = redis
|
||||
|
||||
async def create_session(
|
||||
self,
|
||||
demo_account_type: str,
|
||||
user_id: Optional[str] = None,
|
||||
ip_address: Optional[str] = None,
|
||||
user_agent: Optional[str] = None
|
||||
) -> DemoSession:
|
||||
"""
|
||||
Create a new demo session
|
||||
|
||||
Args:
|
||||
demo_account_type: 'individual_bakery' or 'central_baker'
|
||||
user_id: Optional user ID if authenticated
|
||||
ip_address: Client IP address
|
||||
user_agent: Client user agent
|
||||
|
||||
Returns:
|
||||
Created demo session
|
||||
"""
|
||||
logger.info("Creating demo session", demo_account_type=demo_account_type)
|
||||
|
||||
# Generate unique session ID
|
||||
session_id = f"demo_{secrets.token_urlsafe(16)}"
|
||||
|
||||
# Generate virtual tenant ID
|
||||
virtual_tenant_id = uuid.uuid4()
|
||||
|
||||
# Get base demo tenant ID from config
|
||||
demo_config = settings.DEMO_ACCOUNTS.get(demo_account_type)
|
||||
if not demo_config:
|
||||
raise ValueError(f"Invalid demo account type: {demo_account_type}")
|
||||
|
||||
# Create session record
|
||||
session = DemoSession(
|
||||
session_id=session_id,
|
||||
user_id=uuid.UUID(user_id) if user_id else None,
|
||||
ip_address=ip_address,
|
||||
user_agent=user_agent,
|
||||
base_demo_tenant_id=uuid.uuid4(), # Will be set by seeding script
|
||||
virtual_tenant_id=virtual_tenant_id,
|
||||
demo_account_type=demo_account_type,
|
||||
status=DemoSessionStatus.ACTIVE,
|
||||
created_at=datetime.now(timezone.utc),
|
||||
expires_at=datetime.now(timezone.utc) + timedelta(
|
||||
minutes=settings.DEMO_SESSION_DURATION_MINUTES
|
||||
),
|
||||
last_activity_at=datetime.now(timezone.utc),
|
||||
data_cloned=False,
|
||||
redis_populated=False,
|
||||
metadata={
|
||||
"demo_config": demo_config,
|
||||
"extension_count": 0
|
||||
}
|
||||
)
|
||||
|
||||
self.db.add(session)
|
||||
await self.db.commit()
|
||||
await self.db.refresh(session)
|
||||
|
||||
# Store session metadata in Redis
|
||||
await self._store_session_metadata(session)
|
||||
|
||||
logger.info(
|
||||
"Demo session created",
|
||||
session_id=session_id,
|
||||
virtual_tenant_id=str(virtual_tenant_id),
|
||||
expires_at=session.expires_at.isoformat()
|
||||
)
|
||||
|
||||
return session
|
||||
|
||||
async def get_session(self, session_id: str) -> Optional[DemoSession]:
|
||||
"""Get session by session_id"""
|
||||
result = await self.db.execute(
|
||||
select(DemoSession).where(DemoSession.session_id == session_id)
|
||||
)
|
||||
return result.scalar_one_or_none()
|
||||
|
||||
async def get_session_by_virtual_tenant(self, virtual_tenant_id: str) -> Optional[DemoSession]:
|
||||
"""Get session by virtual tenant ID"""
|
||||
result = await self.db.execute(
|
||||
select(DemoSession).where(
|
||||
DemoSession.virtual_tenant_id == uuid.UUID(virtual_tenant_id)
|
||||
)
|
||||
)
|
||||
return result.scalar_one_or_none()
|
||||
|
||||
async def extend_session(self, session_id: str) -> DemoSession:
|
||||
"""
|
||||
Extend session expiration time
|
||||
|
||||
Args:
|
||||
session_id: Session ID to extend
|
||||
|
||||
Returns:
|
||||
Updated session
|
||||
|
||||
Raises:
|
||||
ValueError: If session cannot be extended
|
||||
"""
|
||||
session = await self.get_session(session_id)
|
||||
|
||||
if not session:
|
||||
raise ValueError(f"Session not found: {session_id}")
|
||||
|
||||
if session.status != DemoSessionStatus.ACTIVE:
|
||||
raise ValueError(f"Cannot extend {session.status.value} session")
|
||||
|
||||
# Check extension limit
|
||||
extension_count = session.metadata.get("extension_count", 0)
|
||||
if extension_count >= settings.DEMO_SESSION_MAX_EXTENSIONS:
|
||||
raise ValueError(f"Maximum extensions ({settings.DEMO_SESSION_MAX_EXTENSIONS}) reached")
|
||||
|
||||
# Extend expiration
|
||||
new_expires_at = datetime.now(timezone.utc) + timedelta(
|
||||
minutes=settings.DEMO_SESSION_DURATION_MINUTES
|
||||
)
|
||||
|
||||
session.expires_at = new_expires_at
|
||||
session.last_activity_at = datetime.now(timezone.utc)
|
||||
session.metadata["extension_count"] = extension_count + 1
|
||||
|
||||
await self.db.commit()
|
||||
await self.db.refresh(session)
|
||||
|
||||
# Extend Redis TTL
|
||||
await self.redis.extend_session_ttl(
|
||||
session_id,
|
||||
settings.REDIS_SESSION_TTL
|
||||
)
|
||||
|
||||
logger.info(
|
||||
"Session extended",
|
||||
session_id=session_id,
|
||||
new_expires_at=new_expires_at.isoformat(),
|
||||
extension_count=extension_count + 1
|
||||
)
|
||||
|
||||
return session
|
||||
|
||||
async def update_activity(self, session_id: str):
|
||||
"""Update last activity timestamp"""
|
||||
await self.db.execute(
|
||||
update(DemoSession)
|
||||
.where(DemoSession.session_id == session_id)
|
||||
.values(
|
||||
last_activity_at=datetime.now(timezone.utc),
|
||||
request_count=DemoSession.request_count + 1
|
||||
)
|
||||
)
|
||||
await self.db.commit()
|
||||
|
||||
async def mark_data_cloned(self, session_id: str):
|
||||
"""Mark session as having data cloned"""
|
||||
await self.db.execute(
|
||||
update(DemoSession)
|
||||
.where(DemoSession.session_id == session_id)
|
||||
.values(data_cloned=True)
|
||||
)
|
||||
await self.db.commit()
|
||||
|
||||
async def mark_redis_populated(self, session_id: str):
|
||||
"""Mark session as having Redis data populated"""
|
||||
await self.db.execute(
|
||||
update(DemoSession)
|
||||
.where(DemoSession.session_id == session_id)
|
||||
.values(redis_populated=True)
|
||||
)
|
||||
await self.db.commit()
|
||||
|
||||
async def destroy_session(self, session_id: str):
|
||||
"""
|
||||
Destroy a demo session and cleanup resources
|
||||
|
||||
Args:
|
||||
session_id: Session ID to destroy
|
||||
"""
|
||||
session = await self.get_session(session_id)
|
||||
|
||||
if not session:
|
||||
logger.warning("Session not found for destruction", session_id=session_id)
|
||||
return
|
||||
|
||||
# Update session status
|
||||
session.status = DemoSessionStatus.DESTROYED
|
||||
session.destroyed_at = datetime.now(timezone.utc)
|
||||
|
||||
await self.db.commit()
|
||||
|
||||
# Delete Redis data
|
||||
await self.redis.delete_session_data(session_id)
|
||||
|
||||
logger.info(
|
||||
"Session destroyed",
|
||||
session_id=session_id,
|
||||
virtual_tenant_id=str(session.virtual_tenant_id),
|
||||
duration_seconds=(
|
||||
session.destroyed_at - session.created_at
|
||||
).total_seconds()
|
||||
)
|
||||
|
||||
async def _store_session_metadata(self, session: DemoSession):
|
||||
"""Store session metadata in Redis"""
|
||||
await self.redis.set_session_data(
|
||||
session.session_id,
|
||||
"metadata",
|
||||
{
|
||||
"session_id": session.session_id,
|
||||
"virtual_tenant_id": str(session.virtual_tenant_id),
|
||||
"demo_account_type": session.demo_account_type,
|
||||
"expires_at": session.expires_at.isoformat(),
|
||||
"created_at": session.created_at.isoformat()
|
||||
},
|
||||
ttl=settings.REDIS_SESSION_TTL
|
||||
)
|
||||
|
||||
async def get_active_sessions_count(self) -> int:
|
||||
"""Get count of active sessions"""
|
||||
result = await self.db.execute(
|
||||
select(DemoSession).where(DemoSession.status == DemoSessionStatus.ACTIVE)
|
||||
)
|
||||
return len(result.scalars().all())
|
||||
|
||||
async def get_session_stats(self) -> Dict[str, Any]:
|
||||
"""Get session statistics"""
|
||||
result = await self.db.execute(select(DemoSession))
|
||||
all_sessions = result.scalars().all()
|
||||
|
||||
active_sessions = [s for s in all_sessions if s.status == DemoSessionStatus.ACTIVE]
|
||||
|
||||
return {
|
||||
"total_sessions": len(all_sessions),
|
||||
"active_sessions": len(active_sessions),
|
||||
"expired_sessions": len([s for s in all_sessions if s.status == DemoSessionStatus.EXPIRED]),
|
||||
"destroyed_sessions": len([s for s in all_sessions if s.status == DemoSessionStatus.DESTROYED]),
|
||||
"avg_duration_minutes": sum(
|
||||
(s.destroyed_at - s.created_at).total_seconds() / 60
|
||||
for s in all_sessions if s.destroyed_at
|
||||
) / max(len([s for s in all_sessions if s.destroyed_at]), 1),
|
||||
"total_requests": sum(s.request_count for s in all_sessions)
|
||||
}
|
||||
77
services/demo_session/migrations/env.py
Normal file
77
services/demo_session/migrations/env.py
Normal file
@@ -0,0 +1,77 @@
|
||||
"""Alembic environment for demo_session service"""
|
||||
|
||||
from logging.config import fileConfig
|
||||
from sqlalchemy import engine_from_config, pool
|
||||
from alembic import context
|
||||
import os
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
# Add service root to path for container environment
|
||||
service_root = Path(__file__).parent.parent
|
||||
sys.path.insert(0, str(service_root))
|
||||
|
||||
# Also add project root for local development
|
||||
project_root = Path(__file__).parent.parent.parent.parent
|
||||
sys.path.insert(0, str(project_root))
|
||||
|
||||
# Import models - try container path first, then dev path
|
||||
try:
|
||||
from app.models import *
|
||||
from shared.database.base import Base
|
||||
except ImportError:
|
||||
from services.demo_session.app.models import *
|
||||
from shared.database.base import Base
|
||||
|
||||
# this is the Alembic Config object
|
||||
config = context.config
|
||||
|
||||
# Set database URL from environment
|
||||
database_url = os.getenv("DEMO_SESSION_DATABASE_URL")
|
||||
if database_url:
|
||||
# Convert asyncpg URL to psycopg2 for synchronous migrations
|
||||
database_url = database_url.replace("postgresql+asyncpg://", "postgresql://")
|
||||
config.set_main_option("sqlalchemy.url", database_url)
|
||||
|
||||
# Interpret the config file for Python logging
|
||||
if config.config_file_name is not None:
|
||||
fileConfig(config.config_file_name)
|
||||
|
||||
target_metadata = Base.metadata
|
||||
|
||||
|
||||
def run_migrations_offline() -> None:
|
||||
"""Run migrations in 'offline' mode."""
|
||||
url = config.get_main_option("sqlalchemy.url")
|
||||
context.configure(
|
||||
url=url,
|
||||
target_metadata=target_metadata,
|
||||
literal_binds=True,
|
||||
dialect_opts={"paramstyle": "named"},
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
def run_migrations_online() -> None:
|
||||
"""Run migrations in 'online' mode."""
|
||||
connectable = engine_from_config(
|
||||
config.get_section(config.config_ini_section, {}),
|
||||
prefix="sqlalchemy.",
|
||||
poolclass=pool.NullPool,
|
||||
)
|
||||
|
||||
with connectable.connect() as connection:
|
||||
context.configure(
|
||||
connection=connection, target_metadata=target_metadata
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
if context.is_offline_mode():
|
||||
run_migrations_offline()
|
||||
else:
|
||||
run_migrations_online()
|
||||
24
services/demo_session/migrations/script.py.mako
Normal file
24
services/demo_session/migrations/script.py.mako
Normal file
@@ -0,0 +1,24 @@
|
||||
"""${message}
|
||||
|
||||
Revision ID: ${up_revision}
|
||||
Revises: ${down_revision | comma,n}
|
||||
Create Date: ${create_date}
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
${imports if imports else ""}
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = ${repr(up_revision)}
|
||||
down_revision = ${repr(down_revision)}
|
||||
branch_labels = ${repr(branch_labels)}
|
||||
depends_on = ${repr(depends_on)}
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
${upgrades if upgrades else "pass"}
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
${downgrades if downgrades else "pass"}
|
||||
@@ -0,0 +1,64 @@
|
||||
"""initial_schema
|
||||
|
||||
Revision ID: a1b2c3d4e5f6
|
||||
Revises:
|
||||
Create Date: 2025-10-02 17:45:00.000000+02:00
|
||||
|
||||
"""
|
||||
from typing import Sequence, Union
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy.dialects import postgresql
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = 'a1b2c3d4e5f6'
|
||||
down_revision: Union[str, None] = None
|
||||
branch_labels: Union[str, Sequence[str], None] = None
|
||||
depends_on: Union[str, Sequence[str], None] = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
# Create demo_sessions table
|
||||
op.create_table('demo_sessions',
|
||||
sa.Column('id', postgresql.UUID(as_uuid=True), nullable=False),
|
||||
sa.Column('session_id', sa.String(length=100), nullable=False),
|
||||
sa.Column('user_id', postgresql.UUID(as_uuid=True), nullable=True),
|
||||
sa.Column('ip_address', sa.String(length=45), nullable=True),
|
||||
sa.Column('user_agent', sa.String(length=500), nullable=True),
|
||||
sa.Column('base_demo_tenant_id', postgresql.UUID(as_uuid=True), nullable=False),
|
||||
sa.Column('virtual_tenant_id', postgresql.UUID(as_uuid=True), nullable=False),
|
||||
sa.Column('demo_account_type', sa.String(length=50), nullable=False),
|
||||
sa.Column('status', sa.Enum('active', 'expired', 'destroyed', name='demosessionstatus'), nullable=True),
|
||||
sa.Column('created_at', sa.DateTime(timezone=True), nullable=True),
|
||||
sa.Column('expires_at', sa.DateTime(timezone=True), nullable=False),
|
||||
sa.Column('last_activity_at', sa.DateTime(timezone=True), nullable=True),
|
||||
sa.Column('destroyed_at', sa.DateTime(timezone=True), nullable=True),
|
||||
sa.Column('request_count', sa.Integer(), nullable=True),
|
||||
sa.Column('data_cloned', sa.Boolean(), nullable=True),
|
||||
sa.Column('redis_populated', sa.Boolean(), nullable=True),
|
||||
sa.Column('session_metadata', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
|
||||
sa.PrimaryKeyConstraint('id'),
|
||||
sa.UniqueConstraint('session_id')
|
||||
)
|
||||
|
||||
# Create indexes
|
||||
op.create_index(op.f('ix_demo_sessions_session_id'), 'demo_sessions', ['session_id'], unique=False)
|
||||
op.create_index(op.f('ix_demo_sessions_base_demo_tenant_id'), 'demo_sessions', ['base_demo_tenant_id'], unique=False)
|
||||
op.create_index(op.f('ix_demo_sessions_virtual_tenant_id'), 'demo_sessions', ['virtual_tenant_id'], unique=False)
|
||||
op.create_index(op.f('ix_demo_sessions_status'), 'demo_sessions', ['status'], unique=False)
|
||||
op.create_index(op.f('ix_demo_sessions_created_at'), 'demo_sessions', ['created_at'], unique=False)
|
||||
op.create_index(op.f('ix_demo_sessions_expires_at'), 'demo_sessions', ['expires_at'], unique=False)
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
# Drop indexes
|
||||
op.drop_index(op.f('ix_demo_sessions_expires_at'), table_name='demo_sessions')
|
||||
op.drop_index(op.f('ix_demo_sessions_created_at'), table_name='demo_sessions')
|
||||
op.drop_index(op.f('ix_demo_sessions_status'), table_name='demo_sessions')
|
||||
op.drop_index(op.f('ix_demo_sessions_virtual_tenant_id'), table_name='demo_sessions')
|
||||
op.drop_index(op.f('ix_demo_sessions_base_demo_tenant_id'), table_name='demo_sessions')
|
||||
op.drop_index(op.f('ix_demo_sessions_session_id'), table_name='demo_sessions')
|
||||
|
||||
# Drop table (this will automatically drop the enum if it's only used here)
|
||||
op.drop_table('demo_sessions')
|
||||
13
services/demo_session/requirements.txt
Normal file
13
services/demo_session/requirements.txt
Normal file
@@ -0,0 +1,13 @@
|
||||
fastapi==0.104.1
|
||||
uvicorn[standard]==0.24.0
|
||||
sqlalchemy[asyncio]==2.0.23
|
||||
asyncpg==0.29.0
|
||||
psycopg2-binary==2.9.9
|
||||
alembic==1.12.1
|
||||
redis==5.0.1
|
||||
structlog==23.2.0
|
||||
pydantic==2.5.0
|
||||
pydantic-settings==2.1.0
|
||||
httpx==0.25.2
|
||||
PyJWT==2.8.0
|
||||
python-multipart==0.0.6
|
||||
@@ -41,13 +41,13 @@ async def create_enhanced_single_forecast(
|
||||
):
|
||||
"""Generate a single product forecast using enhanced repository pattern"""
|
||||
metrics = get_metrics_collector(request_obj)
|
||||
|
||||
|
||||
try:
|
||||
logger.info("Generating enhanced single forecast",
|
||||
tenant_id=tenant_id,
|
||||
inventory_product_id=request.inventory_product_id,
|
||||
forecast_date=request.forecast_date.isoformat())
|
||||
|
||||
|
||||
# Record metrics
|
||||
if metrics:
|
||||
metrics.increment_counter("enhanced_single_forecasts_total")
|
||||
@@ -163,13 +163,13 @@ async def create_enhanced_batch_forecast(
|
||||
):
|
||||
"""Generate batch forecasts using enhanced repository pattern"""
|
||||
metrics = get_metrics_collector(request_obj)
|
||||
|
||||
|
||||
try:
|
||||
logger.info("Generating enhanced batch forecasts",
|
||||
tenant_id=tenant_id,
|
||||
products_count=len(request.inventory_product_ids),
|
||||
forecast_dates_count=request.forecast_days)
|
||||
|
||||
|
||||
# Record metrics
|
||||
if metrics:
|
||||
metrics.increment_counter("enhanced_batch_forecasts_total")
|
||||
|
||||
@@ -11,6 +11,7 @@ import structlog
|
||||
from apscheduler.triggers.cron import CronTrigger
|
||||
|
||||
from shared.alerts.base_service import BaseAlertService, AlertServiceMixin
|
||||
from shared.database.base import create_database_manager
|
||||
from app.services.procurement_service import ProcurementService
|
||||
|
||||
logger = structlog.get_logger()
|
||||
@@ -204,10 +205,44 @@ class ProcurementSchedulerService(BaseAlertService, AlertServiceMixin):
|
||||
logger.error("💥 Stale plan cleanup failed", error=str(e))
|
||||
|
||||
async def get_active_tenants(self) -> List[UUID]:
|
||||
"""Get active tenants from tenant service or base implementation"""
|
||||
# Only use tenant service, no fallbacks
|
||||
"""Get active tenants from tenant service, excluding demo tenants"""
|
||||
try:
|
||||
return await super().get_active_tenants()
|
||||
all_tenants = await super().get_active_tenants()
|
||||
|
||||
# Filter out demo tenants
|
||||
from services.tenant.app.models.tenants import Tenant
|
||||
from sqlalchemy import select
|
||||
import os
|
||||
|
||||
tenant_db_url = os.getenv("TENANT_DATABASE_URL")
|
||||
if not tenant_db_url:
|
||||
logger.warning("TENANT_DATABASE_URL not set, returning all tenants")
|
||||
return all_tenants
|
||||
|
||||
tenant_db = create_database_manager(tenant_db_url, "tenant-filter")
|
||||
non_demo_tenants = []
|
||||
|
||||
async with tenant_db.get_session() as session:
|
||||
for tenant_id in all_tenants:
|
||||
result = await session.execute(
|
||||
select(Tenant).where(Tenant.id == tenant_id)
|
||||
)
|
||||
tenant = result.scalars().first()
|
||||
|
||||
# Only include non-demo tenants
|
||||
if tenant and not tenant.is_demo:
|
||||
non_demo_tenants.append(tenant_id)
|
||||
elif tenant and tenant.is_demo:
|
||||
logger.debug("Excluding demo tenant from procurement scheduler",
|
||||
tenant_id=str(tenant_id))
|
||||
|
||||
logger.info("Filtered demo tenants from procurement scheduling",
|
||||
total_tenants=len(all_tenants),
|
||||
non_demo_tenants=len(non_demo_tenants),
|
||||
demo_tenants_filtered=len(all_tenants) - len(non_demo_tenants))
|
||||
|
||||
return non_demo_tenants
|
||||
|
||||
except Exception as e:
|
||||
logger.error("Could not fetch tenants from base service", error=str(e))
|
||||
return []
|
||||
|
||||
@@ -36,11 +36,18 @@ class Tenant(Base):
|
||||
# Status
|
||||
is_active = Column(Boolean, default=True)
|
||||
subscription_tier = Column(String(50), default="starter")
|
||||
|
||||
|
||||
# Demo account flags
|
||||
is_demo = Column(Boolean, default=False, index=True)
|
||||
is_demo_template = Column(Boolean, default=False, index=True)
|
||||
base_demo_tenant_id = Column(UUID(as_uuid=True), nullable=True, index=True)
|
||||
demo_session_id = Column(String(100), nullable=True, index=True)
|
||||
demo_expires_at = Column(DateTime(timezone=True), nullable=True)
|
||||
|
||||
# ML status
|
||||
ml_model_trained = Column(Boolean, default=False)
|
||||
last_training_date = Column(DateTime(timezone=True))
|
||||
|
||||
|
||||
# Ownership (user_id without FK - cross-service reference)
|
||||
owner_id = Column(UUID(as_uuid=True), nullable=False, index=True)
|
||||
|
||||
|
||||
@@ -0,0 +1,48 @@
|
||||
"""add_demo_columns
|
||||
|
||||
Revision ID: 2a9b3c4d5e6f
|
||||
Revises: 1e8aebb4d9ce
|
||||
Create Date: 2025-10-02 17:00:00.000000+02:00
|
||||
|
||||
"""
|
||||
from typing import Sequence, Union
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = '2a9b3c4d5e6f'
|
||||
down_revision: Union[str, None] = '1e8aebb4d9ce'
|
||||
branch_labels: Union[str, Sequence[str], None] = None
|
||||
depends_on: Union[str, Sequence[str], None] = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
# Add demo-related columns to tenants table
|
||||
op.add_column('tenants', sa.Column('is_demo', sa.Boolean(), nullable=False, server_default='false'))
|
||||
op.add_column('tenants', sa.Column('is_demo_template', sa.Boolean(), nullable=False, server_default='false'))
|
||||
op.add_column('tenants', sa.Column('base_demo_tenant_id', sa.UUID(), nullable=True))
|
||||
op.add_column('tenants', sa.Column('demo_session_id', sa.String(length=100), nullable=True))
|
||||
op.add_column('tenants', sa.Column('demo_expires_at', sa.DateTime(timezone=True), nullable=True))
|
||||
|
||||
# Create indexes for demo columns
|
||||
op.create_index(op.f('ix_tenants_is_demo'), 'tenants', ['is_demo'], unique=False)
|
||||
op.create_index(op.f('ix_tenants_is_demo_template'), 'tenants', ['is_demo_template'], unique=False)
|
||||
op.create_index(op.f('ix_tenants_base_demo_tenant_id'), 'tenants', ['base_demo_tenant_id'], unique=False)
|
||||
op.create_index(op.f('ix_tenants_demo_session_id'), 'tenants', ['demo_session_id'], unique=False)
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
# Drop indexes
|
||||
op.drop_index(op.f('ix_tenants_demo_session_id'), table_name='tenants')
|
||||
op.drop_index(op.f('ix_tenants_base_demo_tenant_id'), table_name='tenants')
|
||||
op.drop_index(op.f('ix_tenants_is_demo_template'), table_name='tenants')
|
||||
op.drop_index(op.f('ix_tenants_is_demo'), table_name='tenants')
|
||||
|
||||
# Drop columns
|
||||
op.drop_column('tenants', 'demo_expires_at')
|
||||
op.drop_column('tenants', 'demo_session_id')
|
||||
op.drop_column('tenants', 'base_demo_tenant_id')
|
||||
op.drop_column('tenants', 'is_demo_template')
|
||||
op.drop_column('tenants', 'is_demo')
|
||||
@@ -144,7 +144,8 @@ class BaseAlertService:
|
||||
else:
|
||||
# Already leader - try to extend the lock
|
||||
current_value = await self.redis.get(lock_key)
|
||||
if current_value and current_value.decode() == instance_id:
|
||||
# Note: decode_responses=True means Redis returns strings, not bytes
|
||||
if current_value and current_value == instance_id:
|
||||
# Still our lock, extend it using a Lua script for atomicity
|
||||
lua_script = """
|
||||
if redis.call("GET", KEYS[1]) == ARGV[1] then
|
||||
|
||||
@@ -167,6 +167,7 @@ class BaseServiceSettings(BaseSettings):
|
||||
SUPPLIERS_SERVICE_URL: str = os.getenv("SUPPLIERS_SERVICE_URL", "http://bakery-suppliers-service:8000")
|
||||
RECIPES_SERVICE_URL: str = os.getenv("RECIPES_SERVICE_URL", "http://recipes-service:8000")
|
||||
NOMINATIM_SERVICE_URL: str = os.getenv("NOMINATIM_SERVICE_URL", "http://nominatim:8080")
|
||||
DEMO_SESSION_SERVICE_URL: str = os.getenv("DEMO_SESSION_SERVICE_URL", "http://demo-session-service:8000")
|
||||
|
||||
# HTTP Client Settings
|
||||
HTTP_TIMEOUT: int = int(os.getenv("HTTP_TIMEOUT", "30"))
|
||||
|
||||
Reference in New Issue
Block a user