Initial commit - production deployment

This commit is contained in:
2026-01-21 17:17:16 +01:00
commit c23d00dd92
2289 changed files with 638440 additions and 0 deletions

View File

@@ -0,0 +1,59 @@
# =============================================================================
# Orders Service Dockerfile - Environment-Configurable Base Images
# =============================================================================
# Build arguments for registry configuration:
# - BASE_REGISTRY: Registry URL (default: docker.io for Docker Hub)
# - PYTHON_IMAGE: Python image name and tag (default: python:3.11-slim)
# =============================================================================
ARG BASE_REGISTRY=docker.io
ARG PYTHON_IMAGE=python:3.11-slim
FROM ${BASE_REGISTRY}/${PYTHON_IMAGE} AS shared
WORKDIR /shared
COPY shared/ /shared/
ARG BASE_REGISTRY=docker.io
ARG PYTHON_IMAGE=python:3.11-slim
FROM ${BASE_REGISTRY}/${PYTHON_IMAGE}
WORKDIR /app
# Install system dependencies
RUN apt-get update && apt-get install -y \
gcc \
curl \
&& rm -rf /var/lib/apt/lists/*
# Copy requirements
COPY shared/requirements-tracing.txt /tmp/
COPY services/orders/requirements.txt .
# Install Python dependencies
RUN pip install --no-cache-dir -r /tmp/requirements-tracing.txt
RUN pip install --no-cache-dir -r requirements.txt
# Copy shared libraries from the shared stage
COPY --from=shared /shared /app/shared
# Copy application code
COPY services/orders/ .
# Add shared libraries to Python path
ENV PYTHONPATH="/app:/app/shared:${PYTHONPATH:-}"
ENV PYTHONUNBUFFERED=1
# Expose port
EXPOSE 8000
# Health check
HEALTHCHECK --interval=30s --timeout=10s --start-period=5s --retries=3 \
CMD curl -f http://localhost:8000/health || exit 1
# Run application
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]

959
services/orders/README.md Normal file
View File

@@ -0,0 +1,959 @@
# Orders Service
## Overview
The **Orders Service** manages the complete customer order lifecycle from creation to fulfillment, tracking custom orders, wholesale orders, and direct sales. It maintains a comprehensive customer database with purchase history, enables order scheduling for pickup/delivery, and provides analytics on customer behavior and order patterns. This service is essential for B2B relationships with restaurants and hotels, as well as managing special orders for events and celebrations.
## Key Features
### Order Management
- **Multi-Channel Orders** - In-store, phone, online, wholesale
- **Order Lifecycle Tracking** - From pending to completed/cancelled
- **Custom Orders** - Special requests for events and celebrations
- **Recurring Orders** - Automated weekly/monthly orders for B2B
- **Order Scheduling** - Pickup/delivery date and time management
- **Order Priority** - Rush orders vs. standard processing
- **Order Status Updates** - Real-time status with customer notifications
### Customer Database
- **Customer Profiles** - Complete contact and preference information
- **Purchase History** - Track all orders per customer
- **Customer Segmentation** - B2B vs. B2C, loyalty tiers
- **Customer Preferences** - Favorite products, allergen notes
- **Credit Terms** - Payment terms for wholesale customers
- **Customer Analytics** - RFM analysis (Recency, Frequency, Monetary)
- **Customer Lifetime Value** - Total value per customer
### B2B Wholesale Management
- **Wholesale Pricing** - Custom pricing per B2B customer
- **Volume Discounts** - Automatic tier-based discounts
- **Delivery Routes** - Optimize delivery scheduling
- **Invoice Generation** - Automated invoicing with payment terms
- **Standing Orders** - Repeat orders without manual entry
- **Account Management** - Credit limits and payment tracking
### Order Fulfillment
- **Production Integration** - Orders trigger production planning
- **Inventory Reservation** - Reserve stock for confirmed orders
- **Fulfillment Status** - Track preparation and delivery
- **Delivery Management** - Route planning and tracking
- **Order Picking Lists** - Generate lists for warehouse staff
- **Quality Control** - Pre-delivery quality checks
### Payment Tracking
- **Payment Methods** - Cash, card, transfer, credit terms
- **Payment Status** - Paid, pending, overdue
- **Partial Payments** - Split payments over time
- **Invoice History** - Complete payment records
- **Overdue Alerts** - Automatic reminders for B2B accounts
- **Revenue Recognition** - Track revenue per order
### Analytics & Reporting
- **Order Dashboard** - Real-time order metrics
- **Customer Analytics** - Top customers, retention rates
- **Product Analytics** - Most ordered products
- **Revenue Analytics** - Daily/weekly/monthly revenue
- **Order Source Analysis** - Channel performance
- **Delivery Performance** - On-time delivery rates
## Business Value
### For Bakery Owners
- **Revenue Growth** - Better customer relationships drive repeat business
- **B2B Efficiency** - Automate wholesale order management
- **Cash Flow** - Track outstanding payments and credit terms
- **Customer Retention** - Purchase history enables personalized service
- **Order Accuracy** - Digital orders reduce errors vs. phone/paper
- **Analytics** - Understand customer behavior for marketing
### Quantifiable Impact
- **Revenue Growth**: 10-20% through improved B2B relationships
- **Time Savings**: 5-8 hours/week on order management
- **Order Accuracy**: 99%+ vs. 85-90% manual (phone/paper)
- **Payment Collection**: 30% faster with automated reminders
- **Customer Retention**: 15-25% improvement with history tracking
- **B2B Efficiency**: 50-70% time reduction on wholesale orders
### For Sales Staff
- **Quick Order Entry** - Fast order creation with customer lookup
- **Customer History** - See previous orders for upselling
- **Pricing Accuracy** - Automatic wholesale pricing application
- **Order Tracking** - Know exactly when orders will be ready
- **Customer Notes** - Allergen info and preferences visible
### For Customers
- **Order Confirmation** - Immediate confirmation with details
- **Order Tracking** - Real-time status updates
- **Order History** - View and repeat previous orders
- **Flexible Scheduling** - Choose pickup/delivery times
- **Payment Options** - Multiple payment methods
## Technology Stack
- **Framework**: FastAPI (Python 3.11+) - Async web framework
- **Database**: PostgreSQL 17 - Order and customer data
- **Caching**: Redis 7.4 - Customer and order cache
- **Messaging**: RabbitMQ 4.1 - Order event publishing
- **ORM**: SQLAlchemy 2.0 (async) - Database abstraction
- **Validation**: Pydantic 2.0 - Schema validation
- **Logging**: Structlog - Structured JSON logging
- **Metrics**: Prometheus Client - Order metrics
## API Endpoints (Key Routes)
### Order Management
- `GET /api/v1/orders` - List orders with filters
- `POST /api/v1/orders` - Create new order
- `GET /api/v1/orders/{order_id}` - Get order details
- `PUT /api/v1/orders/{order_id}` - Update order
- `DELETE /api/v1/orders/{order_id}` - Cancel order
- `PUT /api/v1/orders/{order_id}/status` - Update order status
- `POST /api/v1/orders/{order_id}/complete` - Mark order complete
### Order Items
- `GET /api/v1/orders/{order_id}/items` - List order items
- `POST /api/v1/orders/{order_id}/items` - Add item to order
- `PUT /api/v1/orders/{order_id}/items/{item_id}` - Update order item
- `DELETE /api/v1/orders/{order_id}/items/{item_id}` - Remove item
### Customer Management
- `GET /api/v1/customers` - List customers with filters
- `POST /api/v1/customers` - Create new customer
- `GET /api/v1/customers/{customer_id}` - Get customer details
- `PUT /api/v1/customers/{customer_id}` - Update customer
- `GET /api/v1/customers/{customer_id}/orders` - Get customer order history
- `GET /api/v1/customers/{customer_id}/analytics` - Customer analytics
### Wholesale Management
- `GET /api/v1/orders/wholesale` - List wholesale orders
- `POST /api/v1/orders/wholesale/recurring` - Create recurring order
- `GET /api/v1/orders/wholesale/invoices` - List invoices
- `POST /api/v1/orders/wholesale/invoices/{invoice_id}/send` - Send invoice
- `GET /api/v1/orders/wholesale/overdue` - List overdue payments
### Fulfillment
- `GET /api/v1/orders/fulfillment/pending` - Orders pending fulfillment
- `POST /api/v1/orders/{order_id}/prepare` - Start order preparation
- `POST /api/v1/orders/{order_id}/ready` - Mark order ready
- `POST /api/v1/orders/{order_id}/deliver` - Mark order delivered
- `GET /api/v1/orders/fulfillment/picking-list` - Generate picking list
### Analytics
- `GET /api/v1/orders/analytics/dashboard` - Order dashboard KPIs
- `GET /api/v1/orders/analytics/revenue` - Revenue analytics
- `GET /api/v1/orders/analytics/customers/top` - Top customers
- `GET /api/v1/orders/analytics/products/popular` - Most ordered products
- `GET /api/v1/orders/analytics/channels` - Order channel breakdown
## Database Schema
### Main Tables
**customers**
```sql
CREATE TABLE customers (
id UUID PRIMARY KEY,
tenant_id UUID NOT NULL,
customer_type VARCHAR(50) NOT NULL, -- retail, wholesale, restaurant, hotel
business_name VARCHAR(255), -- For B2B customers
contact_name VARCHAR(255) NOT NULL,
email VARCHAR(255),
phone VARCHAR(50) NOT NULL,
secondary_phone VARCHAR(50),
address_line1 VARCHAR(255),
address_line2 VARCHAR(255),
city VARCHAR(100),
postal_code VARCHAR(20),
country VARCHAR(100) DEFAULT 'España',
tax_id VARCHAR(50), -- CIF/NIF for businesses
credit_limit DECIMAL(10, 2), -- For B2B customers
credit_term_days INTEGER DEFAULT 0, -- Payment terms (e.g., Net 30)
payment_status VARCHAR(50) DEFAULT 'good_standing', -- good_standing, overdue, suspended
customer_notes TEXT,
allergen_notes TEXT,
preferred_contact_method VARCHAR(50), -- email, phone, whatsapp
loyalty_tier VARCHAR(50) DEFAULT 'standard', -- standard, silver, gold, platinum
total_lifetime_value DECIMAL(12, 2) DEFAULT 0.00,
total_orders INTEGER DEFAULT 0,
last_order_date DATE,
created_at TIMESTAMP DEFAULT NOW(),
updated_at TIMESTAMP DEFAULT NOW(),
UNIQUE(tenant_id, email),
UNIQUE(tenant_id, phone)
);
```
**orders**
```sql
CREATE TABLE orders (
id UUID PRIMARY KEY,
tenant_id UUID NOT NULL,
order_number VARCHAR(100) NOT NULL, -- Human-readable order number
customer_id UUID REFERENCES customers(id),
order_type VARCHAR(50) NOT NULL, -- retail, wholesale, custom, standing
order_source VARCHAR(50), -- in_store, phone, online, email
status VARCHAR(50) DEFAULT 'pending', -- pending, confirmed, preparing, ready, completed, cancelled
priority VARCHAR(50) DEFAULT 'standard', -- rush, standard, scheduled
order_date DATE NOT NULL DEFAULT CURRENT_DATE,
requested_date DATE, -- Pickup/delivery date
requested_time TIME, -- Pickup/delivery time
fulfilled_date DATE,
subtotal DECIMAL(10, 2) NOT NULL DEFAULT 0.00,
discount_amount DECIMAL(10, 2) DEFAULT 0.00,
tax_amount DECIMAL(10, 2) DEFAULT 0.00,
total_amount DECIMAL(10, 2) NOT NULL DEFAULT 0.00,
payment_method VARCHAR(50), -- cash, card, transfer, credit
payment_status VARCHAR(50) DEFAULT 'unpaid', -- unpaid, paid, partial, overdue
payment_due_date DATE,
delivery_method VARCHAR(50), -- pickup, delivery, shipping
delivery_address TEXT,
delivery_notes TEXT,
internal_notes TEXT,
created_by UUID NOT NULL,
created_at TIMESTAMP DEFAULT NOW(),
updated_at TIMESTAMP DEFAULT NOW(),
UNIQUE(tenant_id, order_number)
);
```
**order_items**
```sql
CREATE TABLE order_items (
id UUID PRIMARY KEY,
tenant_id UUID NOT NULL,
order_id UUID REFERENCES orders(id) ON DELETE CASCADE,
product_id UUID NOT NULL,
product_name VARCHAR(255) NOT NULL, -- Cached for performance
quantity DECIMAL(10, 2) NOT NULL,
unit VARCHAR(50) NOT NULL,
unit_price DECIMAL(10, 2) NOT NULL,
discount_percentage DECIMAL(5, 2) DEFAULT 0.00,
line_total DECIMAL(10, 2) NOT NULL,
custom_instructions TEXT,
recipe_id UUID, -- Link to recipe if applicable
production_batch_id UUID, -- Link to production batch
fulfilled_quantity DECIMAL(10, 2) DEFAULT 0.00,
fulfillment_status VARCHAR(50) DEFAULT 'pending', -- pending, reserved, prepared, fulfilled
created_at TIMESTAMP DEFAULT NOW(),
updated_at TIMESTAMP DEFAULT NOW()
);
```
**customer_pricing**
```sql
CREATE TABLE customer_pricing (
id UUID PRIMARY KEY,
tenant_id UUID NOT NULL,
customer_id UUID REFERENCES customers(id) ON DELETE CASCADE,
product_id UUID NOT NULL,
custom_price DECIMAL(10, 2) NOT NULL,
discount_percentage DECIMAL(5, 2),
min_quantity DECIMAL(10, 2), -- Minimum order quantity for price
valid_from DATE DEFAULT CURRENT_DATE,
valid_until DATE,
is_active BOOLEAN DEFAULT TRUE,
notes TEXT,
created_at TIMESTAMP DEFAULT NOW(),
UNIQUE(tenant_id, customer_id, product_id)
);
```
**recurring_orders**
```sql
CREATE TABLE recurring_orders (
id UUID PRIMARY KEY,
tenant_id UUID NOT NULL,
customer_id UUID REFERENCES customers(id) ON DELETE CASCADE,
recurring_name VARCHAR(255) NOT NULL,
frequency VARCHAR(50) NOT NULL, -- daily, weekly, biweekly, monthly
delivery_day VARCHAR(50), -- Monday, Tuesday, etc.
delivery_time TIME,
order_items JSONB NOT NULL, -- Array of {product_id, quantity, unit}
is_active BOOLEAN DEFAULT TRUE,
next_order_date DATE,
last_generated_order_id UUID,
created_at TIMESTAMP DEFAULT NOW(),
updated_at TIMESTAMP DEFAULT NOW()
);
```
**order_status_history**
```sql
CREATE TABLE order_status_history (
id UUID PRIMARY KEY,
tenant_id UUID NOT NULL,
order_id UUID REFERENCES orders(id) ON DELETE CASCADE,
from_status VARCHAR(50),
to_status VARCHAR(50) NOT NULL,
changed_by UUID NOT NULL,
notes TEXT,
changed_at TIMESTAMP DEFAULT NOW()
);
```
**invoices**
```sql
CREATE TABLE invoices (
id UUID PRIMARY KEY,
tenant_id UUID NOT NULL,
invoice_number VARCHAR(100) NOT NULL,
order_id UUID REFERENCES orders(id),
customer_id UUID REFERENCES customers(id),
invoice_date DATE NOT NULL DEFAULT CURRENT_DATE,
due_date DATE NOT NULL,
subtotal DECIMAL(10, 2) NOT NULL,
tax_amount DECIMAL(10, 2) NOT NULL,
total_amount DECIMAL(10, 2) NOT NULL,
amount_paid DECIMAL(10, 2) DEFAULT 0.00,
amount_due DECIMAL(10, 2) NOT NULL,
status VARCHAR(50) DEFAULT 'sent', -- draft, sent, paid, overdue, cancelled
payment_terms VARCHAR(255),
notes TEXT,
sent_at TIMESTAMP,
paid_at TIMESTAMP,
created_at TIMESTAMP DEFAULT NOW(),
UNIQUE(tenant_id, invoice_number)
);
```
### Indexes for Performance
```sql
CREATE INDEX idx_orders_tenant_status ON orders(tenant_id, status);
CREATE INDEX idx_orders_customer ON orders(customer_id);
CREATE INDEX idx_orders_date ON orders(tenant_id, order_date DESC);
CREATE INDEX idx_orders_requested_date ON orders(tenant_id, requested_date);
CREATE INDEX idx_customers_tenant_type ON customers(tenant_id, customer_type);
CREATE INDEX idx_order_items_order ON order_items(order_id);
CREATE INDEX idx_order_items_product ON order_items(tenant_id, product_id);
CREATE INDEX idx_invoices_status ON invoices(tenant_id, status);
CREATE INDEX idx_invoices_due_date ON invoices(tenant_id, due_date) WHERE status != 'paid';
```
## Business Logic Examples
### Order Creation with Pricing
```python
async def create_order(order_data: OrderCreate, current_user: User) -> Order:
"""
Create new order with automatic pricing and customer detection.
"""
# Get or create customer
customer = await get_or_create_customer(
order_data.customer_phone,
order_data.customer_name,
order_data.customer_email
)
# Generate order number
order_number = await generate_order_number(current_user.tenant_id)
# Create order
order = Order(
tenant_id=current_user.tenant_id,
order_number=order_number,
customer_id=customer.id,
order_type=order_data.order_type,
order_source=order_data.order_source,
status='pending',
order_date=date.today(),
requested_date=order_data.requested_date,
created_by=current_user.id
)
db.add(order)
await db.flush() # Get order.id
# Add order items with pricing
subtotal = Decimal('0.00')
for item_data in order_data.items:
# Get product price
base_price = await get_product_price(item_data.product_id)
# Check for customer-specific pricing
custom_price = await get_customer_price(
customer.id,
item_data.product_id,
item_data.quantity
)
unit_price = custom_price if custom_price else base_price
# Apply wholesale discount if applicable
if customer.customer_type == 'wholesale':
discount_pct = await calculate_volume_discount(
item_data.product_id,
item_data.quantity
)
else:
discount_pct = Decimal('0.00')
# Calculate line total
line_total = (unit_price * item_data.quantity) * (1 - discount_pct / 100)
# Create order item
order_item = OrderItem(
tenant_id=current_user.tenant_id,
order_id=order.id,
product_id=item_data.product_id,
product_name=item_data.product_name,
quantity=item_data.quantity,
unit=item_data.unit,
unit_price=unit_price,
discount_percentage=discount_pct,
line_total=line_total
)
db.add(order_item)
subtotal += line_total
# Calculate tax (e.g., Spanish IVA 10% for food)
tax_rate = Decimal('0.10')
tax_amount = subtotal * tax_rate
total_amount = subtotal + tax_amount
# Update order totals
order.subtotal = subtotal
order.tax_amount = tax_amount
order.total_amount = total_amount
# Set payment terms for B2B
if customer.customer_type == 'wholesale':
order.payment_due_date = date.today() + timedelta(days=customer.credit_term_days)
order.payment_status = 'unpaid'
else:
order.payment_status = 'paid' # Retail assumes immediate payment
await db.commit()
await db.refresh(order)
# Publish order created event
await publish_event('orders', 'order.created', {
'order_id': str(order.id),
'customer_id': str(customer.id),
'total_amount': float(order.total_amount),
'requested_date': order.requested_date.isoformat() if order.requested_date else None
})
return order
```
### Recurring Order Generation
```python
async def generate_recurring_orders(tenant_id: UUID):
"""
Generate orders from recurring order templates.
Run daily via orchestrator.
"""
# Get active recurring orders due today
today = date.today()
recurring_orders = await db.query(RecurringOrder).filter(
RecurringOrder.tenant_id == tenant_id,
RecurringOrder.is_active == True,
RecurringOrder.next_order_date <= today
).all()
generated_count = 0
for recurring in recurring_orders:
try:
# Create order from template
order = Order(
tenant_id=tenant_id,
order_number=await generate_order_number(tenant_id),
customer_id=recurring.customer_id,
order_type='standing',
order_source='auto_recurring',
status='confirmed',
order_date=today,
requested_date=recurring.next_order_date,
requested_time=recurring.delivery_time
)
db.add(order)
await db.flush()
# Add items from template
subtotal = Decimal('0.00')
for item_template in recurring.order_items:
product_price = await get_product_price(item_template['product_id'])
line_total = product_price * Decimal(str(item_template['quantity']))
order_item = OrderItem(
tenant_id=tenant_id,
order_id=order.id,
product_id=UUID(item_template['product_id']),
product_name=item_template['product_name'],
quantity=Decimal(str(item_template['quantity'])),
unit=item_template['unit'],
unit_price=product_price,
line_total=line_total
)
db.add(order_item)
subtotal += line_total
# Calculate totals
tax_amount = subtotal * Decimal('0.10')
order.subtotal = subtotal
order.tax_amount = tax_amount
order.total_amount = subtotal + tax_amount
# Update recurring order
recurring.last_generated_order_id = order.id
recurring.next_order_date = calculate_next_order_date(
recurring.next_order_date,
recurring.frequency
)
await db.commit()
generated_count += 1
# Publish event
await publish_event('orders', 'recurring_order.generated', {
'order_id': str(order.id),
'recurring_order_id': str(recurring.id),
'customer_id': str(recurring.customer_id)
})
except Exception as e:
logger.error("Failed to generate recurring order",
recurring_id=str(recurring.id),
error=str(e))
continue
logger.info("Generated recurring orders",
tenant_id=str(tenant_id),
count=generated_count)
return generated_count
```
### Customer RFM Analysis
```python
async def calculate_customer_rfm(customer_id: UUID) -> dict:
"""
Calculate RFM (Recency, Frequency, Monetary) metrics for customer.
"""
# Get customer orders
orders = await db.query(Order).filter(
Order.customer_id == customer_id,
Order.status.in_(['completed'])
).order_by(Order.order_date.desc()).all()
if not orders:
return {"rfm_score": 0, "segment": "inactive"}
# Recency: Days since last order
last_order_date = orders[0].order_date
recency_days = (date.today() - last_order_date).days
# Frequency: Number of orders in last 365 days
one_year_ago = date.today() - timedelta(days=365)
recent_orders = [o for o in orders if o.order_date >= one_year_ago]
frequency = len(recent_orders)
# Monetary: Total spend in last 365 days
monetary = sum(o.total_amount for o in recent_orders)
# Score each dimension (1-5 scale)
recency_score = 5 if recency_days <= 30 else \
4 if recency_days <= 60 else \
3 if recency_days <= 90 else \
2 if recency_days <= 180 else 1
frequency_score = 5 if frequency >= 12 else \
4 if frequency >= 6 else \
3 if frequency >= 3 else \
2 if frequency >= 1 else 1
monetary_score = 5 if monetary >= 5000 else \
4 if monetary >= 2000 else \
3 if monetary >= 500 else \
2 if monetary >= 100 else 1
# Overall RFM score
rfm_score = (recency_score + frequency_score + monetary_score) / 3
# Customer segment
if rfm_score >= 4.5:
segment = "champion"
elif rfm_score >= 3.5:
segment = "loyal"
elif rfm_score >= 2.5:
segment = "potential"
elif rfm_score >= 1.5:
segment = "at_risk"
else:
segment = "inactive"
return {
"rfm_score": round(rfm_score, 2),
"recency_days": recency_days,
"recency_score": recency_score,
"frequency": frequency,
"frequency_score": frequency_score,
"monetary": float(monetary),
"monetary_score": monetary_score,
"segment": segment
}
```
## Events & Messaging
### Published Events (RabbitMQ)
**Exchange**: `orders`
**Routing Keys**: `orders.created`, `orders.completed`, `orders.cancelled`, `orders.overdue`
**Order Created Event**
```json
{
"event_type": "order_created",
"tenant_id": "uuid",
"order_id": "uuid",
"order_number": "ORD-2025-1106-001",
"customer_id": "uuid",
"customer_name": "Restaurante El Prado",
"order_type": "wholesale",
"total_amount": 450.00,
"requested_date": "2025-11-07",
"requested_time": "06:00:00",
"item_count": 12,
"timestamp": "2025-11-06T10:30:00Z"
}
```
**Order Completed Event**
```json
{
"event_type": "order_completed",
"tenant_id": "uuid",
"order_id": "uuid",
"order_number": "ORD-2025-1106-001",
"customer_id": "uuid",
"total_amount": 450.00,
"payment_status": "paid",
"completed_at": "2025-11-07T06:15:00Z",
"timestamp": "2025-11-07T06:15:00Z"
}
```
**Payment Overdue Alert**
```json
{
"event_type": "payment_overdue",
"tenant_id": "uuid",
"invoice_id": "uuid",
"invoice_number": "INV-2025-1106-001",
"customer_id": "uuid",
"customer_name": "Hotel Gran Vía",
"amount_due": 850.00,
"days_overdue": 15,
"due_date": "2025-10-22",
"timestamp": "2025-11-06T09:00:00Z"
}
```
### Alert Events
The Orders service also publishes procurement-related alerts through the alert processor.
**Exchange**: `events.exchange`
**Domain**: `procurement`
#### 1. POs Pending Approval Alert
**Event Type**: `procurement.pos_pending_approval`
**Severity**: urgent (>€10,000), high (>€5,000 or critical POs), medium (otherwise)
**Trigger**: New purchase orders created and awaiting approval
```json
{
"event_type": "procurement.pos_pending_approval",
"severity": "high",
"metadata": {
"tenant_id": "uuid",
"pos_count": 3,
"total_amount": 6500.00,
"critical_count": 1,
"pos": [
{
"po_id": "uuid",
"po_number": "PO-001",
"supplier_id": "uuid",
"total_amount": 3000.00,
"auto_approved": false
}
],
"action_required": true,
"action_url": "/app/comprar"
}
}
```
#### 2. Approval Reminder Alert
**Event Type**: `procurement.approval_reminder`
**Severity**: high (>36 hours pending), medium (otherwise)
**Trigger**: PO not approved within threshold time
```json
{
"event_type": "procurement.approval_reminder",
"severity": "high",
"metadata": {
"tenant_id": "uuid",
"po_id": "uuid",
"po_number": "PO-001",
"supplier_name": "Supplier ABC",
"total_amount": 3000.00,
"hours_pending": 40,
"created_at": "2025-12-18T10:00:00Z",
"action_required": true,
"action_url": "/app/comprar?po=uuid"
}
}
```
#### 3. Critical PO Escalation Alert
**Event Type**: `procurement.critical_po_escalation`
**Severity**: urgent
**Trigger**: Critical/urgent PO not approved in time
```json
{
"event_type": "procurement.critical_po_escalation",
"severity": "urgent",
"metadata": {
"tenant_id": "uuid",
"po_id": "uuid",
"po_number": "PO-001",
"supplier_name": "Supplier ABC",
"total_amount": 5000.00,
"priority": "urgent",
"required_delivery_date": "2025-12-22",
"hours_pending": 48,
"escalated": true,
"action_required": true,
"action_url": "/app/comprar?po=uuid"
}
}
```
#### 4. Auto-Approval Summary (Notification)
**Event Type**: `procurement.auto_approval_summary`
**Type**: Notification (not alert)
**Trigger**: Daily summary of auto-approved POs
```json
{
"event_type": "procurement.auto_approval_summary",
"metadata": {
"tenant_id": "uuid",
"auto_approved_count": 5,
"total_auto_approved_amount": 8500.00,
"manual_approval_count": 2,
"summary_date": "2025-12-19",
"auto_approved_pos": [...],
"pending_approval_pos": [...],
"action_url": "/app/comprar"
}
}
```
#### 5. PO Approved Confirmation (Notification)
**Event Type**: `procurement.po_approved_confirmation`
**Type**: Notification (not alert)
**Trigger**: Purchase order approved
```json
{
"event_type": "procurement.po_approved_confirmation",
"metadata": {
"tenant_id": "uuid",
"po_id": "uuid",
"po_number": "PO-001",
"supplier_name": "Supplier ABC",
"total_amount": 3000.00,
"approved_by": "user@example.com",
"auto_approved": false,
"approved_at": "2025-12-19T14:30:00Z",
"action_url": "/app/comprar?po=uuid"
}
}
```
### Consumed Events
- **From Production**: Batch completion updates order fulfillment status
- **From Inventory**: Stock availability affects order confirmation
- **From Forecasting**: Demand forecasts inform production for pending orders
## Custom Metrics (Prometheus)
```python
# Order metrics
orders_total = Counter(
'orders_total',
'Total orders created',
['tenant_id', 'order_type', 'order_source', 'status']
)
order_value_euros = Histogram(
'order_value_euros',
'Order value distribution',
['tenant_id', 'order_type'],
buckets=[10, 25, 50, 100, 200, 500, 1000, 2000, 5000]
)
# Customer metrics
customers_total = Gauge(
'customers_total',
'Total customers',
['tenant_id', 'customer_type']
)
customer_lifetime_value_euros = Histogram(
'customer_lifetime_value_euros',
'Customer lifetime value distribution',
['tenant_id', 'customer_type'],
buckets=[100, 500, 1000, 2000, 5000, 10000, 20000, 50000]
)
# Fulfillment metrics
order_fulfillment_time_hours = Histogram(
'order_fulfillment_time_hours',
'Time from order to fulfillment',
['tenant_id', 'order_type'],
buckets=[1, 6, 12, 24, 48, 72]
)
# Payment metrics
invoice_payment_time_days = Histogram(
'invoice_payment_time_days',
'Days from invoice to payment',
['tenant_id'],
buckets=[0, 7, 14, 21, 30, 45, 60, 90]
)
overdue_invoices_total = Gauge(
'overdue_invoices_total',
'Total overdue invoices',
['tenant_id']
)
```
## Configuration
### Environment Variables
**Service Configuration:**
- `PORT` - Service port (default: 8010)
- `DATABASE_URL` - PostgreSQL connection string
- `REDIS_URL` - Redis connection string
- `RABBITMQ_URL` - RabbitMQ connection string
**Order Configuration:**
- `AUTO_CONFIRM_RETAIL_ORDERS` - Auto-confirm retail orders (default: true)
- `ORDER_NUMBER_PREFIX` - Order number prefix (default: "ORD")
- `DEFAULT_TAX_RATE` - Default tax rate (default: 0.10 for Spain's 10% IVA)
- `ENABLE_RECURRING_ORDERS` - Enable recurring order generation (default: true)
**Payment Configuration:**
- `DEFAULT_CREDIT_TERMS_DAYS` - Default payment terms (default: 30)
- `OVERDUE_ALERT_THRESHOLD_DAYS` - Days before overdue alert (default: 7)
- `MAX_CREDIT_LIMIT` - Maximum credit limit per customer (default: 10000.00)
**Notification:**
- `SEND_ORDER_CONFIRMATION` - Send order confirmation to customer (default: true)
- `SEND_READY_NOTIFICATION` - Notify when order ready (default: true)
- `SEND_OVERDUE_REMINDERS` - Send overdue payment reminders (default: true)
## Development Setup
### Prerequisites
- Python 3.11+
- PostgreSQL 17
- Redis 7.4
- RabbitMQ 4.1
### Local Development
```bash
cd services/orders
python -m venv venv
source venv/bin/activate
pip install -r requirements.txt
export DATABASE_URL=postgresql://user:pass@localhost:5432/orders
export REDIS_URL=redis://localhost:6379/0
export RABBITMQ_URL=amqp://guest:guest@localhost:5672/
alembic upgrade head
python main.py
```
## Integration Points
### Dependencies
- **Customers Service** - Customer data (if separate)
- **Products Service** - Product catalog and pricing
- **Inventory Service** - Stock availability checks
- **Production Service** - Production planning for orders
- **Auth Service** - User authentication
- **PostgreSQL** - Order and customer data
- **Redis** - Caching
- **RabbitMQ** - Event publishing
### Dependents
- **Production Service** - Orders trigger production planning
- **Inventory Service** - Orders reserve stock
- **Invoicing/Accounting** - Financial reporting
- **Notification Service** - Order confirmations and alerts
- **AI Insights Service** - Customer behavior analysis
- **Frontend Dashboard** - Order management UI
## Business Value for VUE Madrid
### Problem Statement
Spanish bakeries struggle with:
- Manual order tracking on paper or spreadsheets
- Lost orders and miscommunication (especially phone orders)
- No customer purchase history for relationship management
- Complex wholesale order management with multiple B2B clients
- Overdue payment tracking for credit accounts
- No analytics on customer behavior or product popularity
### Solution
Bakery-IA Orders Service provides:
- **Digital Order Management**: Capture all orders across channels
- **Customer Database**: Complete purchase history and preferences
- **B2B Automation**: Recurring orders and automated invoicing
- **Payment Tracking**: Monitor outstanding payments with alerts
- **Analytics**: Customer segmentation and product performance
### Quantifiable Impact
**Revenue Growth:**
- 10-20% revenue increase through improved B2B relationships
- 5-10% from reduced lost orders (99% order accuracy)
- 15-25% customer retention improvement with history tracking
- **Total: €300-600/month additional revenue per bakery**
**Time Savings:**
- 5-8 hours/week on order management and tracking
- 2-3 hours/week on invoicing and payment follow-up
- 1-2 hours/week on customer lookup and history
- **Total: 8-13 hours/week saved**
**Financial Performance:**
- 30% faster payment collection (overdue alerts)
- 50-70% time reduction on wholesale order processing
- 99%+ order accuracy vs. 85-90% manual
### Target Market Fit (Spanish Bakeries)
- **B2B Focus**: Many Spanish bakeries supply restaurants, hotels, cafés
- **Payment Terms**: Spanish B2B typically uses Net 30-60 payment terms
- **Relationship-Driven**: Customer history critical for Spanish business culture
- **Regulatory**: Spanish tax law requires proper invoicing and records
### ROI Calculation
**Investment**: €0 additional (included in platform subscription)
**Monthly Value**: €300-600 additional revenue + cost savings
**Annual ROI**: €3,600-7,200 value per bakery
**Payback**: Immediate (included in subscription)
---
**Copyright © 2025 Bakery-IA. All rights reserved.**

View File

@@ -0,0 +1,84 @@
# ================================================================
# services/orders/alembic.ini - Alembic Configuration
# ================================================================
[alembic]
# path to migration scripts
script_location = migrations
# template used to generate migration file names
file_template = %%(year)d%%(month).2d%%(day).2d_%%(hour).2d%%(minute).2d_%%(rev)s_%%(slug)s
# sys.path path, will be prepended to sys.path if present.
prepend_sys_path = .
# timezone to use when rendering the date within the migration file
# as well as the filename.
timezone = Europe/Madrid
# max length of characters to apply to the
# "slug" field
truncate_slug_length = 40
# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
revision_environment = false
# set to 'true' to allow .pyc and .pyo files without
# a source .py file to be detected as revisions in the
# versions/ directory
sourceless = false
# version of a migration file's filename format
version_num_format = %%s
# version path separator
version_path_separator = os
# set to 'true' to search source files recursively
# in each "version_locations" directory
recursive_version_locations = false
# the output encoding used when revision files
# are written from script.py.mako
output_encoding = utf-8
# Database URL - will be overridden by environment variable or settings
sqlalchemy.url = postgresql+asyncpg://orders_user:password@orders-db-service:5432/orders_db
[post_write_hooks]
# post_write_hooks defines scripts or Python functions that are run
# on newly generated revision scripts.
[loggers]
keys = root,sqlalchemy,alembic
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = WARN
handlers = console
qualname =
[logger_sqlalchemy]
level = WARN
handlers =
qualname = sqlalchemy.engine
[logger_alembic]
level = INFO
handlers =
qualname = alembic
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S

View File

@@ -0,0 +1,237 @@
# services/orders/app/api/audit.py
"""
Audit Logs API - Retrieve audit trail for orders service
"""
from fastapi import APIRouter, Depends, HTTPException, Query, Path, status
from typing import Optional, Dict, Any
from uuid import UUID
from datetime import datetime
import structlog
from sqlalchemy import select, func, and_
from sqlalchemy.ext.asyncio import AsyncSession
from app.models import AuditLog
from shared.auth.decorators import get_current_user_dep
from shared.auth.access_control import require_user_role
from shared.routing import RouteBuilder
from shared.models.audit_log_schemas import (
AuditLogResponse,
AuditLogListResponse,
AuditLogStatsResponse
)
from app.core.database import database_manager
route_builder = RouteBuilder('orders')
router = APIRouter(tags=["audit-logs"])
logger = structlog.get_logger()
async def get_db():
"""Database session dependency"""
async with database_manager.get_session() as session:
yield session
@router.get(
route_builder.build_base_route("audit-logs"),
response_model=AuditLogListResponse
)
@require_user_role(['admin', 'owner'])
async def get_audit_logs(
tenant_id: UUID = Path(..., description="Tenant ID"),
start_date: Optional[datetime] = Query(None, description="Filter logs from this date"),
end_date: Optional[datetime] = Query(None, description="Filter logs until this date"),
user_id: Optional[UUID] = Query(None, description="Filter by user ID"),
action: Optional[str] = Query(None, description="Filter by action type"),
resource_type: Optional[str] = Query(None, description="Filter by resource type"),
severity: Optional[str] = Query(None, description="Filter by severity level"),
search: Optional[str] = Query(None, description="Search in description field"),
limit: int = Query(100, ge=1, le=1000, description="Number of records to return"),
offset: int = Query(0, ge=0, description="Number of records to skip"),
current_user: Dict[str, Any] = Depends(get_current_user_dep),
db: AsyncSession = Depends(get_db)
):
"""
Get audit logs for orders service.
Requires admin or owner role.
"""
try:
logger.info(
"Retrieving audit logs",
tenant_id=tenant_id,
user_id=current_user.get("user_id"),
filters={
"start_date": start_date,
"end_date": end_date,
"action": action,
"resource_type": resource_type,
"severity": severity
}
)
# Build query filters
filters = [AuditLog.tenant_id == tenant_id]
if start_date:
filters.append(AuditLog.created_at >= start_date)
if end_date:
filters.append(AuditLog.created_at <= end_date)
if user_id:
filters.append(AuditLog.user_id == user_id)
if action:
filters.append(AuditLog.action == action)
if resource_type:
filters.append(AuditLog.resource_type == resource_type)
if severity:
filters.append(AuditLog.severity == severity)
if search:
filters.append(AuditLog.description.ilike(f"%{search}%"))
# Count total matching records
count_query = select(func.count()).select_from(AuditLog).where(and_(*filters))
total_result = await db.execute(count_query)
total = total_result.scalar() or 0
# Fetch paginated results
query = (
select(AuditLog)
.where(and_(*filters))
.order_by(AuditLog.created_at.desc())
.limit(limit)
.offset(offset)
)
result = await db.execute(query)
audit_logs = result.scalars().all()
# Convert to response models
items = [AuditLogResponse.from_orm(log) for log in audit_logs]
logger.info(
"Successfully retrieved audit logs",
tenant_id=tenant_id,
total=total,
returned=len(items)
)
return AuditLogListResponse(
items=items,
total=total,
limit=limit,
offset=offset,
has_more=(offset + len(items)) < total
)
except Exception as e:
logger.error(
"Failed to retrieve audit logs",
error=str(e),
tenant_id=tenant_id
)
raise HTTPException(
status_code=500,
detail=f"Failed to retrieve audit logs: {str(e)}"
)
@router.get(
route_builder.build_base_route("audit-logs/stats"),
response_model=AuditLogStatsResponse
)
@require_user_role(['admin', 'owner'])
async def get_audit_log_stats(
tenant_id: UUID = Path(..., description="Tenant ID"),
start_date: Optional[datetime] = Query(None, description="Filter logs from this date"),
end_date: Optional[datetime] = Query(None, description="Filter logs until this date"),
current_user: Dict[str, Any] = Depends(get_current_user_dep),
db: AsyncSession = Depends(get_db)
):
"""
Get audit log statistics for orders service.
Requires admin or owner role.
"""
try:
logger.info(
"Retrieving audit log statistics",
tenant_id=tenant_id,
user_id=current_user.get("user_id")
)
# Build base filters
filters = [AuditLog.tenant_id == tenant_id]
if start_date:
filters.append(AuditLog.created_at >= start_date)
if end_date:
filters.append(AuditLog.created_at <= end_date)
# Total events
count_query = select(func.count()).select_from(AuditLog).where(and_(*filters))
total_result = await db.execute(count_query)
total_events = total_result.scalar() or 0
# Events by action
action_query = (
select(AuditLog.action, func.count().label('count'))
.where(and_(*filters))
.group_by(AuditLog.action)
)
action_result = await db.execute(action_query)
events_by_action = {row.action: row.count for row in action_result}
# Events by severity
severity_query = (
select(AuditLog.severity, func.count().label('count'))
.where(and_(*filters))
.group_by(AuditLog.severity)
)
severity_result = await db.execute(severity_query)
events_by_severity = {row.severity: row.count for row in severity_result}
# Events by resource type
resource_query = (
select(AuditLog.resource_type, func.count().label('count'))
.where(and_(*filters))
.group_by(AuditLog.resource_type)
)
resource_result = await db.execute(resource_query)
events_by_resource_type = {row.resource_type: row.count for row in resource_result}
# Date range
date_range_query = (
select(
func.min(AuditLog.created_at).label('min_date'),
func.max(AuditLog.created_at).label('max_date')
)
.where(and_(*filters))
)
date_result = await db.execute(date_range_query)
date_row = date_result.one()
logger.info(
"Successfully retrieved audit log statistics",
tenant_id=tenant_id,
total_events=total_events
)
return AuditLogStatsResponse(
total_events=total_events,
events_by_action=events_by_action,
events_by_severity=events_by_severity,
events_by_resource_type=events_by_resource_type,
date_range={
"min": date_row.min_date,
"max": date_row.max_date
}
)
except Exception as e:
logger.error(
"Failed to retrieve audit log statistics",
error=str(e),
tenant_id=tenant_id
)
raise HTTPException(
status_code=500,
detail=f"Failed to retrieve audit log statistics: {str(e)}"
)

View File

@@ -0,0 +1,322 @@
# ================================================================
# services/orders/app/api/customers.py
# ================================================================
"""
Customers API endpoints - ATOMIC CRUD operations
"""
from typing import List
from uuid import UUID
from fastapi import APIRouter, Depends, HTTPException, Path, Query, status
import structlog
from shared.auth.decorators import get_current_user_dep
from shared.auth.access_control import require_user_role
from shared.routing import RouteBuilder
from shared.security import create_audit_logger, AuditSeverity, AuditAction
from app.core.database import get_db
from app.services.orders_service import OrdersService
from app.models import AuditLog
from app.schemas.order_schemas import (
CustomerCreate,
CustomerUpdate,
CustomerResponse
)
logger = structlog.get_logger()
audit_logger = create_audit_logger("orders-service", AuditLog)
# Create route builder for consistent URL structure
route_builder = RouteBuilder('orders')
router = APIRouter()
# ===== Dependency Injection =====
async def get_orders_service(db = Depends(get_db)) -> OrdersService:
"""Get orders service with dependencies"""
from app.repositories.order_repository import (
OrderRepository,
CustomerRepository,
OrderItemRepository,
OrderStatusHistoryRepository
)
from shared.clients import (
get_inventory_client,
get_production_client,
get_sales_client
)
return OrdersService(
order_repo=OrderRepository(),
customer_repo=CustomerRepository(),
order_item_repo=OrderItemRepository(),
status_history_repo=OrderStatusHistoryRepository(),
inventory_client=get_inventory_client(),
production_client=get_production_client(),
sales_client=get_sales_client()
)
# ===== Customer CRUD Endpoints =====
@router.post(
route_builder.build_base_route("customers"),
response_model=CustomerResponse,
status_code=status.HTTP_201_CREATED
)
@require_user_role(['admin', 'owner', 'member'])
async def create_customer(
customer_data: CustomerCreate,
tenant_id: UUID = Path(...),
current_user: dict = Depends(get_current_user_dep),
orders_service: OrdersService = Depends(get_orders_service),
db = Depends(get_db)
):
"""Create a new customer"""
try:
# Check if customer code already exists
existing_customer = await orders_service.customer_repo.get_by_customer_code(
db, customer_data.customer_code, tenant_id
)
if existing_customer:
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Customer code already exists"
)
# Extract user ID safely
user_id = current_user.get("user_id")
if not user_id:
logger.error("User ID not found in current_user context", current_user_keys=list(current_user.keys()))
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="User authentication error"
)
customer = await orders_service.customer_repo.create(
db,
obj_in=customer_data,
created_by=UUID(user_id),
tenant_id=tenant_id
)
# Commit the transaction to persist changes
await db.commit()
logger.info("Customer created successfully",
customer_id=str(customer.id),
customer_code=customer.customer_code)
return CustomerResponse.from_orm(customer)
except HTTPException:
raise
except Exception as e:
logger.error("Error creating customer", error=str(e))
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Failed to create customer"
)
@router.get(
route_builder.build_base_route("customers"),
response_model=List[CustomerResponse]
)
async def get_customers(
tenant_id: UUID = Path(...),
active_only: bool = Query(True, description="Filter for active customers only"),
skip: int = Query(0, ge=0, description="Number of customers to skip"),
limit: int = Query(100, ge=1, le=1000, description="Number of customers to return"),
current_user: dict = Depends(get_current_user_dep),
orders_service: OrdersService = Depends(get_orders_service),
db = Depends(get_db)
):
"""Get customers with filtering and pagination"""
try:
if active_only:
customers = await orders_service.customer_repo.get_active_customers(
db, tenant_id, skip, limit
)
else:
customers = await orders_service.customer_repo.get_multi(
db, tenant_id, skip, limit, order_by="name"
)
return [CustomerResponse.from_orm(customer) for customer in customers]
except Exception as e:
logger.error("Error getting customers", error=str(e))
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Failed to retrieve customers"
)
@router.get(
route_builder.build_resource_detail_route("customers", "customer_id"),
response_model=CustomerResponse
)
async def get_customer(
tenant_id: UUID = Path(...),
customer_id: UUID = Path(...),
current_user: dict = Depends(get_current_user_dep),
orders_service: OrdersService = Depends(get_orders_service),
db = Depends(get_db)
):
"""Get customer details by ID"""
try:
customer = await orders_service.customer_repo.get(db, customer_id, tenant_id)
if not customer:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Customer not found"
)
return CustomerResponse.from_orm(customer)
except HTTPException:
raise
except Exception as e:
logger.error("Error getting customer",
customer_id=str(customer_id),
error=str(e))
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Failed to retrieve customer"
)
@router.put(
route_builder.build_resource_detail_route("customers", "customer_id"),
response_model=CustomerResponse
)
@require_user_role(['admin', 'owner', 'member'])
async def update_customer(
customer_data: CustomerUpdate,
tenant_id: UUID = Path(...),
customer_id: UUID = Path(...),
current_user: dict = Depends(get_current_user_dep),
orders_service: OrdersService = Depends(get_orders_service),
db = Depends(get_db)
):
"""Update customer information"""
try:
# Get existing customer
customer = await orders_service.customer_repo.get(db, customer_id, tenant_id)
if not customer:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Customer not found"
)
# Update customer
# Extract user ID safely for update
user_id = current_user.get("user_id")
if not user_id:
logger.error("User ID not found in current_user context for update", current_user_keys=list(current_user.keys()))
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="User authentication error"
)
updated_customer = await orders_service.customer_repo.update(
db,
db_obj=customer,
obj_in=customer_data.dict(exclude_unset=True),
updated_by=UUID(user_id)
)
# Commit the transaction to persist changes
await db.commit()
logger.info("Customer updated successfully",
customer_id=str(customer_id))
return CustomerResponse.from_orm(updated_customer)
except HTTPException:
raise
except Exception as e:
logger.error("Error updating customer",
customer_id=str(customer_id),
error=str(e))
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Failed to update customer"
)
@router.delete(
route_builder.build_resource_detail_route("customers", "customer_id"),
status_code=status.HTTP_204_NO_CONTENT
)
@require_user_role(['admin', 'owner'])
async def delete_customer(
tenant_id: UUID = Path(...),
customer_id: UUID = Path(...),
current_user: dict = Depends(get_current_user_dep),
orders_service: OrdersService = Depends(get_orders_service),
db = Depends(get_db)
):
"""
Delete a customer (Admin+ only, GDPR-compliant soft delete)
Removes PII while maintaining referential integrity
"""
try:
customer = await orders_service.customer_repo.get(db, customer_id, tenant_id)
if not customer:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Customer not found"
)
# Capture customer data before deletion (for audit trail)
# Note: This is anonymized after retention period in compliance with GDPR
customer_data = {
"customer_code": customer.customer_code,
"customer_name": customer.customer_name,
"email": customer.email,
"phone": customer.phone,
"business_type": customer.business_type if hasattr(customer, 'business_type') else None
}
await orders_service.customer_repo.delete(db, customer_id, tenant_id)
# Commit the transaction to persist deletion
await db.commit()
# Log HIGH severity audit event for customer deletion (GDPR compliance)
try:
await audit_logger.log_deletion(
db_session=db,
tenant_id=str(tenant_id),
user_id=current_user["user_id"],
resource_type="customer",
resource_id=str(customer_id),
resource_data=customer_data,
description=f"Admin {current_user.get('email', 'unknown')} deleted customer {customer_data['customer_code']} (GDPR-compliant soft delete)",
endpoint=f"/customers/{customer_id}",
method="DELETE",
severity=AuditSeverity.HIGH.value
)
except Exception as audit_error:
logger.warning("Failed to log audit event", error=str(audit_error))
logger.info("Customer deleted successfully (GDPR-compliant)",
customer_id=str(customer_id),
tenant_id=str(tenant_id),
user_id=current_user["user_id"])
except HTTPException:
raise
except Exception as e:
logger.error("Error deleting customer",
customer_id=str(customer_id),
error=str(e))
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Failed to delete customer"
)

View File

@@ -0,0 +1,455 @@
"""
Internal Demo Cloning API for Orders Service
Service-to-service endpoint for cloning order and customer data
"""
from fastapi import APIRouter, Depends, HTTPException, Header
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import select, delete, func
import structlog
import uuid
from datetime import datetime, timezone, timedelta, date
from typing import Optional
import os
from decimal import Decimal
import json
from pathlib import Path
from app.core.database import get_db
from app.models.order import CustomerOrder, OrderItem
from app.models.customer import Customer
from shared.utils.demo_dates import adjust_date_for_demo, resolve_time_marker, get_next_workday
from app.core.config import settings
logger = structlog.get_logger()
router = APIRouter(prefix="/internal/demo", tags=["internal"])
# Base demo tenant IDs
DEMO_TENANT_PROFESSIONAL = "a1b2c3d4-e5f6-47a8-b9c0-d1e2f3a4b5c6"
def parse_date_field(date_value, session_time: datetime, field_name: str = "date") -> Optional[datetime]:
"""
Parse date field, handling both ISO strings and BASE_TS markers.
Supports:
- BASE_TS markers: "BASE_TS + 1h30m", "BASE_TS - 2d"
- ISO 8601 strings: "2025-01-15T06:00:00Z"
- None values (returns None)
Returns timezone-aware datetime or None.
"""
if not date_value:
return None
# Check if it's a BASE_TS marker
if isinstance(date_value, str) and date_value.startswith("BASE_TS"):
try:
return resolve_time_marker(date_value, session_time)
except ValueError as e:
logger.warning(
f"Invalid BASE_TS marker in {field_name}",
marker=date_value,
error=str(e)
)
return None
# Handle regular ISO date strings
try:
if isinstance(date_value, str):
original_date = datetime.fromisoformat(date_value.replace('Z', '+00:00'))
elif hasattr(date_value, 'isoformat'):
original_date = date_value
else:
logger.warning(f"Unsupported date format in {field_name}", date_value=date_value)
return None
return adjust_date_for_demo(original_date, session_time)
except (ValueError, AttributeError) as e:
logger.warning(
f"Invalid date format in {field_name}",
date_value=date_value,
error=str(e)
)
return None
def ensure_workday(target_date: datetime) -> datetime:
"""Ensure delivery date falls on a workday (Monday-Friday)"""
if target_date and target_date.weekday() >= 5: # Saturday or Sunday
return get_next_workday(target_date)
return target_date
@router.post("/clone")
async def clone_demo_data(
base_tenant_id: str,
virtual_tenant_id: str,
demo_account_type: str,
session_id: Optional[str] = None,
session_created_at: Optional[str] = None,
db: AsyncSession = Depends(get_db)
):
"""
Clone orders service data for a virtual demo tenant
Clones:
- Customers
- Customer orders with line items
- Adjusts dates to recent timeframe
Args:
base_tenant_id: Template tenant UUID to clone from
virtual_tenant_id: Target virtual tenant UUID
demo_account_type: Type of demo account
session_id: Originating session ID for tracing
Returns:
Cloning status and record counts
"""
start_time = datetime.now(timezone.utc)
# Parse session creation time for date adjustment
if session_created_at:
try:
session_time = datetime.fromisoformat(session_created_at.replace('Z', '+00:00'))
except (ValueError, AttributeError):
session_time = start_time
else:
session_time = start_time
logger.info(
"Starting orders data cloning",
base_tenant_id=base_tenant_id,
virtual_tenant_id=virtual_tenant_id,
demo_account_type=demo_account_type,
session_id=session_id,
session_created_at=session_created_at
)
try:
# Validate UUIDs
base_uuid = uuid.UUID(base_tenant_id)
virtual_uuid = uuid.UUID(virtual_tenant_id)
# Track cloning statistics
stats = {
"customers": 0,
"customer_orders": 0,
"order_line_items": 0,
"alerts_generated": 0
}
# Customer ID mapping (old -> new)
customer_id_map = {}
# Load Customers from seed data
try:
from shared.utils.seed_data_paths import get_seed_data_path
if demo_account_type == "professional":
json_file = get_seed_data_path("professional", "08-orders.json")
elif demo_account_type == "enterprise":
json_file = get_seed_data_path("enterprise", "08-orders.json")
elif demo_account_type == "enterprise_child":
json_file = get_seed_data_path("enterprise", "08-orders.json", child_id=base_tenant_id)
else:
raise ValueError(f"Invalid demo account type: {demo_account_type}")
except ImportError:
# Fallback to original path
seed_data_dir = Path(__file__).parent.parent.parent.parent / "infrastructure" / "seed-data"
if demo_account_type == "professional":
json_file = seed_data_dir / "professional" / "08-orders.json"
elif demo_account_type == "enterprise":
json_file = seed_data_dir / "enterprise" / "parent" / "08-orders.json"
elif demo_account_type == "enterprise_child":
json_file = seed_data_dir / "enterprise" / "children" / base_tenant_id / "08-orders.json"
else:
raise ValueError(f"Invalid demo account type: {demo_account_type}")
if not json_file.exists():
raise HTTPException(
status_code=404,
detail=f"Seed data file not found: {json_file}"
)
# Load JSON data
with open(json_file, 'r', encoding='utf-8') as f:
seed_data = json.load(f)
logger.info(
"Loaded orders seed data",
customers=len(seed_data.get('customers', [])),
orders=len(seed_data.get('customer_orders', []))
)
# Load Customers from seed data
for customer_data in seed_data.get('customers', []):
# Transform IDs using XOR
from shared.utils.demo_id_transformer import transform_id
try:
customer_uuid = uuid.UUID(customer_data['id'])
transformed_id = transform_id(customer_data['id'], virtual_uuid)
except ValueError as e:
logger.error("Failed to parse customer UUID",
customer_id=customer_data['id'],
error=str(e))
continue
customer_id_map[uuid.UUID(customer_data['id'])] = transformed_id
new_customer = Customer(
id=transformed_id,
tenant_id=virtual_uuid,
customer_code=customer_data.get('customer_code'),
name=customer_data.get('name'),
business_name=customer_data.get('business_name'),
customer_type=customer_data.get('customer_type'),
tax_id=customer_data.get('tax_id'),
email=customer_data.get('email'),
phone=customer_data.get('phone'),
address_line1=customer_data.get('address_line1'),
address_line2=customer_data.get('address_line2'),
city=customer_data.get('city'),
state=customer_data.get('state'),
postal_code=customer_data.get('postal_code'),
country=customer_data.get('country'),
business_license=customer_data.get('business_license'),
is_active=customer_data.get('is_active', True),
preferred_delivery_method=customer_data.get('preferred_delivery_method'),
payment_terms=customer_data.get('payment_terms'),
credit_limit=customer_data.get('credit_limit', 0.0),
discount_percentage=customer_data.get('discount_percentage', 0.0),
customer_segment=customer_data.get('customer_segment'),
priority_level=customer_data.get('priority_level'),
special_instructions=customer_data.get('special_instructions'),
delivery_preferences=customer_data.get('delivery_preferences'),
product_preferences=customer_data.get('product_preferences'),
total_orders=customer_data.get('total_orders', 0),
total_spent=customer_data.get('total_spent', 0.0),
average_order_value=customer_data.get('average_order_value', 0.0),
last_order_date=parse_date_field(
customer_data.get('last_order_date'),
session_time,
"last_order_date"
),
created_at=session_time,
updated_at=session_time
)
db.add(new_customer)
stats["customers"] += 1
# Load Customer Orders from seed data
order_id_map = {}
for order_data in seed_data.get('customer_orders', []):
# Transform IDs using XOR
from shared.utils.demo_id_transformer import transform_id
try:
order_uuid = uuid.UUID(order_data['id'])
transformed_id = transform_id(order_data['id'], virtual_uuid)
except ValueError as e:
logger.error("Failed to parse order UUID",
order_id=order_data['id'],
error=str(e))
continue
order_id_map[uuid.UUID(order_data['id'])] = transformed_id
# Map customer_id if it exists in our map
customer_id_value = order_data.get('customer_id')
if customer_id_value:
customer_id_value = customer_id_map.get(uuid.UUID(customer_id_value), uuid.UUID(customer_id_value))
# Parse date fields (supports BASE_TS markers and ISO timestamps)
adjusted_order_date = parse_date_field(
order_data.get('order_date'),
session_time,
"order_date"
) or session_time
# Handle delivery date - JSON uses 'delivery_date', database uses 'requested_delivery_date'
# Fallback to order_date + 2 hours if no delivery date is provided
delivery_date_value = order_data.get('delivery_date') or order_data.get('requested_delivery_date')
adjusted_requested_delivery = parse_date_field(
delivery_date_value,
session_time,
"requested_delivery_date"
)
# Ensure requested_delivery_date is never None - fallback to order_date + 2 hours
if adjusted_requested_delivery is None:
adjusted_requested_delivery = adjusted_order_date + timedelta(hours=2)
# Create new order from seed data
# Generate unique order number by appending tenant-specific suffix
base_order_number = order_data.get('order_number', f"ORD-{uuid.uuid4().hex[:8].upper()}")
# Add tenant-specific suffix to ensure global uniqueness
tenant_suffix = virtual_uuid.hex[:4].upper() # Use first 4 chars of tenant ID
unique_order_number = f"{base_order_number}-{tenant_suffix}"
new_order = CustomerOrder(
id=str(transformed_id),
tenant_id=virtual_uuid,
order_number=unique_order_number,
customer_id=str(customer_id_value) if customer_id_value else None,
status=order_data.get('status', 'pending'),
order_type=order_data.get('order_type', 'standard'),
priority=order_data.get('priority', 'normal'),
order_date=adjusted_order_date,
requested_delivery_date=adjusted_requested_delivery,
delivery_method=order_data.get('delivery_method'),
delivery_address=order_data.get('delivery_address'),
delivery_instructions=order_data.get('delivery_instructions'),
subtotal=order_data.get('subtotal', 0.0),
tax_amount=order_data.get('tax_amount', 0.0),
discount_amount=order_data.get('discount_amount', 0.0),
discount_percentage=order_data.get('discount_percentage', 0.0),
delivery_fee=order_data.get('delivery_fee', 0.0),
total_amount=order_data.get('total_amount', 0.0),
payment_status=order_data.get('payment_status', 'pending'),
payment_method=order_data.get('payment_method'),
payment_terms=order_data.get('payment_terms'),
special_instructions=order_data.get('special_instructions'),
order_source=order_data.get('order_source', 'manual'),
sales_channel=order_data.get('sales_channel', 'direct'),
created_at=session_time,
updated_at=session_time
)
db.add(new_order)
stats["customer_orders"] += 1
# Clone Order Items
for old_order_id, new_order_id in order_id_map.items():
result = await db.execute(
select(OrderItem).where(OrderItem.order_id == old_order_id)
)
order_items = result.scalars().all()
for item in order_items:
new_item = OrderItem(
id=uuid.uuid4(),
order_id=new_order_id,
product_id=item.product_id,
product_name=item.product_name,
product_sku=item.product_sku,
quantity=item.quantity,
unit_of_measure=item.unit_of_measure,
unit_price=item.unit_price,
line_discount=item.line_discount,
line_total=item.line_total,
status=item.status
)
db.add(new_item)
stats["order_line_items"] += 1
# Commit cloned data
await db.commit()
# NOTE: Alert generation removed - alerts are now generated automatically by the
# respective alert services which run scheduled checks at appropriate intervals.
# This eliminates duplicate alerts and provides a more realistic demo experience.
stats["alerts_generated"] = 0
total_records = stats["customers"] + stats["customer_orders"] + stats["order_line_items"]
duration_ms = int((datetime.now(timezone.utc) - start_time).total_seconds() * 1000)
logger.info(
"Orders data cloning completed",
virtual_tenant_id=virtual_tenant_id,
total_records=total_records,
stats=stats,
duration_ms=duration_ms
)
return {
"service": "orders",
"status": "completed",
"records_cloned": total_records,
"duration_ms": duration_ms,
"details": stats
}
except ValueError as e:
logger.error("Invalid UUID format", error=str(e))
raise HTTPException(status_code=400, detail=f"Invalid UUID: {str(e)}")
except Exception as e:
logger.error(
"Failed to clone orders data",
error=str(e),
virtual_tenant_id=virtual_tenant_id,
exc_info=True
)
# Rollback on error
await db.rollback()
return {
"service": "orders",
"status": "failed",
"records_cloned": 0,
"duration_ms": int((datetime.now(timezone.utc) - start_time).total_seconds() * 1000),
"error": str(e)
}
@router.get("/clone/health")
async def clone_health_check():
"""
Health check for internal cloning endpoint
Used by orchestrator to verify service availability
"""
return {
"service": "orders",
"clone_endpoint": "available",
"version": "2.0.0"
}
@router.delete("/tenant/{virtual_tenant_id}")
async def delete_demo_data(
virtual_tenant_id: str,
db: AsyncSession = Depends(get_db)
):
"""Delete all order data for a virtual demo tenant"""
logger.info("Deleting order data for virtual tenant", virtual_tenant_id=virtual_tenant_id)
start_time = datetime.now(timezone.utc)
try:
virtual_uuid = uuid.UUID(virtual_tenant_id)
# Count records
order_count = await db.scalar(select(func.count(CustomerOrder.id)).where(CustomerOrder.tenant_id == virtual_uuid))
item_count = await db.scalar(select(func.count(OrderItem.id)).join(CustomerOrder).where(CustomerOrder.tenant_id == virtual_uuid))
customer_count = await db.scalar(select(func.count(Customer.id)).where(Customer.tenant_id == virtual_uuid))
# Delete in order
await db.execute(delete(OrderItem).where(OrderItem.order_id.in_(
select(CustomerOrder.id).where(CustomerOrder.tenant_id == virtual_uuid)
)))
await db.execute(delete(CustomerOrder).where(CustomerOrder.tenant_id == virtual_uuid))
await db.execute(delete(Customer).where(Customer.tenant_id == virtual_uuid))
await db.commit()
duration_ms = int((datetime.now(timezone.utc) - start_time).total_seconds() * 1000)
logger.info("Order data deleted successfully", virtual_tenant_id=virtual_tenant_id, duration_ms=duration_ms)
return {
"service": "orders",
"status": "deleted",
"virtual_tenant_id": virtual_tenant_id,
"records_deleted": {
"orders": order_count,
"items": item_count,
"customers": customer_count,
"total": order_count + item_count + customer_count
},
"duration_ms": duration_ms
}
except Exception as e:
logger.error("Failed to delete order data", error=str(e), exc_info=True)
await db.rollback()
raise HTTPException(status_code=500, detail=str(e))

View File

@@ -0,0 +1,237 @@
# ================================================================
# services/orders/app/api/order_operations.py
# ================================================================
"""
Order Operations API endpoints - BUSINESS logic operations
Includes status updates, demand calculation, dashboard, and business intelligence
"""
from datetime import date, datetime
from typing import List, Optional
from uuid import UUID
from fastapi import APIRouter, Depends, HTTPException, Path, Query, status
import structlog
from shared.auth.decorators import get_current_user_dep
from shared.routing import RouteBuilder
from app.core.database import get_db
from app.services.orders_service import OrdersService
from app.schemas.order_schemas import (
OrderResponse,
OrdersDashboardSummary,
DemandRequirements
)
logger = structlog.get_logger()
# Create route builder for consistent URL structure
route_builder = RouteBuilder('orders')
router = APIRouter()
# ===== Dependency Injection =====
async def get_orders_service(db = Depends(get_db)) -> OrdersService:
"""Get orders service with dependencies"""
from app.repositories.order_repository import (
OrderRepository,
CustomerRepository,
OrderItemRepository,
OrderStatusHistoryRepository
)
from shared.clients import (
get_inventory_client,
get_production_client,
get_sales_client
)
return OrdersService(
order_repo=OrderRepository(),
customer_repo=CustomerRepository(),
order_item_repo=OrderItemRepository(),
status_history_repo=OrderStatusHistoryRepository(),
inventory_client=get_inventory_client(),
production_client=get_production_client(),
sales_client=get_sales_client()
)
# ===== Dashboard and Analytics Endpoints =====
@router.get(
route_builder.build_operations_route("dashboard-summary"),
response_model=OrdersDashboardSummary
)
async def get_dashboard_summary(
tenant_id: UUID = Path(...),
current_user: dict = Depends(get_current_user_dep),
orders_service: OrdersService = Depends(get_orders_service),
db = Depends(get_db)
):
"""Get comprehensive dashboard summary for orders"""
try:
summary = await orders_service.get_dashboard_summary(db, tenant_id)
logger.info("Dashboard summary retrieved",
tenant_id=str(tenant_id),
total_orders=summary.total_orders_today)
return summary
except Exception as e:
logger.error("Error getting dashboard summary",
tenant_id=str(tenant_id),
error=str(e))
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Failed to retrieve dashboard summary"
)
@router.get(
route_builder.build_base_route("demand-requirements"),
response_model=DemandRequirements
)
async def get_demand_requirements(
tenant_id: UUID = Path(...),
target_date: date = Query(..., description="Date for demand analysis"),
current_user: dict = Depends(get_current_user_dep),
orders_service: OrdersService = Depends(get_orders_service),
db = Depends(get_db)
):
"""Get demand requirements for production planning"""
try:
requirements = await orders_service.get_demand_requirements(db, tenant_id, target_date)
logger.info("Demand requirements calculated",
tenant_id=str(tenant_id),
target_date=str(target_date),
total_orders=requirements.total_orders)
return requirements
except Exception as e:
logger.error("Error getting demand requirements",
tenant_id=str(tenant_id),
error=str(e))
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Failed to calculate demand requirements"
)
# ===== Order Status Management =====
@router.put(
route_builder.build_base_route("{order_id}/status"),
response_model=OrderResponse
)
async def update_order_status(
new_status: str,
tenant_id: UUID = Path(...),
order_id: UUID = Path(...),
reason: Optional[str] = Query(None, description="Reason for status change"),
current_user: dict = Depends(get_current_user_dep),
orders_service: OrdersService = Depends(get_orders_service),
db = Depends(get_db)
):
"""Update order status with validation and history tracking"""
try:
# Validate status
valid_statuses = [
"pending", "confirmed", "in_production", "ready",
"out_for_delivery", "delivered", "cancelled", "failed"
]
if new_status not in valid_statuses:
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail=f"Invalid status. Must be one of: {', '.join(valid_statuses)}"
)
order = await orders_service.update_order_status(
db,
order_id,
tenant_id,
new_status,
user_id=UUID(current_user["user_id"]),
reason=reason
)
if not order:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Order not found"
)
logger.info("Order status updated",
order_id=str(order_id),
new_status=new_status)
return order
except HTTPException:
raise
except Exception as e:
logger.error("Error updating order status",
order_id=str(order_id),
error=str(e))
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Failed to update order status"
)
# ===== Business Intelligence Endpoints =====
@router.get(
route_builder.build_base_route("business-model")
)
async def detect_business_model(
tenant_id: UUID = Path(...),
current_user: dict = Depends(get_current_user_dep),
orders_service: OrdersService = Depends(get_orders_service),
db = Depends(get_db)
):
"""Detect business model based on order patterns"""
try:
business_model = await orders_service.detect_business_model(db, tenant_id)
return {
"business_model": business_model,
"confidence": "high" if business_model else "unknown",
"detected_at": datetime.now().isoformat()
}
except Exception as e:
logger.error("Error detecting business model", error=str(e))
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Failed to detect business model"
)
# ===== Health and Status Endpoints =====
@router.get(
route_builder.build_base_route("status")
)
async def get_service_status(
tenant_id: UUID = Path(...),
current_user: dict = Depends(get_current_user_dep)
):
"""Get orders service status"""
try:
return {
"service": "orders-service",
"status": "healthy",
"timestamp": datetime.now().isoformat(),
"tenant_id": str(tenant_id)
}
except Exception as e:
logger.error("Error getting service status", error=str(e))
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Failed to get service status"
)

View File

@@ -0,0 +1,405 @@
# ================================================================
# services/orders/app/api/orders.py
# ================================================================
"""
Orders API endpoints - ATOMIC CRUD operations only
"""
from datetime import date
from typing import List, Optional
from uuid import UUID
from fastapi import APIRouter, Depends, HTTPException, Path, Query, status
from sqlalchemy.ext.asyncio import AsyncSession
import structlog
from shared.auth.decorators import get_current_user_dep
from shared.auth.access_control import require_user_role
from shared.routing import RouteBuilder
from shared.security import create_audit_logger, AuditSeverity, AuditAction
from app.core.database import get_db
from app.services.orders_service import OrdersService
from app.models import AuditLog
from app.schemas.order_schemas import (
OrderCreate,
OrderUpdate,
OrderResponse
)
logger = structlog.get_logger()
audit_logger = create_audit_logger("orders-service", AuditLog)
# Create route builder for consistent URL structure
route_builder = RouteBuilder('orders')
router = APIRouter()
# ===== Dependency Injection =====
async def get_orders_service(db = Depends(get_db)) -> OrdersService:
"""Get orders service with dependencies"""
from app.repositories.order_repository import (
OrderRepository,
CustomerRepository,
OrderItemRepository,
OrderStatusHistoryRepository
)
from shared.clients import (
get_inventory_client,
get_production_client,
get_sales_client
)
return OrdersService(
order_repo=OrderRepository(),
customer_repo=CustomerRepository(),
order_item_repo=OrderItemRepository(),
status_history_repo=OrderStatusHistoryRepository(),
inventory_client=get_inventory_client(),
production_client=get_production_client(),
sales_client=get_sales_client()
)
# ===== Order CRUD Endpoints =====
@router.post(
route_builder.build_base_route("").rstrip("/"),
response_model=OrderResponse,
status_code=status.HTTP_201_CREATED
)
@require_user_role(['admin', 'owner', 'member'])
async def create_order(
order_data: OrderCreate,
tenant_id: UUID = Path(...),
current_user: dict = Depends(get_current_user_dep),
orders_service: OrdersService = Depends(get_orders_service),
db = Depends(get_db)
):
"""Create a new customer order"""
try:
# Extract user ID safely
user_id = current_user.get("user_id")
if not user_id:
logger.error("User ID not found in current_user context", current_user_keys=list(current_user.keys()))
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="User authentication error"
)
order = await orders_service.create_order(
db,
order_data,
user_id=UUID(user_id)
)
# Commit the transaction to persist changes
await db.commit()
logger.info("Order created successfully",
order_id=str(order.id),
order_number=order.order_number)
return order
except ValueError as e:
logger.warning("Invalid order data", error=str(e))
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail=str(e)
)
except Exception as e:
logger.error("Error creating order", error=str(e))
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Failed to create order"
)
@router.get(
route_builder.build_base_route("").rstrip("/"),
response_model=List[OrderResponse]
)
async def get_orders(
tenant_id: UUID = Path(...),
status_filter: Optional[str] = Query(None, description="Filter by order status"),
start_date: Optional[date] = Query(None, description="Start date for date range filter"),
end_date: Optional[date] = Query(None, description="End date for date range filter"),
skip: int = Query(0, ge=0, description="Number of orders to skip"),
limit: int = Query(100, ge=1, le=1000, description="Number of orders to return"),
current_user: dict = Depends(get_current_user_dep),
orders_service: OrdersService = Depends(get_orders_service),
db = Depends(get_db)
):
"""Get orders with filtering and pagination"""
try:
# Determine which repository method to use based on filters
if status_filter:
orders = await orders_service.order_repo.get_orders_by_status(
db, tenant_id, status_filter, skip, limit
)
elif start_date and end_date:
orders = await orders_service.order_repo.get_orders_by_date_range(
db, tenant_id, start_date, end_date, skip, limit
)
else:
orders = await orders_service.order_repo.get_multi(
db, tenant_id, skip, limit, order_by="order_date", order_desc=True
)
return [OrderResponse.from_orm(order) for order in orders]
except Exception as e:
logger.error("Error getting orders", error=str(e))
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Failed to retrieve orders"
)
@router.get(
route_builder.build_base_route("{order_id}"),
response_model=OrderResponse
)
async def get_order(
tenant_id: UUID = Path(...),
order_id: UUID = Path(...),
current_user: dict = Depends(get_current_user_dep),
orders_service: OrdersService = Depends(get_orders_service),
db = Depends(get_db)
):
"""Get order details with items"""
try:
order = await orders_service.get_order_with_items(db, order_id, tenant_id)
if not order:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Order not found"
)
return order
except HTTPException:
raise
except Exception as e:
logger.error("Error getting order",
order_id=str(order_id),
error=str(e))
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Failed to retrieve order"
)
@router.put(
route_builder.build_base_route("{order_id}"),
response_model=OrderResponse
)
@require_user_role(['admin', 'owner', 'member'])
async def update_order(
order_data: OrderUpdate,
tenant_id: UUID = Path(...),
order_id: UUID = Path(...),
current_user: dict = Depends(get_current_user_dep),
orders_service: OrdersService = Depends(get_orders_service),
db = Depends(get_db)
):
"""Update order information"""
try:
# Get existing order
order = await orders_service.order_repo.get(db, order_id, tenant_id)
if not order:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Order not found"
)
# Update order
updated_order = await orders_service.order_repo.update(
db,
db_obj=order,
obj_in=order_data.dict(exclude_unset=True),
updated_by=UUID(current_user["user_id"])
)
# Commit the transaction to persist changes
await db.commit()
logger.info("Order updated successfully",
order_id=str(order_id))
return OrderResponse.from_orm(updated_order)
except HTTPException:
raise
except Exception as e:
logger.error("Error updating order",
order_id=str(order_id),
error=str(e))
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Failed to update order"
)
@router.delete(
route_builder.build_base_route("{order_id}"),
status_code=status.HTTP_204_NO_CONTENT
)
@require_user_role(['admin', 'owner'])
async def delete_order(
tenant_id: UUID = Path(...),
order_id: UUID = Path(...),
current_user: dict = Depends(get_current_user_dep),
orders_service: OrdersService = Depends(get_orders_service),
db = Depends(get_db)
):
"""Delete an order (Admin+ only, soft delete)"""
try:
order = await orders_service.order_repo.get(db, order_id, tenant_id)
if not order:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Order not found"
)
# Capture order data before deletion
order_data = {
"order_number": order.order_number,
"customer_id": str(order.customer_id) if order.customer_id else None,
"order_status": order.order_status,
"total_amount": float(order.total_amount) if order.total_amount else 0.0,
"order_date": order.order_date.isoformat() if order.order_date else None
}
await orders_service.order_repo.delete(db, order_id, tenant_id)
# Commit the transaction to persist deletion
await db.commit()
# Log audit event for order deletion
try:
await audit_logger.log_deletion(
db_session=db,
tenant_id=str(tenant_id),
user_id=current_user["user_id"],
resource_type="order",
resource_id=str(order_id),
resource_data=order_data,
description=f"Admin {current_user.get('email', 'unknown')} deleted order {order_data['order_number']}",
endpoint=f"/orders/{order_id}",
method="DELETE"
)
except Exception as audit_error:
logger.warning("Failed to log audit event", error=str(audit_error))
logger.info("Order deleted successfully",
order_id=str(order_id),
tenant_id=str(tenant_id),
user_id=current_user["user_id"])
except HTTPException:
raise
except Exception as e:
logger.error("Error deleting order",
order_id=str(order_id),
error=str(e))
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Failed to delete order"
)
# ===== Tenant Data Deletion Endpoint =====
@router.delete(
route_builder.build_base_route("tenant/{tenant_id}", include_tenant_prefix=False),
status_code=status.HTTP_200_OK
)
async def delete_tenant_data(
tenant_id: str = Path(..., description="Tenant ID"),
current_user: dict = Depends(get_current_user_dep),
db: AsyncSession = Depends(get_db)
):
"""
Delete all order-related data for a tenant
Only accessible by internal services (called during tenant deletion)
"""
logger.info("Tenant data deletion request received",
tenant_id=tenant_id,
requesting_service=current_user.get("service", "unknown"))
# Only allow internal service calls
if current_user.get("type") != "service":
raise HTTPException(
status_code=status.HTTP_403_FORBIDDEN,
detail="This endpoint is only accessible to internal services"
)
try:
from app.services.tenant_deletion_service import OrdersTenantDeletionService
deletion_service = OrdersTenantDeletionService(db)
result = await deletion_service.safe_delete_tenant_data(tenant_id)
return {
"message": "Tenant data deletion completed in orders-service",
"summary": result.to_dict()
}
except Exception as e:
logger.error("Tenant data deletion failed",
tenant_id=tenant_id,
error=str(e))
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail=f"Failed to delete tenant data: {str(e)}"
)
@router.get(
route_builder.build_base_route("tenant/{tenant_id}/deletion-preview", include_tenant_prefix=False),
status_code=status.HTTP_200_OK
)
async def preview_tenant_data_deletion(
tenant_id: str = Path(..., description="Tenant ID"),
current_user: dict = Depends(get_current_user_dep),
db: AsyncSession = Depends(get_db)
):
"""
Preview what data would be deleted for a tenant (dry-run)
Accessible by internal services and tenant admins
"""
# Allow internal services and admins
is_service = current_user.get("type") == "service"
is_admin = current_user.get("role") in ["owner", "admin"]
if not (is_service or is_admin):
raise HTTPException(
status_code=status.HTTP_403_FORBIDDEN,
detail="Insufficient permissions"
)
try:
from app.services.tenant_deletion_service import OrdersTenantDeletionService
deletion_service = OrdersTenantDeletionService(db)
preview = await deletion_service.get_tenant_data_preview(tenant_id)
return {
"tenant_id": tenant_id,
"service": "orders-service",
"data_counts": preview,
"total_items": sum(preview.values())
}
except Exception as e:
logger.error("Deletion preview failed",
tenant_id=tenant_id,
error=str(e))
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail=f"Failed to get deletion preview: {str(e)}"
)

View File

@@ -0,0 +1,87 @@
# ================================================================
# services/orders/app/core/config.py
# ================================================================
"""
Orders Service Configuration
"""
import os
from shared.config.base import BaseServiceSettings
class OrdersSettings(BaseServiceSettings):
"""Orders service specific settings"""
# Service Identity
APP_NAME: str = "Orders Service"
SERVICE_NAME: str = "orders-service"
VERSION: str = "1.0.0"
DESCRIPTION: str = "Customer orders and procurement planning"
# Database configuration (secure approach - build from components)
@property
def DATABASE_URL(self) -> str:
"""Build database URL from secure components"""
# Try complete URL first (for backward compatibility)
complete_url = os.getenv("ORDERS_DATABASE_URL")
if complete_url:
return complete_url
# Build from components (secure approach)
user = os.getenv("ORDERS_DB_USER", "orders_user")
password = os.getenv("ORDERS_DB_PASSWORD", "orders_pass123")
host = os.getenv("ORDERS_DB_HOST", "localhost")
port = os.getenv("ORDERS_DB_PORT", "5432")
name = os.getenv("ORDERS_DB_NAME", "orders_db")
return f"postgresql+asyncpg://{user}:{password}@{host}:{port}/{name}"
# Order Processing
ORDER_PROCESSING_ENABLED: bool = os.getenv("ORDER_PROCESSING_ENABLED", "true").lower() == "true"
AUTO_APPROVE_ORDERS: bool = os.getenv("AUTO_APPROVE_ORDERS", "false").lower() == "true"
MAX_ORDER_ITEMS: int = int(os.getenv("MAX_ORDER_ITEMS", "50"))
# Procurement Planning
PROCUREMENT_PLANNING_ENABLED: bool = os.getenv("PROCUREMENT_PLANNING_ENABLED", "true").lower() == "true"
PROCUREMENT_LEAD_TIME_DAYS: int = int(os.getenv("PROCUREMENT_LEAD_TIME_DAYS", "3"))
DEMAND_FORECAST_DAYS: int = int(os.getenv("DEMAND_FORECAST_DAYS", "14"))
SAFETY_STOCK_PERCENTAGE: float = float(os.getenv("SAFETY_STOCK_PERCENTAGE", "20.0"))
# Business Model Detection
ENABLE_BUSINESS_MODEL_DETECTION: bool = os.getenv("ENABLE_BUSINESS_MODEL_DETECTION", "true").lower() == "true"
CENTRAL_BAKERY_ORDER_THRESHOLD: int = int(os.getenv("CENTRAL_BAKERY_ORDER_THRESHOLD", "20"))
INDIVIDUAL_BAKERY_ORDER_THRESHOLD: int = int(os.getenv("INDIVIDUAL_BAKERY_ORDER_THRESHOLD", "5"))
# Customer Management
CUSTOMER_VALIDATION_ENABLED: bool = os.getenv("CUSTOMER_VALIDATION_ENABLED", "true").lower() == "true"
MAX_CUSTOMERS_PER_TENANT: int = int(os.getenv("MAX_CUSTOMERS_PER_TENANT", "10000"))
CUSTOMER_CREDIT_CHECK_ENABLED: bool = os.getenv("CUSTOMER_CREDIT_CHECK_ENABLED", "false").lower() == "true"
# Order Validation
MIN_ORDER_VALUE: float = float(os.getenv("MIN_ORDER_VALUE", "0.0"))
MAX_ORDER_VALUE: float = float(os.getenv("MAX_ORDER_VALUE", "100000.0"))
VALIDATE_PRODUCT_AVAILABILITY: bool = os.getenv("VALIDATE_PRODUCT_AVAILABILITY", "true").lower() == "true"
# Payment and Pricing
PAYMENT_VALIDATION_ENABLED: bool = os.getenv("PAYMENT_VALIDATION_ENABLED", "true").lower() == "true"
DYNAMIC_PRICING_ENABLED: bool = os.getenv("DYNAMIC_PRICING_ENABLED", "false").lower() == "true"
DISCOUNT_ENABLED: bool = os.getenv("DISCOUNT_ENABLED", "true").lower() == "true"
MAX_DISCOUNT_PERCENTAGE: float = float(os.getenv("MAX_DISCOUNT_PERCENTAGE", "50.0"))
# Delivery and Fulfillment
DELIVERY_TRACKING_ENABLED: bool = os.getenv("DELIVERY_TRACKING_ENABLED", "true").lower() == "true"
DEFAULT_DELIVERY_WINDOW_HOURS: int = int(os.getenv("DEFAULT_DELIVERY_WINDOW_HOURS", "48"))
PICKUP_ENABLED: bool = os.getenv("PICKUP_ENABLED", "true").lower() == "true"
DELIVERY_ENABLED: bool = os.getenv("DELIVERY_ENABLED", "true").lower() == "true"
# Integration Settings
PRODUCTION_SERVICE_URL: str = os.getenv("PRODUCTION_SERVICE_URL", "http://production-service:8000")
INVENTORY_SERVICE_URL: str = os.getenv("INVENTORY_SERVICE_URL", "http://inventory-service:8000")
SUPPLIERS_SERVICE_URL: str = os.getenv("SUPPLIERS_SERVICE_URL", "http://suppliers-service:8000")
SALES_SERVICE_URL: str = os.getenv("SALES_SERVICE_URL", "http://sales-service:8000")
# Global settings instance
settings = OrdersSettings()

View File

@@ -0,0 +1,90 @@
# ================================================================
# services/orders/app/core/database.py
# ================================================================
"""
Orders Service Database Configuration
"""
from sqlalchemy import create_engine, text
from sqlalchemy.ext.asyncio import create_async_engine, async_sessionmaker, AsyncSession
from sqlalchemy.orm import sessionmaker, DeclarativeBase
import structlog
from typing import AsyncGenerator
from app.core.config import settings
logger = structlog.get_logger()
# Create async engine
async_engine = create_async_engine(
settings.DATABASE_URL,
echo=settings.DEBUG,
pool_size=10,
max_overflow=20,
pool_pre_ping=True,
pool_recycle=3600
)
# Create async session factory
AsyncSessionLocal = async_sessionmaker(
bind=async_engine,
class_=AsyncSession,
expire_on_commit=False
)
# Base class for models
class Base(DeclarativeBase):
pass
async def get_db() -> AsyncGenerator[AsyncSession, None]:
"""Get database session"""
async with AsyncSessionLocal() as session:
try:
yield session
except Exception as e:
await session.rollback()
logger.error("Database session error", error=str(e))
raise
finally:
await session.close()
async def init_database():
"""Initialize database tables"""
try:
async with async_engine.begin() as conn:
# Import all models to ensure they are registered
from app.models.order import CustomerOrder, OrderItem, OrderStatusHistory
from app.models.customer import Customer, CustomerContact
# Create all tables
await conn.run_sync(Base.metadata.create_all)
logger.info("Orders database initialized successfully")
except Exception as e:
logger.error("Failed to initialize orders database", error=str(e))
raise
async def get_db_health() -> bool:
"""Check database health"""
try:
async with async_engine.begin() as conn:
await conn.execute(text("SELECT 1"))
return True
except Exception as e:
logger.error("Database health check failed", error=str(e))
return False
# Database manager instance for service_base compatibility
from shared.database.base import DatabaseManager
database_manager = DatabaseManager(
database_url=settings.DATABASE_URL,
service_name="orders-service",
pool_size=10,
max_overflow=20,
echo=settings.DEBUG
)

139
services/orders/app/main.py Normal file
View File

@@ -0,0 +1,139 @@
# ================================================================
# services/orders/app/main.py
# ================================================================
"""
Orders Service - FastAPI Application
Customer orders management service
"""
from fastapi import FastAPI, Request
from sqlalchemy import text
from app.core.config import settings
from app.core.database import database_manager
from app.api.orders import router as orders_router
from app.api.customers import router as customers_router
from app.api.order_operations import router as order_operations_router
from app.api import audit, internal_demo
from shared.service_base import StandardFastAPIService
class OrdersService(StandardFastAPIService):
"""Orders Service with standardized setup"""
expected_migration_version = "00001"
async def on_startup(self, app):
"""Custom startup logic including migration verification"""
await self.verify_migrations()
await super().on_startup(app)
async def verify_migrations(self):
"""Verify database schema matches the latest migrations."""
try:
async with self.database_manager.get_session() as session:
result = await session.execute(text("SELECT version_num FROM alembic_version"))
version = result.scalar()
if version != self.expected_migration_version:
self.logger.error(f"Migration version mismatch: expected {self.expected_migration_version}, got {version}")
raise RuntimeError(f"Migration version mismatch: expected {self.expected_migration_version}, got {version}")
self.logger.info(f"Migration verification successful: {version}")
except Exception as e:
self.logger.error(f"Migration verification failed: {e}")
raise
def __init__(self):
# Define expected database tables for health checks
orders_expected_tables = [
'customers', 'customer_contacts', 'customer_orders', 'order_items',
'order_status_history', 'audit_logs'
]
super().__init__(
service_name="orders-service",
app_name=settings.APP_NAME,
description=settings.DESCRIPTION,
version=settings.VERSION,
api_prefix="", # Empty because RouteBuilder already includes /api/v1
database_manager=database_manager,
expected_tables=orders_expected_tables
)
async def on_startup(self, app: FastAPI):
"""Custom startup logic for orders service"""
# REMOVED: Procurement scheduler service initialization
# Procurement scheduling is now handled by the Orchestrator Service
# which calls the Procurement Service's /auto-generate endpoint
pass
async def on_shutdown(self, app: FastAPI):
"""Custom shutdown logic for orders service"""
# REMOVED: Scheduler service shutdown
pass
def get_service_features(self):
"""Return orders-specific features"""
return [
"customer_management",
"order_processing",
"order_tracking"
]
# Create service instance
service = OrdersService()
# Create FastAPI app with standardized setup
app = service.create_app()
# Setup standard endpoints
service.setup_standard_endpoints()
# Include routers - organized by ATOMIC and BUSINESS operations
# IMPORTANT: Register specific routes (audit, customers) BEFORE parameterized routes (orders)
# to avoid route matching conflicts where {order_id} would match literal paths like "audit-logs"
# AUDIT: Audit log retrieval endpoints - Must be registered FIRST
service.add_router(audit.router)
# ATOMIC: Direct CRUD operations
# NOTE: Register customers_router BEFORE orders_router to ensure /customers
# matches before the parameterized /{order_id} route
service.add_router(customers_router)
service.add_router(orders_router)
# BUSINESS: Complex operations and workflows
service.add_router(order_operations_router)
# INTERNAL: Service-to-service endpoints - DEPRECATED: Replaced by script-based seed data loading
service.add_router(internal_demo.router, tags=["internal-demo"])
# REMOVED: test_procurement_scheduler endpoint
# Procurement scheduling is now triggered by the Orchestrator Service
@app.middleware("http")
async def logging_middleware(request: Request, call_next):
"""Add request logging middleware"""
import time
start_time = time.time()
response = await call_next(request)
process_time = time.time() - start_time
service.logger.info("HTTP request processed",
method=request.method,
url=str(request.url),
status_code=response.status_code,
process_time=round(process_time, 4))
return response
if __name__ == "__main__":
import uvicorn
uvicorn.run(
"main:app",
host="0.0.0.0",
port=8000,
reload=settings.DEBUG
)

View File

@@ -0,0 +1,68 @@
"""
Orders Service Models Package
Import all models to ensure they are registered with SQLAlchemy Base.
"""
# Import AuditLog model for this service
from shared.security import create_audit_log_model
from shared.database.base import Base
# Create audit log model for this service
AuditLog = create_audit_log_model(Base)
# Import all models to register them with the Base metadata
from .customer import Customer, CustomerContact
from .order import CustomerOrder, OrderItem, OrderStatusHistory
# Import enums
from .enums import (
CustomerType,
DeliveryMethod,
PaymentTerms,
PaymentMethod,
PaymentStatus,
CustomerSegment,
SalesChannel,
BusinessModel,
OrderType,
OrderSource,
OrderStatus,
DeliveryStatus,
ProcurementPlanType,
ProcurementStrategy,
PlanStatus,
PriorityLevel,
RequirementStatus,
RiskLevel,
)
# List all models for easier access
__all__ = [
# Models
"Customer",
"CustomerContact",
"CustomerOrder",
"OrderItem",
"OrderStatusHistory",
# Enums
"CustomerType",
"DeliveryMethod",
"PaymentTerms",
"PaymentMethod",
"PaymentStatus",
"CustomerSegment",
"SalesChannel",
"BusinessModel",
"OrderType",
"OrderSource",
"OrderStatus",
"DeliveryStatus",
"ProcurementPlanType",
"ProcurementStrategy",
"PlanStatus",
"PriorityLevel",
"RequirementStatus",
"RiskLevel",
"AuditLog",
]

View File

@@ -0,0 +1,123 @@
# ================================================================
# services/orders/app/models/customer.py
# ================================================================
"""
Customer-related database models for Orders Service
"""
import uuid
from datetime import datetime
from decimal import Decimal
from typing import Optional, List
from sqlalchemy import Column, String, Boolean, DateTime, Numeric, Text, ForeignKey, Integer
from sqlalchemy.dialects.postgresql import UUID, JSONB
from sqlalchemy.orm import relationship
from sqlalchemy.sql import func
from shared.database.base import Base
class Customer(Base):
"""Customer model for managing customer information"""
__tablename__ = "customers"
# Primary identification
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
tenant_id = Column(UUID(as_uuid=True), nullable=False, index=True)
customer_code = Column(String(50), nullable=False, index=True) # Human-readable code
# Basic information
name = Column(String(200), nullable=False)
business_name = Column(String(200), nullable=True)
customer_type = Column(String(50), nullable=False, default="individual") # individual, business, central_bakery
# Contact information
email = Column(String(255), nullable=True)
phone = Column(String(50), nullable=True)
# Address information
address_line1 = Column(String(255), nullable=True)
address_line2 = Column(String(255), nullable=True)
city = Column(String(100), nullable=True)
state = Column(String(100), nullable=True)
postal_code = Column(String(20), nullable=True)
country = Column(String(100), nullable=False, default="US")
# Business information
tax_id = Column(String(50), nullable=True)
business_license = Column(String(100), nullable=True)
# Customer status and preferences
is_active = Column(Boolean, nullable=False, default=True)
preferred_delivery_method = Column(String(50), nullable=False, default="delivery") # delivery, pickup
payment_terms = Column(String(50), nullable=False, default="immediate") # immediate, net_30, net_60
credit_limit = Column(Numeric(10, 2), nullable=True)
discount_percentage = Column(Numeric(5, 2), nullable=False, default=Decimal("0.00"))
# Customer categorization
customer_segment = Column(String(50), nullable=False, default="regular") # vip, regular, wholesale
priority_level = Column(String(20), nullable=False, default="normal") # high, normal, low
# Preferences and special requirements
special_instructions = Column(Text, nullable=True)
delivery_preferences = Column(JSONB, nullable=True) # Time windows, special requirements
product_preferences = Column(JSONB, nullable=True) # Favorite products, allergies
# Customer metrics
total_orders = Column(Integer, nullable=False, default=0)
total_spent = Column(Numeric(12, 2), nullable=False, default=Decimal("0.00"))
average_order_value = Column(Numeric(10, 2), nullable=False, default=Decimal("0.00"))
last_order_date = Column(DateTime(timezone=True), nullable=True)
# Audit fields
created_at = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
updated_at = Column(DateTime(timezone=True), server_default=func.now(), onupdate=func.now(), nullable=False)
created_by = Column(UUID(as_uuid=True), nullable=True)
updated_by = Column(UUID(as_uuid=True), nullable=True)
# Relationships
contacts = relationship("CustomerContact", back_populates="customer", cascade="all, delete-orphan")
orders = relationship("CustomerOrder", back_populates="customer")
class CustomerContact(Base):
"""Additional contact persons for business customers"""
__tablename__ = "customer_contacts"
# Primary identification
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
customer_id = Column(UUID(as_uuid=True), ForeignKey("customers.id", ondelete="CASCADE"), nullable=False)
# Contact information
name = Column(String(200), nullable=False)
title = Column(String(100), nullable=True)
department = Column(String(100), nullable=True)
# Contact details
email = Column(String(255), nullable=True)
phone = Column(String(50), nullable=True)
mobile = Column(String(50), nullable=True)
# Contact preferences
is_primary = Column(Boolean, nullable=False, default=False)
contact_for_orders = Column(Boolean, nullable=False, default=True)
contact_for_delivery = Column(Boolean, nullable=False, default=False)
contact_for_billing = Column(Boolean, nullable=False, default=False)
contact_for_support = Column(Boolean, nullable=False, default=False)
# Preferred contact methods
preferred_contact_method = Column(String(50), nullable=False, default="email") # email, phone, sms
contact_time_preferences = Column(JSONB, nullable=True) # Time windows for contact
# Notes and special instructions
notes = Column(Text, nullable=True)
# Status
is_active = Column(Boolean, nullable=False, default=True)
# Audit fields
created_at = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
updated_at = Column(DateTime(timezone=True), server_default=func.now(), onupdate=func.now(), nullable=False)
# Relationships
customer = relationship("Customer", back_populates="contacts")

View File

@@ -0,0 +1,162 @@
# services/orders/app/models/enums.py
"""
Enum definitions for Orders Service
Following the pattern used in the Inventory Service for better type safety and maintainability
"""
import enum
class CustomerType(enum.Enum):
"""Customer type classifications"""
INDIVIDUAL = "individual"
BUSINESS = "business"
CENTRAL_BAKERY = "central_bakery"
RETAIL = "RETAIL"
WHOLESALE = "WHOLESALE"
RESTAURANT = "RESTAURANT"
HOTEL = "HOTEL"
ENTERPRISE = "ENTERPRISE"
class DeliveryMethod(enum.Enum):
"""Order delivery methods"""
DELIVERY = "delivery"
PICKUP = "pickup"
STANDARD = "standard" # Standard delivery method
class PaymentTerms(enum.Enum):
"""Payment terms for customers and orders"""
IMMEDIATE = "immediate"
NET_15 = "net_15"
NET_30 = "net_30"
NET_60 = "net_60"
class PaymentMethod(enum.Enum):
"""Payment methods for orders"""
CASH = "cash"
CARD = "card"
CREDIT_CARD = "credit_card" # Credit card payment
CHECK = "check" # Bank check/cheque payment
BANK_TRANSFER = "bank_transfer"
ACCOUNT = "account"
class PaymentStatus(enum.Enum):
"""Payment status for orders"""
PENDING = "pending"
PARTIAL = "partial"
PAID = "paid"
FAILED = "failed"
REFUNDED = "refunded"
class CustomerSegment(enum.Enum):
"""Customer segmentation categories"""
VIP = "vip"
REGULAR = "regular"
WHOLESALE = "wholesale"
class PriorityLevel(enum.Enum):
"""Priority levels for orders and customers"""
URGENT = "urgent"
HIGH = "high"
NORMAL = "normal"
LOW = "low"
class OrderType(enum.Enum):
"""Order type classifications"""
STANDARD = "standard"
RUSH = "rush"
RECURRING = "recurring"
SPECIAL = "special"
class OrderStatus(enum.Enum):
"""Order status workflow"""
PENDING = "pending"
CONFIRMED = "confirmed"
IN_PRODUCTION = "in_production"
READY = "ready"
OUT_FOR_DELIVERY = "out_for_delivery"
DELIVERED = "delivered"
CANCELLED = "cancelled"
FAILED = "failed"
class OrderSource(enum.Enum):
"""Source of order creation"""
MANUAL = "manual"
ONLINE = "online"
PHONE = "phone"
APP = "app"
API = "api"
class SalesChannel(enum.Enum):
"""Sales channel classification"""
DIRECT = "direct"
WHOLESALE = "wholesale"
RETAIL = "retail"
class BusinessModel(enum.Enum):
"""Business model types"""
INDIVIDUAL_BAKERY = "individual_bakery"
CENTRAL_BAKERY = "central_bakery"
# Procurement-related enums
class ProcurementPlanType(enum.Enum):
"""Procurement plan types"""
REGULAR = "regular"
EMERGENCY = "emergency"
SEASONAL = "seasonal"
class ProcurementStrategy(enum.Enum):
"""Procurement strategies"""
JUST_IN_TIME = "just_in_time"
BULK = "bulk"
MIXED = "mixed"
class RiskLevel(enum.Enum):
"""Risk level classifications"""
LOW = "low"
MEDIUM = "medium"
HIGH = "high"
CRITICAL = "critical"
class RequirementStatus(enum.Enum):
"""Procurement requirement status"""
PENDING = "pending"
APPROVED = "approved"
ORDERED = "ordered"
PARTIALLY_RECEIVED = "partially_received"
RECEIVED = "received"
CANCELLED = "cancelled"
class PlanStatus(enum.Enum):
"""Procurement plan status"""
DRAFT = "draft"
PENDING_APPROVAL = "pending_approval"
APPROVED = "approved"
IN_EXECUTION = "in_execution"
COMPLETED = "completed"
CANCELLED = "cancelled"
class DeliveryStatus(enum.Enum):
"""Delivery status for procurement"""
PENDING = "pending"
IN_TRANSIT = "in_transit"
DELIVERED = "delivered"
DELAYED = "delayed"
CANCELLED = "cancelled"

View File

@@ -0,0 +1,218 @@
# ================================================================
# services/orders/app/models/order.py
# ================================================================
"""
Order-related database models for Orders Service
"""
import uuid
from datetime import datetime
from decimal import Decimal
from typing import Optional, List
from sqlalchemy import Column, String, Boolean, DateTime, Numeric, Text, ForeignKey, Integer
from sqlalchemy.dialects.postgresql import UUID, JSONB
from sqlalchemy.orm import relationship
from sqlalchemy.sql import func
from shared.database.base import Base
class CustomerOrder(Base):
"""Customer order model for tracking orders throughout their lifecycle"""
__tablename__ = "customer_orders"
# Primary identification
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
tenant_id = Column(UUID(as_uuid=True), nullable=False, index=True)
order_number = Column(String(50), nullable=False, unique=True, index=True)
# Customer information
customer_id = Column(UUID(as_uuid=True), ForeignKey("customers.id"), nullable=False, index=True)
# Order status and lifecycle
status = Column(String(50), nullable=False, default="pending", index=True)
# Status values: pending, confirmed, in_production, ready, out_for_delivery, delivered, cancelled, failed
order_type = Column(String(50), nullable=False, default="standard") # standard, rush, recurring, special
priority = Column(String(20), nullable=False, default="normal") # high, normal, low
# Order timing
order_date = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
requested_delivery_date = Column(DateTime(timezone=True), nullable=False)
confirmed_delivery_date = Column(DateTime(timezone=True), nullable=True)
actual_delivery_date = Column(DateTime(timezone=True), nullable=True)
# Delivery information
delivery_method = Column(String(50), nullable=False, default="delivery") # delivery, pickup
delivery_address = Column(JSONB, nullable=True) # Complete delivery address
delivery_instructions = Column(Text, nullable=True)
delivery_window_start = Column(DateTime(timezone=True), nullable=True)
delivery_window_end = Column(DateTime(timezone=True), nullable=True)
# Financial information
subtotal = Column(Numeric(10, 2), nullable=False, default=Decimal("0.00"))
discount_amount = Column(Numeric(10, 2), nullable=False, default=Decimal("0.00"))
discount_percentage = Column(Numeric(5, 2), nullable=False, default=Decimal("0.00"))
tax_amount = Column(Numeric(10, 2), nullable=False, default=Decimal("0.00"))
delivery_fee = Column(Numeric(10, 2), nullable=False, default=Decimal("0.00"))
total_amount = Column(Numeric(10, 2), nullable=False, default=Decimal("0.00"))
# Payment information
payment_status = Column(String(50), nullable=False, default="pending") # pending, partial, paid, failed, refunded
payment_method = Column(String(50), nullable=True) # cash, card, bank_transfer, account
payment_terms = Column(String(50), nullable=False, default="immediate")
payment_due_date = Column(DateTime(timezone=True), nullable=True)
# Special requirements and customizations
special_instructions = Column(Text, nullable=True)
custom_requirements = Column(JSONB, nullable=True) # Special dietary requirements, decorations
allergen_warnings = Column(JSONB, nullable=True) # Allergen information
# Business model detection
business_model = Column(String(50), nullable=True) # individual_bakery, central_bakery (auto-detected)
estimated_business_model = Column(String(50), nullable=True) # Based on order patterns
# Order source and channel
order_source = Column(String(50), nullable=False, default="manual") # manual, online, phone, app, api
sales_channel = Column(String(50), nullable=False, default="direct") # direct, wholesale, retail
order_origin = Column(String(100), nullable=True) # Website, app, store location
# Fulfillment tracking
production_batch_id = Column(UUID(as_uuid=True), nullable=True) # Link to production batch
fulfillment_location = Column(String(100), nullable=True) # Which location fulfills this order
estimated_preparation_time = Column(Integer, nullable=True) # Minutes
actual_preparation_time = Column(Integer, nullable=True) # Minutes
# Customer communication
customer_notified_confirmed = Column(Boolean, nullable=False, default=False)
customer_notified_ready = Column(Boolean, nullable=False, default=False)
customer_notified_delivered = Column(Boolean, nullable=False, default=False)
communication_preferences = Column(JSONB, nullable=True)
# Quality and feedback
quality_score = Column(Numeric(3, 1), nullable=True) # 1.0 to 10.0
customer_rating = Column(Integer, nullable=True) # 1-5 stars
customer_feedback = Column(Text, nullable=True)
# Cancellation and refunds
cancellation_reason = Column(String(200), nullable=True)
cancelled_at = Column(DateTime(timezone=True), nullable=True)
cancelled_by = Column(UUID(as_uuid=True), nullable=True)
refund_amount = Column(Numeric(10, 2), nullable=False, default=Decimal("0.00"))
refund_processed_at = Column(DateTime(timezone=True), nullable=True)
# Audit fields
created_at = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
updated_at = Column(DateTime(timezone=True), server_default=func.now(), onupdate=func.now(), nullable=False)
created_by = Column(UUID(as_uuid=True), nullable=True)
updated_by = Column(UUID(as_uuid=True), nullable=True)
# Additional metadata
order_metadata = Column(JSONB, nullable=True) # Flexible field for additional data
# Relationships
customer = relationship("Customer", back_populates="orders")
items = relationship("OrderItem", back_populates="order", cascade="all, delete-orphan")
status_history = relationship("OrderStatusHistory", back_populates="order", cascade="all, delete-orphan")
class OrderItem(Base):
"""Individual items within a customer order"""
__tablename__ = "order_items"
# Primary identification
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
order_id = Column(UUID(as_uuid=True), ForeignKey("customer_orders.id", ondelete="CASCADE"), nullable=False)
# Product information
product_id = Column(UUID(as_uuid=True), nullable=False, index=True) # Reference to products service
product_name = Column(String(200), nullable=False)
product_sku = Column(String(100), nullable=True)
product_category = Column(String(100), nullable=True)
# Quantity and units
quantity = Column(Numeric(10, 3), nullable=False)
unit_of_measure = Column(String(50), nullable=False, default="each")
weight = Column(Numeric(10, 3), nullable=True) # For weight-based products
# Pricing information
unit_price = Column(Numeric(10, 2), nullable=False)
line_discount = Column(Numeric(10, 2), nullable=False, default=Decimal("0.00"))
line_total = Column(Numeric(10, 2), nullable=False)
# Product specifications and customizations
product_specifications = Column(JSONB, nullable=True) # Size, flavor, decorations
customization_details = Column(Text, nullable=True)
special_instructions = Column(Text, nullable=True)
# Production requirements
recipe_id = Column(UUID(as_uuid=True), nullable=True) # Reference to recipes service
production_requirements = Column(JSONB, nullable=True) # Ingredients, equipment needed
estimated_production_time = Column(Integer, nullable=True) # Minutes
# Fulfillment tracking
status = Column(String(50), nullable=False, default="pending") # pending, in_production, ready, delivered
production_started_at = Column(DateTime(timezone=True), nullable=True)
production_completed_at = Column(DateTime(timezone=True), nullable=True)
quality_checked = Column(Boolean, nullable=False, default=False)
quality_score = Column(Numeric(3, 1), nullable=True)
# Cost tracking
ingredient_cost = Column(Numeric(10, 2), nullable=True)
labor_cost = Column(Numeric(10, 2), nullable=True)
overhead_cost = Column(Numeric(10, 2), nullable=True)
total_cost = Column(Numeric(10, 2), nullable=True)
margin = Column(Numeric(10, 2), nullable=True)
# Inventory impact
reserved_inventory = Column(Boolean, nullable=False, default=False)
inventory_allocated_at = Column(DateTime(timezone=True), nullable=True)
# Audit fields
created_at = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
updated_at = Column(DateTime(timezone=True), server_default=func.now(), onupdate=func.now(), nullable=False)
# Additional metadata
customer_metadata = Column(JSONB, nullable=True)
# Relationships
order = relationship("CustomerOrder", back_populates="items")
class OrderStatusHistory(Base):
"""Track status changes and important events in order lifecycle"""
__tablename__ = "order_status_history"
# Primary identification
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
order_id = Column(UUID(as_uuid=True), ForeignKey("customer_orders.id", ondelete="CASCADE"), nullable=False)
# Status change information
from_status = Column(String(50), nullable=True)
to_status = Column(String(50), nullable=False)
change_reason = Column(String(200), nullable=True)
# Event details
event_type = Column(String(50), nullable=False, default="status_change")
# Event types: status_change, payment_received, production_started, delivery_scheduled, etc.
event_description = Column(Text, nullable=True)
event_data = Column(JSONB, nullable=True) # Additional event-specific data
# Who made the change
changed_by = Column(UUID(as_uuid=True), nullable=True)
change_source = Column(String(50), nullable=False, default="manual") # manual, automatic, system, api
# Timing
changed_at = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
# Customer communication
customer_notified = Column(Boolean, nullable=False, default=False)
notification_method = Column(String(50), nullable=True) # email, sms, phone, app
notification_sent_at = Column(DateTime(timezone=True), nullable=True)
# Additional notes
notes = Column(Text, nullable=True)
# Relationships
order = relationship("CustomerOrder", back_populates="status_history")

View File

@@ -0,0 +1,289 @@
# ================================================================
# services/orders/app/repositories/base_repository.py
# ================================================================
"""
Base repository class for Orders Service
"""
from typing import Any, Dict, Generic, List, Optional, Type, TypeVar, Union
from uuid import UUID
from sqlalchemy import select, update, delete, func, and_, or_
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy.orm import selectinload, joinedload
import structlog
from app.core.database import Base
logger = structlog.get_logger()
ModelType = TypeVar("ModelType", bound=Base)
CreateSchemaType = TypeVar("CreateSchemaType")
UpdateSchemaType = TypeVar("UpdateSchemaType")
class BaseRepository(Generic[ModelType, CreateSchemaType, UpdateSchemaType]):
"""Base repository with common CRUD operations"""
def __init__(self, model: Type[ModelType]):
self.model = model
async def get(
self,
db: AsyncSession,
id: UUID,
tenant_id: Optional[UUID] = None
) -> Optional[ModelType]:
"""Get a single record by ID with optional tenant filtering"""
try:
query = select(self.model).where(self.model.id == id)
# Add tenant filtering if tenant_id is provided and model has tenant_id field
if tenant_id and hasattr(self.model, 'tenant_id'):
query = query.where(self.model.tenant_id == tenant_id)
result = await db.execute(query)
return result.scalar_one_or_none()
except Exception as e:
logger.error("Error getting record", model=self.model.__name__, id=str(id), error=str(e))
raise
async def get_by_field(
self,
db: AsyncSession,
field_name: str,
field_value: Any,
tenant_id: Optional[UUID] = None
) -> Optional[ModelType]:
"""Get a single record by field value"""
try:
field = getattr(self.model, field_name)
query = select(self.model).where(field == field_value)
if tenant_id and hasattr(self.model, 'tenant_id'):
query = query.where(self.model.tenant_id == tenant_id)
result = await db.execute(query)
return result.scalar_one_or_none()
except Exception as e:
logger.error("Error getting record by field",
model=self.model.__name__,
field_name=field_name,
field_value=str(field_value),
error=str(e))
raise
async def get_multi(
self,
db: AsyncSession,
tenant_id: Optional[UUID] = None,
skip: int = 0,
limit: int = 100,
filters: Optional[Dict[str, Any]] = None,
order_by: Optional[str] = None,
order_desc: bool = False
) -> List[ModelType]:
"""Get multiple records with filtering, pagination, and sorting"""
try:
query = select(self.model)
# Add tenant filtering
if tenant_id and hasattr(self.model, 'tenant_id'):
query = query.where(self.model.tenant_id == tenant_id)
# Add additional filters
if filters:
for field_name, field_value in filters.items():
if hasattr(self.model, field_name):
field = getattr(self.model, field_name)
if isinstance(field_value, list):
query = query.where(field.in_(field_value))
else:
query = query.where(field == field_value)
# Add ordering
if order_by and hasattr(self.model, order_by):
order_field = getattr(self.model, order_by)
if order_desc:
query = query.order_by(order_field.desc())
else:
query = query.order_by(order_field)
# Add pagination
query = query.offset(skip).limit(limit)
result = await db.execute(query)
return result.scalars().all()
except Exception as e:
logger.error("Error getting multiple records",
model=self.model.__name__,
error=str(e))
raise
async def count(
self,
db: AsyncSession,
tenant_id: Optional[UUID] = None,
filters: Optional[Dict[str, Any]] = None
) -> int:
"""Count records with optional filtering"""
try:
query = select(func.count()).select_from(self.model)
# Add tenant filtering
if tenant_id and hasattr(self.model, 'tenant_id'):
query = query.where(self.model.tenant_id == tenant_id)
# Add additional filters
if filters:
for field_name, field_value in filters.items():
if hasattr(self.model, field_name):
field = getattr(self.model, field_name)
if isinstance(field_value, list):
query = query.where(field.in_(field_value))
else:
query = query.where(field == field_value)
result = await db.execute(query)
return result.scalar()
except Exception as e:
logger.error("Error counting records",
model=self.model.__name__,
error=str(e))
raise
async def create(
self,
db: AsyncSession,
*,
obj_in: CreateSchemaType,
created_by: Optional[UUID] = None,
tenant_id: Optional[UUID] = None
) -> ModelType:
"""Create a new record"""
try:
# Convert schema to dict
if hasattr(obj_in, 'dict'):
obj_data = obj_in.dict()
else:
obj_data = obj_in
# Add tenant_id if the model supports it and it's provided
if tenant_id and hasattr(self.model, 'tenant_id'):
obj_data['tenant_id'] = tenant_id
# Add created_by if the model supports it
if created_by and hasattr(self.model, 'created_by'):
obj_data['created_by'] = created_by
# Create model instance
db_obj = self.model(**obj_data)
# Add to session and flush to get ID
db.add(db_obj)
await db.flush()
await db.refresh(db_obj)
logger.info("Record created",
model=self.model.__name__,
id=str(db_obj.id))
return db_obj
except Exception as e:
logger.error("Error creating record",
model=self.model.__name__,
error=str(e))
raise
async def update(
self,
db: AsyncSession,
*,
db_obj: ModelType,
obj_in: Union[UpdateSchemaType, Dict[str, Any]],
updated_by: Optional[UUID] = None
) -> ModelType:
"""Update an existing record"""
try:
# Convert schema to dict
if hasattr(obj_in, 'dict'):
update_data = obj_in.dict(exclude_unset=True)
else:
update_data = obj_in
# Add updated_by if the model supports it
if updated_by and hasattr(self.model, 'updated_by'):
update_data['updated_by'] = updated_by
# Update fields
for field, value in update_data.items():
if hasattr(db_obj, field):
setattr(db_obj, field, value)
# Flush changes
await db.flush()
await db.refresh(db_obj)
logger.info("Record updated",
model=self.model.__name__,
id=str(db_obj.id))
return db_obj
except Exception as e:
logger.error("Error updating record",
model=self.model.__name__,
id=str(db_obj.id),
error=str(e))
raise
async def delete(
self,
db: AsyncSession,
*,
id: UUID,
tenant_id: Optional[UUID] = None
) -> Optional[ModelType]:
"""Delete a record by ID"""
try:
# First get the record
db_obj = await self.get(db, id=id, tenant_id=tenant_id)
if not db_obj:
return None
# Delete the record
await db.delete(db_obj)
await db.flush()
logger.info("Record deleted",
model=self.model.__name__,
id=str(id))
return db_obj
except Exception as e:
logger.error("Error deleting record",
model=self.model.__name__,
id=str(id),
error=str(e))
raise
async def exists(
self,
db: AsyncSession,
id: UUID,
tenant_id: Optional[UUID] = None
) -> bool:
"""Check if a record exists"""
try:
query = select(func.count()).select_from(self.model).where(self.model.id == id)
if tenant_id and hasattr(self.model, 'tenant_id'):
query = query.where(self.model.tenant_id == tenant_id)
result = await db.execute(query)
count = result.scalar()
return count > 0
except Exception as e:
logger.error("Error checking record existence",
model=self.model.__name__,
id=str(id),
error=str(e))
raise

View File

@@ -0,0 +1,628 @@
# ================================================================
# services/orders/app/repositories/order_repository.py
# ================================================================
"""
Order-related repositories for Orders Service
"""
from datetime import datetime, date, timedelta
from decimal import Decimal
from typing import List, Optional, Dict, Any
from uuid import UUID
from sqlalchemy import select, func, and_, or_, case, extract
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy.orm import selectinload, joinedload
import structlog
from app.models.customer import Customer
from app.models.order import CustomerOrder, OrderItem, OrderStatusHistory
from app.schemas.order_schemas import OrderCreate, OrderUpdate, OrderItemCreate, OrderItemUpdate
from app.repositories.base_repository import BaseRepository
logger = structlog.get_logger()
class CustomerRepository(BaseRepository[Customer, dict, dict]):
"""Repository for customer operations"""
def __init__(self):
super().__init__(Customer)
async def get_by_customer_code(
self,
db: AsyncSession,
customer_code: str,
tenant_id: UUID
) -> Optional[Customer]:
"""Get customer by customer code within tenant"""
try:
query = select(Customer).where(
and_(
Customer.customer_code == customer_code,
Customer.tenant_id == tenant_id
)
)
result = await db.execute(query)
return result.scalar_one_or_none()
except Exception as e:
logger.error("Error getting customer by code",
customer_code=customer_code,
error=str(e))
raise
async def get_active_customers(
self,
db: AsyncSession,
tenant_id: UUID,
skip: int = 0,
limit: int = 100
) -> List[Customer]:
"""Get active customers for a tenant"""
try:
query = select(Customer).where(
and_(
Customer.tenant_id == tenant_id,
Customer.is_active == True
)
).order_by(Customer.name).offset(skip).limit(limit)
result = await db.execute(query)
return result.scalars().all()
except Exception as e:
logger.error("Error getting active customers", error=str(e))
raise
async def update_customer_metrics(
self,
db: AsyncSession,
customer_id: UUID,
order_value: Decimal,
order_date: datetime
):
"""Update customer metrics after order creation"""
try:
customer = await self.get(db, customer_id)
if customer:
customer.total_orders += 1
customer.total_spent += order_value
customer.average_order_value = customer.total_spent / customer.total_orders
customer.last_order_date = order_date
await db.flush()
logger.info("Customer metrics updated",
customer_id=str(customer_id),
new_total_spent=str(customer.total_spent))
except Exception as e:
logger.error("Error updating customer metrics",
customer_id=str(customer_id),
error=str(e))
raise
async def count_created_since(
self,
db: AsyncSession,
tenant_id: UUID,
since_date: datetime
) -> int:
"""Count customers created since a specific date"""
try:
query = select(func.count()).select_from(Customer).where(
and_(
Customer.tenant_id == tenant_id,
Customer.created_at >= since_date
)
)
result = await db.execute(query)
return result.scalar()
except Exception as e:
logger.error("Error counting customers created since date",
tenant_id=str(tenant_id),
since_date=str(since_date),
error=str(e))
raise
class OrderRepository(BaseRepository[CustomerOrder, OrderCreate, OrderUpdate]):
"""Repository for customer order operations"""
def __init__(self):
super().__init__(CustomerOrder)
async def get_multi(
self,
db: AsyncSession,
tenant_id: Optional[UUID] = None,
skip: int = 0,
limit: int = 100,
filters: Optional[Dict[str, Any]] = None,
order_by: Optional[str] = None,
order_desc: bool = False
) -> List[CustomerOrder]:
"""Get multiple orders with eager loading of items and customer"""
try:
query = select(self.model).options(
selectinload(CustomerOrder.items),
selectinload(CustomerOrder.customer)
)
# Apply tenant filter
if tenant_id:
query = query.where(self.model.tenant_id == tenant_id)
# Apply additional filters
if filters:
for key, value in filters.items():
if hasattr(self.model, key) and value is not None:
field = getattr(self.model, key)
if isinstance(value, list):
query = query.where(field.in_(value))
else:
query = query.where(field == value)
# Apply ordering
if order_by and hasattr(self.model, order_by):
order_column = getattr(self.model, order_by)
if order_desc:
query = query.order_by(order_column.desc())
else:
query = query.order_by(order_column)
else:
# Default ordering by order_date desc
query = query.order_by(CustomerOrder.order_date.desc())
# Apply pagination
query = query.offset(skip).limit(limit)
result = await db.execute(query)
return result.scalars().all()
except Exception as e:
logger.error("Error getting multiple orders", error=str(e))
raise
async def get_with_items(
self,
db: AsyncSession,
order_id: UUID,
tenant_id: UUID
) -> Optional[CustomerOrder]:
"""Get order with all its items and customer info"""
try:
query = select(CustomerOrder).options(
selectinload(CustomerOrder.items),
selectinload(CustomerOrder.customer),
selectinload(CustomerOrder.status_history)
).where(
and_(
CustomerOrder.id == order_id,
CustomerOrder.tenant_id == tenant_id
)
)
result = await db.execute(query)
return result.scalar_one_or_none()
except Exception as e:
logger.error("Error getting order with items",
order_id=str(order_id),
error=str(e))
raise
async def get_by_order_number(
self,
db: AsyncSession,
order_number: str,
tenant_id: UUID
) -> Optional[CustomerOrder]:
"""Get order by order number within tenant"""
try:
query = select(CustomerOrder).where(
and_(
CustomerOrder.order_number == order_number,
CustomerOrder.tenant_id == tenant_id
)
)
result = await db.execute(query)
return result.scalar_one_or_none()
except Exception as e:
logger.error("Error getting order by number",
order_number=order_number,
error=str(e))
raise
async def get_orders_by_status(
self,
db: AsyncSession,
tenant_id: UUID,
status: str,
skip: int = 0,
limit: int = 100
) -> List[CustomerOrder]:
"""Get orders by status"""
try:
query = select(CustomerOrder).options(
selectinload(CustomerOrder.customer)
).where(
and_(
CustomerOrder.tenant_id == tenant_id,
CustomerOrder.status == status
)
).order_by(CustomerOrder.order_date.desc()).offset(skip).limit(limit)
result = await db.execute(query)
return result.scalars().all()
except Exception as e:
logger.error("Error getting orders by status",
status=status,
error=str(e))
raise
async def get_orders_by_date_range(
self,
db: AsyncSession,
tenant_id: UUID,
start_date: date,
end_date: date,
skip: int = 0,
limit: int = 100
) -> List[CustomerOrder]:
"""Get orders within date range"""
try:
query = select(CustomerOrder).options(
selectinload(CustomerOrder.customer),
selectinload(CustomerOrder.items)
).where(
and_(
CustomerOrder.tenant_id == tenant_id,
func.date(CustomerOrder.order_date) >= start_date,
func.date(CustomerOrder.order_date) <= end_date
)
).order_by(CustomerOrder.order_date.desc()).offset(skip).limit(limit)
result = await db.execute(query)
return result.scalars().all()
except Exception as e:
logger.error("Error getting orders by date range",
start_date=str(start_date),
end_date=str(end_date),
error=str(e))
raise
async def get_pending_orders_by_delivery_date(
self,
db: AsyncSession,
tenant_id: UUID,
delivery_date: date
) -> List[CustomerOrder]:
"""Get pending orders for a specific delivery date"""
try:
query = select(CustomerOrder).options(
selectinload(CustomerOrder.items),
selectinload(CustomerOrder.customer)
).where(
and_(
CustomerOrder.tenant_id == tenant_id,
CustomerOrder.status.in_(["pending", "confirmed", "in_production"]),
func.date(CustomerOrder.requested_delivery_date) == delivery_date
)
).order_by(CustomerOrder.priority.desc(), CustomerOrder.order_date)
result = await db.execute(query)
return result.scalars().all()
except Exception as e:
logger.error("Error getting pending orders by delivery date",
delivery_date=str(delivery_date),
error=str(e))
raise
async def get_dashboard_metrics(
self,
db: AsyncSession,
tenant_id: UUID
) -> Dict[str, Any]:
"""Get dashboard metrics for orders"""
try:
# Today's metrics
today = datetime.now().date()
week_start = today - timedelta(days=today.weekday())
month_start = today.replace(day=1)
# Order counts by period
orders_today = await db.execute(
select(func.count()).select_from(CustomerOrder).where(
and_(
CustomerOrder.tenant_id == tenant_id,
func.date(CustomerOrder.order_date) == today
)
)
)
orders_week = await db.execute(
select(func.count()).select_from(CustomerOrder).where(
and_(
CustomerOrder.tenant_id == tenant_id,
func.date(CustomerOrder.order_date) >= week_start
)
)
)
orders_month = await db.execute(
select(func.count()).select_from(CustomerOrder).where(
and_(
CustomerOrder.tenant_id == tenant_id,
func.date(CustomerOrder.order_date) >= month_start
)
)
)
# Revenue by period
revenue_today = await db.execute(
select(func.coalesce(func.sum(CustomerOrder.total_amount), 0)).where(
and_(
CustomerOrder.tenant_id == tenant_id,
func.date(CustomerOrder.order_date) == today,
CustomerOrder.status != "cancelled"
)
)
)
revenue_week = await db.execute(
select(func.coalesce(func.sum(CustomerOrder.total_amount), 0)).where(
and_(
CustomerOrder.tenant_id == tenant_id,
func.date(CustomerOrder.order_date) >= week_start,
CustomerOrder.status != "cancelled"
)
)
)
revenue_month = await db.execute(
select(func.coalesce(func.sum(CustomerOrder.total_amount), 0)).where(
and_(
CustomerOrder.tenant_id == tenant_id,
func.date(CustomerOrder.order_date) >= month_start,
CustomerOrder.status != "cancelled"
)
)
)
# Status breakdown
status_counts = await db.execute(
select(CustomerOrder.status, func.count()).select_from(CustomerOrder).where(
CustomerOrder.tenant_id == tenant_id
).group_by(CustomerOrder.status)
)
status_breakdown = {status: count for status, count in status_counts.fetchall()}
# Average order value
avg_order_value = await db.execute(
select(func.coalesce(func.avg(CustomerOrder.total_amount), 0)).where(
and_(
CustomerOrder.tenant_id == tenant_id,
CustomerOrder.status != "cancelled"
)
)
)
# Calculate repeat customers rate
# Count customers who have made more than one order
repeat_customers_query = await db.execute(
select(func.count()).select_from(
select(CustomerOrder.customer_id)
.where(CustomerOrder.tenant_id == tenant_id)
.group_by(CustomerOrder.customer_id)
.having(func.count(CustomerOrder.id) > 1)
.subquery()
)
)
total_customers_query = await db.execute(
select(func.count(func.distinct(CustomerOrder.customer_id))).where(
CustomerOrder.tenant_id == tenant_id
)
)
repeat_customers_count = repeat_customers_query.scalar() or 0
total_customers_count = total_customers_query.scalar() or 0
repeat_customers_rate = Decimal("0.0")
if total_customers_count > 0:
repeat_customers_rate = Decimal(str(repeat_customers_count)) / Decimal(str(total_customers_count))
repeat_customers_rate = repeat_customers_rate * Decimal("100.0") # Convert to percentage
# Calculate order fulfillment rate
total_orders_query = await db.execute(
select(func.count()).where(
and_(
CustomerOrder.tenant_id == tenant_id,
CustomerOrder.status != "cancelled"
)
)
)
fulfilled_orders_query = await db.execute(
select(func.count()).where(
and_(
CustomerOrder.tenant_id == tenant_id,
CustomerOrder.status.in_(["delivered", "completed"])
)
)
)
total_orders_count = total_orders_query.scalar() or 0
fulfilled_orders_count = fulfilled_orders_query.scalar() or 0
fulfillment_rate = Decimal("0.0")
if total_orders_count > 0:
fulfillment_rate = Decimal(str(fulfilled_orders_count)) / Decimal(str(total_orders_count))
fulfillment_rate = fulfillment_rate * Decimal("100.0") # Convert to percentage
# Calculate on-time delivery rate
on_time_delivered_query = await db.execute(
select(func.count()).where(
and_(
CustomerOrder.tenant_id == tenant_id,
CustomerOrder.status == "delivered",
CustomerOrder.actual_delivery_date <= CustomerOrder.requested_delivery_date
)
)
)
total_delivered_query = await db.execute(
select(func.count()).where(
and_(
CustomerOrder.tenant_id == tenant_id,
CustomerOrder.status == "delivered"
)
)
)
on_time_delivered_count = on_time_delivered_query.scalar() or 0
total_delivered_count = total_delivered_query.scalar() or 0
on_time_delivery_rate = Decimal("0.0")
if total_delivered_count > 0:
on_time_delivery_rate = Decimal(str(on_time_delivered_count)) / Decimal(str(total_delivered_count))
on_time_delivery_rate = on_time_delivery_rate * Decimal("100.0") # Convert to percentage
return {
"total_orders_today": orders_today.scalar(),
"total_orders_this_week": orders_week.scalar(),
"total_orders_this_month": orders_month.scalar(),
"revenue_today": revenue_today.scalar(),
"revenue_this_week": revenue_week.scalar(),
"revenue_this_month": revenue_month.scalar(),
"status_breakdown": status_breakdown,
"average_order_value": avg_order_value.scalar(),
"repeat_customers_rate": repeat_customers_rate,
"fulfillment_rate": fulfillment_rate,
"on_time_delivery_rate": on_time_delivery_rate,
"repeat_customers_count": repeat_customers_count,
"total_customers_count": total_customers_count,
"total_orders_count": total_orders_count,
"fulfilled_orders_count": fulfilled_orders_count,
"on_time_delivered_count": on_time_delivered_count,
"total_delivered_count": total_delivered_count
}
except Exception as e:
logger.error("Error getting dashboard metrics", error=str(e))
raise
async def detect_business_model(
self,
db: AsyncSession,
tenant_id: UUID,
lookback_days: int = 30
) -> Optional[str]:
"""Detect business model based on order patterns"""
try:
cutoff_date = datetime.now().date() - timedelta(days=lookback_days)
# Analyze order patterns
query = select(
func.count().label("total_orders"),
func.avg(CustomerOrder.total_amount).label("avg_order_value"),
func.count(func.distinct(CustomerOrder.customer_id)).label("unique_customers"),
func.sum(
case(
(CustomerOrder.order_type == "rush", 1),
else_=0
)
).label("rush_orders"),
func.sum(
case(
(CustomerOrder.sales_channel == "wholesale", 1),
else_=0
)
).label("wholesale_orders")
).where(
and_(
CustomerOrder.tenant_id == tenant_id,
func.date(CustomerOrder.order_date) >= cutoff_date
)
)
result = await db.execute(query)
metrics = result.fetchone()
if not metrics or metrics.total_orders == 0:
return None
# Business model detection logic
orders_per_customer = metrics.total_orders / metrics.unique_customers
wholesale_ratio = metrics.wholesale_orders / metrics.total_orders
rush_ratio = metrics.rush_orders / metrics.total_orders
if wholesale_ratio > 0.6 or orders_per_customer > 20:
return "central_bakery"
else:
return "individual_bakery"
except Exception as e:
logger.error("Error detecting business model", error=str(e))
return None
class OrderItemRepository(BaseRepository[OrderItem, OrderItemCreate, OrderItemUpdate]):
"""Repository for order item operations"""
def __init__(self):
super().__init__(OrderItem)
async def get_items_by_order(
self,
db: AsyncSession,
order_id: UUID
) -> List[OrderItem]:
"""Get all items for an order"""
try:
query = select(OrderItem).where(OrderItem.order_id == order_id)
result = await db.execute(query)
return result.scalars().all()
except Exception as e:
logger.error("Error getting order items",
order_id=str(order_id),
error=str(e))
raise
class OrderStatusHistoryRepository(BaseRepository[OrderStatusHistory, dict, dict]):
"""Repository for order status history operations"""
def __init__(self):
super().__init__(OrderStatusHistory)
async def create_status_change(
self,
db: AsyncSession,
order_id: UUID,
from_status: Optional[str],
to_status: str,
change_reason: Optional[str] = None,
changed_by: Optional[UUID] = None,
event_data: Optional[Dict[str, Any]] = None
) -> OrderStatusHistory:
"""Create a status change record"""
try:
status_history = OrderStatusHistory(
order_id=order_id,
from_status=from_status,
to_status=to_status,
change_reason=change_reason,
changed_by=changed_by,
event_data=event_data
)
db.add(status_history)
await db.flush()
await db.refresh(status_history)
logger.info("Status change recorded",
order_id=str(order_id),
from_status=from_status,
to_status=to_status)
return status_history
except Exception as e:
logger.error("Error creating status change",
order_id=str(order_id),
error=str(e))
raise

View File

@@ -0,0 +1,283 @@
# ================================================================
# services/orders/app/schemas/order_schemas.py
# ================================================================
"""
Order-related Pydantic schemas for Orders Service
"""
from datetime import datetime, date
from decimal import Decimal
from typing import Optional, List, Dict, Any
from uuid import UUID
from pydantic import BaseModel, Field, validator
from app.models.enums import (
CustomerType, DeliveryMethod, PaymentTerms, PaymentMethod, PaymentStatus,
CustomerSegment, PriorityLevel, OrderType, OrderStatus, OrderSource,
SalesChannel, BusinessModel, DeliveryStatus
)
# ===== Customer Schemas =====
class CustomerBase(BaseModel):
name: str = Field(..., min_length=1, max_length=200)
business_name: Optional[str] = Field(None, max_length=200)
customer_type: CustomerType = Field(default=CustomerType.INDIVIDUAL)
email: Optional[str] = Field(None, max_length=255)
phone: Optional[str] = Field(None, max_length=50)
address_line1: Optional[str] = Field(None, max_length=255)
address_line2: Optional[str] = Field(None, max_length=255)
city: Optional[str] = Field(None, max_length=100)
state: Optional[str] = Field(None, max_length=100)
postal_code: Optional[str] = Field(None, max_length=20)
country: str = Field(default="US", max_length=100)
is_active: bool = Field(default=True)
preferred_delivery_method: DeliveryMethod = Field(default=DeliveryMethod.DELIVERY)
payment_terms: PaymentTerms = Field(default=PaymentTerms.IMMEDIATE)
credit_limit: Optional[Decimal] = Field(None, ge=0)
discount_percentage: Decimal = Field(default=Decimal("0.00"), ge=0, le=100)
customer_segment: CustomerSegment = Field(default=CustomerSegment.REGULAR)
priority_level: PriorityLevel = Field(default=PriorityLevel.NORMAL)
special_instructions: Optional[str] = None
delivery_preferences: Optional[Dict[str, Any]] = None
product_preferences: Optional[Dict[str, Any]] = None
class Config:
from_attributes = True
use_enum_values = True
class CustomerCreate(CustomerBase):
customer_code: str = Field(..., min_length=1, max_length=50)
class CustomerUpdate(BaseModel):
name: Optional[str] = Field(None, min_length=1, max_length=200)
business_name: Optional[str] = Field(None, max_length=200)
customer_type: Optional[CustomerType] = None
email: Optional[str] = Field(None, max_length=255)
phone: Optional[str] = Field(None, max_length=50)
address_line1: Optional[str] = Field(None, max_length=255)
address_line2: Optional[str] = Field(None, max_length=255)
city: Optional[str] = Field(None, max_length=100)
state: Optional[str] = Field(None, max_length=100)
postal_code: Optional[str] = Field(None, max_length=20)
country: Optional[str] = Field(None, max_length=100)
is_active: Optional[bool] = None
preferred_delivery_method: Optional[DeliveryMethod] = None
payment_terms: Optional[PaymentTerms] = None
credit_limit: Optional[Decimal] = Field(None, ge=0)
discount_percentage: Optional[Decimal] = Field(None, ge=0, le=100)
customer_segment: Optional[CustomerSegment] = None
priority_level: Optional[PriorityLevel] = None
special_instructions: Optional[str] = None
delivery_preferences: Optional[Dict[str, Any]] = None
product_preferences: Optional[Dict[str, Any]] = None
class Config:
from_attributes = True
use_enum_values = True
class CustomerResponse(CustomerBase):
id: UUID
tenant_id: UUID
customer_code: str
total_orders: int
total_spent: Decimal
average_order_value: Decimal
last_order_date: Optional[datetime]
created_at: datetime
updated_at: datetime
class Config:
from_attributes = True
# ===== Order Item Schemas =====
class OrderItemBase(BaseModel):
product_id: UUID
product_name: str = Field(..., min_length=1, max_length=200)
product_sku: Optional[str] = Field(None, max_length=100)
product_category: Optional[str] = Field(None, max_length=100)
quantity: Decimal = Field(..., gt=0)
unit_of_measure: str = Field(default="each", max_length=50)
weight: Optional[Decimal] = Field(None, ge=0)
unit_price: Decimal = Field(..., ge=0)
line_discount: Decimal = Field(default=Decimal("0.00"), ge=0)
product_specifications: Optional[Dict[str, Any]] = None
customization_details: Optional[str] = None
special_instructions: Optional[str] = None
recipe_id: Optional[UUID] = None
class OrderItemCreate(OrderItemBase):
pass
class OrderItemUpdate(BaseModel):
quantity: Optional[Decimal] = Field(None, gt=0)
unit_price: Optional[Decimal] = Field(None, ge=0)
line_discount: Optional[Decimal] = Field(None, ge=0)
product_specifications: Optional[Dict[str, Any]] = None
customization_details: Optional[str] = None
special_instructions: Optional[str] = None
class OrderItemResponse(OrderItemBase):
id: UUID
order_id: UUID
line_total: Decimal
status: str
created_at: datetime
updated_at: datetime
class Config:
from_attributes = True
# ===== Order Schemas =====
class OrderBase(BaseModel):
customer_id: UUID
order_type: OrderType = Field(default=OrderType.STANDARD)
priority: PriorityLevel = Field(default=PriorityLevel.NORMAL)
requested_delivery_date: datetime
delivery_method: DeliveryMethod = Field(default=DeliveryMethod.DELIVERY)
delivery_address: Optional[Dict[str, Any]] = None
delivery_instructions: Optional[str] = None
delivery_window_start: Optional[datetime] = None
delivery_window_end: Optional[datetime] = None
discount_percentage: Decimal = Field(default=Decimal("0.00"), ge=0, le=100)
delivery_fee: Decimal = Field(default=Decimal("0.00"), ge=0)
payment_method: Optional[PaymentMethod] = None
payment_terms: PaymentTerms = Field(default=PaymentTerms.IMMEDIATE)
special_instructions: Optional[str] = None
custom_requirements: Optional[Dict[str, Any]] = None
allergen_warnings: Optional[Dict[str, Any]] = None
order_source: OrderSource = Field(default=OrderSource.MANUAL)
sales_channel: SalesChannel = Field(default=SalesChannel.DIRECT)
order_origin: Optional[str] = Field(None, max_length=100)
communication_preferences: Optional[Dict[str, Any]] = None
class Config:
from_attributes = True
use_enum_values = True
class OrderCreate(OrderBase):
tenant_id: UUID
items: List[OrderItemCreate] = Field(..., min_items=1)
class OrderUpdate(BaseModel):
status: Optional[OrderStatus] = None
priority: Optional[PriorityLevel] = None
requested_delivery_date: Optional[datetime] = None
confirmed_delivery_date: Optional[datetime] = None
delivery_method: Optional[DeliveryMethod] = None
delivery_address: Optional[Dict[str, Any]] = None
delivery_instructions: Optional[str] = None
delivery_window_start: Optional[datetime] = None
delivery_window_end: Optional[datetime] = None
payment_method: Optional[PaymentMethod] = None
payment_status: Optional[PaymentStatus] = None
special_instructions: Optional[str] = None
custom_requirements: Optional[Dict[str, Any]] = None
allergen_warnings: Optional[Dict[str, Any]] = None
class Config:
from_attributes = True
use_enum_values = True
class OrderResponse(OrderBase):
id: UUID
tenant_id: UUID
order_number: str
status: str
order_date: datetime
confirmed_delivery_date: Optional[datetime]
actual_delivery_date: Optional[datetime]
subtotal: Decimal
discount_amount: Decimal
tax_amount: Decimal
total_amount: Decimal
payment_status: str
business_model: Optional[str]
estimated_business_model: Optional[str]
production_batch_id: Optional[UUID]
quality_score: Optional[Decimal]
customer_rating: Optional[int]
created_at: datetime
updated_at: datetime
items: List[OrderItemResponse] = []
class Config:
from_attributes = True
# ===== Dashboard and Analytics Schemas =====
class OrdersDashboardSummary(BaseModel):
"""Summary data for orders dashboard"""
# Current period metrics
total_orders_today: int
total_orders_this_week: int
total_orders_this_month: int
# Revenue metrics
revenue_today: Decimal
revenue_this_week: Decimal
revenue_this_month: Decimal
# Order status breakdown
pending_orders: int
confirmed_orders: int
in_production_orders: int
ready_orders: int
delivered_orders: int
# Customer metrics
total_customers: int
new_customers_this_month: int
repeat_customers_rate: Decimal
# Performance metrics
average_order_value: Decimal
order_fulfillment_rate: Decimal
on_time_delivery_rate: Decimal
# Business model detection
business_model: Optional[str]
business_model_confidence: Optional[Decimal]
# Recent activity
recent_orders: List[OrderResponse]
high_priority_orders: List[OrderResponse]
class DemandRequirements(BaseModel):
"""Demand requirements for production planning"""
date: date
tenant_id: UUID
# Product demand breakdown
product_demands: List[Dict[str, Any]]
# Aggregate metrics
total_orders: int
total_quantity: Decimal
total_value: Decimal
# Business context
business_model: Optional[str]
rush_orders_count: int
special_requirements: List[str]
# Timing requirements
earliest_delivery: datetime
latest_delivery: datetime
average_lead_time_hours: int

View File

@@ -0,0 +1,232 @@
# services/orders/app/services/approval_rules_service.py
"""
Approval Rules Service - Smart auto-approval logic for purchase orders
Evaluates POs against configurable business rules to determine if auto-approval is appropriate
"""
from typing import Dict, List, Any, Optional, Tuple
from decimal import Decimal
from uuid import UUID
import structlog
from shared.config.base import BaseServiceSettings
from shared.utils.tenant_settings_client import TenantSettingsClient
logger = structlog.get_logger()
class ApprovalRulesService:
"""
Service for evaluating purchase orders against approval rules
Implements smart auto-approval logic based on multiple criteria
Uses tenant-specific settings from the database instead of system-level config
"""
def __init__(self, config: BaseServiceSettings, tenant_id: UUID):
self.config = config
self.tenant_id = tenant_id
# Initialize tenant settings client
tenant_service_url = getattr(config, 'TENANT_SERVICE_URL', 'http://tenant-service:8000')
self.tenant_settings_client = TenantSettingsClient(tenant_service_url=tenant_service_url)
async def evaluate_po_for_auto_approval(
self,
po_data: Dict[str, Any],
supplier_data: Optional[Dict[str, Any]] = None,
requirements_data: Optional[List[Dict[str, Any]]] = None
) -> Tuple[bool, List[str]]:
"""
Evaluate if a PO should be auto-approved using tenant-specific settings
Returns:
Tuple of (should_auto_approve, reasons)
"""
# Fetch tenant-specific procurement settings
try:
tenant_settings = await self.tenant_settings_client.get_procurement_settings(self.tenant_id)
except Exception as e:
logger.error("Failed to fetch tenant settings, using safe defaults",
tenant_id=str(self.tenant_id),
error=str(e))
# Use safe defaults if settings unavailable
tenant_settings = {
'auto_approve_enabled': False,
'auto_approve_threshold_eur': 500.0,
'auto_approve_min_supplier_score': 0.80,
'require_approval_new_suppliers': True,
'require_approval_critical_items': True
}
# Check if auto-approval is enabled for this tenant
if not tenant_settings.get('auto_approve_enabled', True):
return False, ["Auto-approval is disabled in tenant settings"]
reasons = []
should_approve = True
# Rule 1: Amount threshold check
total_amount = self._calculate_po_total(po_data)
threshold = Decimal(str(tenant_settings.get('auto_approve_threshold_eur', 500.0)))
if total_amount > threshold:
should_approve = False
reasons.append(
f"PO amount €{total_amount:.2f} exceeds threshold €{threshold:.2f}"
)
else:
reasons.append(
f"PO amount €{total_amount:.2f} within threshold €{threshold:.2f}"
)
# Rule 2: Supplier trust score check
min_supplier_score = tenant_settings.get('auto_approve_min_supplier_score', 0.80)
if supplier_data:
supplier_score = supplier_data.get('trust_score', 0.0)
is_preferred = supplier_data.get('is_preferred_supplier', False)
auto_approve_enabled = supplier_data.get('auto_approve_enabled', False)
if supplier_score < min_supplier_score:
should_approve = False
reasons.append(
f"Supplier trust score {supplier_score:.2f} below minimum {min_supplier_score:.2f}"
)
else:
reasons.append(f"Supplier trust score {supplier_score:.2f} meets minimum requirements")
if not is_preferred:
should_approve = False
reasons.append("Supplier is not marked as preferred")
else:
reasons.append("Supplier is a preferred supplier")
if not auto_approve_enabled:
should_approve = False
reasons.append("Auto-approve is disabled for this supplier")
else:
reasons.append("Auto-approve is enabled for this supplier")
elif supplier_data is None:
should_approve = False
reasons.append("No supplier data available")
# Rule 3: New supplier check
require_approval_new_suppliers = tenant_settings.get('require_approval_new_suppliers', True)
if supplier_data and require_approval_new_suppliers:
total_pos = supplier_data.get('total_pos_count', 0)
if total_pos < 5:
should_approve = False
reasons.append(f"New supplier with only {total_pos} previous orders (minimum 5 required)")
else:
reasons.append(f"Established supplier with {total_pos} previous orders")
# Rule 4: Critical/urgent items check
require_approval_critical_items = tenant_settings.get('require_approval_critical_items', True)
if requirements_data and require_approval_critical_items:
critical_count = sum(
1 for req in requirements_data
if req.get('priority') in ['critical', 'urgent', 'CRITICAL', 'URGENT']
)
if critical_count > 0:
should_approve = False
reasons.append(f"Contains {critical_count} critical/urgent items requiring manual review")
else:
reasons.append("No critical/urgent items detected")
# Rule 5: Historical approval rate check
if supplier_data:
total_pos = supplier_data.get('total_pos_count', 0)
approved_pos = supplier_data.get('approved_pos_count', 0)
if total_pos > 0:
approval_rate = approved_pos / total_pos
if approval_rate < 0.95:
should_approve = False
reasons.append(
f"Historical approval rate {approval_rate:.1%} below 95% threshold"
)
else:
reasons.append(f"High historical approval rate {approval_rate:.1%}")
# Rule 6: PO priority check
priority = po_data.get('priority', 'normal')
if priority in ['urgent', 'critical', 'URGENT', 'CRITICAL']:
should_approve = False
reasons.append(f"PO priority is '{priority}' - requires manual review")
logger.info(
"PO auto-approval evaluation completed",
should_auto_approve=should_approve,
total_amount=float(total_amount),
supplier_id=supplier_data.get('id') if supplier_data else None,
reasons_count=len(reasons),
po_data=po_data.get('id') if isinstance(po_data, dict) else str(po_data)
)
return should_approve, reasons
def _calculate_po_total(self, po_data: Dict[str, Any]) -> Decimal:
"""Calculate total PO amount including tax and shipping"""
subtotal = Decimal(str(po_data.get('subtotal', 0)))
tax = Decimal(str(po_data.get('tax_amount', 0)))
shipping = Decimal(str(po_data.get('shipping_cost', 0)))
discount = Decimal(str(po_data.get('discount_amount', 0)))
total = subtotal + tax + shipping - discount
return total
def get_approval_summary(
self,
should_approve: bool,
reasons: List[str]
) -> Dict[str, Any]:
"""
Generate a human-readable approval summary
Returns:
Dict with summary data for UI display
"""
return {
"auto_approved": should_approve,
"decision": "APPROVED" if should_approve else "REQUIRES_MANUAL_APPROVAL",
"reasons": reasons,
"reason_count": len(reasons),
"summary": self._format_summary(should_approve, reasons)
}
def _format_summary(self, should_approve: bool, reasons: List[str]) -> str:
"""Format approval decision summary"""
if should_approve:
return f"Auto-approved: {', '.join(reasons[:2])}"
else:
failing_reasons = [r for r in reasons if any(
keyword in r.lower()
for keyword in ['exceeds', 'below', 'not', 'disabled', 'new', 'critical']
)]
if failing_reasons:
return f"Manual approval required: {failing_reasons[0]}"
return "Manual approval required"
def validate_approval_override(
self,
override_reason: str,
user_role: str
) -> Tuple[bool, Optional[str]]:
"""
Validate if a user can override auto-approval decision
Returns:
Tuple of (is_valid, error_message)
"""
# Only admin/owner can override
if user_role not in ['admin', 'owner']:
return False, "Insufficient permissions to override approval rules"
# Require a reason
if not override_reason or len(override_reason.strip()) < 10:
return False, "Override reason must be at least 10 characters"
return True, None

View File

@@ -0,0 +1,490 @@
# ================================================================
# services/orders/app/services/orders_service.py
# ================================================================
"""
Orders Service - Main business logic service
"""
import uuid
from datetime import datetime, date, timedelta
from decimal import Decimal
from typing import List, Optional, Dict, Any
from uuid import UUID
import structlog
from shared.clients import (
InventoryServiceClient,
ProductionServiceClient,
SalesServiceClient
)
from shared.database.transactions import transactional
from app.core.config import settings
from app.repositories.order_repository import (
OrderRepository,
CustomerRepository,
OrderItemRepository,
OrderStatusHistoryRepository
)
from app.schemas.order_schemas import (
OrderCreate,
OrderUpdate,
OrderResponse,
CustomerCreate,
CustomerUpdate,
DemandRequirements,
OrdersDashboardSummary
)
logger = structlog.get_logger()
class OrdersService:
"""Main service for orders operations"""
def __init__(
self,
order_repo: OrderRepository,
customer_repo: CustomerRepository,
order_item_repo: OrderItemRepository,
status_history_repo: OrderStatusHistoryRepository,
inventory_client: InventoryServiceClient,
production_client: ProductionServiceClient,
sales_client: SalesServiceClient,
):
self.order_repo = order_repo
self.customer_repo = customer_repo
self.order_item_repo = order_item_repo
self.status_history_repo = status_history_repo
self.inventory_client = inventory_client
self.production_client = production_client
self.sales_client = sales_client
async def create_order(
self,
db,
order_data: OrderCreate,
user_id: Optional[UUID] = None
) -> OrderResponse:
"""Create a new customer order with comprehensive processing"""
try:
logger.info("Creating new order",
customer_id=str(order_data.customer_id),
tenant_id=str(order_data.tenant_id))
# 1. Validate customer exists
customer = await self.customer_repo.get(
db,
order_data.customer_id,
order_data.tenant_id
)
if not customer:
raise ValueError(f"Customer {order_data.customer_id} not found")
# 2. Generate order number
order_number = await self._generate_order_number(db, order_data.tenant_id)
# 3. Calculate order totals
subtotal = sum(item.quantity * item.unit_price - item.line_discount
for item in order_data.items)
discount_amount = subtotal * (order_data.discount_percentage / 100)
tax_amount = (subtotal - discount_amount) * Decimal("0.08") # Configurable tax rate
total_amount = subtotal - discount_amount + tax_amount + order_data.delivery_fee
# 4. Create order record
order_dict = order_data.dict(exclude={"items"})
order_dict.update({
"order_number": order_number,
"subtotal": subtotal,
"discount_amount": discount_amount,
"tax_amount": tax_amount,
"total_amount": total_amount,
"status": "pending"
})
order = await self.order_repo.create(db, obj_in=order_dict, created_by=user_id)
# 5. Create order items
for item_data in order_data.items:
item_dict = item_data.dict()
item_dict.update({
"order_id": order.id,
"line_total": item_data.quantity * item_data.unit_price - item_data.line_discount
})
await self.order_item_repo.create(db, obj_in=item_dict)
# 6. Create initial status history
await self.status_history_repo.create_status_change(
db=db,
order_id=order.id,
from_status=None,
to_status="pending",
change_reason="Order created",
changed_by=user_id
)
# 7. Update customer metrics
await self.customer_repo.update_customer_metrics(
db, order.customer_id, total_amount, order.order_date
)
# 8. Business model detection
business_model = await self.detect_business_model(db, order_data.tenant_id)
if business_model:
order.business_model = business_model
# 10. Integrate with production service if auto-processing is enabled
if settings.ORDER_PROCESSING_ENABLED:
await self._notify_production_service(order)
logger.info("Order created successfully",
order_id=str(order.id),
order_number=order_number,
total_amount=str(total_amount))
# Return order with items loaded
return await self.get_order_with_items(db, order.id, order_data.tenant_id)
except Exception as e:
logger.error("Error creating order", error=str(e))
raise
async def get_order_with_items(
self,
db,
order_id: UUID,
tenant_id: UUID
) -> Optional[OrderResponse]:
"""Get order with all related data"""
try:
order = await self.order_repo.get_with_items(db, order_id, tenant_id)
if not order:
return None
return OrderResponse.from_orm(order)
except Exception as e:
logger.error("Error getting order with items",
order_id=str(order_id),
error=str(e))
raise
async def update_order_status(
self,
db,
order_id: UUID,
tenant_id: UUID,
new_status: str,
user_id: Optional[UUID] = None,
reason: Optional[str] = None
) -> Optional[OrderResponse]:
"""Update order status with proper tracking"""
try:
order = await self.order_repo.get(db, order_id, tenant_id)
if not order:
return None
old_status = order.status
# Update order status
order.status = new_status
if new_status == "confirmed":
order.confirmed_delivery_date = order.requested_delivery_date
elif new_status == "delivered":
order.actual_delivery_date = datetime.now()
# Record status change
await self.status_history_repo.create_status_change(
db=db,
order_id=order_id,
from_status=old_status,
to_status=new_status,
change_reason=reason,
changed_by=user_id
)
# Customer notifications
await self._send_status_notification(order, old_status, new_status)
logger.info("Order status updated",
order_id=str(order_id),
old_status=old_status,
new_status=new_status)
return await self.get_order_with_items(db, order_id, tenant_id)
except Exception as e:
logger.error("Error updating order status",
order_id=str(order_id),
error=str(e))
raise
async def get_demand_requirements(
self,
db,
tenant_id: UUID,
target_date: date
) -> DemandRequirements:
"""Get demand requirements for production planning"""
try:
logger.info("Calculating demand requirements",
tenant_id=str(tenant_id),
target_date=str(target_date))
# Get orders for target date
orders = await self.order_repo.get_pending_orders_by_delivery_date(
db, tenant_id, target_date
)
# Aggregate product demands
product_demands = {}
total_orders = len(orders)
total_quantity = Decimal("0")
total_value = Decimal("0")
rush_orders_count = 0
special_requirements = []
earliest_delivery = None
latest_delivery = None
for order in orders:
total_value += order.total_amount
if order.order_type == "rush":
rush_orders_count += 1
if order.special_instructions:
special_requirements.append(order.special_instructions)
# Track delivery timing
if not earliest_delivery or order.requested_delivery_date < earliest_delivery:
earliest_delivery = order.requested_delivery_date
if not latest_delivery or order.requested_delivery_date > latest_delivery:
latest_delivery = order.requested_delivery_date
# Aggregate product demands
for item in order.items:
product_id = str(item.product_id)
if product_id not in product_demands:
product_demands[product_id] = {
"product_id": product_id,
"product_name": item.product_name,
"total_quantity": Decimal("0"),
"unit_of_measure": item.unit_of_measure,
"orders_count": 0,
"rush_quantity": Decimal("0"),
"special_requirements": []
}
product_demands[product_id]["total_quantity"] += item.quantity
product_demands[product_id]["orders_count"] += 1
total_quantity += item.quantity
if order.order_type == "rush":
product_demands[product_id]["rush_quantity"] += item.quantity
if item.special_instructions:
product_demands[product_id]["special_requirements"].append(
item.special_instructions
)
# Calculate average lead time
average_lead_time_hours = 24 # Default
if earliest_delivery and latest_delivery:
time_diff = latest_delivery - earliest_delivery
average_lead_time_hours = max(24, int(time_diff.total_seconds() / 3600))
# Detect business model
business_model = await self.detect_business_model(db, tenant_id)
return DemandRequirements(
date=target_date,
tenant_id=tenant_id,
product_demands=list(product_demands.values()),
total_orders=total_orders,
total_quantity=total_quantity,
total_value=total_value,
business_model=business_model,
rush_orders_count=rush_orders_count,
special_requirements=list(set(special_requirements)),
earliest_delivery=earliest_delivery or datetime.combine(target_date, datetime.min.time()),
latest_delivery=latest_delivery or datetime.combine(target_date, datetime.max.time()),
average_lead_time_hours=average_lead_time_hours
)
except Exception as e:
logger.error("Error calculating demand requirements",
tenant_id=str(tenant_id),
error=str(e))
raise
async def get_dashboard_summary(
self,
db,
tenant_id: UUID
) -> OrdersDashboardSummary:
"""Get dashboard summary data"""
try:
# Get basic metrics
metrics = await self.order_repo.get_dashboard_metrics(db, tenant_id)
# Get customer counts
total_customers = await self.customer_repo.count(
db, tenant_id, filters={"is_active": True}
)
# Get new customers this month
month_start = datetime.now().replace(day=1, hour=0, minute=0, second=0, microsecond=0)
new_customers_this_month = await self.customer_repo.count_created_since(
db,
tenant_id,
month_start
)
# Get recent orders
recent_orders = await self.order_repo.get_multi(
db, tenant_id, limit=5, order_by="order_date", order_desc=True
)
# Get high priority orders
high_priority_orders = await self.order_repo.get_multi(
db,
tenant_id,
filters={"priority": "high", "status": ["pending", "confirmed", "in_production"]},
limit=10
)
# Detect business model
business_model = await self.detect_business_model(db, tenant_id)
# Calculate performance metrics from actual data
fulfillment_rate = metrics.get("fulfillment_rate", Decimal("0.0")) # Use actual calculated rate
on_time_delivery_rate = metrics.get("on_time_delivery_rate", Decimal("0.0")) # Use actual calculated rate
repeat_customers_rate = metrics.get("repeat_customers_rate", Decimal("0.0")) # Use actual calculated rate
# Use the actual calculated values from the repository
order_fulfillment_rate = metrics.get("fulfillment_rate", Decimal("0.0"))
on_time_delivery_rate_metric = metrics.get("on_time_delivery_rate", Decimal("0.0"))
repeat_customers_rate_metric = metrics.get("repeat_customers_rate", Decimal("0.0"))
return OrdersDashboardSummary(
total_orders_today=metrics["total_orders_today"],
total_orders_this_week=metrics["total_orders_this_week"],
total_orders_this_month=metrics["total_orders_this_month"],
revenue_today=metrics["revenue_today"],
revenue_this_week=metrics["revenue_this_week"],
revenue_this_month=metrics["revenue_this_month"],
pending_orders=metrics["status_breakdown"].get("pending", 0),
confirmed_orders=metrics["status_breakdown"].get("confirmed", 0),
in_production_orders=metrics["status_breakdown"].get("in_production", 0),
ready_orders=metrics["status_breakdown"].get("ready", 0),
delivered_orders=metrics["status_breakdown"].get("delivered", 0),
total_customers=total_customers,
new_customers_this_month=new_customers_this_month,
repeat_customers_rate=repeat_customers_rate_metric,
average_order_value=metrics["average_order_value"],
order_fulfillment_rate=order_fulfillment_rate,
on_time_delivery_rate=on_time_delivery_rate_metric,
business_model=business_model,
business_model_confidence=Decimal("85.0") if business_model else None,
recent_orders=[OrderResponse.from_orm(order) for order in recent_orders],
high_priority_orders=[OrderResponse.from_orm(order) for order in high_priority_orders]
)
except Exception as e:
logger.error("Error getting dashboard summary", error=str(e))
raise
async def detect_business_model(
self,
db,
tenant_id: UUID
) -> Optional[str]:
"""Detect business model based on order patterns"""
try:
if not settings.ENABLE_BUSINESS_MODEL_DETECTION:
return None
return await self.order_repo.detect_business_model(db, tenant_id)
except Exception as e:
logger.error("Error detecting business model", error=str(e))
return None
# ===== Private Helper Methods =====
async def _generate_order_number(self, db, tenant_id: UUID) -> str:
"""Generate unique order number"""
try:
# Simple format: ORD-YYYYMMDD-XXXX
today = datetime.now()
date_part = today.strftime("%Y%m%d")
# Get count of orders today for this tenant
today_start = today.replace(hour=0, minute=0, second=0, microsecond=0)
today_end = today.replace(hour=23, minute=59, second=59, microsecond=999999)
count = await self.order_repo.count(
db,
tenant_id,
filters={
"order_date": {"gte": today_start, "lte": today_end}
}
)
sequence = count + 1
return f"ORD-{date_part}-{sequence:04d}"
except Exception as e:
logger.error("Error generating order number", error=str(e))
# Fallback to UUID
return f"ORD-{uuid.uuid4().hex[:8].upper()}"
async def _notify_production_service(self, order):
"""Notify production service of new order"""
try:
if self.production_client:
await self.production_client.notify_new_order(
str(order.tenant_id),
{
"order_id": str(order.id),
"order_number": order.order_number,
"delivery_date": order.requested_delivery_date.isoformat(),
"priority": order.priority,
"items": [
{
"product_id": str(item.product_id),
"quantity": float(item.quantity),
"unit_of_measure": item.unit_of_measure
}
for item in order.items
]
}
)
except Exception as e:
logger.warning("Failed to notify production service",
order_id=str(order.id),
error=str(e))
async def _send_status_notification(self, order, old_status: str, new_status: str):
"""Send customer notification for status change"""
try:
if self.notification_client and order.customer:
message = f"Order {order.order_number} status changed from {old_status} to {new_status}"
await self.notification_client.send_notification(
tenant_id=str(order.tenant_id),
notification_type="email",
message=message,
recipient_email=order.customer.email,
subject=f"Order {order.order_number} Status Update",
priority="normal",
metadata={
"order_id": str(order.id),
"order_number": order.order_number,
"old_status": old_status,
"new_status": new_status
}
)
except Exception as e:
logger.warning("Failed to send status notification",
order_id=str(order.id),
error=str(e))

View File

@@ -0,0 +1,251 @@
# services/orders/app/services/procurement_notification_service.py
"""
Procurement Notification Service - Send alerts and notifications for procurement events using EventPublisher
Handles PO approval notifications, reminders, escalations, and summaries
"""
from typing import Dict, List, Any, Optional
from uuid import UUID
from datetime import datetime, timezone
import structlog
from shared.config.base import BaseServiceSettings
from shared.messaging import UnifiedEventPublisher
logger = structlog.get_logger()
class ProcurementNotificationService:
"""Service for sending procurement-related notifications and alerts using EventPublisher"""
def __init__(self, event_publisher: UnifiedEventPublisher):
self.publisher = event_publisher
async def send_pos_pending_approval_alert(
self,
tenant_id: UUID,
pos_data: List[Dict[str, Any]]
):
"""
Send alert when new POs are created and need approval
Groups POs and sends a summary notification
"""
try:
if not pos_data:
return
# Calculate totals
total_amount = sum(float(po.get('total_amount', 0)) for po in pos_data)
critical_count = sum(1 for po in pos_data if po.get('priority') in ['high', 'critical', 'urgent'])
# Determine severity based on amount and urgency
severity = "medium"
if critical_count > 0 or total_amount > 5000:
severity = "high"
elif total_amount > 10000:
severity = "urgent"
metadata = {
"tenant_id": str(tenant_id),
"pos_count": len(pos_data),
"total_amount": total_amount,
"critical_count": critical_count,
"pos": [
{
"po_id": po.get("po_id"),
"po_number": po.get("po_number"),
"supplier_id": po.get("supplier_id"),
"total_amount": po.get("total_amount"),
"auto_approved": po.get("auto_approved", False)
}
for po in pos_data
],
"action_required": True,
"action_url": "/app/comprar"
}
await self.publisher.publish_alert(
event_type="procurement.pos_pending_approval",
tenant_id=tenant_id,
severity=severity,
data=metadata
)
logger.info("POs pending approval alert sent",
tenant_id=str(tenant_id),
pos_count=len(pos_data),
total_amount=total_amount)
except Exception as e:
logger.error("Error sending POs pending approval alert",
tenant_id=str(tenant_id),
error=str(e))
async def send_approval_reminder(
self,
tenant_id: UUID,
po_data: Dict[str, Any],
hours_pending: int
):
"""
Send reminder for POs that haven't been approved within threshold
"""
try:
# Determine severity based on pending hours
severity = "medium" if hours_pending < 36 else "high"
metadata = {
"tenant_id": str(tenant_id),
"po_id": po_data.get("po_id"),
"po_number": po_data.get("po_number"),
"supplier_name": po_data.get("supplier_name"),
"total_amount": po_data.get("total_amount"),
"hours_pending": hours_pending,
"created_at": po_data.get("created_at"),
"action_required": True,
"action_url": f"/app/comprar?po={po_data.get('po_id')}"
}
await self.publisher.publish_alert(
event_type="procurement.approval_reminder",
tenant_id=tenant_id,
severity=severity,
data=metadata
)
logger.info("Approval reminder sent",
tenant_id=str(tenant_id),
po_id=po_data.get("po_id"),
hours_pending=hours_pending)
except Exception as e:
logger.error("Error sending approval reminder",
tenant_id=str(tenant_id),
po_id=po_data.get("po_id"),
error=str(e))
async def send_critical_po_escalation(
self,
tenant_id: UUID,
po_data: Dict[str, Any],
hours_pending: int
):
"""
Send escalation alert for critical/urgent POs not approved in time
"""
try:
metadata = {
"tenant_id": str(tenant_id),
"po_id": po_data.get("po_id"),
"po_number": po_data.get("po_number"),
"supplier_name": po_data.get("supplier_name"),
"total_amount": po_data.get("total_amount"),
"priority": po_data.get("priority"),
"required_delivery_date": po_data.get("required_delivery_date"),
"hours_pending": hours_pending,
"escalated": True,
"action_required": True,
"action_url": f"/app/comprar?po={po_data.get('po_id')}"
}
await self.publisher.publish_alert(
event_type="procurement.critical_po_escalation",
tenant_id=tenant_id,
severity="urgent",
data=metadata
)
logger.warning("Critical PO escalation sent",
tenant_id=str(tenant_id),
po_id=po_data.get("po_id"),
hours_pending=hours_pending)
except Exception as e:
logger.error("Error sending critical PO escalation",
tenant_id=str(tenant_id),
po_id=po_data.get("po_id"),
error=str(e))
async def send_auto_approval_summary(
self,
tenant_id: UUID,
summary_data: Dict[str, Any]
):
"""
Send daily summary of auto-approved POs
"""
try:
auto_approved_count = summary_data.get("auto_approved_count", 0)
total_amount = summary_data.get("total_auto_approved_amount", 0)
manual_approval_count = summary_data.get("manual_approval_count", 0)
if auto_approved_count == 0 and manual_approval_count == 0:
# No activity, skip notification
return
metadata = {
"tenant_id": str(tenant_id),
"auto_approved_count": auto_approved_count,
"total_auto_approved_amount": total_amount,
"manual_approval_count": manual_approval_count,
"summary_date": summary_data.get("date"),
"auto_approved_pos": summary_data.get("auto_approved_pos", []),
"pending_approval_pos": summary_data.get("pending_approval_pos", []),
"action_url": "/app/comprar"
}
await self.publisher.publish_notification(
event_type="procurement.auto_approval_summary",
tenant_id=tenant_id,
data=metadata
)
logger.info("Auto-approval summary sent",
tenant_id=str(tenant_id),
auto_approved_count=auto_approved_count,
manual_count=manual_approval_count)
except Exception as e:
logger.error("Error sending auto-approval summary",
tenant_id=str(tenant_id),
error=str(e))
async def send_po_approved_confirmation(
self,
tenant_id: UUID,
po_data: Dict[str, Any],
approved_by: str,
auto_approved: bool = False
):
"""
Send confirmation when a PO is approved
"""
try:
metadata = {
"tenant_id": str(tenant_id),
"po_id": po_data.get("po_id"),
"po_number": po_data.get("po_number"),
"supplier_name": po_data.get("supplier_name"),
"total_amount": po_data.get("total_amount"),
"approved_by": approved_by,
"auto_approved": auto_approved,
"approved_at": datetime.now(timezone.utc).isoformat(),
"action_url": f"/app/comprar?po={po_data.get('po_id')}"
}
await self.publisher.publish_notification(
event_type="procurement.po_approved_confirmation",
tenant_id=tenant_id,
data=metadata
)
logger.info("PO approved confirmation sent",
tenant_id=str(tenant_id),
po_id=po_data.get("po_id"),
auto_approved=auto_approved)
except Exception as e:
logger.error("Error sending PO approved confirmation",
tenant_id=str(tenant_id),
po_id=po_data.get("po_id"),
error=str(e))

View File

@@ -0,0 +1,339 @@
# services/orders/app/services/smart_procurement_calculator.py
"""
Smart Procurement Calculator
Implements multi-constraint procurement quantity optimization combining:
- AI demand forecasting
- Ingredient reorder rules (reorder_point, reorder_quantity)
- Supplier constraints (minimum_order_quantity, minimum_order_amount)
- Storage limits (max_stock_level)
- Price tier optimization
"""
import math
from decimal import Decimal
from typing import Dict, Any, List, Tuple, Optional
import structlog
logger = structlog.get_logger()
class SmartProcurementCalculator:
"""
Smart procurement quantity calculator with multi-tier constraint optimization
"""
def __init__(self, procurement_settings: Dict[str, Any]):
"""
Initialize calculator with tenant procurement settings
Args:
procurement_settings: Tenant settings dict with flags:
- use_reorder_rules: bool
- economic_rounding: bool
- respect_storage_limits: bool
- use_supplier_minimums: bool
- optimize_price_tiers: bool
"""
self.use_reorder_rules = procurement_settings.get('use_reorder_rules', True)
self.economic_rounding = procurement_settings.get('economic_rounding', True)
self.respect_storage_limits = procurement_settings.get('respect_storage_limits', True)
self.use_supplier_minimums = procurement_settings.get('use_supplier_minimums', True)
self.optimize_price_tiers = procurement_settings.get('optimize_price_tiers', True)
def calculate_procurement_quantity(
self,
ingredient: Dict[str, Any],
supplier: Optional[Dict[str, Any]],
price_list_entry: Optional[Dict[str, Any]],
ai_forecast_quantity: Decimal,
current_stock: Decimal,
safety_stock_percentage: Decimal = Decimal('20.0')
) -> Dict[str, Any]:
"""
Calculate optimal procurement quantity using smart hybrid approach
Args:
ingredient: Ingredient data with reorder_point, reorder_quantity, max_stock_level
supplier: Supplier data with minimum_order_amount
price_list_entry: Price list with minimum_order_quantity, tier_pricing
ai_forecast_quantity: AI-predicted demand quantity
current_stock: Current stock level
safety_stock_percentage: Safety stock buffer percentage
Returns:
Dict with:
- order_quantity: Final calculated quantity to order
- calculation_method: Method used (e.g., 'REORDER_POINT_TRIGGERED')
- ai_suggested_quantity: Original AI forecast
- adjusted_quantity: Final quantity after constraints
- adjustment_reason: Human-readable explanation
- warnings: List of warnings/notes
- supplier_minimum_applied: bool
- storage_limit_applied: bool
- reorder_rule_applied: bool
- price_tier_applied: Dict or None
"""
warnings = []
result = {
'ai_suggested_quantity': ai_forecast_quantity,
'supplier_minimum_applied': False,
'storage_limit_applied': False,
'reorder_rule_applied': False,
'price_tier_applied': None
}
# Extract ingredient parameters
reorder_point = Decimal(str(ingredient.get('reorder_point', 0)))
reorder_quantity = Decimal(str(ingredient.get('reorder_quantity', 0)))
low_stock_threshold = Decimal(str(ingredient.get('low_stock_threshold', 0)))
max_stock_level = Decimal(str(ingredient.get('max_stock_level') or 'Infinity'))
# Extract supplier/price list parameters
supplier_min_qty = Decimal('0')
supplier_min_amount = Decimal('0')
tier_pricing = []
if price_list_entry:
supplier_min_qty = Decimal(str(price_list_entry.get('minimum_order_quantity', 0)))
tier_pricing = price_list_entry.get('tier_pricing') or []
if supplier:
supplier_min_amount = Decimal(str(supplier.get('minimum_order_amount', 0)))
# Calculate AI-based net requirement with safety stock
safety_stock = ai_forecast_quantity * (safety_stock_percentage / Decimal('100'))
total_needed = ai_forecast_quantity + safety_stock
ai_net_requirement = max(Decimal('0'), total_needed - current_stock)
# TIER 1: Critical Safety Check (Emergency Override)
if self.use_reorder_rules and current_stock <= low_stock_threshold:
base_order = max(reorder_quantity, ai_net_requirement)
result['calculation_method'] = 'CRITICAL_STOCK_EMERGENCY'
result['reorder_rule_applied'] = True
warnings.append(f"CRITICAL: Stock ({current_stock}) below threshold ({low_stock_threshold})")
order_qty = base_order
# TIER 2: Reorder Point Triggered
elif self.use_reorder_rules and current_stock <= reorder_point:
base_order = max(reorder_quantity, ai_net_requirement)
result['calculation_method'] = 'REORDER_POINT_TRIGGERED'
result['reorder_rule_applied'] = True
warnings.append(f"Reorder point triggered: stock ({current_stock}) ≤ reorder point ({reorder_point})")
order_qty = base_order
# TIER 3: Forecast-Driven (Above reorder point, no immediate need)
elif ai_net_requirement > 0:
order_qty = ai_net_requirement
result['calculation_method'] = 'FORECAST_DRIVEN_PROACTIVE'
warnings.append(f"AI forecast suggests ordering {ai_net_requirement} units")
# TIER 4: No Order Needed
else:
result['order_quantity'] = Decimal('0')
result['adjusted_quantity'] = Decimal('0')
result['calculation_method'] = 'SUFFICIENT_STOCK'
result['adjustment_reason'] = f"Current stock ({current_stock}) is sufficient. No order needed."
result['warnings'] = warnings
return result
# Apply Economic Rounding (reorder_quantity multiples)
if self.economic_rounding and reorder_quantity > 0:
multiples = math.ceil(float(order_qty / reorder_quantity))
rounded_qty = Decimal(multiples) * reorder_quantity
if rounded_qty > order_qty:
warnings.append(f"Rounded to {multiples}× reorder quantity ({reorder_quantity}) = {rounded_qty}")
order_qty = rounded_qty
# Apply Supplier Minimum Quantity Constraint
if self.use_supplier_minimums and supplier_min_qty > 0:
if order_qty < supplier_min_qty:
warnings.append(f"Increased from {order_qty} to supplier minimum ({supplier_min_qty})")
order_qty = supplier_min_qty
result['supplier_minimum_applied'] = True
else:
# Round to multiples of minimum_order_quantity (packaging constraint)
multiples = math.ceil(float(order_qty / supplier_min_qty))
rounded_qty = Decimal(multiples) * supplier_min_qty
if rounded_qty > order_qty:
warnings.append(f"Rounded to {multiples}× supplier packaging ({supplier_min_qty}) = {rounded_qty}")
result['supplier_minimum_applied'] = True
order_qty = rounded_qty
# Apply Price Tier Optimization
if self.optimize_price_tiers and tier_pricing and price_list_entry:
unit_price = Decimal(str(price_list_entry.get('unit_price', 0)))
tier_result = self._optimize_price_tier(
order_qty,
unit_price,
tier_pricing,
current_stock,
max_stock_level
)
if tier_result['tier_applied']:
order_qty = tier_result['optimized_quantity']
result['price_tier_applied'] = tier_result['tier_info']
warnings.append(tier_result['message'])
# Apply Storage Capacity Constraint
if self.respect_storage_limits and max_stock_level != Decimal('Infinity'):
if (current_stock + order_qty) > max_stock_level:
capped_qty = max(Decimal('0'), max_stock_level - current_stock)
warnings.append(f"Capped from {order_qty} to {capped_qty} due to storage limit ({max_stock_level})")
order_qty = capped_qty
result['storage_limit_applied'] = True
result['calculation_method'] += '_STORAGE_LIMITED'
# Check supplier minimum_order_amount (total order value constraint)
if self.use_supplier_minimums and supplier_min_amount > 0 and price_list_entry:
unit_price = Decimal(str(price_list_entry.get('unit_price', 0)))
order_value = order_qty * unit_price
if order_value < supplier_min_amount:
warnings.append(
f"⚠️ Order value €{order_value:.2f} < supplier minimum €{supplier_min_amount:.2f}. "
"This item needs to be combined with other products in the same PO."
)
result['calculation_method'] += '_NEEDS_CONSOLIDATION'
# Build final result
result['order_quantity'] = order_qty
result['adjusted_quantity'] = order_qty
result['adjustment_reason'] = self._build_adjustment_reason(
ai_forecast_quantity,
ai_net_requirement,
order_qty,
warnings,
result
)
result['warnings'] = warnings
return result
def _optimize_price_tier(
self,
current_qty: Decimal,
base_unit_price: Decimal,
tier_pricing: List[Dict[str, Any]],
current_stock: Decimal,
max_stock_level: Decimal
) -> Dict[str, Any]:
"""
Optimize order quantity to capture volume discount tiers if beneficial
Args:
current_qty: Current calculated order quantity
base_unit_price: Base unit price without tiers
tier_pricing: List of tier dicts with 'quantity' and 'price'
current_stock: Current stock level
max_stock_level: Maximum storage capacity
Returns:
Dict with tier_applied (bool), optimized_quantity, tier_info, message
"""
if not tier_pricing:
return {'tier_applied': False, 'optimized_quantity': current_qty}
# Sort tiers by quantity
sorted_tiers = sorted(tier_pricing, key=lambda x: x['quantity'])
best_tier = None
best_savings = Decimal('0')
for tier in sorted_tiers:
tier_qty = Decimal(str(tier['quantity']))
tier_price = Decimal(str(tier['price']))
# Skip if tier quantity is below current quantity (already captured)
if tier_qty <= current_qty:
continue
# Skip if tier would exceed storage capacity
if self.respect_storage_limits and (current_stock + tier_qty) > max_stock_level:
continue
# Skip if tier is more than 50% above current quantity (too much excess)
if tier_qty > current_qty * Decimal('1.5'):
continue
# Calculate savings
current_cost = current_qty * base_unit_price
tier_cost = tier_qty * tier_price
savings = current_cost - tier_cost
if savings > best_savings:
best_savings = savings
best_tier = {
'quantity': tier_qty,
'price': tier_price,
'savings': savings
}
if best_tier:
return {
'tier_applied': True,
'optimized_quantity': best_tier['quantity'],
'tier_info': best_tier,
'message': (
f"Upgraded to {best_tier['quantity']} units "
f"@ €{best_tier['price']}/unit "
f"(saves €{best_tier['savings']:.2f})"
)
}
return {'tier_applied': False, 'optimized_quantity': current_qty}
def _build_adjustment_reason(
self,
ai_forecast: Decimal,
ai_net_requirement: Decimal,
final_quantity: Decimal,
warnings: List[str],
result: Dict[str, Any]
) -> str:
"""
Build human-readable explanation of quantity adjustments
Args:
ai_forecast: Original AI forecast
ai_net_requirement: AI forecast + safety stock - current stock
final_quantity: Final order quantity after all adjustments
warnings: List of warning messages
result: Calculation result dict
Returns:
Human-readable adjustment explanation
"""
parts = []
# Start with calculation method
method = result.get('calculation_method', 'UNKNOWN')
parts.append(f"Method: {method.replace('_', ' ').title()}")
# AI forecast base
parts.append(f"AI Forecast: {ai_forecast} units, Net Requirement: {ai_net_requirement} units")
# Adjustments applied
adjustments = []
if result.get('reorder_rule_applied'):
adjustments.append("reorder rules")
if result.get('supplier_minimum_applied'):
adjustments.append("supplier minimums")
if result.get('storage_limit_applied'):
adjustments.append("storage limits")
if result.get('price_tier_applied'):
adjustments.append("price tier optimization")
if adjustments:
parts.append(f"Adjustments: {', '.join(adjustments)}")
# Final quantity
parts.append(f"Final Quantity: {final_quantity} units")
# Key warnings
if warnings:
key_warnings = [w for w in warnings if '⚠️' in w or 'CRITICAL' in w or 'saves €' in w]
if key_warnings:
parts.append(f"Notes: {'; '.join(key_warnings)}")
return " | ".join(parts)

View File

@@ -0,0 +1,140 @@
"""
Orders Service - Tenant Data Deletion
Handles deletion of all order-related data for a tenant
"""
from typing import Dict
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import select, delete, func
import structlog
from shared.services.tenant_deletion import BaseTenantDataDeletionService, TenantDataDeletionResult
from app.models.order import CustomerOrder, OrderItem, OrderStatusHistory
from app.models.customer import Customer, CustomerContact
logger = structlog.get_logger()
class OrdersTenantDeletionService(BaseTenantDataDeletionService):
"""Service for deleting all orders-related data for a tenant"""
def __init__(self, db_session: AsyncSession):
super().__init__("orders-service")
self.db = db_session
async def get_tenant_data_preview(self, tenant_id: str) -> Dict[str, int]:
"""Get counts of what would be deleted"""
try:
preview = {}
# Count orders
order_count = await self.db.scalar(
select(func.count(CustomerOrder.id)).where(CustomerOrder.tenant_id == tenant_id)
)
preview["orders"] = order_count or 0
# Count order items (will be deleted via CASCADE)
order_item_count = await self.db.scalar(
select(func.count(OrderItem.id))
.join(CustomerOrder)
.where(CustomerOrder.tenant_id == tenant_id)
)
preview["order_items"] = order_item_count or 0
# Count order status history (will be deleted via CASCADE)
status_history_count = await self.db.scalar(
select(func.count(OrderStatusHistory.id))
.join(CustomerOrder)
.where(CustomerOrder.tenant_id == tenant_id)
)
preview["order_status_history"] = status_history_count or 0
# Count customers
customer_count = await self.db.scalar(
select(func.count(Customer.id)).where(Customer.tenant_id == tenant_id)
)
preview["customers"] = customer_count or 0
# Count customer contacts (will be deleted via CASCADE)
contact_count = await self.db.scalar(
select(func.count(CustomerContact.id))
.join(Customer)
.where(Customer.tenant_id == tenant_id)
)
preview["customer_contacts"] = contact_count or 0
return preview
except Exception as e:
logger.error("Error getting deletion preview",
tenant_id=tenant_id,
error=str(e))
return {}
async def delete_tenant_data(self, tenant_id: str) -> TenantDataDeletionResult:
"""Delete all data for a tenant"""
result = TenantDataDeletionResult(tenant_id, self.service_name)
try:
# Get preview before deletion for reporting
preview = await self.get_tenant_data_preview(tenant_id)
# Delete customers (CASCADE will delete customer_contacts)
try:
customer_delete = await self.db.execute(
delete(Customer).where(Customer.tenant_id == tenant_id)
)
deleted_customers = customer_delete.rowcount
result.add_deleted_items("customers", deleted_customers)
# Customer contacts are deleted via CASCADE
result.add_deleted_items("customer_contacts", preview.get("customer_contacts", 0))
logger.info("Deleted customers for tenant",
tenant_id=tenant_id,
count=deleted_customers)
except Exception as e:
logger.error("Error deleting customers",
tenant_id=tenant_id,
error=str(e))
result.add_error(f"Customer deletion: {str(e)}")
# Delete orders (CASCADE will delete order_items and order_status_history)
try:
order_delete = await self.db.execute(
delete(CustomerOrder).where(CustomerOrder.tenant_id == tenant_id)
)
deleted_orders = order_delete.rowcount
result.add_deleted_items("orders", deleted_orders)
# Order items and status history are deleted via CASCADE
result.add_deleted_items("order_items", preview.get("order_items", 0))
result.add_deleted_items("order_status_history", preview.get("order_status_history", 0))
logger.info("Deleted orders for tenant",
tenant_id=tenant_id,
count=deleted_orders)
except Exception as e:
logger.error("Error deleting orders",
tenant_id=tenant_id,
error=str(e))
result.add_error(f"Order deletion: {str(e)}")
# Commit all deletions
await self.db.commit()
logger.info("Tenant data deletion completed",
tenant_id=tenant_id,
deleted_counts=result.deleted_counts)
except Exception as e:
logger.error("Fatal error during tenant data deletion",
tenant_id=tenant_id,
error=str(e))
await self.db.rollback()
result.add_error(f"Fatal error: {str(e)}")
return result

View File

@@ -0,0 +1,141 @@
"""Alembic environment configuration for orders service"""
import asyncio
import os
import sys
from logging.config import fileConfig
from sqlalchemy import pool
from sqlalchemy.engine import Connection
from sqlalchemy.ext.asyncio import async_engine_from_config
from alembic import context
# Add the service directory to the Python path
service_path = os.path.abspath(os.path.join(os.path.dirname(__file__), ".."))
if service_path not in sys.path:
sys.path.insert(0, service_path)
# Add shared modules to path
shared_path = os.path.abspath(os.path.join(os.path.dirname(__file__), "..", "..", "shared"))
if shared_path not in sys.path:
sys.path.insert(0, shared_path)
try:
from app.core.config import settings
from shared.database.base import Base
# Import all models to ensure they are registered with Base.metadata
from app.models import * # noqa: F401, F403
except ImportError as e:
print(f"Import error in migrations env.py: {e}")
print(f"Current Python path: {sys.path}")
raise
# this is the Alembic Config object
config = context.config
# Determine service name from file path
service_name = os.path.basename(os.path.dirname(os.path.dirname(__file__)))
service_name_upper = service_name.upper().replace('-', '_')
# Set database URL from environment variables with multiple fallback strategies
database_url = (
os.getenv(f'{service_name_upper}_DATABASE_URL') or # Service-specific
os.getenv('DATABASE_URL') # Generic fallback
)
# If DATABASE_URL is not set, construct from individual components
if not database_url:
# Try generic PostgreSQL environment variables first
postgres_host = os.getenv('POSTGRES_HOST')
postgres_port = os.getenv('POSTGRES_PORT', '5432')
postgres_db = os.getenv('POSTGRES_DB')
postgres_user = os.getenv('POSTGRES_USER')
postgres_password = os.getenv('POSTGRES_PASSWORD')
if all([postgres_host, postgres_db, postgres_user, postgres_password]):
database_url = f"postgresql+asyncpg://{postgres_user}:{postgres_password}@{postgres_host}:{postgres_port}/{postgres_db}"
else:
# Try service-specific environment variables
db_host = os.getenv(f'{service_name_upper}_DB_HOST', f'{service_name}-db-service')
db_port = os.getenv(f'{service_name_upper}_DB_PORT', '5432')
db_name = os.getenv(f'{service_name_upper}_DB_NAME', f'{service_name.replace("-", "_")}_db')
db_user = os.getenv(f'{service_name_upper}_DB_USER', f'{service_name.replace("-", "_")}_user')
db_password = os.getenv(f'{service_name_upper}_DB_PASSWORD')
if db_password:
database_url = f"postgresql+asyncpg://{db_user}:{db_password}@{db_host}:{db_port}/{db_name}"
else:
# Final fallback: try to get from settings object
try:
database_url = getattr(settings, 'DATABASE_URL', None)
except Exception:
pass
if not database_url:
error_msg = f"ERROR: No database URL configured for {service_name} service"
print(error_msg)
raise Exception(error_msg)
config.set_main_option("sqlalchemy.url", database_url)
# Interpret the config file for Python logging
if config.config_file_name is not None:
fileConfig(config.config_file_name)
# Set target metadata
target_metadata = Base.metadata
def run_migrations_offline() -> None:
"""Run migrations in 'offline' mode."""
url = config.get_main_option("sqlalchemy.url")
context.configure(
url=url,
target_metadata=target_metadata,
literal_binds=True,
dialect_opts={"paramstyle": "named"},
compare_type=True,
compare_server_default=True,
)
with context.begin_transaction():
context.run_migrations()
def do_run_migrations(connection: Connection) -> None:
"""Execute migrations with the given connection."""
context.configure(
connection=connection,
target_metadata=target_metadata,
compare_type=True,
compare_server_default=True,
)
with context.begin_transaction():
context.run_migrations()
async def run_async_migrations() -> None:
"""Run migrations in 'online' mode with async support."""
connectable = async_engine_from_config(
config.get_section(config.config_ini_section, {}),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
async with connectable.connect() as connection:
await connection.run_sync(do_run_migrations)
await connectable.dispose()
def run_migrations_online() -> None:
"""Run migrations in 'online' mode."""
asyncio.run(run_async_migrations())
if context.is_offline_mode():
run_migrations_offline()
else:
run_migrations_online()

View File

@@ -0,0 +1,26 @@
"""${message}
Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
${imports if imports else ""}
# revision identifiers, used by Alembic.
revision: str = ${repr(up_revision)}
down_revision: Union[str, None] = ${repr(down_revision)}
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
def upgrade() -> None:
${upgrades if upgrades else "pass"}
def downgrade() -> None:
${downgrades if downgrades else "pass"}

View File

@@ -0,0 +1,268 @@
"""initial_schema_20251015_1229
Revision ID: 7f882c2ca25c
Revises:
Create Date: 2025-10-15 12:29:27.201743+02:00
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import postgresql
# revision identifiers, used by Alembic.
revision: str = '7f882c2ca25c'
down_revision: Union[str, None] = None
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('audit_logs',
sa.Column('id', sa.UUID(), nullable=False),
sa.Column('tenant_id', sa.UUID(), nullable=False),
sa.Column('user_id', sa.UUID(), nullable=False),
sa.Column('action', sa.String(length=100), nullable=False),
sa.Column('resource_type', sa.String(length=100), nullable=False),
sa.Column('resource_id', sa.String(length=255), nullable=True),
sa.Column('severity', sa.String(length=20), nullable=False),
sa.Column('service_name', sa.String(length=100), nullable=False),
sa.Column('description', sa.Text(), nullable=True),
sa.Column('changes', postgresql.JSON(astext_type=sa.Text()), nullable=True),
sa.Column('audit_metadata', postgresql.JSON(astext_type=sa.Text()), nullable=True),
sa.Column('ip_address', sa.String(length=45), nullable=True),
sa.Column('user_agent', sa.Text(), nullable=True),
sa.Column('endpoint', sa.String(length=255), nullable=True),
sa.Column('method', sa.String(length=10), nullable=True),
sa.Column('created_at', sa.DateTime(timezone=True), nullable=False),
sa.PrimaryKeyConstraint('id')
)
op.create_index('idx_audit_resource_type_action', 'audit_logs', ['resource_type', 'action'], unique=False)
op.create_index('idx_audit_service_created', 'audit_logs', ['service_name', 'created_at'], unique=False)
op.create_index('idx_audit_severity_created', 'audit_logs', ['severity', 'created_at'], unique=False)
op.create_index('idx_audit_tenant_created', 'audit_logs', ['tenant_id', 'created_at'], unique=False)
op.create_index('idx_audit_user_created', 'audit_logs', ['user_id', 'created_at'], unique=False)
op.create_index(op.f('ix_audit_logs_action'), 'audit_logs', ['action'], unique=False)
op.create_index(op.f('ix_audit_logs_created_at'), 'audit_logs', ['created_at'], unique=False)
op.create_index(op.f('ix_audit_logs_resource_id'), 'audit_logs', ['resource_id'], unique=False)
op.create_index(op.f('ix_audit_logs_resource_type'), 'audit_logs', ['resource_type'], unique=False)
op.create_index(op.f('ix_audit_logs_service_name'), 'audit_logs', ['service_name'], unique=False)
op.create_index(op.f('ix_audit_logs_severity'), 'audit_logs', ['severity'], unique=False)
op.create_index(op.f('ix_audit_logs_tenant_id'), 'audit_logs', ['tenant_id'], unique=False)
op.create_index(op.f('ix_audit_logs_user_id'), 'audit_logs', ['user_id'], unique=False)
op.create_table('customers',
sa.Column('id', sa.UUID(), nullable=False),
sa.Column('tenant_id', sa.UUID(), nullable=False),
sa.Column('customer_code', sa.String(length=50), nullable=False),
sa.Column('name', sa.String(length=200), nullable=False),
sa.Column('business_name', sa.String(length=200), nullable=True),
sa.Column('customer_type', sa.String(length=50), nullable=False),
sa.Column('email', sa.String(length=255), nullable=True),
sa.Column('phone', sa.String(length=50), nullable=True),
sa.Column('address_line1', sa.String(length=255), nullable=True),
sa.Column('address_line2', sa.String(length=255), nullable=True),
sa.Column('city', sa.String(length=100), nullable=True),
sa.Column('state', sa.String(length=100), nullable=True),
sa.Column('postal_code', sa.String(length=20), nullable=True),
sa.Column('country', sa.String(length=100), nullable=False),
sa.Column('tax_id', sa.String(length=50), nullable=True),
sa.Column('business_license', sa.String(length=100), nullable=True),
sa.Column('is_active', sa.Boolean(), nullable=False),
sa.Column('preferred_delivery_method', sa.String(length=50), nullable=False),
sa.Column('payment_terms', sa.String(length=50), nullable=False),
sa.Column('credit_limit', sa.Numeric(precision=10, scale=2), nullable=True),
sa.Column('discount_percentage', sa.Numeric(precision=5, scale=2), nullable=False),
sa.Column('customer_segment', sa.String(length=50), nullable=False),
sa.Column('priority_level', sa.String(length=20), nullable=False),
sa.Column('special_instructions', sa.Text(), nullable=True),
sa.Column('delivery_preferences', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
sa.Column('product_preferences', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
sa.Column('total_orders', sa.Integer(), nullable=False),
sa.Column('total_spent', sa.Numeric(precision=12, scale=2), nullable=False),
sa.Column('average_order_value', sa.Numeric(precision=10, scale=2), nullable=False),
sa.Column('last_order_date', sa.DateTime(timezone=True), nullable=True),
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
sa.Column('created_by', sa.UUID(), nullable=True),
sa.Column('updated_by', sa.UUID(), nullable=True),
sa.PrimaryKeyConstraint('id')
)
op.create_index(op.f('ix_customers_customer_code'), 'customers', ['customer_code'], unique=False)
op.create_index(op.f('ix_customers_tenant_id'), 'customers', ['tenant_id'], unique=False)
op.create_table('customer_contacts',
sa.Column('id', sa.UUID(), nullable=False),
sa.Column('customer_id', sa.UUID(), nullable=False),
sa.Column('name', sa.String(length=200), nullable=False),
sa.Column('title', sa.String(length=100), nullable=True),
sa.Column('department', sa.String(length=100), nullable=True),
sa.Column('email', sa.String(length=255), nullable=True),
sa.Column('phone', sa.String(length=50), nullable=True),
sa.Column('mobile', sa.String(length=50), nullable=True),
sa.Column('is_primary', sa.Boolean(), nullable=False),
sa.Column('contact_for_orders', sa.Boolean(), nullable=False),
sa.Column('contact_for_delivery', sa.Boolean(), nullable=False),
sa.Column('contact_for_billing', sa.Boolean(), nullable=False),
sa.Column('contact_for_support', sa.Boolean(), nullable=False),
sa.Column('preferred_contact_method', sa.String(length=50), nullable=False),
sa.Column('contact_time_preferences', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
sa.Column('notes', sa.Text(), nullable=True),
sa.Column('is_active', sa.Boolean(), nullable=False),
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
sa.ForeignKeyConstraint(['customer_id'], ['customers.id'], ondelete='CASCADE'),
sa.PrimaryKeyConstraint('id')
)
op.create_table('customer_orders',
sa.Column('id', sa.UUID(), nullable=False),
sa.Column('tenant_id', sa.UUID(), nullable=False),
sa.Column('order_number', sa.String(length=50), nullable=False),
sa.Column('customer_id', sa.UUID(), nullable=False),
sa.Column('status', sa.String(length=50), nullable=False),
sa.Column('order_type', sa.String(length=50), nullable=False),
sa.Column('priority', sa.String(length=20), nullable=False),
sa.Column('order_date', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
sa.Column('requested_delivery_date', sa.DateTime(timezone=True), nullable=False),
sa.Column('confirmed_delivery_date', sa.DateTime(timezone=True), nullable=True),
sa.Column('actual_delivery_date', sa.DateTime(timezone=True), nullable=True),
sa.Column('delivery_method', sa.String(length=50), nullable=False),
sa.Column('delivery_address', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
sa.Column('delivery_instructions', sa.Text(), nullable=True),
sa.Column('delivery_window_start', sa.DateTime(timezone=True), nullable=True),
sa.Column('delivery_window_end', sa.DateTime(timezone=True), nullable=True),
sa.Column('subtotal', sa.Numeric(precision=10, scale=2), nullable=False),
sa.Column('discount_amount', sa.Numeric(precision=10, scale=2), nullable=False),
sa.Column('discount_percentage', sa.Numeric(precision=5, scale=2), nullable=False),
sa.Column('tax_amount', sa.Numeric(precision=10, scale=2), nullable=False),
sa.Column('delivery_fee', sa.Numeric(precision=10, scale=2), nullable=False),
sa.Column('total_amount', sa.Numeric(precision=10, scale=2), nullable=False),
sa.Column('payment_status', sa.String(length=50), nullable=False),
sa.Column('payment_method', sa.String(length=50), nullable=True),
sa.Column('payment_terms', sa.String(length=50), nullable=False),
sa.Column('payment_due_date', sa.DateTime(timezone=True), nullable=True),
sa.Column('special_instructions', sa.Text(), nullable=True),
sa.Column('custom_requirements', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
sa.Column('allergen_warnings', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
sa.Column('business_model', sa.String(length=50), nullable=True),
sa.Column('estimated_business_model', sa.String(length=50), nullable=True),
sa.Column('order_source', sa.String(length=50), nullable=False),
sa.Column('sales_channel', sa.String(length=50), nullable=False),
sa.Column('order_origin', sa.String(length=100), nullable=True),
sa.Column('production_batch_id', sa.UUID(), nullable=True),
sa.Column('fulfillment_location', sa.String(length=100), nullable=True),
sa.Column('estimated_preparation_time', sa.Integer(), nullable=True),
sa.Column('actual_preparation_time', sa.Integer(), nullable=True),
sa.Column('customer_notified_confirmed', sa.Boolean(), nullable=False),
sa.Column('customer_notified_ready', sa.Boolean(), nullable=False),
sa.Column('customer_notified_delivered', sa.Boolean(), nullable=False),
sa.Column('communication_preferences', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
sa.Column('quality_score', sa.Numeric(precision=3, scale=1), nullable=True),
sa.Column('customer_rating', sa.Integer(), nullable=True),
sa.Column('customer_feedback', sa.Text(), nullable=True),
sa.Column('cancellation_reason', sa.String(length=200), nullable=True),
sa.Column('cancelled_at', sa.DateTime(timezone=True), nullable=True),
sa.Column('cancelled_by', sa.UUID(), nullable=True),
sa.Column('refund_amount', sa.Numeric(precision=10, scale=2), nullable=False),
sa.Column('refund_processed_at', sa.DateTime(timezone=True), nullable=True),
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
sa.Column('created_by', sa.UUID(), nullable=True),
sa.Column('updated_by', sa.UUID(), nullable=True),
sa.Column('order_metadata', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
sa.ForeignKeyConstraint(['customer_id'], ['customers.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_index(op.f('ix_customer_orders_customer_id'), 'customer_orders', ['customer_id'], unique=False)
op.create_index(op.f('ix_customer_orders_order_number'), 'customer_orders', ['order_number'], unique=True)
op.create_index(op.f('ix_customer_orders_status'), 'customer_orders', ['status'], unique=False)
op.create_index(op.f('ix_customer_orders_tenant_id'), 'customer_orders', ['tenant_id'], unique=False)
op.create_table('order_items',
sa.Column('id', sa.UUID(), nullable=False),
sa.Column('order_id', sa.UUID(), nullable=False),
sa.Column('product_id', sa.UUID(), nullable=False),
sa.Column('product_name', sa.String(length=200), nullable=False),
sa.Column('product_sku', sa.String(length=100), nullable=True),
sa.Column('product_category', sa.String(length=100), nullable=True),
sa.Column('quantity', sa.Numeric(precision=10, scale=3), nullable=False),
sa.Column('unit_of_measure', sa.String(length=50), nullable=False),
sa.Column('weight', sa.Numeric(precision=10, scale=3), nullable=True),
sa.Column('unit_price', sa.Numeric(precision=10, scale=2), nullable=False),
sa.Column('line_discount', sa.Numeric(precision=10, scale=2), nullable=False),
sa.Column('line_total', sa.Numeric(precision=10, scale=2), nullable=False),
sa.Column('product_specifications', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
sa.Column('customization_details', sa.Text(), nullable=True),
sa.Column('special_instructions', sa.Text(), nullable=True),
sa.Column('recipe_id', sa.UUID(), nullable=True),
sa.Column('production_requirements', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
sa.Column('estimated_production_time', sa.Integer(), nullable=True),
sa.Column('status', sa.String(length=50), nullable=False),
sa.Column('production_started_at', sa.DateTime(timezone=True), nullable=True),
sa.Column('production_completed_at', sa.DateTime(timezone=True), nullable=True),
sa.Column('quality_checked', sa.Boolean(), nullable=False),
sa.Column('quality_score', sa.Numeric(precision=3, scale=1), nullable=True),
sa.Column('ingredient_cost', sa.Numeric(precision=10, scale=2), nullable=True),
sa.Column('labor_cost', sa.Numeric(precision=10, scale=2), nullable=True),
sa.Column('overhead_cost', sa.Numeric(precision=10, scale=2), nullable=True),
sa.Column('total_cost', sa.Numeric(precision=10, scale=2), nullable=True),
sa.Column('margin', sa.Numeric(precision=10, scale=2), nullable=True),
sa.Column('reserved_inventory', sa.Boolean(), nullable=False),
sa.Column('inventory_allocated_at', sa.DateTime(timezone=True), nullable=True),
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
sa.Column('customer_metadata', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
sa.ForeignKeyConstraint(['order_id'], ['customer_orders.id'], ondelete='CASCADE'),
sa.PrimaryKeyConstraint('id')
)
op.create_index(op.f('ix_order_items_product_id'), 'order_items', ['product_id'], unique=False)
op.create_table('order_status_history',
sa.Column('id', sa.UUID(), nullable=False),
sa.Column('order_id', sa.UUID(), nullable=False),
sa.Column('from_status', sa.String(length=50), nullable=True),
sa.Column('to_status', sa.String(length=50), nullable=False),
sa.Column('change_reason', sa.String(length=200), nullable=True),
sa.Column('event_type', sa.String(length=50), nullable=False),
sa.Column('event_description', sa.Text(), nullable=True),
sa.Column('event_data', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
sa.Column('changed_by', sa.UUID(), nullable=True),
sa.Column('change_source', sa.String(length=50), nullable=False),
sa.Column('changed_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
sa.Column('customer_notified', sa.Boolean(), nullable=False),
sa.Column('notification_method', sa.String(length=50), nullable=True),
sa.Column('notification_sent_at', sa.DateTime(timezone=True), nullable=True),
sa.Column('notes', sa.Text(), nullable=True),
sa.ForeignKeyConstraint(['order_id'], ['customer_orders.id'], ondelete='CASCADE'),
sa.PrimaryKeyConstraint('id')
)
# ### end Alembic commands ###
def downgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
op.drop_table('order_status_history')
op.drop_index(op.f('ix_order_items_product_id'), table_name='order_items')
op.drop_table('order_items')
op.drop_index(op.f('ix_customer_orders_tenant_id'), table_name='customer_orders')
op.drop_index(op.f('ix_customer_orders_status'), table_name='customer_orders')
op.drop_index(op.f('ix_customer_orders_order_number'), table_name='customer_orders')
op.drop_index(op.f('ix_customer_orders_customer_id'), table_name='customer_orders')
op.drop_table('customer_orders')
op.drop_table('customer_contacts')
op.drop_index(op.f('ix_customers_tenant_id'), table_name='customers')
op.drop_index(op.f('ix_customers_customer_code'), table_name='customers')
op.drop_table('customers')
op.drop_index(op.f('ix_audit_logs_user_id'), table_name='audit_logs')
op.drop_index(op.f('ix_audit_logs_tenant_id'), table_name='audit_logs')
op.drop_index(op.f('ix_audit_logs_severity'), table_name='audit_logs')
op.drop_index(op.f('ix_audit_logs_service_name'), table_name='audit_logs')
op.drop_index(op.f('ix_audit_logs_resource_type'), table_name='audit_logs')
op.drop_index(op.f('ix_audit_logs_resource_id'), table_name='audit_logs')
op.drop_index(op.f('ix_audit_logs_created_at'), table_name='audit_logs')
op.drop_index(op.f('ix_audit_logs_action'), table_name='audit_logs')
op.drop_index('idx_audit_user_created', table_name='audit_logs')
op.drop_index('idx_audit_tenant_created', table_name='audit_logs')
op.drop_index('idx_audit_severity_created', table_name='audit_logs')
op.drop_index('idx_audit_service_created', table_name='audit_logs')
op.drop_index('idx_audit_resource_type_action', table_name='audit_logs')
op.drop_table('audit_logs')
# ### end Alembic commands ###

View File

@@ -0,0 +1,52 @@
# Orders Service Dependencies
# FastAPI and web framework
fastapi==0.119.0
uvicorn[standard]==0.32.1
pydantic==2.12.3
pydantic-settings==2.7.1
# Database
sqlalchemy==2.0.44
asyncpg==0.30.0
alembic==1.17.0
psycopg2-binary==2.9.10
# HTTP clients
httpx==0.28.1
# Redis for caching
redis==6.4.0
# Message queuing
aio-pika==9.4.3
# Scheduling
APScheduler==3.10.4
# Logging and monitoring
structlog==25.4.0
psutil==5.9.8
opentelemetry-api==1.39.1
opentelemetry-sdk==1.39.1
opentelemetry-instrumentation-fastapi==0.60b1
opentelemetry-exporter-otlp-proto-grpc==1.39.1
opentelemetry-exporter-otlp-proto-http==1.39.1
opentelemetry-instrumentation-httpx==0.60b1
opentelemetry-instrumentation-redis==0.60b1
opentelemetry-instrumentation-sqlalchemy==0.60b1
# Date and time utilities
python-dateutil==2.9.0.post0
pytz==2024.2
# Validation and utilities
email-validator==2.2.0
# Authentication
python-jose[cryptography]==3.3.0
cryptography==44.0.0
# Development dependencies
python-multipart==0.0.6
pytest==8.3.4
pytest-asyncio==0.25.2

View File

@@ -0,0 +1,281 @@
{
"clientes": [
{
"id": "20000000-0000-0000-0000-000000000001",
"customer_code": "CLI-001",
"name": "Hotel Plaza Mayor",
"business_name": "Hotel Plaza Mayor S.L.",
"customer_type": "business",
"email": "compras@hotelplazamayor.es",
"phone": "+34 91 234 5601",
"address_line1": "Plaza Mayor 15",
"city": "Madrid",
"postal_code": "28012",
"country": "España",
"customer_segment": "wholesale",
"priority_level": "high",
"payment_terms": "net_30",
"credit_limit": 5000.00,
"discount_percentage": 10.00,
"preferred_delivery_method": "delivery",
"special_instructions": "Entrega antes de las 6:00 AM. Llamar al llegar."
},
{
"id": "20000000-0000-0000-0000-000000000002",
"customer_code": "CLI-002",
"name": "Restaurante El Mesón",
"business_name": "Restaurante El Mesón S.L.",
"customer_type": "business",
"email": "pedidos@elmeson.es",
"phone": "+34 91 345 6702",
"address_line1": "Calle Mayor 45",
"city": "Madrid",
"postal_code": "28013",
"country": "España",
"customer_segment": "regular",
"priority_level": "normal",
"payment_terms": "net_15",
"credit_limit": 2000.00,
"discount_percentage": 5.00,
"preferred_delivery_method": "delivery",
"special_instructions": "Dejar pedido en la puerta de servicio."
},
{
"id": "20000000-0000-0000-0000-000000000003",
"customer_code": "CLI-003",
"name": "Cafetería La Esquina",
"business_name": "Cafetería La Esquina S.L.",
"customer_type": "business",
"email": "info@laesquina.es",
"phone": "+34 91 456 7803",
"address_line1": "Calle Toledo 23",
"city": "Madrid",
"postal_code": "28005",
"country": "España",
"customer_segment": "regular",
"priority_level": "normal",
"payment_terms": "immediate",
"credit_limit": 1000.00,
"discount_percentage": 0.00,
"preferred_delivery_method": "delivery"
},
{
"id": "20000000-0000-0000-0000-000000000004",
"customer_code": "CLI-004",
"name": "María García Ruiz",
"customer_type": "individual",
"email": "maria.garcia@email.com",
"phone": "+34 612 345 678",
"address_line1": "Calle Alcalá 100, 3º B",
"city": "Madrid",
"postal_code": "28009",
"country": "España",
"customer_segment": "vip",
"priority_level": "high",
"payment_terms": "immediate",
"preferred_delivery_method": "delivery",
"special_instructions": "Cliente VIP - Tartas de cumpleaños personalizadas"
},
{
"id": "20000000-0000-0000-0000-000000000005",
"customer_code": "CLI-005",
"name": "Carlos Martínez López",
"customer_type": "individual",
"email": "carlos.m@email.com",
"phone": "+34 623 456 789",
"address_line1": "Gran Vía 75, 5º A",
"city": "Madrid",
"postal_code": "28013",
"country": "España",
"customer_segment": "regular",
"priority_level": "normal",
"payment_terms": "immediate",
"preferred_delivery_method": "pickup"
},
{
"id": "20000000-0000-0000-0000-000000000006",
"customer_code": "CLI-006",
"name": "Panadería Central Distribución",
"business_name": "Panadería Central S.A.",
"customer_type": "central_bakery",
"email": "produccion@panaderiacentral.es",
"phone": "+34 91 567 8904",
"address_line1": "Polígono Industrial Norte, Nave 12",
"city": "Madrid",
"postal_code": "28050",
"country": "España",
"customer_segment": "wholesale",
"priority_level": "high",
"payment_terms": "net_15",
"credit_limit": 10000.00,
"discount_percentage": 15.00,
"preferred_delivery_method": "pickup",
"special_instructions": "Pedidos grandes - Coordinación con almacén necesaria"
},
{
"id": "20000000-0000-0000-0000-000000000007",
"customer_code": "CLI-007",
"name": "Supermercado El Ahorro",
"business_name": "Supermercado El Ahorro S.L.",
"customer_type": "business",
"email": "compras@elahorro.es",
"phone": "+34 91 678 9015",
"address_line1": "Avenida de América 200",
"city": "Madrid",
"postal_code": "28028",
"country": "España",
"customer_segment": "wholesale",
"priority_level": "high",
"payment_terms": "net_30",
"credit_limit": 8000.00,
"discount_percentage": 12.00,
"preferred_delivery_method": "delivery",
"special_instructions": "Entrega en muelle de carga. Horario: 7:00-9:00 AM"
},
{
"id": "20000000-0000-0000-0000-000000000008",
"customer_code": "CLI-008",
"name": "Ana Rodríguez Fernández",
"customer_type": "individual",
"email": "ana.rodriguez@email.com",
"phone": "+34 634 567 890",
"address_line1": "Calle Serrano 50, 2º D",
"city": "Madrid",
"postal_code": "28001",
"country": "España",
"customer_segment": "vip",
"priority_level": "high",
"payment_terms": "immediate",
"preferred_delivery_method": "delivery",
"special_instructions": "Prefiere croissants de mantequilla y pan integral"
},
{
"id": "20000000-0000-0000-0000-000000000009",
"customer_code": "CLI-009",
"name": "Colegio San José",
"business_name": "Colegio San José - Comedor Escolar",
"customer_type": "business",
"email": "administracion@colegiosanjose.es",
"phone": "+34 91 789 0126",
"address_line1": "Calle Bravo Murillo 150",
"city": "Madrid",
"postal_code": "28020",
"country": "España",
"customer_segment": "regular",
"priority_level": "normal",
"payment_terms": "net_30",
"credit_limit": 3000.00,
"discount_percentage": 8.00,
"preferred_delivery_method": "delivery",
"special_instructions": "Entrega diaria a las 7:30 AM. 500 alumnos."
},
{
"id": "20000000-0000-0000-0000-000000000010",
"customer_code": "CLI-010",
"name": "Javier López Sánchez",
"customer_type": "individual",
"email": "javier.lopez@email.com",
"phone": "+34 645 678 901",
"address_line1": "Calle Atocha 25, 1º C",
"city": "Madrid",
"postal_code": "28012",
"country": "España",
"customer_segment": "regular",
"priority_level": "normal",
"payment_terms": "immediate",
"preferred_delivery_method": "pickup"
},
{
"id": "20000000-0000-0000-0000-000000000011",
"customer_code": "CLI-011",
"name": "Cafetería Central Station",
"business_name": "Central Station Coffee S.L.",
"customer_type": "business",
"email": "pedidos@centralstation.es",
"phone": "+34 91 890 1237",
"address_line1": "Estación de Atocha, Local 23",
"city": "Madrid",
"postal_code": "28045",
"country": "España",
"customer_segment": "wholesale",
"priority_level": "high",
"payment_terms": "net_15",
"credit_limit": 4000.00,
"discount_percentage": 10.00,
"preferred_delivery_method": "delivery",
"special_instructions": "Dos entregas diarias: 5:30 AM y 12:00 PM"
},
{
"id": "20000000-0000-0000-0000-000000000012",
"customer_code": "CLI-012",
"name": "Isabel Torres Muñoz",
"customer_type": "individual",
"email": "isabel.torres@email.com",
"phone": "+34 656 789 012",
"address_line1": "Calle Goya 88, 4º A",
"city": "Madrid",
"postal_code": "28001",
"country": "España",
"customer_segment": "vip",
"priority_level": "high",
"payment_terms": "immediate",
"preferred_delivery_method": "delivery",
"special_instructions": "Pedidos semanales de tartas especiales"
},
{
"id": "20000000-0000-0000-0000-000000000013",
"customer_code": "CLI-013",
"name": "Bar Tapas La Latina",
"business_name": "Bar La Latina S.L.",
"customer_type": "business",
"email": "info@barlalatina.es",
"phone": "+34 91 901 2348",
"address_line1": "Plaza de la Paja 8",
"city": "Madrid",
"postal_code": "28005",
"country": "España",
"customer_segment": "regular",
"priority_level": "normal",
"payment_terms": "net_15",
"credit_limit": 1500.00,
"discount_percentage": 5.00,
"preferred_delivery_method": "pickup"
},
{
"id": "20000000-0000-0000-0000-000000000014",
"customer_code": "CLI-014",
"name": "Francisco Gómez Rivera",
"customer_type": "individual",
"email": "francisco.gomez@email.com",
"phone": "+34 667 890 123",
"address_line1": "Calle Velázquez 120, 6º B",
"city": "Madrid",
"postal_code": "28006",
"country": "España",
"customer_segment": "regular",
"priority_level": "normal",
"payment_terms": "immediate",
"preferred_delivery_method": "pickup"
},
{
"id": "20000000-0000-0000-0000-000000000015",
"customer_code": "CLI-015",
"name": "Residencia Tercera Edad Los Olivos",
"business_name": "Residencia Los Olivos S.L.",
"customer_type": "business",
"email": "cocina@residenciaolivos.es",
"phone": "+34 91 012 3459",
"address_line1": "Calle Arturo Soria 345",
"city": "Madrid",
"postal_code": "28033",
"country": "España",
"customer_segment": "wholesale",
"priority_level": "high",
"payment_terms": "net_30",
"credit_limit": 6000.00,
"discount_percentage": 10.00,
"preferred_delivery_method": "delivery",
"special_instructions": "Pan de molde sin corteza para 120 residentes. Entrega 6:00 AM."
}
]
}

View File

@@ -0,0 +1,266 @@
{
"configuracion_compras": {
"planes_por_tenant": 8,
"requisitos_por_plan": {
"min": 5,
"max": 12
},
"distribucion_temporal": {
"completados": {
"porcentaje": 0.25,
"offset_dias_min": -45,
"offset_dias_max": -8,
"estados": ["completed"]
},
"en_ejecucion": {
"porcentaje": 0.375,
"offset_dias_min": -7,
"offset_dias_max": -1,
"estados": ["in_execution", "approved"]
},
"pendiente_aprobacion": {
"porcentaje": 0.25,
"offset_dias_min": 0,
"offset_dias_max": 0,
"estados": ["pending_approval"]
},
"borrador": {
"porcentaje": 0.125,
"offset_dias_min": 1,
"offset_dias_max": 3,
"estados": ["draft"]
}
},
"distribucion_estados": {
"draft": 0.125,
"pending_approval": 0.25,
"approved": 0.25,
"in_execution": 0.25,
"completed": 0.125
},
"tipos_plan": [
{"tipo": "regular", "peso": 0.75},
{"tipo": "emergency", "peso": 0.15},
{"tipo": "seasonal", "peso": 0.10}
],
"prioridades": {
"low": 0.20,
"normal": 0.55,
"high": 0.20,
"critical": 0.05
},
"estrategias_compra": [
{"estrategia": "just_in_time", "peso": 0.50},
{"estrategia": "bulk", "peso": 0.30},
{"estrategia": "mixed", "peso": 0.20}
],
"niveles_riesgo": {
"low": 0.50,
"medium": 0.30,
"high": 0.15,
"critical": 0.05
},
"ingredientes_demo": [
{
"id": "10000000-0000-0000-0000-000000000001",
"nombre": "Harina de Trigo Panadera T-55",
"sku": "ING-HAR-001",
"categoria": "harinas",
"tipo": "ingredient",
"unidad": "kg",
"costo_unitario": 0.65,
"lead_time_dias": 3,
"cantidad_minima": 500.0,
"vida_util_dias": 180
},
{
"id": "10000000-0000-0000-0000-000000000002",
"nombre": "Harina de Trigo Integral",
"sku": "ING-HAR-002",
"categoria": "harinas",
"tipo": "ingredient",
"unidad": "kg",
"costo_unitario": 0.85,
"lead_time_dias": 3,
"cantidad_minima": 300.0,
"vida_util_dias": 120
},
{
"id": "10000000-0000-0000-0000-000000000003",
"nombre": "Levadura Fresca Prensada",
"sku": "ING-LEV-001",
"categoria": "levaduras",
"tipo": "ingredient",
"unidad": "kg",
"costo_unitario": 3.50,
"lead_time_dias": 2,
"cantidad_minima": 25.0,
"vida_util_dias": 21
},
{
"id": "10000000-0000-0000-0000-000000000004",
"nombre": "Sal Marina Refinada",
"sku": "ING-SAL-001",
"categoria": "ingredientes_basicos",
"tipo": "ingredient",
"unidad": "kg",
"costo_unitario": 0.40,
"lead_time_dias": 7,
"cantidad_minima": 200.0,
"vida_util_dias": 730
},
{
"id": "10000000-0000-0000-0000-000000000005",
"nombre": "Mantequilla 82% MG",
"sku": "ING-MAN-001",
"categoria": "lacteos",
"tipo": "ingredient",
"unidad": "kg",
"costo_unitario": 5.80,
"lead_time_dias": 2,
"cantidad_minima": 50.0,
"vida_util_dias": 90
},
{
"id": "10000000-0000-0000-0000-000000000006",
"nombre": "Azúcar Blanco Refinado",
"sku": "ING-AZU-001",
"categoria": "azucares",
"tipo": "ingredient",
"unidad": "kg",
"costo_unitario": 0.75,
"lead_time_dias": 5,
"cantidad_minima": 300.0,
"vida_util_dias": 365
},
{
"id": "10000000-0000-0000-0000-000000000007",
"nombre": "Huevos Categoría A",
"sku": "ING-HUE-001",
"categoria": "lacteos",
"tipo": "ingredient",
"unidad": "unidad",
"costo_unitario": 0.18,
"lead_time_dias": 2,
"cantidad_minima": 360.0,
"vida_util_dias": 28
},
{
"id": "10000000-0000-0000-0000-000000000008",
"nombre": "Leche Entera UHT",
"sku": "ING-LEC-001",
"categoria": "lacteos",
"tipo": "ingredient",
"unidad": "litro",
"costo_unitario": 0.85,
"lead_time_dias": 3,
"cantidad_minima": 100.0,
"vida_util_dias": 90
},
{
"id": "10000000-0000-0000-0000-000000000009",
"nombre": "Chocolate Cobertura 70%",
"sku": "ING-CHO-001",
"categoria": "chocolates",
"tipo": "ingredient",
"unidad": "kg",
"costo_unitario": 12.50,
"lead_time_dias": 5,
"cantidad_minima": 25.0,
"vida_util_dias": 365
},
{
"id": "10000000-0000-0000-0000-000000000010",
"nombre": "Aceite de Oliva Virgen Extra",
"sku": "ING-ACE-001",
"categoria": "aceites",
"tipo": "ingredient",
"unidad": "litro",
"costo_unitario": 4.20,
"lead_time_dias": 4,
"cantidad_minima": 50.0,
"vida_util_dias": 540
},
{
"id": "10000000-0000-0000-0000-000000000011",
"nombre": "Bolsas de Papel Kraft",
"sku": "PAC-BOL-001",
"categoria": "embalaje",
"tipo": "packaging",
"unidad": "unidad",
"costo_unitario": 0.08,
"lead_time_dias": 10,
"cantidad_minima": 5000.0,
"vida_util_dias": 730
},
{
"id": "10000000-0000-0000-0000-000000000012",
"nombre": "Cajas de Cartón Grande",
"sku": "PAC-CAJ-001",
"categoria": "embalaje",
"tipo": "packaging",
"unidad": "unidad",
"costo_unitario": 0.45,
"lead_time_dias": 7,
"cantidad_minima": 500.0,
"vida_util_dias": 730
}
],
"rangos_cantidad": {
"harinas": {"min": 500.0, "max": 2000.0},
"levaduras": {"min": 20.0, "max": 100.0},
"ingredientes_basicos": {"min": 100.0, "max": 500.0},
"lacteos": {"min": 50.0, "max": 300.0},
"azucares": {"min": 200.0, "max": 800.0},
"chocolates": {"min": 10.0, "max": 50.0},
"aceites": {"min": 30.0, "max": 150.0},
"embalaje": {"min": 1000.0, "max": 10000.0}
},
"buffer_seguridad_porcentaje": {
"min": 10.0,
"max": 30.0,
"tipico": 20.0
},
"horizonte_planificacion_dias": {
"individual_bakery": 14,
"central_bakery": 21
},
"metricas_rendimiento": {
"tasa_cumplimiento": {"min": 85.0, "max": 98.0},
"entrega_puntual": {"min": 80.0, "max": 95.0},
"precision_costo": {"min": 90.0, "max": 99.0},
"puntuacion_calidad": {"min": 7.0, "max": 10.0}
}
},
"alertas_compras": {
"plan_urgente": {
"condicion": "plan_type = emergency AND status IN (draft, pending_approval)",
"mensaje": "Plan de compras de emergencia requiere aprobación urgente: {plan_number}",
"severidad": "high"
},
"requisito_critico": {
"condicion": "priority = critical AND required_by_date < NOW() + INTERVAL '3 days'",
"mensaje": "Requisito crítico con fecha límite próxima: {product_name} para {required_by_date}",
"severidad": "high"
},
"riesgo_suministro": {
"condicion": "supply_risk_level IN (high, critical)",
"mensaje": "Alto riesgo de suministro detectado en plan {plan_number}",
"severidad": "medium"
},
"fecha_pedido_proxima": {
"condicion": "suggested_order_date BETWEEN NOW() AND NOW() + INTERVAL '2 days'",
"mensaje": "Fecha sugerida de pedido próxima: {product_name}",
"severidad": "medium"
}
},
"notas": {
"descripcion": "Configuración para generación de planes de compras demo",
"planes_totales": 8,
"ingredientes_disponibles": 12,
"proveedores": "Usar proveedores de proveedores_es.json",
"fechas": "Usar offsets relativos a BASE_REFERENCE_DATE",
"moneda": "EUR",
"idioma": "español"
}
}

View File

@@ -0,0 +1,220 @@
{
"configuracion_pedidos": {
"total_pedidos_por_tenant": 30,
"distribucion_temporal": {
"completados_antiguos": {
"porcentaje": 0.30,
"offset_dias_min": -60,
"offset_dias_max": -15,
"estados": ["delivered", "completed"]
},
"completados_recientes": {
"porcentaje": 0.25,
"offset_dias_min": -14,
"offset_dias_max": -1,
"estados": ["delivered", "completed"]
},
"en_proceso": {
"porcentaje": 0.25,
"offset_dias_min": 0,
"offset_dias_max": 0,
"estados": ["confirmed", "in_production", "ready"]
},
"futuros": {
"porcentaje": 0.20,
"offset_dias_min": 1,
"offset_dias_max": 7,
"estados": ["pending", "confirmed"]
}
},
"distribucion_estados": {
"pending": 0.10,
"confirmed": 0.15,
"in_production": 0.10,
"ready": 0.10,
"in_delivery": 0.05,
"delivered": 0.35,
"completed": 0.10,
"cancelled": 0.05
},
"distribucion_prioridad": {
"low": 0.30,
"normal": 0.50,
"high": 0.15,
"urgent": 0.05
},
"lineas_por_pedido": {
"min": 2,
"max": 8
},
"cantidad_por_linea": {
"min": 5,
"max": 100
},
"precio_unitario": {
"min": 1.50,
"max": 15.00
},
"descuento_porcentaje": {
"sin_descuento": 0.70,
"con_descuento_5": 0.15,
"con_descuento_10": 0.10,
"con_descuento_15": 0.05
},
"metodos_pago": [
{"metodo": "bank_transfer", "peso": 0.40},
{"metodo": "credit_card", "peso": 0.25},
{"metodo": "cash", "peso": 0.20},
{"metodo": "check", "peso": 0.10},
{"metodo": "account", "peso": 0.05}
],
"tipos_entrega": [
{"tipo": "standard", "peso": 0.60},
{"tipo": "delivery", "peso": 0.25},
{"tipo": "pickup", "peso": 0.15}
],
"notas_pedido": [
"Entrega en horario de mañana, antes de las 8:00 AM",
"Llamar 15 minutos antes de llegar",
"Dejar en la entrada de servicio",
"Contactar con el encargado al llegar",
"Pedido urgente para evento especial",
"Embalaje especial para transporte",
"Verificar cantidad antes de descargar",
"Entrega programada según calendario acordado",
"Incluir factura con el pedido",
"Pedido recurrente semanal"
],
"productos_demo": [
{
"nombre": "Pan de Barra Tradicional",
"codigo": "PROD-001",
"precio_base": 1.80,
"unidad": "unidad"
},
{
"nombre": "Baguette",
"codigo": "PROD-002",
"precio_base": 2.00,
"unidad": "unidad"
},
{
"nombre": "Pan Integral",
"codigo": "PROD-003",
"precio_base": 2.50,
"unidad": "unidad"
},
{
"nombre": "Pan de Centeno",
"codigo": "PROD-004",
"precio_base": 2.80,
"unidad": "unidad"
},
{
"nombre": "Croissant",
"codigo": "PROD-005",
"precio_base": 1.50,
"unidad": "unidad"
},
{
"nombre": "Napolitana de Chocolate",
"codigo": "PROD-006",
"precio_base": 1.80,
"unidad": "unidad"
},
{
"nombre": "Palmera",
"codigo": "PROD-007",
"precio_base": 1.60,
"unidad": "unidad"
},
{
"nombre": "Ensaimada",
"codigo": "PROD-008",
"precio_base": 3.50,
"unidad": "unidad"
},
{
"nombre": "Magdalena",
"codigo": "PROD-009",
"precio_base": 1.20,
"unidad": "unidad"
},
{
"nombre": "Bollo de Leche",
"codigo": "PROD-010",
"precio_base": 1.00,
"unidad": "unidad"
},
{
"nombre": "Pan de Molde Blanco",
"codigo": "PROD-011",
"precio_base": 2.20,
"unidad": "unidad"
},
{
"nombre": "Pan de Molde Integral",
"codigo": "PROD-012",
"precio_base": 2.50,
"unidad": "unidad"
},
{
"nombre": "Panecillo",
"codigo": "PROD-013",
"precio_base": 0.80,
"unidad": "unidad"
},
{
"nombre": "Rosca de Anís",
"codigo": "PROD-014",
"precio_base": 3.00,
"unidad": "unidad"
},
{
"nombre": "Empanada de Atún",
"codigo": "PROD-015",
"precio_base": 4.50,
"unidad": "unidad"
}
],
"horarios_entrega": [
"06:00-08:00",
"08:00-10:00",
"10:00-12:00",
"12:00-14:00",
"14:00-16:00",
"16:00-18:00"
]
},
"alertas_pedidos": {
"pedidos_urgentes": {
"condicion": "priority = urgent AND status IN (pending, confirmed)",
"mensaje": "Pedido urgente requiere atención inmediata: {order_number}",
"severidad": "high"
},
"pedidos_retrasados": {
"condicion": "delivery_date < NOW() AND status NOT IN (delivered, completed, cancelled)",
"mensaje": "Pedido retrasado: {order_number} para cliente {customer_name}",
"severidad": "high"
},
"pedidos_proximos": {
"condicion": "delivery_date BETWEEN NOW() AND NOW() + INTERVAL '24 hours'",
"mensaje": "Entrega programada en las próximas 24 horas: {order_number}",
"severidad": "medium"
},
"pedidos_grandes": {
"condicion": "total_amount > 500",
"mensaje": "Pedido de alto valor requiere verificación: {order_number} ({total_amount}¬)",
"severidad": "medium"
}
},
"notas": {
"descripcion": "Configuración para generación automática de pedidos demo",
"total_pedidos": 30,
"productos_disponibles": 15,
"clientes_requeridos": "Usar clientes de clientes_es.json",
"fechas": "Usar offsets relativos a BASE_REFERENCE_DATE",
"moneda": "EUR",
"idioma": "español"
}
}