Initial commit - production deployment
This commit is contained in:
58
services/recipes/Dockerfile
Normal file
58
services/recipes/Dockerfile
Normal file
@@ -0,0 +1,58 @@
|
||||
# =============================================================================
|
||||
# Recipes Service Dockerfile - Environment-Configurable Base Images
|
||||
# =============================================================================
|
||||
# Build arguments for registry configuration:
|
||||
# - BASE_REGISTRY: Registry URL (default: docker.io for Docker Hub)
|
||||
# - PYTHON_IMAGE: Python image name and tag (default: python:3.11-slim)
|
||||
# =============================================================================
|
||||
|
||||
ARG BASE_REGISTRY=docker.io
|
||||
ARG PYTHON_IMAGE=python:3.11-slim
|
||||
|
||||
FROM ${BASE_REGISTRY}/${PYTHON_IMAGE} AS shared
|
||||
WORKDIR /shared
|
||||
COPY shared/ /shared/
|
||||
|
||||
ARG BASE_REGISTRY=docker.io
|
||||
ARG PYTHON_IMAGE=python:3.11-slim
|
||||
FROM ${BASE_REGISTRY}/${PYTHON_IMAGE}
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Install system dependencies
|
||||
RUN apt-get update && apt-get install -y \
|
||||
gcc \
|
||||
curl \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Copy requirements
|
||||
COPY shared/requirements-tracing.txt /tmp/
|
||||
|
||||
COPY services/recipes/requirements.txt .
|
||||
|
||||
# Install Python dependencies
|
||||
RUN pip install --no-cache-dir -r /tmp/requirements-tracing.txt
|
||||
|
||||
RUN pip install --no-cache-dir -r requirements.txt
|
||||
|
||||
# Copy shared libraries from the shared stage
|
||||
COPY --from=shared /shared /app/shared
|
||||
|
||||
# Copy application code
|
||||
COPY services/recipes/ .
|
||||
|
||||
|
||||
|
||||
# Add shared libraries to Python path
|
||||
ENV PYTHONPATH="/app:/app/shared:${PYTHONPATH:-}"
|
||||
|
||||
|
||||
# Expose port
|
||||
EXPOSE 8000
|
||||
|
||||
# Health check
|
||||
HEALTHCHECK --interval=30s --timeout=10s --start-period=5s --retries=3 \
|
||||
CMD curl -f http://localhost:8000/health || exit 1
|
||||
|
||||
# Run application
|
||||
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
|
||||
712
services/recipes/README.md
Normal file
712
services/recipes/README.md
Normal file
@@ -0,0 +1,712 @@
|
||||
# Recipes Service
|
||||
|
||||
## Overview
|
||||
|
||||
The **Recipes Service** is the central knowledge base for all bakery products, managing detailed recipes with precise ingredient quantities, preparation instructions, and cost calculations. It enables standardized production, accurate batch scaling, nutritional tracking, and cost management across all bakery operations. This service ensures consistent product quality and provides the foundation for production planning and inventory management.
|
||||
|
||||
## Key Features
|
||||
|
||||
### Recipe Management
|
||||
- **Complete Recipe Database** - Store all product recipes with full details
|
||||
- **Ingredient Specifications** - Precise quantities, units, and preparation notes
|
||||
- **Multi-Step Instructions** - Detailed preparation steps with timing
|
||||
- **Recipe Versioning** - Track recipe changes over time
|
||||
- **Recipe Categories** - Organize by product type (bread, pastries, cakes, etc.)
|
||||
- **Recipe Status** - Active, draft, archived recipe states
|
||||
|
||||
### Batch Scaling
|
||||
- **Automatic Scaling** - Calculate ingredients for any batch size
|
||||
- **Unit Conversion** - Convert between kg, g, L, mL, units
|
||||
- **Yield Calculation** - Expected output per recipe
|
||||
- **Scaling Validation** - Ensure scaled quantities are practical
|
||||
- **Multi-Batch Planning** - Scale for multiple simultaneous batches
|
||||
- **Equipment Consideration** - Validate against equipment capacity
|
||||
|
||||
### Cost Calculation
|
||||
- **Real-Time Costing** - Current ingredient prices from Inventory
|
||||
- **Cost Per Unit** - Calculate cost per individual product
|
||||
- **Profit Margin Analysis** - Compare cost vs. selling price
|
||||
- **Cost Breakdown** - Ingredient-level cost contribution
|
||||
- **Historical Cost Tracking** - Monitor cost changes over time
|
||||
- **Target Price Alerts** - Notify when costs exceed thresholds
|
||||
|
||||
### Nutritional Information
|
||||
- **Nutritional Facts** - Calories, protein, carbs, fats per serving
|
||||
- **Allergen Tracking** - Common allergens (gluten, nuts, dairy, eggs)
|
||||
- **Dietary Labels** - Vegan, vegetarian, gluten-free indicators
|
||||
- **Regulatory Compliance** - EU food labeling requirements
|
||||
- **Serving Size** - Standard serving definitions
|
||||
- **Label Generation** - Auto-generate compliant food labels
|
||||
|
||||
### Recipe Intelligence
|
||||
- **Popular Recipes** - Track most-produced recipes
|
||||
- **Cost Optimization Suggestions** - Identify expensive recipes
|
||||
- **Ingredient Substitutions** - Alternative ingredient recommendations
|
||||
- **Seasonal Recipes** - Highlight seasonal products
|
||||
- **Recipe Performance** - Track yield accuracy and quality
|
||||
- **Cross-Service Integration** - Used by Production, Inventory, Procurement
|
||||
|
||||
## Business Value
|
||||
|
||||
### For Bakery Owners
|
||||
- **Standardized Production** - Consistent product quality every time
|
||||
- **Cost Control** - Know exact cost and profit margin per product
|
||||
- **Pricing Optimization** - Data-driven pricing decisions
|
||||
- **Regulatory Compliance** - Meet EU food labeling requirements
|
||||
- **Waste Reduction** - Accurate scaling prevents over-production
|
||||
- **Knowledge Preservation** - Recipes survive staff turnover
|
||||
|
||||
### Quantifiable Impact
|
||||
- **Time Savings**: 3-5 hours/week on recipe calculations
|
||||
- **Cost Accuracy**: 99%+ vs. manual estimation (±20-30%)
|
||||
- **Waste Reduction**: 10-15% through accurate batch scaling
|
||||
- **Quality Consistency**: 95%+ batch consistency vs. 70-80% manual
|
||||
- **Compliance**: Avoid €500-5,000 fines for labeling violations
|
||||
- **Pricing Optimization**: 5-10% profit margin improvement
|
||||
|
||||
### For Production Staff
|
||||
- **Clear Instructions** - Step-by-step production guidance
|
||||
- **Exact Quantities** - No guesswork on ingredient amounts
|
||||
- **Scaling Confidence** - Reliably produce any batch size
|
||||
- **Quality Standards** - Know expected yield and appearance
|
||||
- **Allergen Awareness** - Critical safety information visible
|
||||
|
||||
## Technology Stack
|
||||
|
||||
- **Framework**: FastAPI (Python 3.11+) - Async web framework
|
||||
- **Database**: PostgreSQL 17 - Recipe data storage
|
||||
- **Caching**: Redis 7.4 - Recipe and cost cache
|
||||
- **ORM**: SQLAlchemy 2.0 (async) - Database abstraction
|
||||
- **Validation**: Pydantic 2.0 - Schema validation
|
||||
- **Logging**: Structlog - Structured JSON logging
|
||||
- **Metrics**: Prometheus Client - Custom metrics
|
||||
|
||||
## API Endpoints (Key Routes)
|
||||
|
||||
### Recipe Management
|
||||
- `GET /api/v1/recipes` - List all recipes with filters
|
||||
- `POST /api/v1/recipes` - Create new recipe
|
||||
- `GET /api/v1/recipes/{recipe_id}` - Get recipe details
|
||||
- `PUT /api/v1/recipes/{recipe_id}` - Update recipe
|
||||
- `DELETE /api/v1/recipes/{recipe_id}` - Delete recipe (soft delete)
|
||||
- `GET /api/v1/recipes/{recipe_id}/versions` - Get recipe version history
|
||||
|
||||
### Ingredient Management
|
||||
- `GET /api/v1/recipes/{recipe_id}/ingredients` - List recipe ingredients
|
||||
- `POST /api/v1/recipes/{recipe_id}/ingredients` - Add ingredient to recipe
|
||||
- `PUT /api/v1/recipes/{recipe_id}/ingredients/{ingredient_id}` - Update ingredient quantity
|
||||
- `DELETE /api/v1/recipes/{recipe_id}/ingredients/{ingredient_id}` - Remove ingredient
|
||||
|
||||
### Batch Scaling
|
||||
- `POST /api/v1/recipes/{recipe_id}/scale` - Scale recipe to batch size
|
||||
- `POST /api/v1/recipes/{recipe_id}/scale/multiple` - Scale for multiple batches
|
||||
- `GET /api/v1/recipes/{recipe_id}/scale/validate` - Validate scaling parameters
|
||||
|
||||
### Cost Calculation
|
||||
- `GET /api/v1/recipes/{recipe_id}/cost` - Get current recipe cost
|
||||
- `GET /api/v1/recipes/{recipe_id}/cost/history` - Historical cost data
|
||||
- `GET /api/v1/recipes/cost/analysis` - Cost analysis dashboard
|
||||
- `POST /api/v1/recipes/{recipe_id}/cost/target` - Set target cost threshold
|
||||
|
||||
### Nutritional Information
|
||||
- `GET /api/v1/recipes/{recipe_id}/nutrition` - Get nutritional facts
|
||||
- `PUT /api/v1/recipes/{recipe_id}/nutrition` - Update nutritional data
|
||||
- `GET /api/v1/recipes/{recipe_id}/allergens` - Get allergen information
|
||||
- `GET /api/v1/recipes/{recipe_id}/label` - Generate food label
|
||||
|
||||
### Analytics
|
||||
- `GET /api/v1/recipes/analytics/popular` - Most used recipes
|
||||
- `GET /api/v1/recipes/analytics/costly` - Most expensive recipes
|
||||
- `GET /api/v1/recipes/analytics/profitable` - Most profitable recipes
|
||||
- `GET /api/v1/recipes/analytics/categories` - Recipe category breakdown
|
||||
|
||||
## Database Schema
|
||||
|
||||
### Main Tables
|
||||
|
||||
**recipes**
|
||||
```sql
|
||||
CREATE TABLE recipes (
|
||||
id UUID PRIMARY KEY,
|
||||
tenant_id UUID NOT NULL,
|
||||
recipe_name VARCHAR(255) NOT NULL,
|
||||
product_id UUID, -- Link to product catalog
|
||||
category VARCHAR(100), -- bread, pastry, cake, etc.
|
||||
description TEXT,
|
||||
preparation_time_minutes INTEGER,
|
||||
baking_time_minutes INTEGER,
|
||||
total_time_minutes INTEGER,
|
||||
difficulty VARCHAR(50), -- easy, medium, hard
|
||||
servings INTEGER, -- Standard serving count
|
||||
yield_quantity DECIMAL(10, 2), -- Expected output quantity
|
||||
yield_unit VARCHAR(50), -- kg, units, etc.
|
||||
status VARCHAR(50) DEFAULT 'active', -- active, draft, archived
|
||||
version INTEGER DEFAULT 1,
|
||||
parent_recipe_id UUID, -- For versioning
|
||||
created_by UUID NOT NULL,
|
||||
created_at TIMESTAMP DEFAULT NOW(),
|
||||
updated_at TIMESTAMP DEFAULT NOW(),
|
||||
UNIQUE(tenant_id, recipe_name, version)
|
||||
);
|
||||
```
|
||||
|
||||
**recipe_ingredients**
|
||||
```sql
|
||||
CREATE TABLE recipe_ingredients (
|
||||
id UUID PRIMARY KEY,
|
||||
tenant_id UUID NOT NULL,
|
||||
recipe_id UUID REFERENCES recipes(id) ON DELETE CASCADE,
|
||||
ingredient_id UUID NOT NULL, -- Link to inventory items
|
||||
ingredient_name VARCHAR(255) NOT NULL, -- Cached for performance
|
||||
quantity DECIMAL(10, 3) NOT NULL,
|
||||
unit VARCHAR(50) NOT NULL, -- kg, g, L, mL, units
|
||||
preparation_notes TEXT, -- e.g., "sifted", "room temperature"
|
||||
is_optional BOOLEAN DEFAULT FALSE,
|
||||
substitutes JSONB, -- Alternative ingredients
|
||||
cost_per_unit DECIMAL(10, 2), -- Cached from inventory
|
||||
display_order INTEGER DEFAULT 0, -- Order in recipe
|
||||
created_at TIMESTAMP DEFAULT NOW(),
|
||||
updated_at TIMESTAMP DEFAULT NOW()
|
||||
);
|
||||
```
|
||||
|
||||
**recipe_instructions**
|
||||
```sql
|
||||
CREATE TABLE recipe_instructions (
|
||||
id UUID PRIMARY KEY,
|
||||
tenant_id UUID NOT NULL,
|
||||
recipe_id UUID REFERENCES recipes(id) ON DELETE CASCADE,
|
||||
step_number INTEGER NOT NULL,
|
||||
instruction_text TEXT NOT NULL,
|
||||
duration_minutes INTEGER, -- Time for this step
|
||||
temperature_celsius INTEGER, -- Oven temperature if applicable
|
||||
equipment_needed VARCHAR(255),
|
||||
tips TEXT,
|
||||
created_at TIMESTAMP DEFAULT NOW(),
|
||||
UNIQUE(recipe_id, step_number)
|
||||
);
|
||||
```
|
||||
|
||||
**recipe_nutrition**
|
||||
```sql
|
||||
CREATE TABLE recipe_nutrition (
|
||||
id UUID PRIMARY KEY,
|
||||
tenant_id UUID NOT NULL,
|
||||
recipe_id UUID REFERENCES recipes(id) ON DELETE CASCADE,
|
||||
serving_size DECIMAL(10, 2),
|
||||
serving_unit VARCHAR(50),
|
||||
calories DECIMAL(10, 2),
|
||||
protein_g DECIMAL(10, 2),
|
||||
carbohydrates_g DECIMAL(10, 2),
|
||||
fat_g DECIMAL(10, 2),
|
||||
fiber_g DECIMAL(10, 2),
|
||||
sugar_g DECIMAL(10, 2),
|
||||
sodium_mg DECIMAL(10, 2),
|
||||
allergens JSONB, -- Array of allergen codes
|
||||
dietary_labels JSONB, -- vegan, vegetarian, gluten-free, etc.
|
||||
created_at TIMESTAMP DEFAULT NOW(),
|
||||
updated_at TIMESTAMP DEFAULT NOW(),
|
||||
UNIQUE(recipe_id)
|
||||
);
|
||||
```
|
||||
|
||||
**recipe_costs**
|
||||
```sql
|
||||
CREATE TABLE recipe_costs (
|
||||
id UUID PRIMARY KEY,
|
||||
tenant_id UUID NOT NULL,
|
||||
recipe_id UUID REFERENCES recipes(id),
|
||||
calculated_at TIMESTAMP DEFAULT NOW(),
|
||||
total_ingredient_cost DECIMAL(10, 2) NOT NULL,
|
||||
cost_per_unit DECIMAL(10, 2) NOT NULL,
|
||||
cost_breakdown JSONB, -- Per-ingredient costs
|
||||
selling_price DECIMAL(10, 2),
|
||||
profit_margin_percentage DECIMAL(5, 2),
|
||||
is_current BOOLEAN DEFAULT TRUE -- Most recent calculation
|
||||
);
|
||||
```
|
||||
|
||||
**recipe_scaling_history**
|
||||
```sql
|
||||
CREATE TABLE recipe_scaling_history (
|
||||
id UUID PRIMARY KEY,
|
||||
tenant_id UUID NOT NULL,
|
||||
recipe_id UUID REFERENCES recipes(id),
|
||||
scaled_by UUID NOT NULL, -- User who scaled
|
||||
original_yield DECIMAL(10, 2),
|
||||
target_yield DECIMAL(10, 2),
|
||||
scaling_factor DECIMAL(10, 4),
|
||||
scaled_ingredients JSONB, -- Calculated quantities
|
||||
used_in_batch_id UUID, -- Link to production batch
|
||||
created_at TIMESTAMP DEFAULT NOW()
|
||||
);
|
||||
```
|
||||
|
||||
### Indexes for Performance
|
||||
```sql
|
||||
CREATE INDEX idx_recipes_tenant_status ON recipes(tenant_id, status);
|
||||
CREATE INDEX idx_recipes_category ON recipes(tenant_id, category);
|
||||
CREATE INDEX idx_recipe_ingredients_recipe ON recipe_ingredients(recipe_id);
|
||||
CREATE INDEX idx_recipe_ingredients_ingredient ON recipe_ingredients(tenant_id, ingredient_id);
|
||||
CREATE INDEX idx_recipe_costs_current ON recipe_costs(recipe_id, is_current) WHERE is_current = TRUE;
|
||||
```
|
||||
|
||||
## Business Logic Examples
|
||||
|
||||
### Batch Scaling Algorithm
|
||||
```python
|
||||
async def scale_recipe(recipe_id: UUID, target_yield: float, target_unit: str) -> ScaledRecipe:
|
||||
"""
|
||||
Scale recipe ingredients to produce target yield.
|
||||
"""
|
||||
# Get recipe and ingredients
|
||||
recipe = await get_recipe(recipe_id)
|
||||
ingredients = await get_recipe_ingredients(recipe_id)
|
||||
|
||||
# Calculate scaling factor
|
||||
scaling_factor = target_yield / recipe.yield_quantity
|
||||
|
||||
# Scale each ingredient
|
||||
scaled_ingredients = []
|
||||
for ingredient in ingredients:
|
||||
scaled_quantity = ingredient.quantity * scaling_factor
|
||||
|
||||
# Round to practical values (e.g., 0.5g increments)
|
||||
scaled_quantity = round_to_practical_value(scaled_quantity, ingredient.unit)
|
||||
|
||||
scaled_ingredients.append({
|
||||
"ingredient_id": ingredient.ingredient_id,
|
||||
"ingredient_name": ingredient.ingredient_name,
|
||||
"original_quantity": ingredient.quantity,
|
||||
"scaled_quantity": scaled_quantity,
|
||||
"unit": ingredient.unit,
|
||||
"preparation_notes": ingredient.preparation_notes
|
||||
})
|
||||
|
||||
# Store scaling history
|
||||
await store_scaling_history(recipe_id, recipe.yield_quantity, target_yield, scaling_factor)
|
||||
|
||||
return ScaledRecipe(
|
||||
recipe_id=recipe_id,
|
||||
recipe_name=recipe.recipe_name,
|
||||
original_yield=recipe.yield_quantity,
|
||||
target_yield=target_yield,
|
||||
scaling_factor=scaling_factor,
|
||||
scaled_ingredients=scaled_ingredients
|
||||
)
|
||||
```
|
||||
|
||||
### Real-Time Cost Calculation
|
||||
```python
|
||||
async def calculate_recipe_cost(recipe_id: UUID) -> RecipeCost:
|
||||
"""
|
||||
Calculate current recipe cost based on live ingredient prices.
|
||||
"""
|
||||
# Get recipe ingredients
|
||||
ingredients = await get_recipe_ingredients(recipe_id)
|
||||
|
||||
total_cost = 0.0
|
||||
cost_breakdown = []
|
||||
|
||||
for ingredient in ingredients:
|
||||
# Get current price from inventory service
|
||||
current_price = await get_ingredient_current_price(
|
||||
ingredient.ingredient_id
|
||||
)
|
||||
|
||||
# Calculate cost for this ingredient
|
||||
ingredient_cost = ingredient.quantity * current_price
|
||||
total_cost += ingredient_cost
|
||||
|
||||
cost_breakdown.append({
|
||||
"ingredient_name": ingredient.ingredient_name,
|
||||
"quantity": ingredient.quantity,
|
||||
"unit": ingredient.unit,
|
||||
"price_per_unit": current_price,
|
||||
"total_cost": ingredient_cost,
|
||||
"percentage_of_total": 0 # Calculated after loop
|
||||
})
|
||||
|
||||
# Calculate percentages
|
||||
for item in cost_breakdown:
|
||||
item["percentage_of_total"] = (item["total_cost"] / total_cost) * 100
|
||||
|
||||
# Get recipe yield
|
||||
recipe = await get_recipe(recipe_id)
|
||||
cost_per_unit = total_cost / recipe.yield_quantity
|
||||
|
||||
# Get selling price if available
|
||||
selling_price = await get_product_selling_price(recipe.product_id)
|
||||
profit_margin = None
|
||||
if selling_price:
|
||||
profit_margin = ((selling_price - cost_per_unit) / selling_price) * 100
|
||||
|
||||
# Store cost calculation
|
||||
cost_record = await store_recipe_cost(
|
||||
recipe_id=recipe_id,
|
||||
total_cost=total_cost,
|
||||
cost_per_unit=cost_per_unit,
|
||||
cost_breakdown=cost_breakdown,
|
||||
selling_price=selling_price,
|
||||
profit_margin=profit_margin
|
||||
)
|
||||
|
||||
return cost_record
|
||||
```
|
||||
|
||||
### Unit Conversion System
|
||||
```python
|
||||
class UnitConverter:
|
||||
"""
|
||||
Convert between different measurement units.
|
||||
"""
|
||||
|
||||
WEIGHT_CONVERSIONS = {
|
||||
'kg': 1000, # Base unit: grams
|
||||
'g': 1,
|
||||
'mg': 0.001,
|
||||
'lb': 453.592,
|
||||
'oz': 28.3495
|
||||
}
|
||||
|
||||
VOLUME_CONVERSIONS = {
|
||||
'L': 1000, # Base unit: milliliters
|
||||
'mL': 1,
|
||||
'cup': 236.588,
|
||||
'tbsp': 14.7868,
|
||||
'tsp': 4.92892
|
||||
}
|
||||
|
||||
@classmethod
|
||||
def convert(cls, quantity: float, from_unit: str, to_unit: str) -> float:
|
||||
"""
|
||||
Convert quantity from one unit to another.
|
||||
"""
|
||||
# Check if units are in same category
|
||||
if from_unit in cls.WEIGHT_CONVERSIONS and to_unit in cls.WEIGHT_CONVERSIONS:
|
||||
# Convert to base unit (grams) then to target
|
||||
base_quantity = quantity * cls.WEIGHT_CONVERSIONS[from_unit]
|
||||
return base_quantity / cls.WEIGHT_CONVERSIONS[to_unit]
|
||||
|
||||
elif from_unit in cls.VOLUME_CONVERSIONS and to_unit in cls.VOLUME_CONVERSIONS:
|
||||
# Convert to base unit (mL) then to target
|
||||
base_quantity = quantity * cls.VOLUME_CONVERSIONS[from_unit]
|
||||
return base_quantity / cls.VOLUME_CONVERSIONS[to_unit]
|
||||
|
||||
else:
|
||||
raise ValueError(f"Cannot convert {from_unit} to {to_unit}")
|
||||
|
||||
@staticmethod
|
||||
def round_to_practical_value(quantity: float, unit: str) -> float:
|
||||
"""
|
||||
Round to practical measurement values.
|
||||
"""
|
||||
if unit in ['kg']:
|
||||
return round(quantity, 2) # 10g precision
|
||||
elif unit in ['g', 'mL']:
|
||||
return round(quantity, 1) # 0.1 precision
|
||||
elif unit in ['mg']:
|
||||
return round(quantity, 0) # 1mg precision
|
||||
else:
|
||||
return round(quantity, 2) # Default 2 decimals
|
||||
```
|
||||
|
||||
## Events & Messaging
|
||||
|
||||
### Published Events (RabbitMQ)
|
||||
|
||||
**Exchange**: `recipes`
|
||||
**Routing Keys**: `recipes.created`, `recipes.updated`, `recipes.cost_changed`
|
||||
|
||||
**Recipe Created Event**
|
||||
```json
|
||||
{
|
||||
"event_type": "recipe_created",
|
||||
"tenant_id": "uuid",
|
||||
"recipe_id": "uuid",
|
||||
"recipe_name": "Baguette Tradicional",
|
||||
"category": "bread",
|
||||
"yield_quantity": 100,
|
||||
"yield_unit": "units",
|
||||
"ingredient_count": 5,
|
||||
"created_by": "uuid",
|
||||
"timestamp": "2025-11-06T10:30:00Z"
|
||||
}
|
||||
```
|
||||
|
||||
**Recipe Cost Changed Event**
|
||||
```json
|
||||
{
|
||||
"event_type": "recipe_cost_changed",
|
||||
"tenant_id": "uuid",
|
||||
"recipe_id": "uuid",
|
||||
"recipe_name": "Croissant",
|
||||
"old_cost_per_unit": 0.45,
|
||||
"new_cost_per_unit": 0.52,
|
||||
"change_percentage": 15.56,
|
||||
"reason": "flour_price_increase",
|
||||
"timestamp": "2025-11-06T14:00:00Z"
|
||||
}
|
||||
```
|
||||
|
||||
**Recipe Scaled Event**
|
||||
```json
|
||||
{
|
||||
"event_type": "recipe_scaled",
|
||||
"tenant_id": "uuid",
|
||||
"recipe_id": "uuid",
|
||||
"recipe_name": "Whole Wheat Bread",
|
||||
"original_yield": 50,
|
||||
"target_yield": 200,
|
||||
"scaling_factor": 4.0,
|
||||
"scaled_by": "uuid",
|
||||
"used_in_batch_id": "uuid",
|
||||
"timestamp": "2025-11-06T08:00:00Z"
|
||||
}
|
||||
```
|
||||
|
||||
### Consumed Events
|
||||
- **From Inventory**: Ingredient price updates trigger cost recalculation
|
||||
- **From Production**: Batch completion with actual yields updates recipe accuracy
|
||||
- **From Procurement**: New ingredient purchases may affect costs
|
||||
|
||||
## Custom Metrics (Prometheus)
|
||||
|
||||
```python
|
||||
# Recipe metrics
|
||||
recipes_total = Counter(
|
||||
'recipes_total',
|
||||
'Total recipes in system',
|
||||
['tenant_id', 'category', 'status']
|
||||
)
|
||||
|
||||
recipe_cost_per_unit = Histogram(
|
||||
'recipe_cost_per_unit_euros',
|
||||
'Recipe cost per unit distribution',
|
||||
['tenant_id', 'category'],
|
||||
buckets=[0.10, 0.25, 0.50, 0.75, 1.00, 1.50, 2.00, 3.00, 5.00]
|
||||
)
|
||||
|
||||
# Scaling metrics
|
||||
recipe_scaling_total = Counter(
|
||||
'recipe_scaling_operations_total',
|
||||
'Total recipe scaling operations',
|
||||
['tenant_id', 'recipe_id']
|
||||
)
|
||||
|
||||
scaling_factor_distribution = Histogram(
|
||||
'recipe_scaling_factor',
|
||||
'Recipe scaling factor distribution',
|
||||
['tenant_id'],
|
||||
buckets=[0.5, 0.75, 1.0, 1.5, 2.0, 3.0, 5.0, 10.0]
|
||||
)
|
||||
|
||||
# Cost calculation metrics
|
||||
cost_calculations_total = Counter(
|
||||
'recipe_cost_calculations_total',
|
||||
'Total cost calculations performed',
|
||||
['tenant_id']
|
||||
)
|
||||
|
||||
profit_margin_percentage = Histogram(
|
||||
'recipe_profit_margin_percentage',
|
||||
'Recipe profit margin distribution',
|
||||
['tenant_id', 'category'],
|
||||
buckets=[0, 10, 20, 30, 40, 50, 60, 70, 80]
|
||||
)
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
### Environment Variables
|
||||
|
||||
**Service Configuration:**
|
||||
- `PORT` - Service port (default: 8009)
|
||||
- `DATABASE_URL` - PostgreSQL connection string
|
||||
- `REDIS_URL` - Redis connection string
|
||||
- `RABBITMQ_URL` - RabbitMQ connection string
|
||||
|
||||
**Recipe Configuration:**
|
||||
- `ENABLE_AUTO_COST_UPDATE` - Auto-recalculate costs on price changes (default: true)
|
||||
- `COST_CACHE_TTL_SECONDS` - Cost cache duration (default: 3600)
|
||||
- `MIN_PROFIT_MARGIN_PERCENTAGE` - Minimum acceptable margin (default: 30)
|
||||
- `ALERT_ON_LOW_MARGIN` - Alert when margin drops below threshold (default: true)
|
||||
|
||||
**Scaling Configuration:**
|
||||
- `MAX_SCALING_FACTOR` - Maximum scaling multiplier (default: 10.0)
|
||||
- `MIN_SCALING_FACTOR` - Minimum scaling multiplier (default: 0.1)
|
||||
- `ENABLE_PRACTICAL_ROUNDING` - Round to practical values (default: true)
|
||||
|
||||
**Validation:**
|
||||
- `REQUIRE_NUTRITION_INFO` - Require nutritional data (default: false)
|
||||
- `REQUIRE_ALLERGEN_INFO` - Require allergen declaration (default: true)
|
||||
- `VALIDATE_INGREDIENT_AVAILABILITY` - Check inventory before saving (default: true)
|
||||
|
||||
## Development Setup
|
||||
|
||||
### Prerequisites
|
||||
- Python 3.11+
|
||||
- PostgreSQL 17
|
||||
- Redis 7.4
|
||||
- RabbitMQ 4.1
|
||||
|
||||
### Local Development
|
||||
```bash
|
||||
cd services/recipes
|
||||
python -m venv venv
|
||||
source venv/bin/activate
|
||||
|
||||
pip install -r requirements.txt
|
||||
|
||||
export DATABASE_URL=postgresql://user:pass@localhost:5432/recipes
|
||||
export REDIS_URL=redis://localhost:6379/0
|
||||
export RABBITMQ_URL=amqp://guest:guest@localhost:5672/
|
||||
|
||||
alembic upgrade head
|
||||
python main.py
|
||||
```
|
||||
|
||||
### Running Tests
|
||||
```bash
|
||||
pytest tests/ -v --cov=app
|
||||
```
|
||||
|
||||
### API Documentation
|
||||
Access Swagger UI: `http://localhost:8009/docs`
|
||||
|
||||
## Integration Points
|
||||
|
||||
### Dependencies
|
||||
- **Inventory Service** - Real-time ingredient prices and availability
|
||||
- **Production Service** - Recipe scaling for production batches
|
||||
- **Procurement Service** - Ingredient specifications for ordering
|
||||
- **Auth Service** - User authentication for recipe creation
|
||||
- **PostgreSQL** - Recipe data storage
|
||||
- **Redis** - Cost caching
|
||||
- **RabbitMQ** - Event publishing
|
||||
|
||||
### Dependents
|
||||
- **Production Service** - Uses recipes for batch planning
|
||||
- **Inventory Service** - Knows required ingredients
|
||||
- **Procurement Service** - Plans purchases based on recipes
|
||||
- **Forecasting Service** - Recipe yield data for demand planning
|
||||
- **AI Insights Service** - Cost optimization recommendations
|
||||
- **Frontend Dashboard** - Recipe management UI
|
||||
|
||||
## Security Measures
|
||||
|
||||
### Authentication & Authorization
|
||||
- JWT token validation on all endpoints
|
||||
- Tenant isolation at database level
|
||||
- Role-based access control (admin, manager, staff)
|
||||
- Recipe ownership verification
|
||||
|
||||
### Data Protection
|
||||
- Tenant-scoped queries (prevent data leaks)
|
||||
- Input validation with Pydantic schemas
|
||||
- SQL injection prevention (parameterized queries)
|
||||
- XSS protection on recipe instructions
|
||||
- Recipe versioning (prevent accidental overwrites)
|
||||
|
||||
### Audit Logging
|
||||
```python
|
||||
# Log all recipe modifications
|
||||
logger.info(
|
||||
"recipe_updated",
|
||||
recipe_id=recipe.id,
|
||||
tenant_id=recipe.tenant_id,
|
||||
updated_by=current_user.id,
|
||||
changes=changes_dict,
|
||||
timestamp=datetime.utcnow()
|
||||
)
|
||||
```
|
||||
|
||||
## Competitive Advantages
|
||||
|
||||
### 1. Real-Time Cost Tracking
|
||||
Unlike static recipe books, costs update automatically when ingredient prices change, enabling immediate pricing decisions.
|
||||
|
||||
### 2. Intelligent Scaling
|
||||
Advanced scaling algorithm with practical rounding ensures recipes work in real-world production scenarios, not just mathematically.
|
||||
|
||||
### 3. Cross-Service Intelligence
|
||||
Recipe data flows seamlessly to production, inventory, and procurement—no manual data entry or synchronization.
|
||||
|
||||
### 4. EU Compliance Built-In
|
||||
Nutritional facts and allergen tracking meet EU food labeling regulations (EU FIC 1169/2011), avoiding costly fines.
|
||||
|
||||
### 5. Cost Breakdown Analysis
|
||||
See exactly which ingredients drive costs, enabling targeted negotiations with suppliers or ingredient substitutions.
|
||||
|
||||
### 6. Recipe Versioning
|
||||
Track recipe changes over time, enabling quality control and the ability to revert to previous versions.
|
||||
|
||||
## Business Value for VUE Madrid
|
||||
|
||||
### Problem Statement
|
||||
Spanish bakeries struggle with:
|
||||
- Inconsistent product quality due to informal recipes
|
||||
- Unknown production costs leading to poor pricing
|
||||
- Manual batch scaling errors causing waste
|
||||
- EU labeling compliance complexity
|
||||
- Recipe knowledge lost when staff leave
|
||||
|
||||
### Solution
|
||||
Bakery-IA Recipes Service provides:
|
||||
- **Standardized Production**: Digital recipes ensure consistency
|
||||
- **Cost Transparency**: Real-time cost calculation for informed pricing
|
||||
- **Batch Scaling**: Automatic ingredient calculation for any volume
|
||||
- **Compliance**: Built-in EU food labeling support
|
||||
- **Knowledge Base**: Recipes preserved digitally forever
|
||||
|
||||
### Quantifiable Impact
|
||||
|
||||
**Cost Savings:**
|
||||
- €50-150/month from improved pricing decisions
|
||||
- €100-300/month from waste reduction (accurate scaling)
|
||||
- €500-5,000 avoided fines (compliance)
|
||||
- **Total: €150-450/month savings**
|
||||
|
||||
**Time Savings:**
|
||||
- 3-5 hours/week on manual recipe calculations
|
||||
- 2-3 hours/week on cost analysis
|
||||
- 1-2 hours/week on batch planning
|
||||
- **Total: 6-10 hours/week saved**
|
||||
|
||||
**Quality Improvements:**
|
||||
- 95%+ batch consistency vs. 70-80% manual
|
||||
- 99%+ cost accuracy vs. ±20-30% estimation
|
||||
- 100% EU labeling compliance
|
||||
- Zero recipe knowledge loss
|
||||
|
||||
### Target Market Fit (Spanish Bakeries)
|
||||
- **Regulatory**: EU food labeling laws (FIC 1169/2011) require detailed allergen and nutritional information
|
||||
- **Market Size**: 10,000+ bakeries in Spain need recipe management
|
||||
- **Pain Point**: Most bakeries use paper recipes or personal knowledge
|
||||
- **Differentiation**: First Spanish bakery platform with integrated recipe costing
|
||||
|
||||
### ROI Calculation
|
||||
**Investment**: €0 additional (included in platform subscription)
|
||||
**Monthly Savings**: €150-450
|
||||
**Annual ROI**: €1,800-5,400 value per bakery
|
||||
**Payback**: Immediate (included in subscription)
|
||||
|
||||
---
|
||||
|
||||
## Technical Innovation
|
||||
|
||||
### Intelligent Scaling Algorithm
|
||||
Scales recipes while maintaining practical measurements (e.g., rounds to 0.5g increments for precision scales).
|
||||
|
||||
### Real-Time Cost Engine
|
||||
Recalculates recipe costs in <100ms when ingredient prices change, using Redis caching for performance.
|
||||
|
||||
### EU Compliance Automation
|
||||
Automatically generates EU-compliant food labels with nutritional facts and allergen declarations.
|
||||
|
||||
### Cross-Service Integration
|
||||
Recipe data flows to 5+ other services (Production, Inventory, Procurement, Forecasting, AI Insights) enabling platform-wide intelligence.
|
||||
|
||||
---
|
||||
|
||||
**Copyright © 2025 Bakery-IA. All rights reserved.**
|
||||
84
services/recipes/alembic.ini
Normal file
84
services/recipes/alembic.ini
Normal file
@@ -0,0 +1,84 @@
|
||||
# ================================================================
|
||||
# services/recipes/alembic.ini - Alembic Configuration
|
||||
# ================================================================
|
||||
[alembic]
|
||||
# path to migration scripts
|
||||
script_location = migrations
|
||||
|
||||
# template used to generate migration file names
|
||||
file_template = %%(year)d%%(month).2d%%(day).2d_%%(hour).2d%%(minute).2d_%%(rev)s_%%(slug)s
|
||||
|
||||
# sys.path path, will be prepended to sys.path if present.
|
||||
prepend_sys_path = .
|
||||
|
||||
# timezone to use when rendering the date within the migration file
|
||||
# as well as the filename.
|
||||
timezone = Europe/Madrid
|
||||
|
||||
# max length of characters to apply to the
|
||||
# "slug" field
|
||||
truncate_slug_length = 40
|
||||
|
||||
# set to 'true' to run the environment during
|
||||
# the 'revision' command, regardless of autogenerate
|
||||
revision_environment = false
|
||||
|
||||
# set to 'true' to allow .pyc and .pyo files without
|
||||
# a source .py file to be detected as revisions in the
|
||||
# versions/ directory
|
||||
sourceless = false
|
||||
|
||||
# version of a migration file's filename format
|
||||
version_num_format = %%s
|
||||
|
||||
# version path separator
|
||||
version_path_separator = os
|
||||
|
||||
# set to 'true' to search source files recursively
|
||||
# in each "version_locations" directory
|
||||
recursive_version_locations = false
|
||||
|
||||
# the output encoding used when revision files
|
||||
# are written from script.py.mako
|
||||
output_encoding = utf-8
|
||||
|
||||
# Database URL - will be overridden by environment variable or settings
|
||||
sqlalchemy.url = postgresql+asyncpg://recipes_user:password@recipes-db-service:5432/recipes_db
|
||||
|
||||
[post_write_hooks]
|
||||
# post_write_hooks defines scripts or Python functions that are run
|
||||
# on newly generated revision scripts.
|
||||
|
||||
[loggers]
|
||||
keys = root,sqlalchemy,alembic
|
||||
|
||||
[handlers]
|
||||
keys = console
|
||||
|
||||
[formatters]
|
||||
keys = generic
|
||||
|
||||
[logger_root]
|
||||
level = WARN
|
||||
handlers = console
|
||||
qualname =
|
||||
|
||||
[logger_sqlalchemy]
|
||||
level = WARN
|
||||
handlers =
|
||||
qualname = sqlalchemy.engine
|
||||
|
||||
[logger_alembic]
|
||||
level = INFO
|
||||
handlers =
|
||||
qualname = alembic
|
||||
|
||||
[handler_console]
|
||||
class = StreamHandler
|
||||
args = (sys.stderr,)
|
||||
level = NOTSET
|
||||
formatter = generic
|
||||
|
||||
[formatter_generic]
|
||||
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||
datefmt = %H:%M:%S
|
||||
1
services/recipes/app/__init__.py
Normal file
1
services/recipes/app/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
# services/recipes/app/__init__.py
|
||||
1
services/recipes/app/api/__init__.py
Normal file
1
services/recipes/app/api/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
# services/recipes/app/api/__init__.py
|
||||
237
services/recipes/app/api/audit.py
Normal file
237
services/recipes/app/api/audit.py
Normal file
@@ -0,0 +1,237 @@
|
||||
# services/recipes/app/api/audit.py
|
||||
"""
|
||||
Audit Logs API - Retrieve audit trail for recipes service
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, Query, Path, status
|
||||
from typing import Optional, Dict, Any
|
||||
from uuid import UUID
|
||||
from datetime import datetime
|
||||
import structlog
|
||||
from sqlalchemy import select, func, and_
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
|
||||
from app.models import AuditLog
|
||||
from shared.auth.decorators import get_current_user_dep
|
||||
from shared.auth.access_control import require_user_role
|
||||
from shared.routing import RouteBuilder
|
||||
from shared.models.audit_log_schemas import (
|
||||
AuditLogResponse,
|
||||
AuditLogListResponse,
|
||||
AuditLogStatsResponse
|
||||
)
|
||||
from app.core.database import db_manager
|
||||
|
||||
route_builder = RouteBuilder('recipes')
|
||||
router = APIRouter(tags=["audit-logs"])
|
||||
logger = structlog.get_logger()
|
||||
|
||||
|
||||
async def get_db():
|
||||
"""Database session dependency"""
|
||||
async with db_manager.get_session() as session:
|
||||
yield session
|
||||
|
||||
|
||||
@router.get(
|
||||
route_builder.build_base_route("audit-logs"),
|
||||
response_model=AuditLogListResponse
|
||||
)
|
||||
@require_user_role(['admin', 'owner'])
|
||||
async def get_audit_logs(
|
||||
tenant_id: UUID = Path(..., description="Tenant ID"),
|
||||
start_date: Optional[datetime] = Query(None, description="Filter logs from this date"),
|
||||
end_date: Optional[datetime] = Query(None, description="Filter logs until this date"),
|
||||
user_id: Optional[UUID] = Query(None, description="Filter by user ID"),
|
||||
action: Optional[str] = Query(None, description="Filter by action type"),
|
||||
resource_type: Optional[str] = Query(None, description="Filter by resource type"),
|
||||
severity: Optional[str] = Query(None, description="Filter by severity level"),
|
||||
search: Optional[str] = Query(None, description="Search in description field"),
|
||||
limit: int = Query(100, ge=1, le=1000, description="Number of records to return"),
|
||||
offset: int = Query(0, ge=0, description="Number of records to skip"),
|
||||
current_user: Dict[str, Any] = Depends(get_current_user_dep),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""
|
||||
Get audit logs for recipes service.
|
||||
Requires admin or owner role.
|
||||
"""
|
||||
try:
|
||||
logger.info(
|
||||
"Retrieving audit logs",
|
||||
tenant_id=tenant_id,
|
||||
user_id=current_user.get("user_id"),
|
||||
filters={
|
||||
"start_date": start_date,
|
||||
"end_date": end_date,
|
||||
"action": action,
|
||||
"resource_type": resource_type,
|
||||
"severity": severity
|
||||
}
|
||||
)
|
||||
|
||||
# Build query filters
|
||||
filters = [AuditLog.tenant_id == tenant_id]
|
||||
|
||||
if start_date:
|
||||
filters.append(AuditLog.created_at >= start_date)
|
||||
if end_date:
|
||||
filters.append(AuditLog.created_at <= end_date)
|
||||
if user_id:
|
||||
filters.append(AuditLog.user_id == user_id)
|
||||
if action:
|
||||
filters.append(AuditLog.action == action)
|
||||
if resource_type:
|
||||
filters.append(AuditLog.resource_type == resource_type)
|
||||
if severity:
|
||||
filters.append(AuditLog.severity == severity)
|
||||
if search:
|
||||
filters.append(AuditLog.description.ilike(f"%{search}%"))
|
||||
|
||||
# Count total matching records
|
||||
count_query = select(func.count()).select_from(AuditLog).where(and_(*filters))
|
||||
total_result = await db.execute(count_query)
|
||||
total = total_result.scalar() or 0
|
||||
|
||||
# Fetch paginated results
|
||||
query = (
|
||||
select(AuditLog)
|
||||
.where(and_(*filters))
|
||||
.order_by(AuditLog.created_at.desc())
|
||||
.limit(limit)
|
||||
.offset(offset)
|
||||
)
|
||||
|
||||
result = await db.execute(query)
|
||||
audit_logs = result.scalars().all()
|
||||
|
||||
# Convert to response models
|
||||
items = [AuditLogResponse.from_orm(log) for log in audit_logs]
|
||||
|
||||
logger.info(
|
||||
"Successfully retrieved audit logs",
|
||||
tenant_id=tenant_id,
|
||||
total=total,
|
||||
returned=len(items)
|
||||
)
|
||||
|
||||
return AuditLogListResponse(
|
||||
items=items,
|
||||
total=total,
|
||||
limit=limit,
|
||||
offset=offset,
|
||||
has_more=(offset + len(items)) < total
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(
|
||||
"Failed to retrieve audit logs",
|
||||
error=str(e),
|
||||
tenant_id=tenant_id
|
||||
)
|
||||
raise HTTPException(
|
||||
status_code=500,
|
||||
detail=f"Failed to retrieve audit logs: {str(e)}"
|
||||
)
|
||||
|
||||
|
||||
@router.get(
|
||||
route_builder.build_base_route("audit-logs/stats"),
|
||||
response_model=AuditLogStatsResponse
|
||||
)
|
||||
@require_user_role(['admin', 'owner'])
|
||||
async def get_audit_log_stats(
|
||||
tenant_id: UUID = Path(..., description="Tenant ID"),
|
||||
start_date: Optional[datetime] = Query(None, description="Filter logs from this date"),
|
||||
end_date: Optional[datetime] = Query(None, description="Filter logs until this date"),
|
||||
current_user: Dict[str, Any] = Depends(get_current_user_dep),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""
|
||||
Get audit log statistics for recipes service.
|
||||
Requires admin or owner role.
|
||||
"""
|
||||
try:
|
||||
logger.info(
|
||||
"Retrieving audit log statistics",
|
||||
tenant_id=tenant_id,
|
||||
user_id=current_user.get("user_id")
|
||||
)
|
||||
|
||||
# Build base filters
|
||||
filters = [AuditLog.tenant_id == tenant_id]
|
||||
if start_date:
|
||||
filters.append(AuditLog.created_at >= start_date)
|
||||
if end_date:
|
||||
filters.append(AuditLog.created_at <= end_date)
|
||||
|
||||
# Total events
|
||||
count_query = select(func.count()).select_from(AuditLog).where(and_(*filters))
|
||||
total_result = await db.execute(count_query)
|
||||
total_events = total_result.scalar() or 0
|
||||
|
||||
# Events by action
|
||||
action_query = (
|
||||
select(AuditLog.action, func.count().label('count'))
|
||||
.where(and_(*filters))
|
||||
.group_by(AuditLog.action)
|
||||
)
|
||||
action_result = await db.execute(action_query)
|
||||
events_by_action = {row.action: row.count for row in action_result}
|
||||
|
||||
# Events by severity
|
||||
severity_query = (
|
||||
select(AuditLog.severity, func.count().label('count'))
|
||||
.where(and_(*filters))
|
||||
.group_by(AuditLog.severity)
|
||||
)
|
||||
severity_result = await db.execute(severity_query)
|
||||
events_by_severity = {row.severity: row.count for row in severity_result}
|
||||
|
||||
# Events by resource type
|
||||
resource_query = (
|
||||
select(AuditLog.resource_type, func.count().label('count'))
|
||||
.where(and_(*filters))
|
||||
.group_by(AuditLog.resource_type)
|
||||
)
|
||||
resource_result = await db.execute(resource_query)
|
||||
events_by_resource_type = {row.resource_type: row.count for row in resource_result}
|
||||
|
||||
# Date range
|
||||
date_range_query = (
|
||||
select(
|
||||
func.min(AuditLog.created_at).label('min_date'),
|
||||
func.max(AuditLog.created_at).label('max_date')
|
||||
)
|
||||
.where(and_(*filters))
|
||||
)
|
||||
date_result = await db.execute(date_range_query)
|
||||
date_row = date_result.one()
|
||||
|
||||
logger.info(
|
||||
"Successfully retrieved audit log statistics",
|
||||
tenant_id=tenant_id,
|
||||
total_events=total_events
|
||||
)
|
||||
|
||||
return AuditLogStatsResponse(
|
||||
total_events=total_events,
|
||||
events_by_action=events_by_action,
|
||||
events_by_severity=events_by_severity,
|
||||
events_by_resource_type=events_by_resource_type,
|
||||
date_range={
|
||||
"min": date_row.min_date,
|
||||
"max": date_row.max_date
|
||||
}
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(
|
||||
"Failed to retrieve audit log statistics",
|
||||
error=str(e),
|
||||
tenant_id=tenant_id
|
||||
)
|
||||
raise HTTPException(
|
||||
status_code=500,
|
||||
detail=f"Failed to retrieve audit log statistics: {str(e)}"
|
||||
)
|
||||
47
services/recipes/app/api/internal.py
Normal file
47
services/recipes/app/api/internal.py
Normal file
@@ -0,0 +1,47 @@
|
||||
"""
|
||||
Internal API for Recipes Service
|
||||
Handles internal service-to-service operations
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, Header
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy import select, func
|
||||
from uuid import UUID
|
||||
import structlog
|
||||
|
||||
from app.core.database import get_db
|
||||
from app.core.config import settings
|
||||
from app.models.recipes import Recipe, RecipeStatus
|
||||
|
||||
logger = structlog.get_logger()
|
||||
router = APIRouter(prefix="/internal", tags=["internal"])
|
||||
|
||||
|
||||
|
||||
@router.get("/count")
|
||||
async def get_recipe_count(
|
||||
tenant_id: str,
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""
|
||||
Get count of recipes for onboarding status check.
|
||||
Counts DRAFT and ACTIVE recipes (excludes ARCHIVED/DISCONTINUED).
|
||||
Internal endpoint for tenant service.
|
||||
"""
|
||||
try:
|
||||
count = await db.scalar(
|
||||
select(func.count()).select_from(Recipe)
|
||||
.where(
|
||||
Recipe.tenant_id == UUID(tenant_id),
|
||||
Recipe.status.in_([RecipeStatus.DRAFT, RecipeStatus.ACTIVE, RecipeStatus.TESTING])
|
||||
)
|
||||
)
|
||||
|
||||
return {
|
||||
"count": count or 0,
|
||||
"tenant_id": tenant_id
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
logger.error("Failed to get recipe count", tenant_id=tenant_id, error=str(e))
|
||||
raise HTTPException(status_code=500, detail=f"Failed to get recipe count: {str(e)}")
|
||||
426
services/recipes/app/api/internal_demo.py
Normal file
426
services/recipes/app/api/internal_demo.py
Normal file
@@ -0,0 +1,426 @@
|
||||
"""
|
||||
Internal Demo Cloning API for Recipes Service
|
||||
Service-to-service endpoint for cloning recipe and production data
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, Header
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy import select, delete, func
|
||||
import structlog
|
||||
import uuid
|
||||
from uuid import UUID
|
||||
from datetime import datetime, timezone, timedelta
|
||||
from typing import Optional
|
||||
import os
|
||||
import sys
|
||||
import json
|
||||
from pathlib import Path
|
||||
|
||||
sys.path.insert(0, str(Path(__file__).parent.parent.parent.parent))
|
||||
from shared.utils.demo_dates import adjust_date_for_demo, resolve_time_marker
|
||||
|
||||
from app.core.database import get_db
|
||||
from app.models.recipes import (
|
||||
Recipe, RecipeIngredient, ProductionBatch, ProductionIngredientConsumption,
|
||||
RecipeStatus, ProductionStatus, MeasurementUnit, ProductionPriority
|
||||
)
|
||||
|
||||
from app.core.config import settings
|
||||
|
||||
logger = structlog.get_logger()
|
||||
router = APIRouter(prefix="/internal/demo", tags=["internal"])
|
||||
|
||||
# Base demo tenant IDs
|
||||
DEMO_TENANT_PROFESSIONAL = "a1b2c3d4-e5f6-47a8-b9c0-d1e2f3a4b5c6"
|
||||
|
||||
|
||||
def parse_date_field(
|
||||
field_value: any,
|
||||
session_time: datetime,
|
||||
field_name: str = "date"
|
||||
) -> Optional[datetime]:
|
||||
"""
|
||||
Parse a date field from JSON, supporting BASE_TS markers and ISO timestamps.
|
||||
|
||||
Args:
|
||||
field_value: The date field value (can be BASE_TS marker, ISO string, or None)
|
||||
session_time: Session creation time (timezone-aware UTC)
|
||||
field_name: Name of the field (for logging)
|
||||
|
||||
Returns:
|
||||
Timezone-aware UTC datetime or None
|
||||
"""
|
||||
if field_value is None:
|
||||
return None
|
||||
|
||||
# Handle BASE_TS markers
|
||||
if isinstance(field_value, str) and field_value.startswith("BASE_TS"):
|
||||
try:
|
||||
return resolve_time_marker(field_value, session_time)
|
||||
except (ValueError, AttributeError) as e:
|
||||
logger.warning(
|
||||
"Failed to resolve BASE_TS marker",
|
||||
field_name=field_name,
|
||||
marker=field_value,
|
||||
error=str(e)
|
||||
)
|
||||
return None
|
||||
|
||||
# Handle ISO timestamps (legacy format - convert to absolute datetime)
|
||||
if isinstance(field_value, str) and ('T' in field_value or 'Z' in field_value):
|
||||
try:
|
||||
parsed_date = datetime.fromisoformat(field_value.replace('Z', '+00:00'))
|
||||
# Adjust relative to session time
|
||||
return adjust_date_for_demo(parsed_date, session_time)
|
||||
except (ValueError, AttributeError) as e:
|
||||
logger.warning(
|
||||
"Failed to parse ISO timestamp",
|
||||
field_name=field_name,
|
||||
value=field_value,
|
||||
error=str(e)
|
||||
)
|
||||
return None
|
||||
|
||||
logger.warning(
|
||||
"Unknown date format",
|
||||
field_name=field_name,
|
||||
value=field_value,
|
||||
value_type=type(field_value).__name__
|
||||
)
|
||||
return None
|
||||
|
||||
|
||||
@router.post("/clone")
|
||||
async def clone_demo_data(
|
||||
base_tenant_id: str,
|
||||
virtual_tenant_id: str,
|
||||
demo_account_type: str,
|
||||
session_id: Optional[str] = None,
|
||||
session_created_at: Optional[str] = None,
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""
|
||||
Clone recipes service data for a virtual demo tenant
|
||||
|
||||
This endpoint creates fresh demo data by:
|
||||
1. Loading seed data from JSON files
|
||||
2. Applying XOR-based ID transformation
|
||||
3. Adjusting dates relative to session creation time
|
||||
4. Creating records in the virtual tenant
|
||||
|
||||
Args:
|
||||
base_tenant_id: Template tenant UUID (for reference)
|
||||
virtual_tenant_id: Target virtual tenant UUID
|
||||
demo_account_type: Type of demo account
|
||||
session_id: Originating session ID for tracing
|
||||
session_created_at: Session creation timestamp for date adjustment
|
||||
|
||||
Returns:
|
||||
Cloning status and record counts
|
||||
"""
|
||||
start_time = datetime.now(timezone.utc)
|
||||
|
||||
try:
|
||||
# Validate UUIDs
|
||||
virtual_uuid = uuid.UUID(virtual_tenant_id)
|
||||
|
||||
# Parse session creation time for date adjustment
|
||||
if session_created_at:
|
||||
try:
|
||||
session_time = datetime.fromisoformat(session_created_at.replace('Z', '+00:00'))
|
||||
except (ValueError, AttributeError):
|
||||
session_time = start_time
|
||||
else:
|
||||
session_time = start_time
|
||||
|
||||
logger.info(
|
||||
"Starting recipes data cloning",
|
||||
base_tenant_id=base_tenant_id,
|
||||
virtual_tenant_id=virtual_tenant_id,
|
||||
demo_account_type=demo_account_type,
|
||||
session_id=session_id,
|
||||
session_created_at=session_created_at
|
||||
)
|
||||
|
||||
# Load seed data from JSON files
|
||||
from shared.utils.seed_data_paths import get_seed_data_path
|
||||
|
||||
if demo_account_type == "professional":
|
||||
json_file = get_seed_data_path("professional", "04-recipes.json")
|
||||
elif demo_account_type == "enterprise":
|
||||
json_file = get_seed_data_path("enterprise", "04-recipes.json")
|
||||
elif demo_account_type == "enterprise_child":
|
||||
json_file = get_seed_data_path("enterprise", "04-recipes.json", child_id=base_tenant_id)
|
||||
else:
|
||||
raise ValueError(f"Invalid demo account type: {demo_account_type}")
|
||||
|
||||
# Load JSON data
|
||||
with open(json_file, 'r', encoding='utf-8') as f:
|
||||
seed_data = json.load(f)
|
||||
|
||||
# Track cloning statistics
|
||||
stats = {
|
||||
"recipes": 0,
|
||||
"recipe_ingredients": 0
|
||||
}
|
||||
|
||||
# First, build recipe ID map by processing all recipes
|
||||
recipe_id_map = {}
|
||||
|
||||
# Create Recipes
|
||||
for recipe_data in seed_data.get('recipes', []):
|
||||
# Transform recipe ID using XOR
|
||||
from shared.utils.demo_id_transformer import transform_id
|
||||
try:
|
||||
recipe_uuid = uuid.UUID(recipe_data['id'])
|
||||
transformed_id = transform_id(recipe_data['id'], virtual_uuid)
|
||||
except ValueError as e:
|
||||
logger.error("Failed to parse recipe UUID",
|
||||
recipe_id=recipe_data['id'],
|
||||
error=str(e))
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail=f"Invalid UUID format in recipe data: {str(e)}"
|
||||
)
|
||||
|
||||
# Parse date fields (supports BASE_TS markers and ISO timestamps)
|
||||
adjusted_created_at = parse_date_field(
|
||||
recipe_data.get('created_at'),
|
||||
session_time,
|
||||
"created_at"
|
||||
)
|
||||
adjusted_updated_at = parse_date_field(
|
||||
recipe_data.get('updated_at'),
|
||||
session_time,
|
||||
"updated_at"
|
||||
)
|
||||
|
||||
# Map field names from seed data to model fields
|
||||
# Handle yield_quantity/yield_unit (may be named finished_product_quantity/unit in seed data)
|
||||
yield_quantity = recipe_data.get('yield_quantity') or recipe_data.get('finished_product_quantity', 1.0)
|
||||
yield_unit_str = recipe_data.get('yield_unit') or recipe_data.get('finished_product_unit', 'UNITS')
|
||||
|
||||
# Convert yield_unit string to enum if needed
|
||||
if isinstance(yield_unit_str, str):
|
||||
try:
|
||||
yield_unit = MeasurementUnit[yield_unit_str.upper()]
|
||||
except KeyError:
|
||||
yield_unit = MeasurementUnit.UNITS
|
||||
else:
|
||||
yield_unit = yield_unit_str
|
||||
|
||||
# Convert status string to enum if needed
|
||||
status = recipe_data.get('status', 'ACTIVE')
|
||||
if isinstance(status, str):
|
||||
try:
|
||||
status = RecipeStatus[status.upper()]
|
||||
except KeyError:
|
||||
status = RecipeStatus.ACTIVE
|
||||
|
||||
new_recipe = Recipe(
|
||||
id=str(transformed_id),
|
||||
tenant_id=virtual_uuid,
|
||||
name=recipe_data['name'],
|
||||
description=recipe_data.get('description'),
|
||||
recipe_code=recipe_data.get('recipe_code'),
|
||||
version=recipe_data.get('version', '1.0'),
|
||||
status=status,
|
||||
finished_product_id=recipe_data['finished_product_id'],
|
||||
yield_quantity=yield_quantity,
|
||||
yield_unit=yield_unit,
|
||||
category=recipe_data.get('category'),
|
||||
difficulty_level=recipe_data.get('difficulty_level', 1),
|
||||
prep_time_minutes=recipe_data.get('prep_time_minutes') or recipe_data.get('preparation_time_minutes'),
|
||||
cook_time_minutes=recipe_data.get('cook_time_minutes') or recipe_data.get('baking_time_minutes'),
|
||||
total_time_minutes=recipe_data.get('total_time_minutes'),
|
||||
rest_time_minutes=recipe_data.get('rest_time_minutes') or recipe_data.get('cooling_time_minutes'),
|
||||
instructions=recipe_data.get('instructions'),
|
||||
preparation_notes=recipe_data.get('notes') or recipe_data.get('preparation_notes'),
|
||||
created_at=adjusted_created_at,
|
||||
updated_at=adjusted_updated_at
|
||||
)
|
||||
db.add(new_recipe)
|
||||
stats["recipes"] += 1
|
||||
|
||||
# Add recipe ID to map for ingredients
|
||||
recipe_id_map[recipe_data['id']] = str(transformed_id)
|
||||
|
||||
# Create Recipe Ingredients
|
||||
for recipe_ingredient_data in seed_data.get('recipe_ingredients', []):
|
||||
# Transform ingredient ID using XOR
|
||||
try:
|
||||
ingredient_uuid = uuid.UUID(recipe_ingredient_data['id'])
|
||||
transformed_id = transform_id(ingredient_uuid, virtual_uuid)
|
||||
except ValueError as e:
|
||||
logger.error("Failed to parse recipe ingredient UUID",
|
||||
ingredient_id=recipe_ingredient_data['id'],
|
||||
error=str(e))
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail=f"Invalid UUID format in recipe ingredient data: {str(e)}"
|
||||
)
|
||||
|
||||
# Get the transformed recipe ID
|
||||
recipe_id = recipe_id_map.get(recipe_ingredient_data['recipe_id'])
|
||||
if not recipe_id:
|
||||
logger.error("Recipe not found for ingredient",
|
||||
recipe_id=recipe_ingredient_data['recipe_id'])
|
||||
continue
|
||||
|
||||
# Convert unit string to enum if needed
|
||||
unit_str = recipe_ingredient_data.get('unit', 'KILOGRAMS')
|
||||
if isinstance(unit_str, str):
|
||||
try:
|
||||
unit = MeasurementUnit[unit_str.upper()]
|
||||
except KeyError:
|
||||
# Try without 'S' for singular forms
|
||||
try:
|
||||
unit = MeasurementUnit[unit_str.upper().rstrip('S')]
|
||||
except KeyError:
|
||||
unit = MeasurementUnit.KILOGRAMS
|
||||
else:
|
||||
unit = unit_str
|
||||
|
||||
new_recipe_ingredient = RecipeIngredient(
|
||||
id=str(transformed_id),
|
||||
tenant_id=virtual_uuid,
|
||||
recipe_id=recipe_id,
|
||||
ingredient_id=recipe_ingredient_data['ingredient_id'],
|
||||
quantity=recipe_ingredient_data['quantity'],
|
||||
unit=unit,
|
||||
unit_cost=recipe_ingredient_data.get('cost_per_unit') or recipe_ingredient_data.get('unit_cost', 0.0),
|
||||
total_cost=recipe_ingredient_data.get('total_cost'),
|
||||
ingredient_order=recipe_ingredient_data.get('sequence') or recipe_ingredient_data.get('ingredient_order', 1),
|
||||
is_optional=recipe_ingredient_data.get('is_optional', False),
|
||||
ingredient_notes=recipe_ingredient_data.get('notes') or recipe_ingredient_data.get('ingredient_notes')
|
||||
)
|
||||
db.add(new_recipe_ingredient)
|
||||
stats["recipe_ingredients"] += 1
|
||||
|
||||
await db.commit()
|
||||
|
||||
duration_ms = int((datetime.now(timezone.utc) - start_time).total_seconds() * 1000)
|
||||
|
||||
logger.info(
|
||||
"Recipes data cloned successfully",
|
||||
virtual_tenant_id=virtual_tenant_id,
|
||||
records_cloned=stats,
|
||||
duration_ms=duration_ms
|
||||
)
|
||||
|
||||
return {
|
||||
"service": "recipes",
|
||||
"status": "completed",
|
||||
"records_cloned": sum(stats.values()),
|
||||
"duration_ms": duration_ms,
|
||||
"details": {
|
||||
"recipes": stats["recipes"],
|
||||
"recipe_ingredients": stats["recipe_ingredients"],
|
||||
"virtual_tenant_id": str(virtual_tenant_id)
|
||||
}
|
||||
}
|
||||
|
||||
except ValueError as e:
|
||||
logger.error("Invalid UUID format", error=str(e), virtual_tenant_id=virtual_tenant_id)
|
||||
raise HTTPException(status_code=400, detail=f"Invalid UUID: {str(e)}")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(
|
||||
"Failed to clone recipes data",
|
||||
error=str(e),
|
||||
virtual_tenant_id=virtual_tenant_id,
|
||||
exc_info=True
|
||||
)
|
||||
|
||||
# Rollback on error
|
||||
await db.rollback()
|
||||
|
||||
return {
|
||||
"service": "recipes",
|
||||
"status": "failed",
|
||||
"records_cloned": 0,
|
||||
"duration_ms": int((datetime.now(timezone.utc) - start_time).total_seconds() * 1000),
|
||||
"error": str(e)
|
||||
}
|
||||
|
||||
|
||||
@router.get("/clone/health")
|
||||
async def clone_health_check():
|
||||
"""
|
||||
Health check for internal cloning endpoint
|
||||
Used by orchestrator to verify service availability
|
||||
"""
|
||||
return {
|
||||
"service": "recipes",
|
||||
"clone_endpoint": "available",
|
||||
"version": "2.0.0"
|
||||
}
|
||||
|
||||
|
||||
@router.delete("/tenant/{virtual_tenant_id}")
|
||||
async def delete_demo_tenant_data(
|
||||
virtual_tenant_id: UUID,
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""
|
||||
Delete all demo data for a virtual tenant.
|
||||
This endpoint is idempotent - safe to call multiple times.
|
||||
"""
|
||||
start_time = datetime.now(timezone.utc)
|
||||
|
||||
records_deleted = {
|
||||
"recipes": 0,
|
||||
"recipe_ingredients": 0,
|
||||
"total": 0
|
||||
}
|
||||
|
||||
try:
|
||||
# Delete in reverse dependency order
|
||||
|
||||
# 1. Delete recipe ingredients (depends on recipes)
|
||||
result = await db.execute(
|
||||
delete(RecipeIngredient)
|
||||
.where(RecipeIngredient.tenant_id == virtual_tenant_id)
|
||||
)
|
||||
records_deleted["recipe_ingredients"] = result.rowcount
|
||||
|
||||
# 2. Delete recipes
|
||||
result = await db.execute(
|
||||
delete(Recipe)
|
||||
.where(Recipe.tenant_id == virtual_tenant_id)
|
||||
)
|
||||
records_deleted["recipes"] = result.rowcount
|
||||
|
||||
records_deleted["total"] = sum(records_deleted.values())
|
||||
|
||||
await db.commit()
|
||||
|
||||
logger.info(
|
||||
"demo_data_deleted",
|
||||
service="recipes",
|
||||
virtual_tenant_id=str(virtual_tenant_id),
|
||||
records_deleted=records_deleted
|
||||
)
|
||||
|
||||
return {
|
||||
"service": "recipes",
|
||||
"status": "deleted",
|
||||
"virtual_tenant_id": str(virtual_tenant_id),
|
||||
"records_deleted": records_deleted,
|
||||
"duration_ms": int((datetime.now(timezone.utc) - start_time).total_seconds() * 1000)
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
await db.rollback()
|
||||
logger.error(
|
||||
"demo_data_deletion_failed",
|
||||
service="recipes",
|
||||
virtual_tenant_id=str(virtual_tenant_id),
|
||||
error=str(e)
|
||||
)
|
||||
raise HTTPException(
|
||||
status_code=500,
|
||||
detail=f"Failed to delete demo data: {str(e)}"
|
||||
)
|
||||
|
||||
|
||||
306
services/recipes/app/api/recipe_operations.py
Normal file
306
services/recipes/app/api/recipe_operations.py
Normal file
@@ -0,0 +1,306 @@
|
||||
# services/recipes/app/api/recipe_operations.py
|
||||
"""
|
||||
Recipe Operations API - Business operations and complex workflows
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, Header, Query, Path
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from uuid import UUID
|
||||
import logging
|
||||
|
||||
from ..core.database import get_db
|
||||
from ..services.recipe_service import RecipeService
|
||||
from ..schemas.recipes import (
|
||||
RecipeResponse,
|
||||
RecipeDuplicateRequest,
|
||||
RecipeFeasibilityResponse,
|
||||
RecipeStatisticsResponse,
|
||||
)
|
||||
from shared.routing import RouteBuilder, RouteCategory
|
||||
from shared.auth.access_control import require_user_role, analytics_tier_required
|
||||
from shared.auth.decorators import get_current_user_dep
|
||||
|
||||
route_builder = RouteBuilder('recipes')
|
||||
logger = logging.getLogger(__name__)
|
||||
router = APIRouter(tags=["recipe-operations"])
|
||||
|
||||
|
||||
def get_user_id(x_user_id: str = Header(...)) -> UUID:
|
||||
"""Extract user ID from header"""
|
||||
try:
|
||||
return UUID(x_user_id)
|
||||
except ValueError:
|
||||
raise HTTPException(status_code=400, detail="Invalid user ID format")
|
||||
|
||||
|
||||
@router.post(
|
||||
route_builder.build_custom_route(RouteCategory.BASE, ["{recipe_id}", "duplicate"]),
|
||||
response_model=RecipeResponse
|
||||
)
|
||||
@require_user_role(['admin', 'owner', 'member'])
|
||||
async def duplicate_recipe(
|
||||
tenant_id: UUID,
|
||||
recipe_id: UUID,
|
||||
duplicate_data: RecipeDuplicateRequest,
|
||||
user_id: UUID = Depends(get_user_id),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Create a duplicate of an existing recipe"""
|
||||
try:
|
||||
recipe_service = RecipeService(db)
|
||||
|
||||
existing_recipe = recipe_service.get_recipe_with_ingredients(recipe_id)
|
||||
if not existing_recipe:
|
||||
raise HTTPException(status_code=404, detail="Recipe not found")
|
||||
|
||||
if existing_recipe["tenant_id"] != str(tenant_id):
|
||||
raise HTTPException(status_code=403, detail="Access denied")
|
||||
|
||||
result = await recipe_service.duplicate_recipe(
|
||||
recipe_id,
|
||||
duplicate_data.new_name,
|
||||
user_id
|
||||
)
|
||||
|
||||
if not result["success"]:
|
||||
raise HTTPException(status_code=400, detail=result["error"])
|
||||
|
||||
return RecipeResponse(**result["data"])
|
||||
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error(f"Error duplicating recipe {recipe_id}: {e}")
|
||||
raise HTTPException(status_code=500, detail="Internal server error")
|
||||
|
||||
|
||||
@router.post(
|
||||
route_builder.build_custom_route(RouteCategory.BASE, ["{recipe_id}", "activate"]),
|
||||
response_model=RecipeResponse
|
||||
)
|
||||
@require_user_role(['admin', 'owner', 'member'])
|
||||
async def activate_recipe(
|
||||
tenant_id: UUID,
|
||||
recipe_id: UUID,
|
||||
user_id: UUID = Depends(get_user_id),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Activate a recipe for production"""
|
||||
try:
|
||||
recipe_service = RecipeService(db)
|
||||
|
||||
existing_recipe = await recipe_service.get_recipe_with_ingredients(recipe_id)
|
||||
if not existing_recipe:
|
||||
raise HTTPException(status_code=404, detail="Recipe not found")
|
||||
|
||||
if existing_recipe["tenant_id"] != str(tenant_id):
|
||||
raise HTTPException(status_code=403, detail="Access denied")
|
||||
|
||||
result = await recipe_service.activate_recipe(recipe_id, user_id)
|
||||
|
||||
if not result["success"]:
|
||||
raise HTTPException(status_code=400, detail=result["error"])
|
||||
|
||||
return RecipeResponse(**result["data"])
|
||||
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error(f"Error activating recipe {recipe_id}: {e}")
|
||||
raise HTTPException(status_code=500, detail="Internal server error")
|
||||
|
||||
|
||||
@router.get(
|
||||
route_builder.build_custom_route(RouteCategory.BASE, ["{recipe_id}", "feasibility"]),
|
||||
response_model=RecipeFeasibilityResponse
|
||||
)
|
||||
@analytics_tier_required
|
||||
async def check_recipe_feasibility(
|
||||
tenant_id: UUID,
|
||||
recipe_id: UUID,
|
||||
batch_multiplier: float = Query(1.0, gt=0),
|
||||
current_user: dict = Depends(get_current_user_dep),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""
|
||||
Check if recipe can be produced with current inventory (Professional+ tier)
|
||||
Supports batch scaling for production planning
|
||||
"""
|
||||
try:
|
||||
recipe_service = RecipeService(db)
|
||||
|
||||
existing_recipe = await recipe_service.get_recipe_with_ingredients(recipe_id)
|
||||
if not existing_recipe:
|
||||
raise HTTPException(status_code=404, detail="Recipe not found")
|
||||
|
||||
if existing_recipe["tenant_id"] != str(tenant_id):
|
||||
raise HTTPException(status_code=403, detail="Access denied")
|
||||
|
||||
result = await recipe_service.check_recipe_feasibility(recipe_id, batch_multiplier)
|
||||
|
||||
if not result["success"]:
|
||||
raise HTTPException(status_code=400, detail=result["error"])
|
||||
|
||||
return RecipeFeasibilityResponse(**result["data"])
|
||||
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error(f"Error checking recipe feasibility {recipe_id}: {e}")
|
||||
raise HTTPException(status_code=500, detail="Internal server error")
|
||||
|
||||
|
||||
@router.get(
|
||||
route_builder.build_dashboard_route("statistics"),
|
||||
response_model=RecipeStatisticsResponse
|
||||
)
|
||||
@require_user_role(['viewer', 'member', 'admin', 'owner'])
|
||||
async def get_recipe_statistics(
|
||||
tenant_id: UUID,
|
||||
current_user: dict = Depends(get_current_user_dep),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Get recipe statistics for dashboard"""
|
||||
try:
|
||||
recipe_service = RecipeService(db)
|
||||
stats = await recipe_service.get_recipe_statistics(tenant_id)
|
||||
|
||||
return RecipeStatisticsResponse(**stats)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error getting recipe statistics: {e}")
|
||||
raise HTTPException(status_code=500, detail="Internal server error")
|
||||
|
||||
|
||||
@router.get(
|
||||
route_builder.build_custom_route(RouteCategory.BASE, ["categories", "list"])
|
||||
)
|
||||
async def get_recipe_categories(
|
||||
tenant_id: UUID,
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Get list of recipe categories used by tenant"""
|
||||
try:
|
||||
recipe_service = RecipeService(db)
|
||||
|
||||
recipes = await recipe_service.search_recipes(tenant_id, limit=1000)
|
||||
categories = list(set(recipe["category"] for recipe in recipes if recipe["category"]))
|
||||
categories.sort()
|
||||
|
||||
return {"categories": categories}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error getting recipe categories: {e}")
|
||||
raise HTTPException(status_code=500, detail="Internal server error")
|
||||
|
||||
|
||||
@router.get(
|
||||
route_builder.build_custom_route(RouteCategory.BASE, ["count"])
|
||||
)
|
||||
async def get_recipe_count(
|
||||
tenant_id: UUID,
|
||||
x_internal_request: str = Header(None),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""
|
||||
Get total count of recipes for a tenant
|
||||
Internal endpoint for subscription usage tracking
|
||||
"""
|
||||
if x_internal_request != "true":
|
||||
raise HTTPException(status_code=403, detail="Internal endpoint only")
|
||||
|
||||
try:
|
||||
recipe_service = RecipeService(db)
|
||||
recipes = await recipe_service.search_recipes(tenant_id, limit=10000)
|
||||
count = len(recipes)
|
||||
|
||||
return {"count": count}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error getting recipe count: {e}")
|
||||
raise HTTPException(status_code=500, detail="Internal server error")
|
||||
|
||||
# ============================================================================
|
||||
# Tenant Data Deletion Operations (Internal Service Only)
|
||||
# ============================================================================
|
||||
|
||||
from shared.auth.access_control import service_only_access
|
||||
from shared.services.tenant_deletion import TenantDataDeletionResult
|
||||
from app.services.tenant_deletion_service import RecipesTenantDeletionService
|
||||
|
||||
|
||||
@router.delete(
|
||||
route_builder.build_base_route("tenant/{tenant_id}", include_tenant_prefix=False),
|
||||
response_model=dict
|
||||
)
|
||||
@service_only_access
|
||||
async def delete_tenant_data(
|
||||
tenant_id: str = Path(..., description="Tenant ID to delete data for"),
|
||||
current_user: dict = Depends(get_current_user_dep),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""
|
||||
Delete all recipes data for a tenant (Internal service only)
|
||||
"""
|
||||
try:
|
||||
logger.info("recipes.tenant_deletion.api_called", tenant_id=tenant_id)
|
||||
|
||||
deletion_service = RecipesTenantDeletionService(db)
|
||||
result = await deletion_service.safe_delete_tenant_data(tenant_id)
|
||||
|
||||
if not result.success:
|
||||
raise HTTPException(
|
||||
status_code=500,
|
||||
detail=f"Tenant data deletion failed: {', '.join(result.errors)}"
|
||||
)
|
||||
|
||||
return {
|
||||
"message": "Tenant data deletion completed successfully",
|
||||
"summary": result.to_dict()
|
||||
}
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error(f"recipes.tenant_deletion.api_error - tenant_id: {tenant_id}, error: {str(e)}", exc_info=True)
|
||||
raise HTTPException(status_code=500, detail=f"Failed to delete tenant data: {str(e)}")
|
||||
|
||||
|
||||
@router.get(
|
||||
route_builder.build_base_route("tenant/{tenant_id}/deletion-preview", include_tenant_prefix=False),
|
||||
response_model=dict
|
||||
)
|
||||
@service_only_access
|
||||
async def preview_tenant_data_deletion(
|
||||
tenant_id: str = Path(..., description="Tenant ID to preview deletion for"),
|
||||
current_user: dict = Depends(get_current_user_dep),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""
|
||||
Preview what data would be deleted for a tenant (dry-run)
|
||||
"""
|
||||
try:
|
||||
logger.info(f"recipes.tenant_deletion.preview_called - tenant_id: {tenant_id}")
|
||||
|
||||
deletion_service = RecipesTenantDeletionService(db)
|
||||
preview_data = await deletion_service.get_tenant_data_preview(tenant_id)
|
||||
result = TenantDataDeletionResult(tenant_id=tenant_id, service_name=deletion_service.service_name)
|
||||
result.deleted_counts = preview_data
|
||||
result.success = True
|
||||
|
||||
if not result.success:
|
||||
raise HTTPException(
|
||||
status_code=500,
|
||||
detail=f"Tenant deletion preview failed: {', '.join(result.errors)}"
|
||||
)
|
||||
|
||||
return {
|
||||
"tenant_id": tenant_id,
|
||||
"service": "recipes-service",
|
||||
"data_counts": result.deleted_counts,
|
||||
"total_items": sum(result.deleted_counts.values())
|
||||
}
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error(f"recipes.tenant_deletion.preview_error - tenant_id: {tenant_id}, error: {str(e)}", exc_info=True)
|
||||
raise HTTPException(status_code=500, detail=f"Failed to preview tenant data deletion: {str(e)}")
|
||||
166
services/recipes/app/api/recipe_quality_configs.py
Normal file
166
services/recipes/app/api/recipe_quality_configs.py
Normal file
@@ -0,0 +1,166 @@
|
||||
# services/recipes/app/api/recipe_quality_configs.py
|
||||
"""
|
||||
Recipe Quality Configuration API - Atomic CRUD operations on RecipeQualityConfiguration
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, Header
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from typing import List
|
||||
from uuid import UUID
|
||||
import logging
|
||||
|
||||
from ..core.database import get_db
|
||||
from ..services.recipe_service import RecipeService
|
||||
from ..schemas.recipes import (
|
||||
RecipeQualityConfiguration,
|
||||
RecipeQualityConfigurationUpdate
|
||||
)
|
||||
from shared.routing import RouteBuilder, RouteCategory
|
||||
from shared.auth.access_control import require_user_role
|
||||
|
||||
route_builder = RouteBuilder('recipes')
|
||||
logger = logging.getLogger(__name__)
|
||||
router = APIRouter(tags=["recipe-quality-configs"])
|
||||
|
||||
|
||||
def get_user_id(x_user_id: str = Header(...)) -> UUID:
|
||||
"""Extract user ID from header"""
|
||||
try:
|
||||
return UUID(x_user_id)
|
||||
except ValueError:
|
||||
raise HTTPException(status_code=400, detail="Invalid user ID format")
|
||||
|
||||
|
||||
@router.get(
|
||||
route_builder.build_custom_route(RouteCategory.BASE, ["{recipe_id}", "quality-configuration"]),
|
||||
response_model=RecipeQualityConfiguration
|
||||
)
|
||||
async def get_recipe_quality_configuration(
|
||||
tenant_id: UUID,
|
||||
recipe_id: UUID,
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Get quality configuration for a specific recipe"""
|
||||
try:
|
||||
recipe_service = RecipeService(db)
|
||||
|
||||
recipe = await recipe_service.get_recipe(tenant_id, recipe_id)
|
||||
if not recipe:
|
||||
raise HTTPException(status_code=404, detail="Recipe not found")
|
||||
|
||||
quality_config = recipe.get("quality_check_configuration")
|
||||
if not quality_config:
|
||||
quality_config = {
|
||||
"stages": {},
|
||||
"overall_quality_threshold": 7.0,
|
||||
"critical_stage_blocking": True,
|
||||
"auto_create_quality_checks": True,
|
||||
"quality_manager_approval_required": False
|
||||
}
|
||||
|
||||
return quality_config
|
||||
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error(f"Error getting recipe quality configuration: {e}")
|
||||
raise HTTPException(status_code=500, detail="Internal server error")
|
||||
|
||||
|
||||
@router.put(
|
||||
route_builder.build_custom_route(RouteCategory.BASE, ["{recipe_id}", "quality-configuration"]),
|
||||
response_model=RecipeQualityConfiguration
|
||||
)
|
||||
@require_user_role(['admin', 'owner', 'member'])
|
||||
async def update_recipe_quality_configuration(
|
||||
tenant_id: UUID,
|
||||
recipe_id: UUID,
|
||||
quality_config: RecipeQualityConfigurationUpdate,
|
||||
user_id: UUID = Depends(get_user_id),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Update quality configuration for a specific recipe"""
|
||||
try:
|
||||
recipe_service = RecipeService(db)
|
||||
|
||||
recipe = await recipe_service.get_recipe(tenant_id, recipe_id)
|
||||
if not recipe:
|
||||
raise HTTPException(status_code=404, detail="Recipe not found")
|
||||
|
||||
updated_recipe = await recipe_service.update_recipe_quality_configuration(
|
||||
tenant_id, recipe_id, quality_config.dict(exclude_unset=True), user_id
|
||||
)
|
||||
|
||||
return updated_recipe["quality_check_configuration"]
|
||||
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error(f"Error updating recipe quality configuration: {e}")
|
||||
raise HTTPException(status_code=500, detail="Internal server error")
|
||||
|
||||
|
||||
@router.post(
|
||||
route_builder.build_custom_route(RouteCategory.BASE, ["{recipe_id}", "quality-configuration", "stages", "{stage}", "templates"])
|
||||
)
|
||||
@require_user_role(['admin', 'owner', 'member'])
|
||||
async def add_quality_templates_to_stage(
|
||||
tenant_id: UUID,
|
||||
recipe_id: UUID,
|
||||
stage: str,
|
||||
template_ids: List[UUID],
|
||||
user_id: UUID = Depends(get_user_id),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Add quality templates to a specific recipe stage"""
|
||||
try:
|
||||
recipe_service = RecipeService(db)
|
||||
|
||||
recipe = await recipe_service.get_recipe(tenant_id, recipe_id)
|
||||
if not recipe:
|
||||
raise HTTPException(status_code=404, detail="Recipe not found")
|
||||
|
||||
await recipe_service.add_quality_templates_to_stage(
|
||||
tenant_id, recipe_id, stage, template_ids, user_id
|
||||
)
|
||||
|
||||
return {"message": f"Added {len(template_ids)} templates to {stage} stage"}
|
||||
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error(f"Error adding quality templates to recipe stage: {e}")
|
||||
raise HTTPException(status_code=500, detail="Internal server error")
|
||||
|
||||
|
||||
@router.delete(
|
||||
route_builder.build_custom_route(RouteCategory.BASE, ["{recipe_id}", "quality-configuration", "stages", "{stage}", "templates", "{template_id}"])
|
||||
)
|
||||
@require_user_role(['admin', 'owner'])
|
||||
async def remove_quality_template_from_stage(
|
||||
tenant_id: UUID,
|
||||
recipe_id: UUID,
|
||||
stage: str,
|
||||
template_id: UUID,
|
||||
user_id: UUID = Depends(get_user_id),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Remove a quality template from a specific recipe stage"""
|
||||
try:
|
||||
recipe_service = RecipeService(db)
|
||||
|
||||
recipe = await recipe_service.get_recipe(tenant_id, recipe_id)
|
||||
if not recipe:
|
||||
raise HTTPException(status_code=404, detail="Recipe not found")
|
||||
|
||||
await recipe_service.remove_quality_template_from_stage(
|
||||
tenant_id, recipe_id, stage, template_id, user_id
|
||||
)
|
||||
|
||||
return {"message": f"Removed template from {stage} stage"}
|
||||
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error(f"Error removing quality template from recipe stage: {e}")
|
||||
raise HTTPException(status_code=500, detail="Internal server error")
|
||||
504
services/recipes/app/api/recipes.py
Normal file
504
services/recipes/app/api/recipes.py
Normal file
@@ -0,0 +1,504 @@
|
||||
# services/recipes/app/api/recipes.py
|
||||
"""
|
||||
Recipes API - Atomic CRUD operations on Recipe model
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, Header, Query, Request
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from typing import List, Optional
|
||||
from uuid import UUID
|
||||
import logging
|
||||
import httpx
|
||||
|
||||
from ..core.database import get_db
|
||||
from ..services.recipe_service import RecipeService
|
||||
from ..schemas.recipes import (
|
||||
RecipeCreate,
|
||||
RecipeUpdate,
|
||||
RecipeResponse,
|
||||
)
|
||||
from ..models import AuditLog
|
||||
from shared.routing import RouteBuilder, RouteCategory
|
||||
from shared.auth.access_control import require_user_role, service_only_access
|
||||
from shared.auth.decorators import get_current_user_dep
|
||||
from shared.security import create_audit_logger, AuditSeverity, AuditAction
|
||||
from shared.services.tenant_deletion import TenantDataDeletionResult
|
||||
|
||||
route_builder = RouteBuilder('recipes')
|
||||
logger = logging.getLogger(__name__)
|
||||
audit_logger = create_audit_logger("recipes-service", AuditLog)
|
||||
router = APIRouter(tags=["recipes"])
|
||||
|
||||
|
||||
def get_user_id(x_user_id: str = Header(...)) -> UUID:
|
||||
"""Extract user ID from header"""
|
||||
try:
|
||||
return UUID(x_user_id)
|
||||
except ValueError:
|
||||
raise HTTPException(status_code=400, detail="Invalid user ID format")
|
||||
|
||||
|
||||
@router.post(
|
||||
route_builder.build_custom_route(RouteCategory.BASE, []),
|
||||
response_model=RecipeResponse
|
||||
)
|
||||
@require_user_role(['admin', 'owner', 'member'])
|
||||
async def create_recipe(
|
||||
tenant_id: UUID,
|
||||
recipe_data: RecipeCreate,
|
||||
user_id: UUID = Depends(get_user_id),
|
||||
current_user: dict = Depends(get_current_user_dep),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Create a new recipe"""
|
||||
try:
|
||||
# CRITICAL: Check subscription limit before creating
|
||||
from ..core.config import settings
|
||||
|
||||
async with httpx.AsyncClient(timeout=5.0) as client:
|
||||
try:
|
||||
# Check recipe limit (not product limit)
|
||||
limit_check_response = await client.get(
|
||||
f"{settings.TENANT_SERVICE_URL}/api/v1/tenants/{tenant_id}/recipes/can-add",
|
||||
headers={
|
||||
"x-user-id": str(current_user.get('user_id')),
|
||||
"x-tenant-id": str(tenant_id)
|
||||
}
|
||||
)
|
||||
|
||||
if limit_check_response.status_code == 200:
|
||||
limit_check = limit_check_response.json()
|
||||
|
||||
if not limit_check.get('can_add', False):
|
||||
logger.warning(
|
||||
f"Recipe limit exceeded for tenant {tenant_id}: {limit_check.get('reason')}"
|
||||
)
|
||||
raise HTTPException(
|
||||
status_code=402,
|
||||
detail={
|
||||
"error": "recipe_limit_exceeded",
|
||||
"message": limit_check.get('reason', 'Recipe limit exceeded'),
|
||||
"current_count": limit_check.get('current_count'),
|
||||
"max_allowed": limit_check.get('max_allowed'),
|
||||
"upgrade_required": True
|
||||
}
|
||||
)
|
||||
else:
|
||||
logger.warning(
|
||||
f"Failed to check recipe limit for tenant {tenant_id}, allowing creation"
|
||||
)
|
||||
except httpx.TimeoutException:
|
||||
logger.warning(f"Timeout checking recipe limit for tenant {tenant_id}, allowing creation")
|
||||
except httpx.RequestError as e:
|
||||
logger.warning(f"Error checking recipe limit for tenant {tenant_id}: {e}, allowing creation")
|
||||
|
||||
recipe_service = RecipeService(db)
|
||||
|
||||
recipe_dict = recipe_data.dict(exclude={"ingredients"})
|
||||
recipe_dict["tenant_id"] = tenant_id
|
||||
|
||||
ingredients_list = [ing.dict() for ing in recipe_data.ingredients]
|
||||
|
||||
result = await recipe_service.create_recipe(
|
||||
recipe_dict,
|
||||
ingredients_list,
|
||||
user_id
|
||||
)
|
||||
|
||||
if not result["success"]:
|
||||
raise HTTPException(status_code=400, detail=result["error"])
|
||||
|
||||
logger.info(f"Recipe created successfully for tenant {tenant_id}: {result['data'].get('name')}")
|
||||
|
||||
return RecipeResponse(**result["data"])
|
||||
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error(f"Error creating recipe: {e}")
|
||||
raise HTTPException(status_code=500, detail="Internal server error")
|
||||
|
||||
|
||||
@router.get(
|
||||
route_builder.build_custom_route(RouteCategory.BASE, []),
|
||||
response_model=List[RecipeResponse]
|
||||
)
|
||||
async def search_recipes(
|
||||
tenant_id: UUID,
|
||||
search_term: Optional[str] = Query(None),
|
||||
status: Optional[str] = Query(None),
|
||||
category: Optional[str] = Query(None),
|
||||
is_seasonal: Optional[bool] = Query(None),
|
||||
is_signature: Optional[bool] = Query(None),
|
||||
difficulty_level: Optional[int] = Query(None, ge=1, le=5),
|
||||
limit: int = Query(100, ge=1, le=1000),
|
||||
offset: int = Query(0, ge=0),
|
||||
current_user: dict = Depends(get_current_user_dep),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Search recipes with filters"""
|
||||
try:
|
||||
recipe_service = RecipeService(db)
|
||||
|
||||
recipes = await recipe_service.search_recipes(
|
||||
tenant_id=tenant_id,
|
||||
search_term=search_term,
|
||||
status=status,
|
||||
category=category,
|
||||
is_seasonal=is_seasonal,
|
||||
is_signature=is_signature,
|
||||
difficulty_level=difficulty_level,
|
||||
limit=limit,
|
||||
offset=offset
|
||||
)
|
||||
|
||||
return [RecipeResponse(**recipe) for recipe in recipes]
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error searching recipes: {e}")
|
||||
raise HTTPException(status_code=500, detail="Internal server error")
|
||||
|
||||
|
||||
@router.get(
|
||||
route_builder.build_custom_route(RouteCategory.BASE, ["count"]),
|
||||
response_model=dict
|
||||
)
|
||||
async def count_recipes(
|
||||
tenant_id: UUID,
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Get count of recipes for a tenant"""
|
||||
try:
|
||||
recipe_service = RecipeService(db)
|
||||
|
||||
# Use the search method with limit 0 to just get the count
|
||||
recipes = await recipe_service.search_recipes(
|
||||
tenant_id=tenant_id,
|
||||
limit=10000 # High limit to get all
|
||||
)
|
||||
|
||||
count = len(recipes)
|
||||
logger.info(f"Retrieved recipe count for tenant {tenant_id}: {count}")
|
||||
|
||||
return {"count": count}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error counting recipes for tenant: {e}")
|
||||
raise HTTPException(status_code=500, detail="Internal server error")
|
||||
|
||||
|
||||
@router.get(
|
||||
route_builder.build_custom_route(RouteCategory.BASE, ["{recipe_id}"]),
|
||||
response_model=RecipeResponse
|
||||
)
|
||||
async def get_recipe(
|
||||
tenant_id: UUID,
|
||||
recipe_id: UUID,
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Get recipe by ID with ingredients"""
|
||||
try:
|
||||
recipe_service = RecipeService(db)
|
||||
|
||||
recipe = await recipe_service.get_recipe_with_ingredients(recipe_id)
|
||||
|
||||
if not recipe:
|
||||
raise HTTPException(status_code=404, detail="Recipe not found")
|
||||
|
||||
if recipe["tenant_id"] != str(tenant_id):
|
||||
raise HTTPException(status_code=403, detail="Access denied")
|
||||
|
||||
return RecipeResponse(**recipe)
|
||||
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error(f"Error getting recipe {recipe_id}: {e}")
|
||||
raise HTTPException(status_code=500, detail="Internal server error")
|
||||
|
||||
|
||||
@router.put(
|
||||
route_builder.build_custom_route(RouteCategory.BASE, ["{recipe_id}"]),
|
||||
response_model=RecipeResponse
|
||||
)
|
||||
@require_user_role(['admin', 'owner', 'member'])
|
||||
async def update_recipe(
|
||||
tenant_id: UUID,
|
||||
recipe_id: UUID,
|
||||
recipe_data: RecipeUpdate,
|
||||
user_id: UUID = Depends(get_user_id),
|
||||
current_user: dict = Depends(get_current_user_dep),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Update an existing recipe"""
|
||||
try:
|
||||
recipe_service = RecipeService(db)
|
||||
|
||||
existing_recipe = await recipe_service.get_recipe_with_ingredients(recipe_id)
|
||||
if not existing_recipe:
|
||||
raise HTTPException(status_code=404, detail="Recipe not found")
|
||||
|
||||
if existing_recipe["tenant_id"] != str(tenant_id):
|
||||
raise HTTPException(status_code=403, detail="Access denied")
|
||||
|
||||
recipe_dict = recipe_data.dict(exclude={"ingredients"}, exclude_unset=True)
|
||||
|
||||
ingredients_list = None
|
||||
if recipe_data.ingredients is not None:
|
||||
ingredients_list = [ing.dict() for ing in recipe_data.ingredients]
|
||||
|
||||
result = await recipe_service.update_recipe(
|
||||
recipe_id,
|
||||
recipe_dict,
|
||||
ingredients_list,
|
||||
user_id
|
||||
)
|
||||
|
||||
if not result["success"]:
|
||||
raise HTTPException(status_code=400, detail=result["error"])
|
||||
|
||||
return RecipeResponse(**result["data"])
|
||||
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error(f"Error updating recipe {recipe_id}: {e}")
|
||||
raise HTTPException(status_code=500, detail="Internal server error")
|
||||
|
||||
|
||||
@router.delete(
|
||||
route_builder.build_custom_route(RouteCategory.BASE, ["{recipe_id}"])
|
||||
)
|
||||
@require_user_role(['admin', 'owner'])
|
||||
async def delete_recipe(
|
||||
tenant_id: UUID,
|
||||
recipe_id: UUID,
|
||||
user_id: UUID = Depends(get_user_id),
|
||||
current_user: dict = Depends(get_current_user_dep),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Delete a recipe (Admin+ only)"""
|
||||
try:
|
||||
recipe_service = RecipeService(db)
|
||||
|
||||
existing_recipe = await recipe_service.get_recipe_with_ingredients(recipe_id)
|
||||
if not existing_recipe:
|
||||
raise HTTPException(status_code=404, detail="Recipe not found")
|
||||
|
||||
if existing_recipe["tenant_id"] != str(tenant_id):
|
||||
raise HTTPException(status_code=403, detail="Access denied")
|
||||
|
||||
# Check if deletion is safe
|
||||
summary = await recipe_service.get_deletion_summary(recipe_id)
|
||||
if not summary["success"]:
|
||||
raise HTTPException(status_code=500, detail=summary["error"])
|
||||
|
||||
if not summary["data"]["can_delete"]:
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail={
|
||||
"message": "Cannot delete recipe with active dependencies",
|
||||
"warnings": summary["data"]["warnings"]
|
||||
}
|
||||
)
|
||||
|
||||
# Capture recipe data before deletion
|
||||
recipe_data = {
|
||||
"recipe_name": existing_recipe.get("name"),
|
||||
"category": existing_recipe.get("category"),
|
||||
"difficulty_level": existing_recipe.get("difficulty_level"),
|
||||
"ingredient_count": len(existing_recipe.get("ingredients", []))
|
||||
}
|
||||
|
||||
success = await recipe_service.delete_recipe(recipe_id)
|
||||
if not success:
|
||||
raise HTTPException(status_code=404, detail="Recipe not found")
|
||||
|
||||
# Log audit event for recipe deletion
|
||||
try:
|
||||
# Get sync db for audit logging
|
||||
from ..core.database import SessionLocal
|
||||
sync_db = SessionLocal()
|
||||
try:
|
||||
await audit_logger.log_deletion(
|
||||
db_session=sync_db,
|
||||
tenant_id=str(tenant_id),
|
||||
user_id=str(user_id),
|
||||
resource_type="recipe",
|
||||
resource_id=str(recipe_id),
|
||||
resource_data=recipe_data,
|
||||
description=f"Admin deleted recipe {recipe_data['recipe_name']}",
|
||||
endpoint=f"/recipes/{recipe_id}",
|
||||
method="DELETE"
|
||||
)
|
||||
sync_db.commit()
|
||||
finally:
|
||||
sync_db.close()
|
||||
except Exception as audit_error:
|
||||
logger.warning(f"Failed to log audit event: {audit_error}")
|
||||
|
||||
logger.info(f"Deleted recipe {recipe_id} by user {user_id}")
|
||||
|
||||
return {"message": "Recipe deleted successfully"}
|
||||
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error(f"Error deleting recipe {recipe_id}: {e}")
|
||||
raise HTTPException(status_code=500, detail="Internal server error")
|
||||
|
||||
|
||||
@router.patch(
|
||||
route_builder.build_custom_route(RouteCategory.OPERATIONS, ["{recipe_id}", "archive"])
|
||||
)
|
||||
@require_user_role(['admin', 'owner'])
|
||||
async def archive_recipe(
|
||||
tenant_id: UUID,
|
||||
recipe_id: UUID,
|
||||
user_id: UUID = Depends(get_user_id),
|
||||
current_user: dict = Depends(get_current_user_dep),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Archive (soft delete) a recipe by setting status to ARCHIVED"""
|
||||
try:
|
||||
recipe_service = RecipeService(db)
|
||||
|
||||
existing_recipe = await recipe_service.get_recipe_with_ingredients(recipe_id)
|
||||
if not existing_recipe:
|
||||
raise HTTPException(status_code=404, detail="Recipe not found")
|
||||
|
||||
if existing_recipe["tenant_id"] != str(tenant_id):
|
||||
raise HTTPException(status_code=403, detail="Not authorized")
|
||||
|
||||
# Check status transitions (business rule)
|
||||
current_status = existing_recipe.get("status")
|
||||
if current_status == "DISCONTINUED":
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail="Cannot archive a discontinued recipe. Use hard delete instead."
|
||||
)
|
||||
|
||||
# Update status to ARCHIVED
|
||||
from ..schemas.recipes import RecipeUpdate, RecipeStatus
|
||||
update_data = RecipeUpdate(status=RecipeStatus.ARCHIVED)
|
||||
|
||||
updated_recipe = await recipe_service.update_recipe(
|
||||
recipe_id,
|
||||
update_data.dict(exclude_unset=True),
|
||||
user_id
|
||||
)
|
||||
|
||||
if not updated_recipe["success"]:
|
||||
raise HTTPException(status_code=400, detail=updated_recipe["error"])
|
||||
|
||||
logger.info(f"Archived recipe {recipe_id} by user {user_id}")
|
||||
return RecipeResponse(**updated_recipe["data"])
|
||||
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error(f"Error archiving recipe: {e}")
|
||||
raise HTTPException(status_code=500, detail="Internal server error")
|
||||
|
||||
|
||||
@router.get(
|
||||
route_builder.build_custom_route(RouteCategory.OPERATIONS, ["{recipe_id}", "deletion-summary"])
|
||||
)
|
||||
@require_user_role(['admin', 'owner'])
|
||||
async def get_recipe_deletion_summary(
|
||||
tenant_id: UUID,
|
||||
recipe_id: UUID,
|
||||
current_user: dict = Depends(get_current_user_dep),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Get summary of what will be affected by deleting this recipe"""
|
||||
try:
|
||||
recipe_service = RecipeService(db)
|
||||
|
||||
existing_recipe = await recipe_service.get_recipe_with_ingredients(recipe_id)
|
||||
if not existing_recipe:
|
||||
raise HTTPException(status_code=404, detail="Recipe not found")
|
||||
|
||||
if existing_recipe["tenant_id"] != str(tenant_id):
|
||||
raise HTTPException(status_code=403, detail="Not authorized")
|
||||
|
||||
summary = await recipe_service.get_deletion_summary(recipe_id)
|
||||
|
||||
if not summary["success"]:
|
||||
raise HTTPException(status_code=500, detail=summary["error"])
|
||||
|
||||
return summary["data"]
|
||||
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error(f"Error getting deletion summary: {e}")
|
||||
raise HTTPException(status_code=500, detail="Internal server error")
|
||||
|
||||
|
||||
# ===== Tenant Data Deletion Endpoints =====
|
||||
|
||||
@router.delete("/tenant/{tenant_id}")
|
||||
@service_only_access
|
||||
async def delete_tenant_data(
|
||||
tenant_id: str,
|
||||
current_user: dict = Depends(get_current_user_dep),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""
|
||||
Delete all recipe-related data for a tenant
|
||||
Only accessible by internal services (called during tenant deletion)
|
||||
"""
|
||||
|
||||
logger.info(f"Tenant data deletion request received for tenant: {tenant_id}")
|
||||
|
||||
try:
|
||||
from app.services.tenant_deletion_service import RecipesTenantDeletionService
|
||||
|
||||
deletion_service = RecipesTenantDeletionService(db)
|
||||
result = await deletion_service.safe_delete_tenant_data(tenant_id)
|
||||
|
||||
return {
|
||||
"message": "Tenant data deletion completed in recipes-service",
|
||||
"summary": result.to_dict()
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Tenant data deletion failed for {tenant_id}: {e}")
|
||||
raise HTTPException(
|
||||
status_code=500,
|
||||
detail=f"Failed to delete tenant data: {str(e)}"
|
||||
)
|
||||
|
||||
|
||||
@router.get("/tenant/{tenant_id}/deletion-preview")
|
||||
@service_only_access
|
||||
async def preview_tenant_data_deletion(
|
||||
tenant_id: str,
|
||||
current_user: dict = Depends(get_current_user_dep),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""
|
||||
Preview what data would be deleted for a tenant (dry-run)
|
||||
Accessible by internal services and tenant admins
|
||||
"""
|
||||
|
||||
try:
|
||||
from app.services.tenant_deletion_service import RecipesTenantDeletionService
|
||||
|
||||
deletion_service = RecipesTenantDeletionService(db)
|
||||
preview = await deletion_service.get_tenant_data_preview(tenant_id)
|
||||
|
||||
return {
|
||||
"tenant_id": tenant_id,
|
||||
"service": "recipes-service",
|
||||
"data_counts": preview,
|
||||
"total_items": sum(preview.values())
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Deletion preview failed for {tenant_id}: {e}")
|
||||
raise HTTPException(
|
||||
status_code=500,
|
||||
detail=f"Failed to get deletion preview: {str(e)}"
|
||||
)
|
||||
1
services/recipes/app/core/__init__.py
Normal file
1
services/recipes/app/core/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
# services/recipes/app/core/__init__.py
|
||||
77
services/recipes/app/core/config.py
Normal file
77
services/recipes/app/core/config.py
Normal file
@@ -0,0 +1,77 @@
|
||||
# services/recipes/app/core/config.py
|
||||
"""
|
||||
Configuration management for Recipe Service
|
||||
"""
|
||||
|
||||
import os
|
||||
from typing import Optional
|
||||
from shared.config.base import BaseServiceSettings
|
||||
|
||||
|
||||
class Settings(BaseServiceSettings):
|
||||
"""Recipe service configuration extending base configuration"""
|
||||
|
||||
# Override service-specific settings
|
||||
SERVICE_NAME: str = "recipes-service"
|
||||
VERSION: str = "1.0.0"
|
||||
APP_NAME: str = "Recipe Service"
|
||||
DESCRIPTION: str = "Recipe management and planning service"
|
||||
|
||||
# API Configuration
|
||||
API_V1_STR: str = "/api/v1"
|
||||
|
||||
# Database configuration (secure approach - build from components)
|
||||
@property
|
||||
def DATABASE_URL(self) -> str:
|
||||
"""Build database URL from secure components"""
|
||||
# Try complete URL first (for backward compatibility)
|
||||
complete_url = os.getenv("RECIPES_DATABASE_URL")
|
||||
if complete_url:
|
||||
return complete_url
|
||||
|
||||
# Build from components (secure approach)
|
||||
user = os.getenv("RECIPES_DB_USER", "recipes_user")
|
||||
password = os.getenv("RECIPES_DB_PASSWORD", "recipes_pass123")
|
||||
host = os.getenv("RECIPES_DB_HOST", "localhost")
|
||||
port = os.getenv("RECIPES_DB_PORT", "5432")
|
||||
name = os.getenv("RECIPES_DB_NAME", "recipes_db")
|
||||
|
||||
return f"postgresql+asyncpg://{user}:{password}@{host}:{port}/{name}"
|
||||
|
||||
# Redis configuration - use a specific database number
|
||||
REDIS_DB: int = 2
|
||||
|
||||
# Recipe-specific settings
|
||||
MAX_RECIPE_INGREDIENTS: int = int(os.getenv("MAX_RECIPE_INGREDIENTS", "50"))
|
||||
MAX_BATCH_SIZE_MULTIPLIER: float = float(os.getenv("MAX_BATCH_SIZE_MULTIPLIER", "10.0"))
|
||||
DEFAULT_RECIPE_VERSION: str = "1.0"
|
||||
|
||||
# Production settings (integration with production service)
|
||||
MAX_PRODUCTION_BATCHES_PER_DAY: int = int(os.getenv("MAX_PRODUCTION_BATCHES_PER_DAY", "100"))
|
||||
PRODUCTION_SCHEDULE_DAYS_AHEAD: int = int(os.getenv("PRODUCTION_SCHEDULE_DAYS_AHEAD", "7"))
|
||||
|
||||
# Cost calculation settings
|
||||
OVERHEAD_PERCENTAGE: float = float(os.getenv("OVERHEAD_PERCENTAGE", "15.0")) # Default 15% overhead
|
||||
LABOR_COST_PER_HOUR: float = float(os.getenv("LABOR_COST_PER_HOUR", "25.0")) # Default €25/hour
|
||||
|
||||
# Quality control
|
||||
MIN_QUALITY_SCORE: float = float(os.getenv("MIN_QUALITY_SCORE", "6.0")) # Minimum acceptable quality score
|
||||
MAX_DEFECT_RATE: float = float(os.getenv("MAX_DEFECT_RATE", "5.0")) # Maximum 5% defect rate
|
||||
|
||||
# External service URLs (specific to recipes service)
|
||||
PRODUCTION_SERVICE_URL: str = os.getenv(
|
||||
"PRODUCTION_SERVICE_URL",
|
||||
"http://production-service:8000"
|
||||
)
|
||||
INVENTORY_SERVICE_URL: str = os.getenv(
|
||||
"INVENTORY_SERVICE_URL",
|
||||
"http://inventory-service:8000"
|
||||
)
|
||||
SALES_SERVICE_URL: str = os.getenv(
|
||||
"SALES_SERVICE_URL",
|
||||
"http://sales-service:8000"
|
||||
)
|
||||
|
||||
|
||||
# Global settings instance
|
||||
settings = Settings()
|
||||
25
services/recipes/app/core/database.py
Normal file
25
services/recipes/app/core/database.py
Normal file
@@ -0,0 +1,25 @@
|
||||
# services/recipes/app/core/database.py
|
||||
"""
|
||||
Database configuration and session management for Recipe Service
|
||||
"""
|
||||
|
||||
from shared.database.base import DatabaseManager, create_database_manager
|
||||
from .config import settings
|
||||
|
||||
# Create database manager using shared async infrastructure
|
||||
db_manager = create_database_manager(
|
||||
database_url=settings.DATABASE_URL,
|
||||
service_name="recipes-service",
|
||||
echo=settings.DEBUG
|
||||
)
|
||||
|
||||
# Dependency for FastAPI routes
|
||||
async def get_db():
|
||||
"""FastAPI dependency to get database session"""
|
||||
async for session in db_manager.get_db():
|
||||
yield session
|
||||
|
||||
# Initialize database
|
||||
async def init_database():
|
||||
"""Initialize database tables"""
|
||||
await db_manager.create_tables()
|
||||
136
services/recipes/app/main.py
Normal file
136
services/recipes/app/main.py
Normal file
@@ -0,0 +1,136 @@
|
||||
# services/recipes/app/main.py
|
||||
"""
|
||||
Recipe Service - FastAPI application
|
||||
Handles recipe management, production planning, and inventory consumption tracking
|
||||
"""
|
||||
|
||||
import time
|
||||
from fastapi import FastAPI, Request
|
||||
from sqlalchemy import text
|
||||
from fastapi.middleware.gzip import GZipMiddleware
|
||||
|
||||
from .core.config import settings
|
||||
from .core.database import db_manager
|
||||
from shared.service_base import StandardFastAPIService
|
||||
|
||||
# Import API routers
|
||||
from .api import recipes, recipe_quality_configs, recipe_operations, audit, internal_demo, internal
|
||||
|
||||
# Import models to register them with SQLAlchemy metadata
|
||||
from .models import recipes as recipe_models
|
||||
|
||||
|
||||
class RecipesService(StandardFastAPIService):
|
||||
"""Recipes Service with standardized setup"""
|
||||
|
||||
expected_migration_version = "00001"
|
||||
|
||||
async def on_startup(self, app):
|
||||
"""Custom startup logic including migration verification"""
|
||||
await self.verify_migrations()
|
||||
await super().on_startup(app)
|
||||
|
||||
async def verify_migrations(self):
|
||||
"""Verify database schema matches the latest migrations."""
|
||||
try:
|
||||
async with self.database_manager.get_session() as session:
|
||||
result = await session.execute(text("SELECT version_num FROM alembic_version"))
|
||||
version = result.scalar()
|
||||
if version != self.expected_migration_version:
|
||||
self.logger.error(f"Migration version mismatch: expected {self.expected_migration_version}, got {version}")
|
||||
raise RuntimeError(f"Migration version mismatch: expected {self.expected_migration_version}, got {version}")
|
||||
self.logger.info(f"Migration verification successful: {version}")
|
||||
except Exception as e:
|
||||
self.logger.error(f"Migration verification failed: {e}")
|
||||
raise
|
||||
|
||||
def __init__(self):
|
||||
# Define expected database tables for health checks
|
||||
recipes_expected_tables = [
|
||||
'recipes', 'recipe_ingredients', 'production_batches',
|
||||
'production_ingredient_consumption', 'production_schedules'
|
||||
]
|
||||
|
||||
super().__init__(
|
||||
service_name="recipes-service",
|
||||
app_name="Recipe Management Service",
|
||||
description="Comprehensive recipe management, production planning, and inventory consumption tracking for bakery operations",
|
||||
version=settings.VERSION,
|
||||
log_level=settings.LOG_LEVEL,
|
||||
cors_origins=settings.CORS_ORIGINS,
|
||||
api_prefix="", # Empty because RouteBuilder already includes /api/v1
|
||||
database_manager=db_manager,
|
||||
expected_tables=recipes_expected_tables
|
||||
)
|
||||
|
||||
async def on_startup(self, app: FastAPI):
|
||||
"""Custom startup logic for recipes service"""
|
||||
# Custom startup completed
|
||||
pass
|
||||
|
||||
async def on_shutdown(self, app: FastAPI):
|
||||
"""Custom shutdown logic for recipes service"""
|
||||
# Database cleanup is handled by the base class
|
||||
pass
|
||||
|
||||
def get_service_features(self):
|
||||
"""Return recipes-specific features"""
|
||||
return [
|
||||
"recipe_management",
|
||||
"production_planning",
|
||||
"inventory_consumption_tracking",
|
||||
"batch_production",
|
||||
"tenant_scoped_operations"
|
||||
]
|
||||
|
||||
def setup_custom_middleware(self):
|
||||
"""Setup custom middleware for recipes service"""
|
||||
# Add GZip middleware
|
||||
self.app.add_middleware(GZipMiddleware, minimum_size=1000)
|
||||
|
||||
# Request timing middleware
|
||||
@self.app.middleware("http")
|
||||
async def add_process_time_header(request: Request, call_next):
|
||||
"""Add processing time header to responses"""
|
||||
start_time = time.time()
|
||||
response = await call_next(request)
|
||||
process_time = time.time() - start_time
|
||||
response.headers["X-Process-Time"] = str(process_time)
|
||||
return response
|
||||
|
||||
|
||||
# Create service instance
|
||||
service = RecipesService()
|
||||
|
||||
# Create FastAPI app with standardized setup
|
||||
app = service.create_app(
|
||||
docs_url="/docs" if settings.DEBUG else None,
|
||||
redoc_url="/redoc" if settings.DEBUG else None
|
||||
)
|
||||
|
||||
# Setup standard endpoints
|
||||
service.setup_standard_endpoints()
|
||||
|
||||
# Setup custom middleware
|
||||
service.setup_custom_middleware()
|
||||
|
||||
# Include routers
|
||||
# IMPORTANT: Register audit router FIRST to avoid route matching conflicts
|
||||
# where {recipe_id} would match literal paths like "audit-logs"
|
||||
service.add_router(audit.router)
|
||||
service.add_router(recipes.router)
|
||||
service.add_router(recipe_quality_configs.router)
|
||||
service.add_router(recipe_operations.router)
|
||||
service.add_router(internal_demo.router, tags=["internal-demo"])
|
||||
service.add_router(internal.router)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import uvicorn
|
||||
uvicorn.run(
|
||||
"main:app",
|
||||
host="0.0.0.0",
|
||||
port=8000,
|
||||
reload=settings.DEBUG,
|
||||
log_level=settings.LOG_LEVEL.lower()
|
||||
)
|
||||
33
services/recipes/app/models/__init__.py
Normal file
33
services/recipes/app/models/__init__.py
Normal file
@@ -0,0 +1,33 @@
|
||||
|
||||
# Import AuditLog model for this service
|
||||
from shared.security import create_audit_log_model
|
||||
from shared.database.base import Base
|
||||
|
||||
# Create audit log model for this service
|
||||
AuditLog = create_audit_log_model(Base)
|
||||
# services/recipes/app/models/__init__.py
|
||||
|
||||
from .recipes import (
|
||||
Recipe,
|
||||
RecipeIngredient,
|
||||
ProductionBatch,
|
||||
ProductionIngredientConsumption,
|
||||
ProductionSchedule,
|
||||
RecipeStatus,
|
||||
ProductionStatus,
|
||||
MeasurementUnit,
|
||||
ProductionPriority
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
"Recipe",
|
||||
"RecipeIngredient",
|
||||
"ProductionBatch",
|
||||
"ProductionIngredientConsumption",
|
||||
"ProductionSchedule",
|
||||
"RecipeStatus",
|
||||
"ProductionStatus",
|
||||
"MeasurementUnit",
|
||||
"ProductionPriority",
|
||||
"AuditLog"
|
||||
]
|
||||
531
services/recipes/app/models/recipes.py
Normal file
531
services/recipes/app/models/recipes.py
Normal file
@@ -0,0 +1,531 @@
|
||||
# services/recipes/app/models/recipes.py
|
||||
"""
|
||||
Recipe and Production Management models for Recipe Service
|
||||
Comprehensive recipe management, production tracking, and inventory consumption
|
||||
"""
|
||||
|
||||
from sqlalchemy import Column, String, DateTime, Float, Integer, Text, Index, Boolean, Numeric, ForeignKey, Enum as SQLEnum
|
||||
from sqlalchemy.dialects.postgresql import UUID, JSONB
|
||||
from sqlalchemy.orm import relationship
|
||||
import uuid
|
||||
import enum
|
||||
from datetime import datetime, timezone
|
||||
from typing import Dict, Any, Optional, List
|
||||
|
||||
from shared.database.base import Base
|
||||
|
||||
|
||||
class RecipeStatus(enum.Enum):
|
||||
"""Recipe lifecycle status"""
|
||||
DRAFT = "DRAFT"
|
||||
ACTIVE = "ACTIVE"
|
||||
TESTING = "TESTING"
|
||||
ARCHIVED = "ARCHIVED"
|
||||
DISCONTINUED = "DISCONTINUED"
|
||||
|
||||
|
||||
class ProductionStatus(enum.Enum):
|
||||
"""Production batch status"""
|
||||
PLANNED = "PLANNED"
|
||||
IN_PROGRESS = "IN_PROGRESS"
|
||||
COMPLETED = "COMPLETED"
|
||||
FAILED = "FAILED"
|
||||
CANCELLED = "CANCELLED"
|
||||
|
||||
|
||||
class MeasurementUnit(enum.Enum):
|
||||
"""Units for recipe measurements"""
|
||||
GRAMS = "g"
|
||||
KILOGRAMS = "kg"
|
||||
MILLILITERS = "ml"
|
||||
LITERS = "l"
|
||||
CUPS = "cups"
|
||||
TABLESPOONS = "tbsp"
|
||||
TEASPOONS = "tsp"
|
||||
UNITS = "units"
|
||||
PIECES = "pieces"
|
||||
PERCENTAGE = "%"
|
||||
|
||||
|
||||
class ProductionPriority(enum.Enum):
|
||||
"""Production batch priority levels"""
|
||||
LOW = "low"
|
||||
MEDIUM = "medium"
|
||||
HIGH = "high"
|
||||
URGENT = "urgent"
|
||||
|
||||
|
||||
class Recipe(Base):
|
||||
"""Master recipe definitions"""
|
||||
__tablename__ = "recipes"
|
||||
|
||||
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
|
||||
tenant_id = Column(UUID(as_uuid=True), nullable=False, index=True)
|
||||
|
||||
# Recipe identification
|
||||
name = Column(String(255), nullable=False, index=True)
|
||||
recipe_code = Column(String(100), nullable=True, index=True)
|
||||
version = Column(String(20), nullable=False, default="1.0")
|
||||
|
||||
# Product association
|
||||
finished_product_id = Column(UUID(as_uuid=True), nullable=False, index=True) # Links to inventory ingredient with product_type=finished_product
|
||||
|
||||
# Recipe details
|
||||
description = Column(Text, nullable=True)
|
||||
category = Column(String(100), nullable=True, index=True) # bread, pastries, cakes, etc.
|
||||
cuisine_type = Column(String(100), nullable=True)
|
||||
difficulty_level = Column(Integer, nullable=False, default=1) # 1-5 scale
|
||||
|
||||
# Production metrics
|
||||
yield_quantity = Column(Float, nullable=False) # How many units this recipe produces
|
||||
yield_unit = Column(SQLEnum(MeasurementUnit), nullable=False)
|
||||
prep_time_minutes = Column(Integer, nullable=True)
|
||||
cook_time_minutes = Column(Integer, nullable=True)
|
||||
total_time_minutes = Column(Integer, nullable=True)
|
||||
rest_time_minutes = Column(Integer, nullable=True) # Rising time, cooling time, etc.
|
||||
|
||||
# Cost and pricing
|
||||
estimated_cost_per_unit = Column(Numeric(10, 2), nullable=True)
|
||||
last_calculated_cost = Column(Numeric(10, 2), nullable=True)
|
||||
cost_calculation_date = Column(DateTime(timezone=True), nullable=True)
|
||||
target_margin_percentage = Column(Float, nullable=True)
|
||||
suggested_selling_price = Column(Numeric(10, 2), nullable=True)
|
||||
|
||||
# Instructions and notes
|
||||
instructions = Column(JSONB, nullable=True) # Structured step-by-step instructions
|
||||
preparation_notes = Column(Text, nullable=True)
|
||||
storage_instructions = Column(Text, nullable=True)
|
||||
|
||||
# Recipe metadata
|
||||
serves_count = Column(Integer, nullable=True) # How many people/portions
|
||||
nutritional_info = Column(JSONB, nullable=True) # Calories, protein, etc.
|
||||
allergen_info = Column(JSONB, nullable=True) # List of allergens
|
||||
dietary_tags = Column(JSONB, nullable=True) # vegan, gluten-free, etc.
|
||||
|
||||
# Production settings
|
||||
batch_size_multiplier = Column(Float, nullable=False, default=1.0) # Standard batch multiplier
|
||||
minimum_batch_size = Column(Float, nullable=True)
|
||||
maximum_batch_size = Column(Float, nullable=True)
|
||||
optimal_production_temperature = Column(Float, nullable=True) # Celsius
|
||||
optimal_humidity = Column(Float, nullable=True) # Percentage
|
||||
|
||||
# Quality control
|
||||
quality_check_configuration = Column(JSONB, nullable=True) # Stage-based quality check config
|
||||
|
||||
# Status and lifecycle
|
||||
status = Column(SQLEnum(RecipeStatus), nullable=False, default=RecipeStatus.DRAFT, index=True)
|
||||
is_seasonal = Column(Boolean, default=False)
|
||||
season_start_month = Column(Integer, nullable=True) # 1-12
|
||||
season_end_month = Column(Integer, nullable=True) # 1-12
|
||||
is_signature_item = Column(Boolean, default=False)
|
||||
|
||||
# Audit fields
|
||||
created_at = Column(DateTime(timezone=True), default=lambda: datetime.now(timezone.utc))
|
||||
updated_at = Column(DateTime(timezone=True),
|
||||
default=lambda: datetime.now(timezone.utc),
|
||||
onupdate=lambda: datetime.now(timezone.utc))
|
||||
created_by = Column(UUID(as_uuid=True), nullable=True)
|
||||
updated_by = Column(UUID(as_uuid=True), nullable=True)
|
||||
|
||||
# Relationships
|
||||
ingredients = relationship("RecipeIngredient", back_populates="recipe", cascade="all, delete-orphan")
|
||||
production_batches = relationship("ProductionBatch", back_populates="recipe", cascade="all, delete-orphan")
|
||||
|
||||
__table_args__ = (
|
||||
Index('idx_recipes_tenant_name', 'tenant_id', 'name'),
|
||||
Index('idx_recipes_tenant_product', 'tenant_id', 'finished_product_id'),
|
||||
Index('idx_recipes_status', 'tenant_id', 'status'),
|
||||
Index('idx_recipes_category', 'tenant_id', 'category', 'status'),
|
||||
Index('idx_recipes_seasonal', 'tenant_id', 'is_seasonal', 'season_start_month', 'season_end_month'),
|
||||
Index('idx_recipes_signature', 'tenant_id', 'is_signature_item', 'status'),
|
||||
)
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
"""Convert model to dictionary for API responses"""
|
||||
return {
|
||||
'id': str(self.id),
|
||||
'tenant_id': str(self.tenant_id),
|
||||
'name': self.name,
|
||||
'recipe_code': self.recipe_code,
|
||||
'version': self.version,
|
||||
'finished_product_id': str(self.finished_product_id),
|
||||
'description': self.description,
|
||||
'category': self.category,
|
||||
'cuisine_type': self.cuisine_type,
|
||||
'difficulty_level': self.difficulty_level,
|
||||
'yield_quantity': self.yield_quantity,
|
||||
'yield_unit': self.yield_unit.value if self.yield_unit else None,
|
||||
'prep_time_minutes': self.prep_time_minutes,
|
||||
'cook_time_minutes': self.cook_time_minutes,
|
||||
'total_time_minutes': self.total_time_minutes,
|
||||
'rest_time_minutes': self.rest_time_minutes,
|
||||
'estimated_cost_per_unit': float(self.estimated_cost_per_unit) if self.estimated_cost_per_unit else None,
|
||||
'last_calculated_cost': float(self.last_calculated_cost) if self.last_calculated_cost else None,
|
||||
'cost_calculation_date': self.cost_calculation_date.isoformat() if self.cost_calculation_date else None,
|
||||
'target_margin_percentage': self.target_margin_percentage,
|
||||
'suggested_selling_price': float(self.suggested_selling_price) if self.suggested_selling_price else None,
|
||||
'instructions': self.instructions,
|
||||
'preparation_notes': self.preparation_notes,
|
||||
'storage_instructions': self.storage_instructions,
|
||||
'serves_count': self.serves_count,
|
||||
'nutritional_info': self.nutritional_info,
|
||||
'allergen_info': self.allergen_info,
|
||||
'dietary_tags': self.dietary_tags,
|
||||
'batch_size_multiplier': self.batch_size_multiplier,
|
||||
'minimum_batch_size': self.minimum_batch_size,
|
||||
'maximum_batch_size': self.maximum_batch_size,
|
||||
'optimal_production_temperature': self.optimal_production_temperature,
|
||||
'optimal_humidity': self.optimal_humidity,
|
||||
'quality_check_configuration': self.quality_check_configuration,
|
||||
'status': self.status.value if self.status else None,
|
||||
'is_seasonal': self.is_seasonal,
|
||||
'season_start_month': self.season_start_month,
|
||||
'season_end_month': self.season_end_month,
|
||||
'is_signature_item': self.is_signature_item,
|
||||
'created_at': self.created_at.isoformat() if self.created_at else None,
|
||||
'updated_at': self.updated_at.isoformat() if self.updated_at else None,
|
||||
'created_by': str(self.created_by) if self.created_by else None,
|
||||
'updated_by': str(self.updated_by) if self.updated_by else None,
|
||||
}
|
||||
|
||||
|
||||
class RecipeIngredient(Base):
|
||||
"""Ingredients required for each recipe"""
|
||||
__tablename__ = "recipe_ingredients"
|
||||
|
||||
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
|
||||
tenant_id = Column(UUID(as_uuid=True), nullable=False, index=True)
|
||||
recipe_id = Column(UUID(as_uuid=True), ForeignKey('recipes.id'), nullable=False, index=True)
|
||||
ingredient_id = Column(UUID(as_uuid=True), nullable=False, index=True) # Links to inventory ingredients
|
||||
|
||||
# Quantity specifications
|
||||
quantity = Column(Float, nullable=False)
|
||||
unit = Column(SQLEnum(MeasurementUnit), nullable=False)
|
||||
quantity_in_base_unit = Column(Float, nullable=True) # Converted to ingredient's base unit
|
||||
|
||||
# Alternative measurements
|
||||
alternative_quantity = Column(Float, nullable=True) # e.g., "2 cups" vs "240ml"
|
||||
alternative_unit = Column(SQLEnum(MeasurementUnit), nullable=True)
|
||||
|
||||
# Ingredient specifications
|
||||
preparation_method = Column(String(255), nullable=True) # "sifted", "room temperature", "chopped"
|
||||
ingredient_notes = Column(Text, nullable=True) # Special instructions for this ingredient
|
||||
is_optional = Column(Boolean, default=False)
|
||||
|
||||
# Recipe organization
|
||||
ingredient_order = Column(Integer, nullable=False, default=1) # Order in recipe
|
||||
ingredient_group = Column(String(100), nullable=True) # "wet ingredients", "dry ingredients", etc.
|
||||
|
||||
# Substitutions
|
||||
substitution_options = Column(JSONB, nullable=True) # Alternative ingredients
|
||||
substitution_ratio = Column(Float, nullable=True) # 1:1, 1:2, etc.
|
||||
|
||||
# Cost tracking
|
||||
unit_cost = Column(Numeric(10, 2), nullable=True)
|
||||
total_cost = Column(Numeric(10, 2), nullable=True)
|
||||
cost_updated_at = Column(DateTime(timezone=True), nullable=True)
|
||||
|
||||
# Relationships
|
||||
recipe = relationship("Recipe", back_populates="ingredients")
|
||||
|
||||
__table_args__ = (
|
||||
Index('idx_recipe_ingredients_recipe', 'recipe_id', 'ingredient_order'),
|
||||
Index('idx_recipe_ingredients_ingredient', 'ingredient_id'),
|
||||
Index('idx_recipe_ingredients_tenant', 'tenant_id', 'recipe_id'),
|
||||
Index('idx_recipe_ingredients_group', 'recipe_id', 'ingredient_group', 'ingredient_order'),
|
||||
)
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
"""Convert model to dictionary for API responses"""
|
||||
return {
|
||||
'id': str(self.id),
|
||||
'tenant_id': str(self.tenant_id),
|
||||
'recipe_id': str(self.recipe_id),
|
||||
'ingredient_id': str(self.ingredient_id),
|
||||
'quantity': self.quantity,
|
||||
'unit': self.unit.value if self.unit else None,
|
||||
'quantity_in_base_unit': self.quantity_in_base_unit,
|
||||
'alternative_quantity': self.alternative_quantity,
|
||||
'alternative_unit': self.alternative_unit.value if self.alternative_unit else None,
|
||||
'preparation_method': self.preparation_method,
|
||||
'ingredient_notes': self.ingredient_notes,
|
||||
'is_optional': self.is_optional,
|
||||
'ingredient_order': self.ingredient_order,
|
||||
'ingredient_group': self.ingredient_group,
|
||||
'substitution_options': self.substitution_options,
|
||||
'substitution_ratio': self.substitution_ratio,
|
||||
'unit_cost': float(self.unit_cost) if self.unit_cost else None,
|
||||
'total_cost': float(self.total_cost) if self.total_cost else None,
|
||||
'cost_updated_at': self.cost_updated_at.isoformat() if self.cost_updated_at else None,
|
||||
}
|
||||
|
||||
|
||||
class ProductionBatch(Base):
|
||||
"""Track production batches and inventory consumption"""
|
||||
__tablename__ = "production_batches"
|
||||
|
||||
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
|
||||
tenant_id = Column(UUID(as_uuid=True), nullable=False, index=True)
|
||||
recipe_id = Column(UUID(as_uuid=True), ForeignKey('recipes.id'), nullable=False, index=True)
|
||||
|
||||
# Batch identification
|
||||
batch_number = Column(String(100), nullable=False, index=True)
|
||||
production_date = Column(DateTime(timezone=True), nullable=False, index=True)
|
||||
planned_start_time = Column(DateTime(timezone=True), nullable=True)
|
||||
actual_start_time = Column(DateTime(timezone=True), nullable=True)
|
||||
planned_end_time = Column(DateTime(timezone=True), nullable=True)
|
||||
actual_end_time = Column(DateTime(timezone=True), nullable=True)
|
||||
|
||||
# Production planning
|
||||
planned_quantity = Column(Float, nullable=False)
|
||||
actual_quantity = Column(Float, nullable=True)
|
||||
yield_percentage = Column(Float, nullable=True) # actual/planned * 100
|
||||
batch_size_multiplier = Column(Float, nullable=False, default=1.0)
|
||||
|
||||
# Production details
|
||||
status = Column(SQLEnum(ProductionStatus), nullable=False, default=ProductionStatus.PLANNED, index=True)
|
||||
priority = Column(SQLEnum(ProductionPriority), nullable=False, default=ProductionPriority.MEDIUM)
|
||||
assigned_staff = Column(JSONB, nullable=True) # List of staff assigned to this batch
|
||||
production_notes = Column(Text, nullable=True)
|
||||
|
||||
# Quality metrics
|
||||
quality_score = Column(Float, nullable=True) # 1-10 scale
|
||||
quality_notes = Column(Text, nullable=True)
|
||||
defect_rate = Column(Float, nullable=True) # Percentage of defective products
|
||||
rework_required = Column(Boolean, default=False)
|
||||
|
||||
# Cost tracking
|
||||
planned_material_cost = Column(Numeric(10, 2), nullable=True)
|
||||
actual_material_cost = Column(Numeric(10, 2), nullable=True)
|
||||
labor_cost = Column(Numeric(10, 2), nullable=True)
|
||||
overhead_cost = Column(Numeric(10, 2), nullable=True)
|
||||
total_production_cost = Column(Numeric(10, 2), nullable=True)
|
||||
cost_per_unit = Column(Numeric(10, 2), nullable=True)
|
||||
|
||||
# Environmental conditions
|
||||
production_temperature = Column(Float, nullable=True)
|
||||
production_humidity = Column(Float, nullable=True)
|
||||
oven_temperature = Column(Float, nullable=True)
|
||||
baking_time_minutes = Column(Integer, nullable=True)
|
||||
|
||||
# Waste and efficiency
|
||||
waste_quantity = Column(Float, nullable=False, default=0.0)
|
||||
waste_reason = Column(String(255), nullable=True)
|
||||
efficiency_percentage = Column(Float, nullable=True) # Based on time vs planned
|
||||
|
||||
# Sales integration
|
||||
customer_order_reference = Column(String(100), nullable=True) # If made to order
|
||||
pre_order_quantity = Column(Float, nullable=True) # Pre-sold quantity
|
||||
shelf_quantity = Column(Float, nullable=True) # For shelf/display
|
||||
|
||||
# Audit fields
|
||||
created_at = Column(DateTime(timezone=True), default=lambda: datetime.now(timezone.utc))
|
||||
updated_at = Column(DateTime(timezone=True),
|
||||
default=lambda: datetime.now(timezone.utc),
|
||||
onupdate=lambda: datetime.now(timezone.utc))
|
||||
created_by = Column(UUID(as_uuid=True), nullable=True)
|
||||
completed_by = Column(UUID(as_uuid=True), nullable=True)
|
||||
|
||||
# Relationships
|
||||
recipe = relationship("Recipe", back_populates="production_batches")
|
||||
ingredient_consumptions = relationship("ProductionIngredientConsumption", back_populates="production_batch", cascade="all, delete-orphan")
|
||||
|
||||
__table_args__ = (
|
||||
Index('idx_production_batches_tenant_date', 'tenant_id', 'production_date'),
|
||||
Index('idx_production_batches_recipe', 'recipe_id', 'production_date'),
|
||||
Index('idx_production_batches_status', 'tenant_id', 'status', 'production_date'),
|
||||
Index('idx_production_batches_batch_number', 'tenant_id', 'batch_number'),
|
||||
Index('idx_production_batches_priority', 'tenant_id', 'priority', 'planned_start_time'),
|
||||
)
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
"""Convert model to dictionary for API responses"""
|
||||
return {
|
||||
'id': str(self.id),
|
||||
'tenant_id': str(self.tenant_id),
|
||||
'recipe_id': str(self.recipe_id),
|
||||
'batch_number': self.batch_number,
|
||||
'production_date': self.production_date.isoformat() if self.production_date else None,
|
||||
'planned_start_time': self.planned_start_time.isoformat() if self.planned_start_time else None,
|
||||
'actual_start_time': self.actual_start_time.isoformat() if self.actual_start_time else None,
|
||||
'planned_end_time': self.planned_end_time.isoformat() if self.planned_end_time else None,
|
||||
'actual_end_time': self.actual_end_time.isoformat() if self.actual_end_time else None,
|
||||
'planned_quantity': self.planned_quantity,
|
||||
'actual_quantity': self.actual_quantity,
|
||||
'yield_percentage': self.yield_percentage,
|
||||
'batch_size_multiplier': self.batch_size_multiplier,
|
||||
'status': self.status.value if self.status else None,
|
||||
'priority': self.priority.value if self.priority else None,
|
||||
'assigned_staff': self.assigned_staff,
|
||||
'production_notes': self.production_notes,
|
||||
'quality_score': self.quality_score,
|
||||
'quality_notes': self.quality_notes,
|
||||
'defect_rate': self.defect_rate,
|
||||
'rework_required': self.rework_required,
|
||||
'planned_material_cost': float(self.planned_material_cost) if self.planned_material_cost else None,
|
||||
'actual_material_cost': float(self.actual_material_cost) if self.actual_material_cost else None,
|
||||
'labor_cost': float(self.labor_cost) if self.labor_cost else None,
|
||||
'overhead_cost': float(self.overhead_cost) if self.overhead_cost else None,
|
||||
'total_production_cost': float(self.total_production_cost) if self.total_production_cost else None,
|
||||
'cost_per_unit': float(self.cost_per_unit) if self.cost_per_unit else None,
|
||||
'production_temperature': self.production_temperature,
|
||||
'production_humidity': self.production_humidity,
|
||||
'oven_temperature': self.oven_temperature,
|
||||
'baking_time_minutes': self.baking_time_minutes,
|
||||
'waste_quantity': self.waste_quantity,
|
||||
'waste_reason': self.waste_reason,
|
||||
'efficiency_percentage': self.efficiency_percentage,
|
||||
'customer_order_reference': self.customer_order_reference,
|
||||
'pre_order_quantity': self.pre_order_quantity,
|
||||
'shelf_quantity': self.shelf_quantity,
|
||||
'created_at': self.created_at.isoformat() if self.created_at else None,
|
||||
'updated_at': self.updated_at.isoformat() if self.updated_at else None,
|
||||
'created_by': str(self.created_by) if self.created_by else None,
|
||||
'completed_by': str(self.completed_by) if self.completed_by else None,
|
||||
}
|
||||
|
||||
|
||||
class ProductionIngredientConsumption(Base):
|
||||
"""Track actual ingredient consumption during production"""
|
||||
__tablename__ = "production_ingredient_consumption"
|
||||
|
||||
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
|
||||
tenant_id = Column(UUID(as_uuid=True), nullable=False, index=True)
|
||||
production_batch_id = Column(UUID(as_uuid=True), ForeignKey('production_batches.id'), nullable=False, index=True)
|
||||
recipe_ingredient_id = Column(UUID(as_uuid=True), ForeignKey('recipe_ingredients.id'), nullable=False, index=True)
|
||||
ingredient_id = Column(UUID(as_uuid=True), nullable=False, index=True) # Links to inventory ingredients
|
||||
stock_id = Column(UUID(as_uuid=True), nullable=True, index=True) # Specific stock batch used
|
||||
|
||||
# Consumption details
|
||||
planned_quantity = Column(Float, nullable=False)
|
||||
actual_quantity = Column(Float, nullable=False)
|
||||
unit = Column(SQLEnum(MeasurementUnit), nullable=False)
|
||||
variance_quantity = Column(Float, nullable=True) # actual - planned
|
||||
variance_percentage = Column(Float, nullable=True) # (actual - planned) / planned * 100
|
||||
|
||||
# Cost tracking
|
||||
unit_cost = Column(Numeric(10, 2), nullable=True)
|
||||
total_cost = Column(Numeric(10, 2), nullable=True)
|
||||
|
||||
# Consumption details
|
||||
consumption_time = Column(DateTime(timezone=True), nullable=False,
|
||||
default=lambda: datetime.now(timezone.utc))
|
||||
consumption_notes = Column(Text, nullable=True)
|
||||
staff_member = Column(UUID(as_uuid=True), nullable=True)
|
||||
|
||||
# Quality and condition
|
||||
ingredient_condition = Column(String(50), nullable=True) # fresh, near_expiry, etc.
|
||||
quality_impact = Column(String(255), nullable=True) # Impact on final product quality
|
||||
substitution_used = Column(Boolean, default=False)
|
||||
substitution_details = Column(Text, nullable=True)
|
||||
|
||||
# Relationships
|
||||
production_batch = relationship("ProductionBatch", back_populates="ingredient_consumptions")
|
||||
|
||||
__table_args__ = (
|
||||
Index('idx_consumption_batch', 'production_batch_id'),
|
||||
Index('idx_consumption_ingredient', 'ingredient_id', 'consumption_time'),
|
||||
Index('idx_consumption_tenant', 'tenant_id', 'consumption_time'),
|
||||
Index('idx_consumption_recipe_ingredient', 'recipe_ingredient_id'),
|
||||
Index('idx_consumption_stock', 'stock_id'),
|
||||
)
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
"""Convert model to dictionary for API responses"""
|
||||
return {
|
||||
'id': str(self.id),
|
||||
'tenant_id': str(self.tenant_id),
|
||||
'production_batch_id': str(self.production_batch_id),
|
||||
'recipe_ingredient_id': str(self.recipe_ingredient_id),
|
||||
'ingredient_id': str(self.ingredient_id),
|
||||
'stock_id': str(self.stock_id) if self.stock_id else None,
|
||||
'planned_quantity': self.planned_quantity,
|
||||
'actual_quantity': self.actual_quantity,
|
||||
'unit': self.unit.value if self.unit else None,
|
||||
'variance_quantity': self.variance_quantity,
|
||||
'variance_percentage': self.variance_percentage,
|
||||
'unit_cost': float(self.unit_cost) if self.unit_cost else None,
|
||||
'total_cost': float(self.total_cost) if self.total_cost else None,
|
||||
'consumption_time': self.consumption_time.isoformat() if self.consumption_time else None,
|
||||
'consumption_notes': self.consumption_notes,
|
||||
'staff_member': str(self.staff_member) if self.staff_member else None,
|
||||
'ingredient_condition': self.ingredient_condition,
|
||||
'quality_impact': self.quality_impact,
|
||||
'substitution_used': self.substitution_used,
|
||||
'substitution_details': self.substitution_details,
|
||||
}
|
||||
|
||||
|
||||
class ProductionSchedule(Base):
|
||||
"""Production planning and scheduling"""
|
||||
__tablename__ = "production_schedules"
|
||||
|
||||
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
|
||||
tenant_id = Column(UUID(as_uuid=True), nullable=False, index=True)
|
||||
|
||||
# Schedule details
|
||||
schedule_date = Column(DateTime(timezone=True), nullable=False, index=True)
|
||||
schedule_name = Column(String(255), nullable=True)
|
||||
|
||||
# Production planning
|
||||
total_planned_batches = Column(Integer, nullable=False, default=0)
|
||||
total_planned_items = Column(Float, nullable=False, default=0.0)
|
||||
estimated_production_hours = Column(Float, nullable=True)
|
||||
estimated_material_cost = Column(Numeric(10, 2), nullable=True)
|
||||
|
||||
# Schedule status
|
||||
is_published = Column(Boolean, default=False)
|
||||
is_completed = Column(Boolean, default=False)
|
||||
completion_percentage = Column(Float, nullable=True)
|
||||
|
||||
# Planning constraints
|
||||
available_staff_hours = Column(Float, nullable=True)
|
||||
oven_capacity_hours = Column(Float, nullable=True)
|
||||
production_capacity_limit = Column(Float, nullable=True)
|
||||
|
||||
# Notes and instructions
|
||||
schedule_notes = Column(Text, nullable=True)
|
||||
preparation_instructions = Column(Text, nullable=True)
|
||||
special_requirements = Column(JSONB, nullable=True)
|
||||
|
||||
# Audit fields
|
||||
created_at = Column(DateTime(timezone=True), default=lambda: datetime.now(timezone.utc))
|
||||
updated_at = Column(DateTime(timezone=True),
|
||||
default=lambda: datetime.now(timezone.utc),
|
||||
onupdate=lambda: datetime.now(timezone.utc))
|
||||
created_by = Column(UUID(as_uuid=True), nullable=True)
|
||||
published_by = Column(UUID(as_uuid=True), nullable=True)
|
||||
published_at = Column(DateTime(timezone=True), nullable=True)
|
||||
|
||||
__table_args__ = (
|
||||
Index('idx_production_schedules_tenant_date', 'tenant_id', 'schedule_date'),
|
||||
Index('idx_production_schedules_published', 'tenant_id', 'is_published', 'schedule_date'),
|
||||
Index('idx_production_schedules_completed', 'tenant_id', 'is_completed', 'schedule_date'),
|
||||
)
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
"""Convert model to dictionary for API responses"""
|
||||
return {
|
||||
'id': str(self.id),
|
||||
'tenant_id': str(self.tenant_id),
|
||||
'schedule_date': self.schedule_date.isoformat() if self.schedule_date else None,
|
||||
'schedule_name': self.schedule_name,
|
||||
'total_planned_batches': self.total_planned_batches,
|
||||
'total_planned_items': self.total_planned_items,
|
||||
'estimated_production_hours': self.estimated_production_hours,
|
||||
'estimated_material_cost': float(self.estimated_material_cost) if self.estimated_material_cost else None,
|
||||
'is_published': self.is_published,
|
||||
'is_completed': self.is_completed,
|
||||
'completion_percentage': self.completion_percentage,
|
||||
'available_staff_hours': self.available_staff_hours,
|
||||
'oven_capacity_hours': self.oven_capacity_hours,
|
||||
'production_capacity_limit': self.production_capacity_limit,
|
||||
'schedule_notes': self.schedule_notes,
|
||||
'preparation_instructions': self.preparation_instructions,
|
||||
'special_requirements': self.special_requirements,
|
||||
'created_at': self.created_at.isoformat() if self.created_at else None,
|
||||
'updated_at': self.updated_at.isoformat() if self.updated_at else None,
|
||||
'created_by': str(self.created_by) if self.created_by else None,
|
||||
'published_by': str(self.published_by) if self.published_by else None,
|
||||
'published_at': self.published_at.isoformat() if self.published_at else None,
|
||||
}
|
||||
7
services/recipes/app/repositories/__init__.py
Normal file
7
services/recipes/app/repositories/__init__.py
Normal file
@@ -0,0 +1,7 @@
|
||||
# services/recipes/app/repositories/__init__.py
|
||||
|
||||
from .recipe_repository import RecipeRepository
|
||||
|
||||
__all__ = [
|
||||
"RecipeRepository"
|
||||
]
|
||||
270
services/recipes/app/repositories/recipe_repository.py
Normal file
270
services/recipes/app/repositories/recipe_repository.py
Normal file
@@ -0,0 +1,270 @@
|
||||
# services/recipes/app/repositories/recipe_repository.py
|
||||
"""
|
||||
Async recipe repository for database operations
|
||||
"""
|
||||
|
||||
from typing import List, Optional, Dict, Any
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy import select, func, and_, or_
|
||||
from sqlalchemy.orm import selectinload
|
||||
from uuid import UUID
|
||||
from datetime import datetime
|
||||
import structlog
|
||||
|
||||
from shared.database.repository import BaseRepository
|
||||
from ..models.recipes import Recipe, RecipeIngredient, RecipeStatus
|
||||
from ..schemas.recipes import RecipeCreate, RecipeUpdate
|
||||
|
||||
logger = structlog.get_logger()
|
||||
|
||||
|
||||
class RecipeRepository(BaseRepository[Recipe, RecipeCreate, RecipeUpdate]):
|
||||
"""Async repository for recipe operations"""
|
||||
|
||||
def __init__(self, session: AsyncSession):
|
||||
super().__init__(Recipe, session)
|
||||
|
||||
async def get_recipe_with_ingredients(self, recipe_id: UUID) -> Optional[Dict[str, Any]]:
|
||||
"""Get recipe with ingredients loaded"""
|
||||
result = await self.session.execute(
|
||||
select(Recipe)
|
||||
.options(selectinload(Recipe.ingredients))
|
||||
.where(Recipe.id == recipe_id)
|
||||
)
|
||||
recipe = result.scalar_one_or_none()
|
||||
|
||||
if not recipe:
|
||||
return None
|
||||
|
||||
return {
|
||||
"id": str(recipe.id),
|
||||
"tenant_id": str(recipe.tenant_id),
|
||||
"name": recipe.name,
|
||||
"recipe_code": recipe.recipe_code,
|
||||
"version": recipe.version,
|
||||
"finished_product_id": str(recipe.finished_product_id),
|
||||
"description": recipe.description,
|
||||
"category": recipe.category,
|
||||
"cuisine_type": recipe.cuisine_type,
|
||||
"difficulty_level": recipe.difficulty_level,
|
||||
"yield_quantity": float(recipe.yield_quantity),
|
||||
"yield_unit": recipe.yield_unit.value if hasattr(recipe.yield_unit, 'value') else recipe.yield_unit,
|
||||
"prep_time_minutes": recipe.prep_time_minutes,
|
||||
"cook_time_minutes": recipe.cook_time_minutes,
|
||||
"total_time_minutes": recipe.total_time_minutes,
|
||||
"rest_time_minutes": recipe.rest_time_minutes,
|
||||
"estimated_cost_per_unit": float(recipe.estimated_cost_per_unit) if recipe.estimated_cost_per_unit else None,
|
||||
"last_calculated_cost": float(recipe.last_calculated_cost) if recipe.last_calculated_cost else None,
|
||||
"cost_calculation_date": recipe.cost_calculation_date.isoformat() if recipe.cost_calculation_date else None,
|
||||
"target_margin_percentage": recipe.target_margin_percentage,
|
||||
"suggested_selling_price": float(recipe.suggested_selling_price) if recipe.suggested_selling_price else None,
|
||||
"instructions": recipe.instructions,
|
||||
"preparation_notes": recipe.preparation_notes,
|
||||
"storage_instructions": recipe.storage_instructions,
|
||||
"quality_check_configuration": recipe.quality_check_configuration,
|
||||
"serves_count": recipe.serves_count,
|
||||
"nutritional_info": recipe.nutritional_info,
|
||||
"allergen_info": recipe.allergen_info,
|
||||
"dietary_tags": recipe.dietary_tags,
|
||||
"batch_size_multiplier": float(recipe.batch_size_multiplier),
|
||||
"minimum_batch_size": float(recipe.minimum_batch_size) if recipe.minimum_batch_size else None,
|
||||
"maximum_batch_size": float(recipe.maximum_batch_size) if recipe.maximum_batch_size else None,
|
||||
"optimal_production_temperature": float(recipe.optimal_production_temperature) if recipe.optimal_production_temperature else None,
|
||||
"optimal_humidity": float(recipe.optimal_humidity) if recipe.optimal_humidity else None,
|
||||
"status": recipe.status.value if hasattr(recipe.status, 'value') else recipe.status,
|
||||
"is_seasonal": recipe.is_seasonal,
|
||||
"season_start_month": recipe.season_start_month,
|
||||
"season_end_month": recipe.season_end_month,
|
||||
"is_signature_item": recipe.is_signature_item,
|
||||
"created_at": recipe.created_at.isoformat() if recipe.created_at else None,
|
||||
"updated_at": recipe.updated_at.isoformat() if recipe.updated_at else None,
|
||||
"created_by": str(recipe.created_by) if recipe.created_by else None,
|
||||
"updated_by": str(recipe.updated_by) if hasattr(recipe, 'updated_by') and recipe.updated_by else None,
|
||||
"ingredients": [
|
||||
{
|
||||
"id": str(ingredient.id),
|
||||
"tenant_id": str(ingredient.tenant_id),
|
||||
"recipe_id": str(ingredient.recipe_id),
|
||||
"ingredient_id": str(ingredient.ingredient_id),
|
||||
"quantity": float(ingredient.quantity),
|
||||
"unit": ingredient.unit.value if hasattr(ingredient.unit, 'value') else ingredient.unit,
|
||||
"quantity_in_base_unit": float(ingredient.quantity_in_base_unit) if ingredient.quantity_in_base_unit else None,
|
||||
"alternative_quantity": float(ingredient.alternative_quantity) if ingredient.alternative_quantity else None,
|
||||
"alternative_unit": ingredient.alternative_unit.value if hasattr(ingredient.alternative_unit, 'value') and ingredient.alternative_unit else None,
|
||||
"preparation_method": ingredient.preparation_method,
|
||||
"ingredient_notes": ingredient.ingredient_notes,
|
||||
"is_optional": ingredient.is_optional,
|
||||
"ingredient_order": ingredient.ingredient_order,
|
||||
"ingredient_group": ingredient.ingredient_group,
|
||||
"substitution_options": ingredient.substitution_options,
|
||||
"substitution_ratio": float(ingredient.substitution_ratio) if ingredient.substitution_ratio else None,
|
||||
"unit_cost": float(ingredient.unit_cost) if hasattr(ingredient, 'unit_cost') and ingredient.unit_cost else None,
|
||||
"total_cost": float(ingredient.total_cost) if hasattr(ingredient, 'total_cost') and ingredient.total_cost else None,
|
||||
"cost_updated_at": ingredient.cost_updated_at.isoformat() if hasattr(ingredient, 'cost_updated_at') and ingredient.cost_updated_at else None
|
||||
}
|
||||
for ingredient in recipe.ingredients
|
||||
] if hasattr(recipe, 'ingredients') else []
|
||||
}
|
||||
|
||||
async def search_recipes(
|
||||
self,
|
||||
tenant_id: UUID,
|
||||
search_term: Optional[str] = None,
|
||||
status: Optional[str] = None,
|
||||
category: Optional[str] = None,
|
||||
is_seasonal: Optional[bool] = None,
|
||||
is_signature: Optional[bool] = None,
|
||||
difficulty_level: Optional[int] = None,
|
||||
limit: int = 100,
|
||||
offset: int = 0
|
||||
) -> List[Dict[str, Any]]:
|
||||
"""Search recipes with multiple filters"""
|
||||
query = select(Recipe).where(Recipe.tenant_id == tenant_id)
|
||||
|
||||
# Text search
|
||||
if search_term:
|
||||
query = query.where(
|
||||
or_(
|
||||
Recipe.name.ilike(f"%{search_term}%"),
|
||||
Recipe.description.ilike(f"%{search_term}%")
|
||||
)
|
||||
)
|
||||
|
||||
# Status filter
|
||||
if status:
|
||||
query = query.where(Recipe.status == status)
|
||||
|
||||
# Category filter
|
||||
if category:
|
||||
query = query.where(Recipe.category == category)
|
||||
|
||||
# Seasonal filter
|
||||
if is_seasonal is not None:
|
||||
query = query.where(Recipe.is_seasonal == is_seasonal)
|
||||
|
||||
# Signature filter
|
||||
if is_signature is not None:
|
||||
query = query.where(Recipe.is_signature_item == is_signature)
|
||||
|
||||
# Difficulty filter
|
||||
if difficulty_level is not None:
|
||||
query = query.where(Recipe.difficulty_level == difficulty_level)
|
||||
|
||||
# Apply ordering and pagination
|
||||
query = query.order_by(Recipe.name).limit(limit).offset(offset)
|
||||
|
||||
result = await self.session.execute(query)
|
||||
recipes = result.scalars().all()
|
||||
|
||||
return [
|
||||
{
|
||||
"id": str(recipe.id),
|
||||
"tenant_id": str(recipe.tenant_id),
|
||||
"name": recipe.name,
|
||||
"recipe_code": recipe.recipe_code,
|
||||
"version": recipe.version,
|
||||
"finished_product_id": str(recipe.finished_product_id),
|
||||
"description": recipe.description,
|
||||
"category": recipe.category,
|
||||
"cuisine_type": recipe.cuisine_type,
|
||||
"difficulty_level": recipe.difficulty_level,
|
||||
"yield_quantity": float(recipe.yield_quantity),
|
||||
"yield_unit": recipe.yield_unit.value if hasattr(recipe.yield_unit, 'value') else recipe.yield_unit,
|
||||
"prep_time_minutes": recipe.prep_time_minutes,
|
||||
"cook_time_minutes": recipe.cook_time_minutes,
|
||||
"total_time_minutes": recipe.total_time_minutes,
|
||||
"rest_time_minutes": recipe.rest_time_minutes,
|
||||
"estimated_cost_per_unit": float(recipe.estimated_cost_per_unit) if recipe.estimated_cost_per_unit else None,
|
||||
"last_calculated_cost": float(recipe.last_calculated_cost) if recipe.last_calculated_cost else None,
|
||||
"cost_calculation_date": recipe.cost_calculation_date.isoformat() if recipe.cost_calculation_date else None,
|
||||
"target_margin_percentage": recipe.target_margin_percentage,
|
||||
"suggested_selling_price": float(recipe.suggested_selling_price) if recipe.suggested_selling_price else None,
|
||||
"instructions": recipe.instructions,
|
||||
"preparation_notes": recipe.preparation_notes,
|
||||
"storage_instructions": recipe.storage_instructions,
|
||||
"quality_check_configuration": recipe.quality_check_configuration,
|
||||
"serves_count": recipe.serves_count,
|
||||
"nutritional_info": recipe.nutritional_info,
|
||||
"allergen_info": recipe.allergen_info,
|
||||
"dietary_tags": recipe.dietary_tags,
|
||||
"batch_size_multiplier": float(recipe.batch_size_multiplier),
|
||||
"minimum_batch_size": float(recipe.minimum_batch_size) if recipe.minimum_batch_size else None,
|
||||
"maximum_batch_size": float(recipe.maximum_batch_size) if recipe.maximum_batch_size else None,
|
||||
"optimal_production_temperature": float(recipe.optimal_production_temperature) if recipe.optimal_production_temperature else None,
|
||||
"optimal_humidity": float(recipe.optimal_humidity) if recipe.optimal_humidity else None,
|
||||
"status": recipe.status.value if hasattr(recipe.status, 'value') else recipe.status,
|
||||
"is_seasonal": recipe.is_seasonal,
|
||||
"season_start_month": recipe.season_start_month,
|
||||
"season_end_month": recipe.season_end_month,
|
||||
"is_signature_item": recipe.is_signature_item,
|
||||
"created_at": recipe.created_at.isoformat() if recipe.created_at else None,
|
||||
"updated_at": recipe.updated_at.isoformat() if recipe.updated_at else None,
|
||||
"created_by": str(recipe.created_by) if recipe.created_by else None,
|
||||
"updated_by": str(recipe.updated_by) if hasattr(recipe, 'updated_by') and recipe.updated_by else None,
|
||||
"ingredients": [] # For list view, don't load ingredients to improve performance
|
||||
}
|
||||
for recipe in recipes
|
||||
]
|
||||
|
||||
async def get_recipe_statistics(self, tenant_id: UUID) -> Dict[str, Any]:
|
||||
"""Get recipe statistics for dashboard"""
|
||||
# Total recipes
|
||||
total_result = await self.session.execute(
|
||||
select(func.count(Recipe.id)).where(Recipe.tenant_id == tenant_id)
|
||||
)
|
||||
total_recipes = total_result.scalar() or 0
|
||||
|
||||
# Active recipes
|
||||
active_result = await self.session.execute(
|
||||
select(func.count(Recipe.id)).where(
|
||||
and_(
|
||||
Recipe.tenant_id == tenant_id,
|
||||
Recipe.status == RecipeStatus.ACTIVE
|
||||
)
|
||||
)
|
||||
)
|
||||
active_recipes = active_result.scalar() or 0
|
||||
|
||||
# Signature recipes
|
||||
signature_result = await self.session.execute(
|
||||
select(func.count(Recipe.id)).where(
|
||||
and_(
|
||||
Recipe.tenant_id == tenant_id,
|
||||
Recipe.is_signature_item == True
|
||||
)
|
||||
)
|
||||
)
|
||||
signature_recipes = signature_result.scalar() or 0
|
||||
|
||||
# Seasonal recipes
|
||||
seasonal_result = await self.session.execute(
|
||||
select(func.count(Recipe.id)).where(
|
||||
and_(
|
||||
Recipe.tenant_id == tenant_id,
|
||||
Recipe.is_seasonal == True
|
||||
)
|
||||
)
|
||||
)
|
||||
seasonal_recipes = seasonal_result.scalar() or 0
|
||||
|
||||
# Category breakdown
|
||||
category_result = await self.session.execute(
|
||||
select(Recipe.category, func.count(Recipe.id))
|
||||
.where(Recipe.tenant_id == tenant_id)
|
||||
.group_by(Recipe.category)
|
||||
)
|
||||
category_data = category_result.all()
|
||||
|
||||
# Convert to list of dicts for the schema
|
||||
category_breakdown = [
|
||||
{"category": category or "Uncategorized", "count": count}
|
||||
for category, count in category_data
|
||||
]
|
||||
|
||||
return {
|
||||
"total_recipes": total_recipes,
|
||||
"active_recipes": active_recipes,
|
||||
"signature_recipes": signature_recipes,
|
||||
"seasonal_recipes": seasonal_recipes,
|
||||
"category_breakdown": category_breakdown
|
||||
}
|
||||
25
services/recipes/app/schemas/__init__.py
Normal file
25
services/recipes/app/schemas/__init__.py
Normal file
@@ -0,0 +1,25 @@
|
||||
# services/recipes/app/schemas/__init__.py
|
||||
|
||||
from .recipes import (
|
||||
RecipeCreate,
|
||||
RecipeUpdate,
|
||||
RecipeResponse,
|
||||
RecipeIngredientCreate,
|
||||
RecipeIngredientResponse,
|
||||
RecipeSearchRequest,
|
||||
RecipeFeasibilityResponse,
|
||||
RecipeDuplicateRequest,
|
||||
RecipeStatisticsResponse
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
"RecipeCreate",
|
||||
"RecipeUpdate",
|
||||
"RecipeResponse",
|
||||
"RecipeIngredientCreate",
|
||||
"RecipeIngredientResponse",
|
||||
"RecipeSearchRequest",
|
||||
"RecipeFeasibilityResponse",
|
||||
"RecipeDuplicateRequest",
|
||||
"RecipeStatisticsResponse"
|
||||
]
|
||||
273
services/recipes/app/schemas/recipes.py
Normal file
273
services/recipes/app/schemas/recipes.py
Normal file
@@ -0,0 +1,273 @@
|
||||
# services/recipes/app/schemas/recipes.py
|
||||
"""
|
||||
Pydantic schemas for recipe-related API requests and responses
|
||||
"""
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
from typing import List, Optional, Dict, Any
|
||||
from uuid import UUID
|
||||
from datetime import datetime
|
||||
from enum import Enum
|
||||
|
||||
from ..models.recipes import RecipeStatus, MeasurementUnit
|
||||
|
||||
|
||||
# Quality Template Association Schemas
|
||||
class QualityStageConfiguration(BaseModel):
|
||||
"""Schema for quality checks configuration per production stage"""
|
||||
template_ids: List[UUID] = Field(default_factory=list, description="Quality template IDs for this stage")
|
||||
required_checks: List[str] = Field(default_factory=list, description="Required quality check types")
|
||||
optional_checks: List[str] = Field(default_factory=list, description="Optional quality check types")
|
||||
blocking_on_failure: bool = Field(default=True, description="Block stage progression on critical failures")
|
||||
min_quality_score: Optional[float] = Field(None, ge=0, le=10, description="Minimum quality score to pass stage")
|
||||
|
||||
|
||||
class RecipeQualityConfiguration(BaseModel):
|
||||
"""Schema for recipe quality configuration across all stages"""
|
||||
stages: Dict[str, QualityStageConfiguration] = Field(default_factory=dict, description="Quality configuration per stage")
|
||||
overall_quality_threshold: float = Field(default=7.0, ge=0, le=10, description="Overall quality threshold for batch")
|
||||
critical_stage_blocking: bool = Field(default=True, description="Block progression if critical checks fail")
|
||||
auto_create_quality_checks: bool = Field(default=True, description="Automatically create quality checks for batches")
|
||||
quality_manager_approval_required: bool = Field(default=False, description="Require quality manager approval")
|
||||
|
||||
|
||||
class RecipeQualityConfigurationUpdate(BaseModel):
|
||||
"""Schema for updating recipe quality configuration"""
|
||||
stages: Optional[Dict[str, QualityStageConfiguration]] = None
|
||||
overall_quality_threshold: Optional[float] = Field(None, ge=0, le=10)
|
||||
critical_stage_blocking: Optional[bool] = None
|
||||
auto_create_quality_checks: Optional[bool] = None
|
||||
quality_manager_approval_required: Optional[bool] = None
|
||||
|
||||
|
||||
class RecipeIngredientCreate(BaseModel):
|
||||
"""Schema for creating recipe ingredients"""
|
||||
ingredient_id: UUID
|
||||
quantity: float = Field(..., gt=0)
|
||||
unit: MeasurementUnit
|
||||
alternative_quantity: Optional[float] = None
|
||||
alternative_unit: Optional[MeasurementUnit] = None
|
||||
preparation_method: Optional[str] = None
|
||||
ingredient_notes: Optional[str] = None
|
||||
is_optional: bool = False
|
||||
ingredient_order: int = Field(..., ge=1)
|
||||
ingredient_group: Optional[str] = None
|
||||
substitution_options: Optional[Dict[str, Any]] = None
|
||||
substitution_ratio: Optional[float] = None
|
||||
|
||||
|
||||
class RecipeIngredientUpdate(BaseModel):
|
||||
"""Schema for updating recipe ingredients"""
|
||||
ingredient_id: Optional[UUID] = None
|
||||
quantity: Optional[float] = Field(None, gt=0)
|
||||
unit: Optional[MeasurementUnit] = None
|
||||
alternative_quantity: Optional[float] = None
|
||||
alternative_unit: Optional[MeasurementUnit] = None
|
||||
preparation_method: Optional[str] = None
|
||||
ingredient_notes: Optional[str] = None
|
||||
is_optional: Optional[bool] = None
|
||||
ingredient_order: Optional[int] = Field(None, ge=1)
|
||||
ingredient_group: Optional[str] = None
|
||||
substitution_options: Optional[Dict[str, Any]] = None
|
||||
substitution_ratio: Optional[float] = None
|
||||
|
||||
|
||||
class RecipeIngredientResponse(BaseModel):
|
||||
"""Schema for recipe ingredient responses"""
|
||||
id: UUID
|
||||
tenant_id: UUID
|
||||
recipe_id: UUID
|
||||
ingredient_id: UUID
|
||||
quantity: float
|
||||
unit: str
|
||||
quantity_in_base_unit: Optional[float] = None
|
||||
alternative_quantity: Optional[float] = None
|
||||
alternative_unit: Optional[str] = None
|
||||
preparation_method: Optional[str] = None
|
||||
ingredient_notes: Optional[str] = None
|
||||
is_optional: bool
|
||||
ingredient_order: int
|
||||
ingredient_group: Optional[str] = None
|
||||
substitution_options: Optional[Dict[str, Any]] = None
|
||||
substitution_ratio: Optional[float] = None
|
||||
unit_cost: Optional[float] = None
|
||||
total_cost: Optional[float] = None
|
||||
cost_updated_at: Optional[datetime] = None
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
|
||||
|
||||
class RecipeCreate(BaseModel):
|
||||
"""Schema for creating recipes"""
|
||||
name: str = Field(..., min_length=1, max_length=255)
|
||||
recipe_code: Optional[str] = Field(None, max_length=100)
|
||||
version: str = Field(default="1.0", max_length=20)
|
||||
finished_product_id: UUID
|
||||
description: Optional[str] = None
|
||||
category: Optional[str] = Field(None, max_length=100)
|
||||
cuisine_type: Optional[str] = Field(None, max_length=100)
|
||||
difficulty_level: int = Field(default=1, ge=1, le=5)
|
||||
yield_quantity: float = Field(..., gt=0)
|
||||
yield_unit: MeasurementUnit
|
||||
prep_time_minutes: Optional[int] = Field(None, ge=0)
|
||||
cook_time_minutes: Optional[int] = Field(None, ge=0)
|
||||
total_time_minutes: Optional[int] = Field(None, ge=0)
|
||||
rest_time_minutes: Optional[int] = Field(None, ge=0)
|
||||
instructions: Optional[Dict[str, Any]] = None
|
||||
preparation_notes: Optional[str] = None
|
||||
storage_instructions: Optional[str] = None
|
||||
quality_check_configuration: Optional[RecipeQualityConfiguration] = None
|
||||
serves_count: Optional[int] = Field(None, ge=1)
|
||||
nutritional_info: Optional[Dict[str, Any]] = None
|
||||
allergen_info: Optional[Dict[str, Any]] = None
|
||||
dietary_tags: Optional[Dict[str, Any]] = None
|
||||
batch_size_multiplier: float = Field(default=1.0, gt=0)
|
||||
minimum_batch_size: Optional[float] = Field(None, gt=0)
|
||||
maximum_batch_size: Optional[float] = Field(None, gt=0)
|
||||
optimal_production_temperature: Optional[float] = None
|
||||
optimal_humidity: Optional[float] = Field(None, ge=0, le=100)
|
||||
is_seasonal: bool = False
|
||||
season_start_month: Optional[int] = Field(None, ge=1, le=12)
|
||||
season_end_month: Optional[int] = Field(None, ge=1, le=12)
|
||||
is_signature_item: bool = False
|
||||
target_margin_percentage: Optional[float] = Field(None, ge=0)
|
||||
ingredients: List[RecipeIngredientCreate] = Field(..., min_items=1)
|
||||
|
||||
|
||||
class RecipeUpdate(BaseModel):
|
||||
"""Schema for updating recipes"""
|
||||
name: Optional[str] = Field(None, min_length=1, max_length=255)
|
||||
recipe_code: Optional[str] = Field(None, max_length=100)
|
||||
version: Optional[str] = Field(None, max_length=20)
|
||||
description: Optional[str] = None
|
||||
category: Optional[str] = Field(None, max_length=100)
|
||||
cuisine_type: Optional[str] = Field(None, max_length=100)
|
||||
difficulty_level: Optional[int] = Field(None, ge=1, le=5)
|
||||
yield_quantity: Optional[float] = Field(None, gt=0)
|
||||
yield_unit: Optional[MeasurementUnit] = None
|
||||
prep_time_minutes: Optional[int] = Field(None, ge=0)
|
||||
cook_time_minutes: Optional[int] = Field(None, ge=0)
|
||||
total_time_minutes: Optional[int] = Field(None, ge=0)
|
||||
rest_time_minutes: Optional[int] = Field(None, ge=0)
|
||||
instructions: Optional[Dict[str, Any]] = None
|
||||
preparation_notes: Optional[str] = None
|
||||
storage_instructions: Optional[str] = None
|
||||
quality_check_configuration: Optional[RecipeQualityConfigurationUpdate] = None
|
||||
serves_count: Optional[int] = Field(None, ge=1)
|
||||
nutritional_info: Optional[Dict[str, Any]] = None
|
||||
allergen_info: Optional[Dict[str, Any]] = None
|
||||
dietary_tags: Optional[Dict[str, Any]] = None
|
||||
batch_size_multiplier: Optional[float] = Field(None, gt=0)
|
||||
minimum_batch_size: Optional[float] = Field(None, gt=0)
|
||||
maximum_batch_size: Optional[float] = Field(None, gt=0)
|
||||
optimal_production_temperature: Optional[float] = None
|
||||
optimal_humidity: Optional[float] = Field(None, ge=0, le=100)
|
||||
status: Optional[RecipeStatus] = None
|
||||
is_seasonal: Optional[bool] = None
|
||||
season_start_month: Optional[int] = Field(None, ge=1, le=12)
|
||||
season_end_month: Optional[int] = Field(None, ge=1, le=12)
|
||||
is_signature_item: Optional[bool] = None
|
||||
target_margin_percentage: Optional[float] = Field(None, ge=0)
|
||||
ingredients: Optional[List[RecipeIngredientCreate]] = None
|
||||
|
||||
|
||||
class RecipeResponse(BaseModel):
|
||||
"""Schema for recipe responses"""
|
||||
id: UUID
|
||||
tenant_id: UUID
|
||||
name: str
|
||||
recipe_code: Optional[str] = None
|
||||
version: str
|
||||
finished_product_id: UUID
|
||||
description: Optional[str] = None
|
||||
category: Optional[str] = None
|
||||
cuisine_type: Optional[str] = None
|
||||
difficulty_level: int
|
||||
yield_quantity: float
|
||||
yield_unit: str
|
||||
prep_time_minutes: Optional[int] = None
|
||||
cook_time_minutes: Optional[int] = None
|
||||
total_time_minutes: Optional[int] = None
|
||||
rest_time_minutes: Optional[int] = None
|
||||
estimated_cost_per_unit: Optional[float] = None
|
||||
last_calculated_cost: Optional[float] = None
|
||||
cost_calculation_date: Optional[datetime] = None
|
||||
target_margin_percentage: Optional[float] = None
|
||||
suggested_selling_price: Optional[float] = None
|
||||
instructions: Optional[Dict[str, Any]] = None
|
||||
preparation_notes: Optional[str] = None
|
||||
storage_instructions: Optional[str] = None
|
||||
quality_check_configuration: Optional[RecipeQualityConfiguration] = None
|
||||
serves_count: Optional[int] = None
|
||||
nutritional_info: Optional[Dict[str, Any]] = None
|
||||
allergen_info: Optional[Dict[str, Any]] = None
|
||||
dietary_tags: Optional[Dict[str, Any]] = None
|
||||
batch_size_multiplier: float
|
||||
minimum_batch_size: Optional[float] = None
|
||||
maximum_batch_size: Optional[float] = None
|
||||
optimal_production_temperature: Optional[float] = None
|
||||
optimal_humidity: Optional[float] = None
|
||||
status: str
|
||||
is_seasonal: bool
|
||||
season_start_month: Optional[int] = None
|
||||
season_end_month: Optional[int] = None
|
||||
is_signature_item: bool
|
||||
created_at: datetime
|
||||
updated_at: datetime
|
||||
created_by: Optional[UUID] = None
|
||||
updated_by: Optional[UUID] = None
|
||||
ingredients: Optional[List[RecipeIngredientResponse]] = None
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
|
||||
|
||||
class RecipeDeletionSummary(BaseModel):
|
||||
"""Summary of what will be deleted when hard-deleting a recipe"""
|
||||
recipe_id: UUID
|
||||
recipe_name: str
|
||||
recipe_code: str
|
||||
production_batches_count: int
|
||||
recipe_ingredients_count: int
|
||||
dependent_recipes_count: int # Recipes that use this as ingredient/sub-recipe
|
||||
affected_orders_count: int # Orders that include this recipe
|
||||
last_used_date: Optional[datetime] = None
|
||||
can_delete: bool
|
||||
warnings: List[str] = []
|
||||
|
||||
|
||||
class RecipeSearchRequest(BaseModel):
|
||||
"""Schema for recipe search requests"""
|
||||
search_term: Optional[str] = None
|
||||
status: Optional[RecipeStatus] = None
|
||||
category: Optional[str] = None
|
||||
is_seasonal: Optional[bool] = None
|
||||
is_signature: Optional[bool] = None
|
||||
difficulty_level: Optional[int] = Field(None, ge=1, le=5)
|
||||
limit: int = Field(default=100, ge=1, le=1000)
|
||||
offset: int = Field(default=0, ge=0)
|
||||
|
||||
|
||||
class RecipeDuplicateRequest(BaseModel):
|
||||
"""Schema for recipe duplication requests"""
|
||||
new_name: str = Field(..., min_length=1, max_length=255)
|
||||
|
||||
|
||||
class RecipeFeasibilityResponse(BaseModel):
|
||||
"""Schema for recipe feasibility check responses"""
|
||||
recipe_id: UUID
|
||||
recipe_name: str
|
||||
batch_multiplier: float
|
||||
feasible: bool
|
||||
missing_ingredients: List[Dict[str, Any]] = []
|
||||
insufficient_ingredients: List[Dict[str, Any]] = []
|
||||
|
||||
|
||||
class RecipeStatisticsResponse(BaseModel):
|
||||
"""Schema for recipe statistics responses"""
|
||||
total_recipes: int
|
||||
active_recipes: int
|
||||
signature_recipes: int
|
||||
seasonal_recipes: int
|
||||
category_breakdown: List[Dict[str, Any]]
|
||||
7
services/recipes/app/services/__init__.py
Normal file
7
services/recipes/app/services/__init__.py
Normal file
@@ -0,0 +1,7 @@
|
||||
# services/recipes/app/services/__init__.py
|
||||
|
||||
from .recipe_service import RecipeService
|
||||
|
||||
__all__ = [
|
||||
"RecipeService"
|
||||
]
|
||||
519
services/recipes/app/services/recipe_service.py
Normal file
519
services/recipes/app/services/recipe_service.py
Normal file
@@ -0,0 +1,519 @@
|
||||
# services/recipes/app/services/recipe_service.py
|
||||
"""
|
||||
Service layer for recipe management operations
|
||||
"""
|
||||
|
||||
import logging
|
||||
from typing import List, Optional, Dict, Any
|
||||
from uuid import UUID
|
||||
from datetime import datetime
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
|
||||
from ..repositories.recipe_repository import RecipeRepository
|
||||
from ..schemas.recipes import RecipeCreate, RecipeUpdate
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class RecipeService:
|
||||
"""Async service for recipe management operations"""
|
||||
|
||||
def __init__(self, session: AsyncSession):
|
||||
self.session = session
|
||||
self.recipe_repo = RecipeRepository(session)
|
||||
|
||||
async def get_recipe_with_ingredients(self, recipe_id: UUID) -> Optional[Dict[str, Any]]:
|
||||
"""Get recipe by ID with ingredients"""
|
||||
try:
|
||||
return await self.recipe_repo.get_recipe_with_ingredients(recipe_id)
|
||||
except Exception as e:
|
||||
logger.error(f"Error getting recipe {recipe_id}: {e}")
|
||||
return None
|
||||
|
||||
async def search_recipes(
|
||||
self,
|
||||
tenant_id: UUID,
|
||||
search_term: Optional[str] = None,
|
||||
status: Optional[str] = None,
|
||||
category: Optional[str] = None,
|
||||
is_seasonal: Optional[bool] = None,
|
||||
is_signature: Optional[bool] = None,
|
||||
difficulty_level: Optional[int] = None,
|
||||
limit: int = 100,
|
||||
offset: int = 0
|
||||
) -> List[Dict[str, Any]]:
|
||||
"""Search recipes with filters"""
|
||||
try:
|
||||
return await self.recipe_repo.search_recipes(
|
||||
tenant_id=tenant_id,
|
||||
search_term=search_term,
|
||||
status=status,
|
||||
category=category,
|
||||
is_seasonal=is_seasonal,
|
||||
is_signature=is_signature,
|
||||
difficulty_level=difficulty_level,
|
||||
limit=limit,
|
||||
offset=offset
|
||||
)
|
||||
except Exception as e:
|
||||
logger.error(f"Error searching recipes: {e}")
|
||||
return []
|
||||
|
||||
async def get_recipe_statistics(self, tenant_id: UUID) -> Dict[str, Any]:
|
||||
"""Get recipe statistics for dashboard"""
|
||||
try:
|
||||
return await self.recipe_repo.get_recipe_statistics(tenant_id)
|
||||
except Exception as e:
|
||||
logger.error(f"Error getting recipe statistics: {e}")
|
||||
return {"total_recipes": 0, "active_recipes": 0, "signature_recipes": 0, "seasonal_recipes": 0}
|
||||
|
||||
async def get_deletion_summary(self, recipe_id: UUID) -> Dict[str, Any]:
|
||||
"""Get summary of what will be affected by deleting this recipe"""
|
||||
try:
|
||||
from sqlalchemy import select, func
|
||||
from ..models.recipes import RecipeIngredient
|
||||
|
||||
# Get recipe info
|
||||
recipe = await self.recipe_repo.get_by_id(recipe_id)
|
||||
if not recipe:
|
||||
return {"success": False, "error": "Recipe not found"}
|
||||
|
||||
# Count recipe ingredients
|
||||
ingredients_result = await self.session.execute(
|
||||
select(func.count(RecipeIngredient.id))
|
||||
.where(RecipeIngredient.recipe_id == recipe_id)
|
||||
)
|
||||
ingredients_count = ingredients_result.scalar() or 0
|
||||
|
||||
# Count production batches using this recipe (if production tables exist)
|
||||
production_batches_count = 0
|
||||
try:
|
||||
# Try to import production models if they exist
|
||||
production_batches_result = await self.session.execute(
|
||||
select(func.count()).select_from(
|
||||
select(1).where(
|
||||
# This would need actual production_batches table reference
|
||||
# For now, set to 0
|
||||
).subquery()
|
||||
)
|
||||
)
|
||||
production_batches_count = 0 # Set to 0 for now
|
||||
except:
|
||||
production_batches_count = 0
|
||||
|
||||
# Count dependent recipes (recipes using this as ingredient) - future feature
|
||||
dependent_recipes_count = 0
|
||||
|
||||
# Count affected orders - would need orders service integration
|
||||
affected_orders_count = 0
|
||||
|
||||
# Determine if deletion is safe
|
||||
warnings = []
|
||||
can_delete = True
|
||||
|
||||
if production_batches_count > 0:
|
||||
warnings.append(f"Esta receta tiene {production_batches_count} lotes de producción asociados")
|
||||
can_delete = False
|
||||
|
||||
if affected_orders_count > 0:
|
||||
warnings.append(f"Esta receta está en {affected_orders_count} pedidos")
|
||||
can_delete = False
|
||||
|
||||
if dependent_recipes_count > 0:
|
||||
warnings.append(f"{dependent_recipes_count} recetas dependen de esta")
|
||||
|
||||
if recipe.status == RecipeStatus.ACTIVE:
|
||||
warnings.append("Esta receta está activa. Considera archivarla primero.")
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"data": {
|
||||
"recipe_id": str(recipe.id),
|
||||
"recipe_name": recipe.name,
|
||||
"recipe_code": recipe.recipe_code or "",
|
||||
"production_batches_count": production_batches_count,
|
||||
"recipe_ingredients_count": ingredients_count,
|
||||
"dependent_recipes_count": dependent_recipes_count,
|
||||
"affected_orders_count": affected_orders_count,
|
||||
"last_used_date": None,
|
||||
"can_delete": can_delete,
|
||||
"warnings": warnings
|
||||
}
|
||||
}
|
||||
except Exception as e:
|
||||
logger.error(f"Error getting deletion summary: {e}")
|
||||
return {"success": False, "error": str(e)}
|
||||
|
||||
async def create_recipe(
|
||||
self,
|
||||
recipe_data: Dict[str, Any],
|
||||
ingredients_data: List[Dict[str, Any]],
|
||||
created_by: UUID
|
||||
) -> Dict[str, Any]:
|
||||
"""Create a new recipe with ingredients"""
|
||||
from ..models.recipes import Recipe, RecipeIngredient, RecipeStatus
|
||||
|
||||
try:
|
||||
# Add metadata
|
||||
recipe_data["created_by"] = created_by
|
||||
recipe_data["created_at"] = datetime.utcnow()
|
||||
recipe_data["updated_at"] = datetime.utcnow()
|
||||
recipe_data["status"] = recipe_data.get("status", RecipeStatus.DRAFT)
|
||||
|
||||
# Create Recipe model directly (without ingredients)
|
||||
recipe = Recipe(**recipe_data)
|
||||
self.session.add(recipe)
|
||||
await self.session.flush() # Get the recipe ID
|
||||
|
||||
# Now create ingredients with the recipe_id and tenant_id
|
||||
for ing_data in ingredients_data:
|
||||
ingredient = RecipeIngredient(
|
||||
recipe_id=recipe.id,
|
||||
tenant_id=recipe.tenant_id, # Add tenant_id from recipe
|
||||
**ing_data
|
||||
)
|
||||
self.session.add(ingredient)
|
||||
|
||||
await self.session.flush()
|
||||
|
||||
# Commit the transaction to persist changes
|
||||
await self.session.commit()
|
||||
|
||||
# Get the created recipe with ingredients
|
||||
result = await self.recipe_repo.get_recipe_with_ingredients(recipe.id)
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"data": result
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error creating recipe: {e}")
|
||||
await self.session.rollback()
|
||||
return {
|
||||
"success": False,
|
||||
"error": str(e)
|
||||
}
|
||||
|
||||
async def update_recipe(
|
||||
self,
|
||||
recipe_id: UUID,
|
||||
recipe_data: Dict[str, Any],
|
||||
ingredients_data: Optional[List[Dict[str, Any]]] = None,
|
||||
updated_by: UUID = None
|
||||
) -> Dict[str, Any]:
|
||||
"""Update an existing recipe"""
|
||||
try:
|
||||
# Check if recipe exists
|
||||
existing_recipe = await self.recipe_repo.get_by_id(recipe_id)
|
||||
if not existing_recipe:
|
||||
return {
|
||||
"success": False,
|
||||
"error": "Recipe not found"
|
||||
}
|
||||
|
||||
# Status transition business rules
|
||||
if "status" in recipe_data:
|
||||
from ..models.recipes import RecipeStatus
|
||||
new_status = recipe_data["status"]
|
||||
current_status = existing_recipe.status
|
||||
|
||||
# Cannot reactivate discontinued recipes
|
||||
if current_status == RecipeStatus.DISCONTINUED:
|
||||
if new_status != RecipeStatus.DISCONTINUED:
|
||||
return {
|
||||
"success": False,
|
||||
"error": "Cannot reactivate a discontinued recipe. Create a new version instead."
|
||||
}
|
||||
|
||||
# Can only archive active or testing recipes
|
||||
if new_status == RecipeStatus.ARCHIVED:
|
||||
if current_status not in [RecipeStatus.ACTIVE, RecipeStatus.TESTING]:
|
||||
return {
|
||||
"success": False,
|
||||
"error": "Can only archive active or testing recipes."
|
||||
}
|
||||
|
||||
# Cannot activate drafts without ingredients
|
||||
if new_status == RecipeStatus.ACTIVE and current_status == RecipeStatus.DRAFT:
|
||||
# Check if recipe has ingredients
|
||||
from sqlalchemy import select, func
|
||||
from ..models.recipes import RecipeIngredient
|
||||
|
||||
result = await self.session.execute(
|
||||
select(func.count(RecipeIngredient.id)).where(RecipeIngredient.recipe_id == recipe_id)
|
||||
)
|
||||
ingredient_count = result.scalar()
|
||||
|
||||
if ingredient_count == 0:
|
||||
return {
|
||||
"success": False,
|
||||
"error": "Cannot activate a recipe without ingredients."
|
||||
}
|
||||
|
||||
# Add metadata
|
||||
if updated_by:
|
||||
recipe_data["updated_by"] = updated_by
|
||||
recipe_data["updated_at"] = datetime.utcnow()
|
||||
|
||||
# Use the shared repository's update method
|
||||
recipe_update = RecipeUpdate(**recipe_data)
|
||||
updated_recipe = await self.recipe_repo.update(recipe_id, recipe_update)
|
||||
|
||||
# Get the updated recipe with ingredients
|
||||
result = await self.recipe_repo.get_recipe_with_ingredients(recipe_id)
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"data": result
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error updating recipe {recipe_id}: {e}")
|
||||
await self.session.rollback()
|
||||
return {
|
||||
"success": False,
|
||||
"error": str(e)
|
||||
}
|
||||
|
||||
async def delete_recipe(self, recipe_id: UUID) -> bool:
|
||||
"""Delete a recipe"""
|
||||
try:
|
||||
return await self.recipe_repo.delete(recipe_id)
|
||||
except Exception as e:
|
||||
logger.error(f"Error deleting recipe {recipe_id}: {e}")
|
||||
return False
|
||||
|
||||
async def check_recipe_feasibility(self, recipe_id: UUID, batch_multiplier: float = 1.0) -> Dict[str, Any]:
|
||||
"""Check if recipe can be produced with current inventory"""
|
||||
try:
|
||||
recipe = await self.recipe_repo.get_recipe_with_ingredients(recipe_id)
|
||||
if not recipe:
|
||||
return {
|
||||
"success": False,
|
||||
"error": "Recipe not found"
|
||||
}
|
||||
|
||||
# Simplified feasibility check - can be enhanced later with inventory service integration
|
||||
return {
|
||||
"success": True,
|
||||
"data": {
|
||||
"recipe_id": str(recipe_id),
|
||||
"recipe_name": recipe["name"],
|
||||
"batch_multiplier": batch_multiplier,
|
||||
"feasible": True,
|
||||
"missing_ingredients": [],
|
||||
"insufficient_ingredients": []
|
||||
}
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error checking recipe feasibility {recipe_id}: {e}")
|
||||
return {
|
||||
"success": False,
|
||||
"error": str(e)
|
||||
}
|
||||
|
||||
async def duplicate_recipe(
|
||||
self,
|
||||
recipe_id: UUID,
|
||||
new_name: str,
|
||||
created_by: UUID
|
||||
) -> Dict[str, Any]:
|
||||
"""Create a duplicate of an existing recipe"""
|
||||
try:
|
||||
# Get original recipe
|
||||
original_recipe = await self.recipe_repo.get_recipe_with_ingredients(recipe_id)
|
||||
if not original_recipe:
|
||||
return {
|
||||
"success": False,
|
||||
"error": "Recipe not found"
|
||||
}
|
||||
|
||||
# Create new recipe data
|
||||
new_recipe_data = original_recipe.copy()
|
||||
new_recipe_data["name"] = new_name
|
||||
|
||||
# Remove fields that should be auto-generated
|
||||
new_recipe_data.pop("id", None)
|
||||
new_recipe_data.pop("created_at", None)
|
||||
new_recipe_data.pop("updated_at", None)
|
||||
|
||||
# Handle ingredients
|
||||
ingredients = new_recipe_data.pop("ingredients", [])
|
||||
|
||||
# Create the duplicate
|
||||
result = await self.create_recipe(new_recipe_data, ingredients, created_by)
|
||||
|
||||
return result
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error duplicating recipe {recipe_id}: {e}")
|
||||
await self.session.rollback()
|
||||
return {
|
||||
"success": False,
|
||||
"error": str(e)
|
||||
}
|
||||
|
||||
async def activate_recipe(self, recipe_id: UUID, activated_by: UUID) -> Dict[str, Any]:
|
||||
"""Activate a recipe for production"""
|
||||
try:
|
||||
# Check if recipe exists
|
||||
recipe = await self.recipe_repo.get_recipe_with_ingredients(recipe_id)
|
||||
if not recipe:
|
||||
return {
|
||||
"success": False,
|
||||
"error": "Recipe not found"
|
||||
}
|
||||
|
||||
if not recipe.get("ingredients"):
|
||||
return {
|
||||
"success": False,
|
||||
"error": "Recipe must have at least one ingredient"
|
||||
}
|
||||
|
||||
# Update recipe status
|
||||
update_data = {
|
||||
"status": "active",
|
||||
"updated_by": activated_by,
|
||||
"updated_at": datetime.utcnow()
|
||||
}
|
||||
|
||||
recipe_update = RecipeUpdate(**update_data)
|
||||
await self.recipe_repo.update(recipe_id, recipe_update)
|
||||
|
||||
# Get the updated recipe
|
||||
result = await self.recipe_repo.get_recipe_with_ingredients(recipe_id)
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"data": result
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error activating recipe {recipe_id}: {e}")
|
||||
return {
|
||||
"success": False,
|
||||
"error": str(e)
|
||||
}
|
||||
|
||||
# Quality Configuration Methods
|
||||
|
||||
async def update_recipe_quality_configuration(
|
||||
self,
|
||||
tenant_id: UUID,
|
||||
recipe_id: UUID,
|
||||
quality_config_update: Dict[str, Any],
|
||||
user_id: UUID
|
||||
) -> Dict[str, Any]:
|
||||
"""Update quality configuration for a recipe"""
|
||||
try:
|
||||
# Get current recipe
|
||||
recipe = await self.recipe_repo.get_recipe(tenant_id, recipe_id)
|
||||
if not recipe:
|
||||
raise ValueError("Recipe not found")
|
||||
|
||||
# Get existing quality configuration or create default
|
||||
current_config = recipe.get("quality_check_configuration", {
|
||||
"stages": {},
|
||||
"overall_quality_threshold": 7.0,
|
||||
"critical_stage_blocking": True,
|
||||
"auto_create_quality_checks": True,
|
||||
"quality_manager_approval_required": False
|
||||
})
|
||||
|
||||
# Merge with updates
|
||||
if "stages" in quality_config_update:
|
||||
current_config["stages"].update(quality_config_update["stages"])
|
||||
|
||||
for key in ["overall_quality_threshold", "critical_stage_blocking",
|
||||
"auto_create_quality_checks", "quality_manager_approval_required"]:
|
||||
if key in quality_config_update:
|
||||
current_config[key] = quality_config_update[key]
|
||||
|
||||
# Update recipe with new configuration
|
||||
recipe_update = RecipeUpdate(quality_check_configuration=current_config)
|
||||
await self.recipe_repo.update_recipe(tenant_id, recipe_id, recipe_update, user_id)
|
||||
|
||||
# Return updated recipe
|
||||
updated_recipe = await self.recipe_repo.get_recipe(tenant_id, recipe_id)
|
||||
return updated_recipe
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error updating recipe quality configuration: {e}")
|
||||
raise
|
||||
|
||||
async def add_quality_templates_to_stage(
|
||||
self,
|
||||
tenant_id: UUID,
|
||||
recipe_id: UUID,
|
||||
stage: str,
|
||||
template_ids: List[UUID],
|
||||
user_id: UUID
|
||||
):
|
||||
"""Add quality templates to a specific recipe stage"""
|
||||
try:
|
||||
# Get current recipe
|
||||
recipe = await self.recipe_repo.get_recipe(tenant_id, recipe_id)
|
||||
if not recipe:
|
||||
raise ValueError("Recipe not found")
|
||||
|
||||
# Get existing quality configuration
|
||||
quality_config = recipe.get("quality_check_configuration", {"stages": {}})
|
||||
|
||||
# Initialize stage if it doesn't exist
|
||||
if stage not in quality_config["stages"]:
|
||||
quality_config["stages"][stage] = {
|
||||
"template_ids": [],
|
||||
"required_checks": [],
|
||||
"optional_checks": [],
|
||||
"blocking_on_failure": True,
|
||||
"min_quality_score": None
|
||||
}
|
||||
|
||||
# Add template IDs (avoid duplicates)
|
||||
stage_config = quality_config["stages"][stage]
|
||||
existing_ids = set(stage_config.get("template_ids", []))
|
||||
new_ids = [str(tid) for tid in template_ids if str(tid) not in existing_ids]
|
||||
stage_config["template_ids"].extend(new_ids)
|
||||
|
||||
# Update recipe
|
||||
recipe_update = RecipeUpdate(quality_check_configuration=quality_config)
|
||||
await self.recipe_repo.update_recipe(tenant_id, recipe_id, recipe_update, user_id)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error adding quality templates to stage: {e}")
|
||||
raise
|
||||
|
||||
async def remove_quality_template_from_stage(
|
||||
self,
|
||||
tenant_id: UUID,
|
||||
recipe_id: UUID,
|
||||
stage: str,
|
||||
template_id: UUID,
|
||||
user_id: UUID
|
||||
):
|
||||
"""Remove a quality template from a specific recipe stage"""
|
||||
try:
|
||||
# Get current recipe
|
||||
recipe = await self.recipe_repo.get_recipe(tenant_id, recipe_id)
|
||||
if not recipe:
|
||||
raise ValueError("Recipe not found")
|
||||
|
||||
# Get existing quality configuration
|
||||
quality_config = recipe.get("quality_check_configuration", {"stages": {}})
|
||||
|
||||
# Remove template ID from stage
|
||||
if stage in quality_config["stages"]:
|
||||
stage_config = quality_config["stages"][stage]
|
||||
template_ids = stage_config.get("template_ids", [])
|
||||
template_ids = [tid for tid in template_ids if str(tid) != str(template_id)]
|
||||
stage_config["template_ids"] = template_ids
|
||||
|
||||
# Update recipe
|
||||
recipe_update = RecipeUpdate(quality_check_configuration=quality_config)
|
||||
await self.recipe_repo.update_recipe(tenant_id, recipe_id, recipe_update, user_id)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error removing quality template from stage: {e}")
|
||||
raise
|
||||
134
services/recipes/app/services/tenant_deletion_service.py
Normal file
134
services/recipes/app/services/tenant_deletion_service.py
Normal file
@@ -0,0 +1,134 @@
|
||||
"""
|
||||
Recipes Service - Tenant Data Deletion
|
||||
Handles deletion of all recipe-related data for a tenant
|
||||
"""
|
||||
from typing import Dict
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy import select, delete, func
|
||||
import structlog
|
||||
|
||||
from shared.services.tenant_deletion import BaseTenantDataDeletionService, TenantDataDeletionResult
|
||||
from app.models.recipes import Recipe, RecipeIngredient, ProductionBatch
|
||||
|
||||
logger = structlog.get_logger()
|
||||
|
||||
|
||||
class RecipesTenantDeletionService(BaseTenantDataDeletionService):
|
||||
"""Service for deleting all recipe-related data for a tenant"""
|
||||
|
||||
def __init__(self, db_session: AsyncSession):
|
||||
super().__init__("recipes-service")
|
||||
self.db = db_session
|
||||
|
||||
async def get_tenant_data_preview(self, tenant_id: str) -> Dict[str, int]:
|
||||
"""Get counts of what would be deleted"""
|
||||
|
||||
try:
|
||||
preview = {}
|
||||
|
||||
# Count recipes
|
||||
recipe_count = await self.db.scalar(
|
||||
select(func.count(Recipe.id)).where(Recipe.tenant_id == tenant_id)
|
||||
)
|
||||
preview["recipes"] = recipe_count or 0
|
||||
|
||||
# Count recipe ingredients (will be deleted via CASCADE)
|
||||
ingredient_count = await self.db.scalar(
|
||||
select(func.count(RecipeIngredient.id))
|
||||
.where(RecipeIngredient.tenant_id == tenant_id)
|
||||
)
|
||||
preview["recipe_ingredients"] = ingredient_count or 0
|
||||
|
||||
# Count production batches (will be deleted via CASCADE)
|
||||
batch_count = await self.db.scalar(
|
||||
select(func.count(ProductionBatch.id))
|
||||
.where(ProductionBatch.tenant_id == tenant_id)
|
||||
)
|
||||
preview["production_batches"] = batch_count or 0
|
||||
|
||||
return preview
|
||||
|
||||
except Exception as e:
|
||||
logger.error("Error getting deletion preview",
|
||||
tenant_id=tenant_id,
|
||||
error=str(e))
|
||||
return {}
|
||||
|
||||
async def delete_tenant_data(self, tenant_id: str) -> TenantDataDeletionResult:
|
||||
"""Delete all data for a tenant"""
|
||||
|
||||
result = TenantDataDeletionResult(tenant_id, self.service_name)
|
||||
|
||||
try:
|
||||
# Get preview before deletion for reporting
|
||||
preview = await self.get_tenant_data_preview(tenant_id)
|
||||
|
||||
# Delete production batches first (foreign key to recipes)
|
||||
try:
|
||||
batch_delete = await self.db.execute(
|
||||
delete(ProductionBatch).where(ProductionBatch.tenant_id == tenant_id)
|
||||
)
|
||||
deleted_batches = batch_delete.rowcount
|
||||
result.add_deleted_items("production_batches", deleted_batches)
|
||||
|
||||
logger.info("Deleted production batches for tenant",
|
||||
tenant_id=tenant_id,
|
||||
count=deleted_batches)
|
||||
|
||||
except Exception as e:
|
||||
logger.error("Error deleting production batches",
|
||||
tenant_id=tenant_id,
|
||||
error=str(e))
|
||||
result.add_error(f"Production batch deletion: {str(e)}")
|
||||
|
||||
# Delete recipe ingredients (foreign key to recipes)
|
||||
try:
|
||||
ingredient_delete = await self.db.execute(
|
||||
delete(RecipeIngredient).where(RecipeIngredient.tenant_id == tenant_id)
|
||||
)
|
||||
deleted_ingredients = ingredient_delete.rowcount
|
||||
result.add_deleted_items("recipe_ingredients", deleted_ingredients)
|
||||
|
||||
logger.info("Deleted recipe ingredients for tenant",
|
||||
tenant_id=tenant_id,
|
||||
count=deleted_ingredients)
|
||||
|
||||
except Exception as e:
|
||||
logger.error("Error deleting recipe ingredients",
|
||||
tenant_id=tenant_id,
|
||||
error=str(e))
|
||||
result.add_error(f"Recipe ingredient deletion: {str(e)}")
|
||||
|
||||
# Delete recipes (parent table)
|
||||
try:
|
||||
recipe_delete = await self.db.execute(
|
||||
delete(Recipe).where(Recipe.tenant_id == tenant_id)
|
||||
)
|
||||
deleted_recipes = recipe_delete.rowcount
|
||||
result.add_deleted_items("recipes", deleted_recipes)
|
||||
|
||||
logger.info("Deleted recipes for tenant",
|
||||
tenant_id=tenant_id,
|
||||
count=deleted_recipes)
|
||||
|
||||
except Exception as e:
|
||||
logger.error("Error deleting recipes",
|
||||
tenant_id=tenant_id,
|
||||
error=str(e))
|
||||
result.add_error(f"Recipe deletion: {str(e)}")
|
||||
|
||||
# Commit all deletions
|
||||
await self.db.commit()
|
||||
|
||||
logger.info("Tenant data deletion completed",
|
||||
tenant_id=tenant_id,
|
||||
deleted_counts=result.deleted_counts)
|
||||
|
||||
except Exception as e:
|
||||
logger.error("Fatal error during tenant data deletion",
|
||||
tenant_id=tenant_id,
|
||||
error=str(e))
|
||||
await self.db.rollback()
|
||||
result.add_error(f"Fatal error: {str(e)}")
|
||||
|
||||
return result
|
||||
141
services/recipes/migrations/env.py
Normal file
141
services/recipes/migrations/env.py
Normal file
@@ -0,0 +1,141 @@
|
||||
"""Alembic environment configuration for recipes service"""
|
||||
|
||||
import asyncio
|
||||
import os
|
||||
import sys
|
||||
from logging.config import fileConfig
|
||||
from sqlalchemy import pool
|
||||
from sqlalchemy.engine import Connection
|
||||
from sqlalchemy.ext.asyncio import async_engine_from_config
|
||||
from alembic import context
|
||||
|
||||
# Add the service directory to the Python path
|
||||
service_path = os.path.abspath(os.path.join(os.path.dirname(__file__), ".."))
|
||||
if service_path not in sys.path:
|
||||
sys.path.insert(0, service_path)
|
||||
|
||||
# Add shared modules to path
|
||||
shared_path = os.path.abspath(os.path.join(os.path.dirname(__file__), "..", "..", "shared"))
|
||||
if shared_path not in sys.path:
|
||||
sys.path.insert(0, shared_path)
|
||||
|
||||
try:
|
||||
from app.core.config import settings
|
||||
from shared.database.base import Base
|
||||
|
||||
# Import all models to ensure they are registered with Base.metadata
|
||||
from app.models import * # noqa: F401, F403
|
||||
|
||||
except ImportError as e:
|
||||
print(f"Import error in migrations env.py: {e}")
|
||||
print(f"Current Python path: {sys.path}")
|
||||
raise
|
||||
|
||||
# this is the Alembic Config object
|
||||
config = context.config
|
||||
|
||||
# Determine service name from file path
|
||||
service_name = os.path.basename(os.path.dirname(os.path.dirname(__file__)))
|
||||
service_name_upper = service_name.upper().replace('-', '_')
|
||||
|
||||
# Set database URL from environment variables with multiple fallback strategies
|
||||
database_url = (
|
||||
os.getenv(f'{service_name_upper}_DATABASE_URL') or # Service-specific
|
||||
os.getenv('DATABASE_URL') # Generic fallback
|
||||
)
|
||||
|
||||
# If DATABASE_URL is not set, construct from individual components
|
||||
if not database_url:
|
||||
# Try generic PostgreSQL environment variables first
|
||||
postgres_host = os.getenv('POSTGRES_HOST')
|
||||
postgres_port = os.getenv('POSTGRES_PORT', '5432')
|
||||
postgres_db = os.getenv('POSTGRES_DB')
|
||||
postgres_user = os.getenv('POSTGRES_USER')
|
||||
postgres_password = os.getenv('POSTGRES_PASSWORD')
|
||||
|
||||
if all([postgres_host, postgres_db, postgres_user, postgres_password]):
|
||||
database_url = f"postgresql+asyncpg://{postgres_user}:{postgres_password}@{postgres_host}:{postgres_port}/{postgres_db}"
|
||||
else:
|
||||
# Try service-specific environment variables
|
||||
db_host = os.getenv(f'{service_name_upper}_DB_HOST', f'{service_name}-db-service')
|
||||
db_port = os.getenv(f'{service_name_upper}_DB_PORT', '5432')
|
||||
db_name = os.getenv(f'{service_name_upper}_DB_NAME', f'{service_name.replace("-", "_")}_db')
|
||||
db_user = os.getenv(f'{service_name_upper}_DB_USER', f'{service_name.replace("-", "_")}_user')
|
||||
db_password = os.getenv(f'{service_name_upper}_DB_PASSWORD')
|
||||
|
||||
if db_password:
|
||||
database_url = f"postgresql+asyncpg://{db_user}:{db_password}@{db_host}:{db_port}/{db_name}"
|
||||
else:
|
||||
# Final fallback: try to get from settings object
|
||||
try:
|
||||
database_url = getattr(settings, 'DATABASE_URL', None)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
if not database_url:
|
||||
error_msg = f"ERROR: No database URL configured for {service_name} service"
|
||||
print(error_msg)
|
||||
raise Exception(error_msg)
|
||||
|
||||
config.set_main_option("sqlalchemy.url", database_url)
|
||||
|
||||
# Interpret the config file for Python logging
|
||||
if config.config_file_name is not None:
|
||||
fileConfig(config.config_file_name)
|
||||
|
||||
# Set target metadata
|
||||
target_metadata = Base.metadata
|
||||
|
||||
|
||||
def run_migrations_offline() -> None:
|
||||
"""Run migrations in 'offline' mode."""
|
||||
url = config.get_main_option("sqlalchemy.url")
|
||||
context.configure(
|
||||
url=url,
|
||||
target_metadata=target_metadata,
|
||||
literal_binds=True,
|
||||
dialect_opts={"paramstyle": "named"},
|
||||
compare_type=True,
|
||||
compare_server_default=True,
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
def do_run_migrations(connection: Connection) -> None:
|
||||
"""Execute migrations with the given connection."""
|
||||
context.configure(
|
||||
connection=connection,
|
||||
target_metadata=target_metadata,
|
||||
compare_type=True,
|
||||
compare_server_default=True,
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
async def run_async_migrations() -> None:
|
||||
"""Run migrations in 'online' mode with async support."""
|
||||
connectable = async_engine_from_config(
|
||||
config.get_section(config.config_ini_section, {}),
|
||||
prefix="sqlalchemy.",
|
||||
poolclass=pool.NullPool,
|
||||
)
|
||||
|
||||
async with connectable.connect() as connection:
|
||||
await connection.run_sync(do_run_migrations)
|
||||
|
||||
await connectable.dispose()
|
||||
|
||||
|
||||
def run_migrations_online() -> None:
|
||||
"""Run migrations in 'online' mode."""
|
||||
asyncio.run(run_async_migrations())
|
||||
|
||||
|
||||
if context.is_offline_mode():
|
||||
run_migrations_offline()
|
||||
else:
|
||||
run_migrations_online()
|
||||
26
services/recipes/migrations/script.py.mako
Normal file
26
services/recipes/migrations/script.py.mako
Normal file
@@ -0,0 +1,26 @@
|
||||
"""${message}
|
||||
|
||||
Revision ID: ${up_revision}
|
||||
Revises: ${down_revision | comma,n}
|
||||
Create Date: ${create_date}
|
||||
|
||||
"""
|
||||
from typing import Sequence, Union
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
${imports if imports else ""}
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = ${repr(up_revision)}
|
||||
down_revision: Union[str, None] = ${repr(down_revision)}
|
||||
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
|
||||
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
${upgrades if upgrades else "pass"}
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
${downgrades if downgrades else "pass"}
|
||||
@@ -0,0 +1,334 @@
|
||||
"""initial_schema_20251015_1228
|
||||
|
||||
Revision ID: 3c4d0f57a312
|
||||
Revises:
|
||||
Create Date: 2025-10-15 12:28:57.066635+02:00
|
||||
|
||||
"""
|
||||
from typing import Sequence, Union
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy.dialects import postgresql
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = '3c4d0f57a312'
|
||||
down_revision: Union[str, None] = None
|
||||
branch_labels: Union[str, Sequence[str], None] = None
|
||||
depends_on: Union[str, Sequence[str], None] = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.create_table('audit_logs',
|
||||
sa.Column('id', sa.UUID(), nullable=False),
|
||||
sa.Column('tenant_id', sa.UUID(), nullable=False),
|
||||
sa.Column('user_id', sa.UUID(), nullable=False),
|
||||
sa.Column('action', sa.String(length=100), nullable=False),
|
||||
sa.Column('resource_type', sa.String(length=100), nullable=False),
|
||||
sa.Column('resource_id', sa.String(length=255), nullable=True),
|
||||
sa.Column('severity', sa.String(length=20), nullable=False),
|
||||
sa.Column('service_name', sa.String(length=100), nullable=False),
|
||||
sa.Column('description', sa.Text(), nullable=True),
|
||||
sa.Column('changes', postgresql.JSON(astext_type=sa.Text()), nullable=True),
|
||||
sa.Column('audit_metadata', postgresql.JSON(astext_type=sa.Text()), nullable=True),
|
||||
sa.Column('ip_address', sa.String(length=45), nullable=True),
|
||||
sa.Column('user_agent', sa.Text(), nullable=True),
|
||||
sa.Column('endpoint', sa.String(length=255), nullable=True),
|
||||
sa.Column('method', sa.String(length=10), nullable=True),
|
||||
sa.Column('created_at', sa.DateTime(timezone=True), nullable=False),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
op.create_index('idx_audit_resource_type_action', 'audit_logs', ['resource_type', 'action'], unique=False)
|
||||
op.create_index('idx_audit_service_created', 'audit_logs', ['service_name', 'created_at'], unique=False)
|
||||
op.create_index('idx_audit_severity_created', 'audit_logs', ['severity', 'created_at'], unique=False)
|
||||
op.create_index('idx_audit_tenant_created', 'audit_logs', ['tenant_id', 'created_at'], unique=False)
|
||||
op.create_index('idx_audit_user_created', 'audit_logs', ['user_id', 'created_at'], unique=False)
|
||||
op.create_index(op.f('ix_audit_logs_action'), 'audit_logs', ['action'], unique=False)
|
||||
op.create_index(op.f('ix_audit_logs_created_at'), 'audit_logs', ['created_at'], unique=False)
|
||||
op.create_index(op.f('ix_audit_logs_resource_id'), 'audit_logs', ['resource_id'], unique=False)
|
||||
op.create_index(op.f('ix_audit_logs_resource_type'), 'audit_logs', ['resource_type'], unique=False)
|
||||
op.create_index(op.f('ix_audit_logs_service_name'), 'audit_logs', ['service_name'], unique=False)
|
||||
op.create_index(op.f('ix_audit_logs_severity'), 'audit_logs', ['severity'], unique=False)
|
||||
op.create_index(op.f('ix_audit_logs_tenant_id'), 'audit_logs', ['tenant_id'], unique=False)
|
||||
op.create_index(op.f('ix_audit_logs_user_id'), 'audit_logs', ['user_id'], unique=False)
|
||||
op.create_table('production_schedules',
|
||||
sa.Column('id', sa.UUID(), nullable=False),
|
||||
sa.Column('tenant_id', sa.UUID(), nullable=False),
|
||||
sa.Column('schedule_date', sa.DateTime(timezone=True), nullable=False),
|
||||
sa.Column('schedule_name', sa.String(length=255), nullable=True),
|
||||
sa.Column('total_planned_batches', sa.Integer(), nullable=False),
|
||||
sa.Column('total_planned_items', sa.Float(), nullable=False),
|
||||
sa.Column('estimated_production_hours', sa.Float(), nullable=True),
|
||||
sa.Column('estimated_material_cost', sa.Numeric(precision=10, scale=2), nullable=True),
|
||||
sa.Column('is_published', sa.Boolean(), nullable=True),
|
||||
sa.Column('is_completed', sa.Boolean(), nullable=True),
|
||||
sa.Column('completion_percentage', sa.Float(), nullable=True),
|
||||
sa.Column('available_staff_hours', sa.Float(), nullable=True),
|
||||
sa.Column('oven_capacity_hours', sa.Float(), nullable=True),
|
||||
sa.Column('production_capacity_limit', sa.Float(), nullable=True),
|
||||
sa.Column('schedule_notes', sa.Text(), nullable=True),
|
||||
sa.Column('preparation_instructions', sa.Text(), nullable=True),
|
||||
sa.Column('special_requirements', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
|
||||
sa.Column('created_at', sa.DateTime(timezone=True), nullable=True),
|
||||
sa.Column('updated_at', sa.DateTime(timezone=True), nullable=True),
|
||||
sa.Column('created_by', sa.UUID(), nullable=True),
|
||||
sa.Column('published_by', sa.UUID(), nullable=True),
|
||||
sa.Column('published_at', sa.DateTime(timezone=True), nullable=True),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
op.create_index('idx_production_schedules_completed', 'production_schedules', ['tenant_id', 'is_completed', 'schedule_date'], unique=False)
|
||||
op.create_index('idx_production_schedules_published', 'production_schedules', ['tenant_id', 'is_published', 'schedule_date'], unique=False)
|
||||
op.create_index('idx_production_schedules_tenant_date', 'production_schedules', ['tenant_id', 'schedule_date'], unique=False)
|
||||
op.create_index(op.f('ix_production_schedules_schedule_date'), 'production_schedules', ['schedule_date'], unique=False)
|
||||
op.create_index(op.f('ix_production_schedules_tenant_id'), 'production_schedules', ['tenant_id'], unique=False)
|
||||
op.create_table('recipes',
|
||||
sa.Column('id', sa.UUID(), nullable=False),
|
||||
sa.Column('tenant_id', sa.UUID(), nullable=False),
|
||||
sa.Column('name', sa.String(length=255), nullable=False),
|
||||
sa.Column('recipe_code', sa.String(length=100), nullable=True),
|
||||
sa.Column('version', sa.String(length=20), nullable=False),
|
||||
sa.Column('finished_product_id', sa.UUID(), nullable=False),
|
||||
sa.Column('description', sa.Text(), nullable=True),
|
||||
sa.Column('category', sa.String(length=100), nullable=True),
|
||||
sa.Column('cuisine_type', sa.String(length=100), nullable=True),
|
||||
sa.Column('difficulty_level', sa.Integer(), nullable=False),
|
||||
sa.Column('yield_quantity', sa.Float(), nullable=False),
|
||||
sa.Column('yield_unit', sa.Enum('GRAMS', 'KILOGRAMS', 'MILLILITERS', 'LITERS', 'CUPS', 'TABLESPOONS', 'TEASPOONS', 'UNITS', 'PIECES', 'PERCENTAGE', name='measurementunit'), nullable=False),
|
||||
sa.Column('prep_time_minutes', sa.Integer(), nullable=True),
|
||||
sa.Column('cook_time_minutes', sa.Integer(), nullable=True),
|
||||
sa.Column('total_time_minutes', sa.Integer(), nullable=True),
|
||||
sa.Column('rest_time_minutes', sa.Integer(), nullable=True),
|
||||
sa.Column('estimated_cost_per_unit', sa.Numeric(precision=10, scale=2), nullable=True),
|
||||
sa.Column('last_calculated_cost', sa.Numeric(precision=10, scale=2), nullable=True),
|
||||
sa.Column('cost_calculation_date', sa.DateTime(timezone=True), nullable=True),
|
||||
sa.Column('target_margin_percentage', sa.Float(), nullable=True),
|
||||
sa.Column('suggested_selling_price', sa.Numeric(precision=10, scale=2), nullable=True),
|
||||
sa.Column('instructions', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
|
||||
sa.Column('preparation_notes', sa.Text(), nullable=True),
|
||||
sa.Column('storage_instructions', sa.Text(), nullable=True),
|
||||
sa.Column('quality_standards', sa.Text(), nullable=True),
|
||||
sa.Column('serves_count', sa.Integer(), nullable=True),
|
||||
sa.Column('nutritional_info', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
|
||||
sa.Column('allergen_info', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
|
||||
sa.Column('dietary_tags', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
|
||||
sa.Column('batch_size_multiplier', sa.Float(), nullable=False),
|
||||
sa.Column('minimum_batch_size', sa.Float(), nullable=True),
|
||||
sa.Column('maximum_batch_size', sa.Float(), nullable=True),
|
||||
sa.Column('optimal_production_temperature', sa.Float(), nullable=True),
|
||||
sa.Column('optimal_humidity', sa.Float(), nullable=True),
|
||||
sa.Column('quality_check_points', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
|
||||
sa.Column('quality_check_configuration', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
|
||||
sa.Column('common_issues', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
|
||||
sa.Column('status', sa.Enum('DRAFT', 'ACTIVE', 'TESTING', 'ARCHIVED', 'DISCONTINUED', name='recipestatus'), nullable=False),
|
||||
sa.Column('is_seasonal', sa.Boolean(), nullable=True),
|
||||
sa.Column('season_start_month', sa.Integer(), nullable=True),
|
||||
sa.Column('season_end_month', sa.Integer(), nullable=True),
|
||||
sa.Column('is_signature_item', sa.Boolean(), nullable=True),
|
||||
sa.Column('created_at', sa.DateTime(timezone=True), nullable=True),
|
||||
sa.Column('updated_at', sa.DateTime(timezone=True), nullable=True),
|
||||
sa.Column('created_by', sa.UUID(), nullable=True),
|
||||
sa.Column('updated_by', sa.UUID(), nullable=True),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
op.create_index('idx_recipes_category', 'recipes', ['tenant_id', 'category', 'status'], unique=False)
|
||||
op.create_index('idx_recipes_seasonal', 'recipes', ['tenant_id', 'is_seasonal', 'season_start_month', 'season_end_month'], unique=False)
|
||||
op.create_index('idx_recipes_signature', 'recipes', ['tenant_id', 'is_signature_item', 'status'], unique=False)
|
||||
op.create_index('idx_recipes_status', 'recipes', ['tenant_id', 'status'], unique=False)
|
||||
op.create_index('idx_recipes_tenant_name', 'recipes', ['tenant_id', 'name'], unique=False)
|
||||
op.create_index('idx_recipes_tenant_product', 'recipes', ['tenant_id', 'finished_product_id'], unique=False)
|
||||
op.create_index(op.f('ix_recipes_category'), 'recipes', ['category'], unique=False)
|
||||
op.create_index(op.f('ix_recipes_finished_product_id'), 'recipes', ['finished_product_id'], unique=False)
|
||||
op.create_index(op.f('ix_recipes_name'), 'recipes', ['name'], unique=False)
|
||||
op.create_index(op.f('ix_recipes_recipe_code'), 'recipes', ['recipe_code'], unique=False)
|
||||
op.create_index(op.f('ix_recipes_status'), 'recipes', ['status'], unique=False)
|
||||
op.create_index(op.f('ix_recipes_tenant_id'), 'recipes', ['tenant_id'], unique=False)
|
||||
op.create_table('production_batches',
|
||||
sa.Column('id', sa.UUID(), nullable=False),
|
||||
sa.Column('tenant_id', sa.UUID(), nullable=False),
|
||||
sa.Column('recipe_id', sa.UUID(), nullable=False),
|
||||
sa.Column('batch_number', sa.String(length=100), nullable=False),
|
||||
sa.Column('production_date', sa.DateTime(timezone=True), nullable=False),
|
||||
sa.Column('planned_start_time', sa.DateTime(timezone=True), nullable=True),
|
||||
sa.Column('actual_start_time', sa.DateTime(timezone=True), nullable=True),
|
||||
sa.Column('planned_end_time', sa.DateTime(timezone=True), nullable=True),
|
||||
sa.Column('actual_end_time', sa.DateTime(timezone=True), nullable=True),
|
||||
sa.Column('planned_quantity', sa.Float(), nullable=False),
|
||||
sa.Column('actual_quantity', sa.Float(), nullable=True),
|
||||
sa.Column('yield_percentage', sa.Float(), nullable=True),
|
||||
sa.Column('batch_size_multiplier', sa.Float(), nullable=False),
|
||||
sa.Column('status', sa.Enum('PLANNED', 'IN_PROGRESS', 'COMPLETED', 'FAILED', 'CANCELLED', name='productionstatus'), nullable=False),
|
||||
sa.Column('priority', sa.Enum('LOW', 'NORMAL', 'HIGH', 'URGENT', name='productionpriority'), nullable=False),
|
||||
sa.Column('assigned_staff', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
|
||||
sa.Column('production_notes', sa.Text(), nullable=True),
|
||||
sa.Column('quality_score', sa.Float(), nullable=True),
|
||||
sa.Column('quality_notes', sa.Text(), nullable=True),
|
||||
sa.Column('defect_rate', sa.Float(), nullable=True),
|
||||
sa.Column('rework_required', sa.Boolean(), nullable=True),
|
||||
sa.Column('planned_material_cost', sa.Numeric(precision=10, scale=2), nullable=True),
|
||||
sa.Column('actual_material_cost', sa.Numeric(precision=10, scale=2), nullable=True),
|
||||
sa.Column('labor_cost', sa.Numeric(precision=10, scale=2), nullable=True),
|
||||
sa.Column('overhead_cost', sa.Numeric(precision=10, scale=2), nullable=True),
|
||||
sa.Column('total_production_cost', sa.Numeric(precision=10, scale=2), nullable=True),
|
||||
sa.Column('cost_per_unit', sa.Numeric(precision=10, scale=2), nullable=True),
|
||||
sa.Column('production_temperature', sa.Float(), nullable=True),
|
||||
sa.Column('production_humidity', sa.Float(), nullable=True),
|
||||
sa.Column('oven_temperature', sa.Float(), nullable=True),
|
||||
sa.Column('baking_time_minutes', sa.Integer(), nullable=True),
|
||||
sa.Column('waste_quantity', sa.Float(), nullable=False),
|
||||
sa.Column('waste_reason', sa.String(length=255), nullable=True),
|
||||
sa.Column('efficiency_percentage', sa.Float(), nullable=True),
|
||||
sa.Column('customer_order_reference', sa.String(length=100), nullable=True),
|
||||
sa.Column('pre_order_quantity', sa.Float(), nullable=True),
|
||||
sa.Column('shelf_quantity', sa.Float(), nullable=True),
|
||||
sa.Column('created_at', sa.DateTime(timezone=True), nullable=True),
|
||||
sa.Column('updated_at', sa.DateTime(timezone=True), nullable=True),
|
||||
sa.Column('created_by', sa.UUID(), nullable=True),
|
||||
sa.Column('completed_by', sa.UUID(), nullable=True),
|
||||
sa.ForeignKeyConstraint(['recipe_id'], ['recipes.id'], ),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
op.create_index('idx_production_batches_batch_number', 'production_batches', ['tenant_id', 'batch_number'], unique=False)
|
||||
op.create_index('idx_production_batches_priority', 'production_batches', ['tenant_id', 'priority', 'planned_start_time'], unique=False)
|
||||
op.create_index('idx_production_batches_recipe', 'production_batches', ['recipe_id', 'production_date'], unique=False)
|
||||
op.create_index('idx_production_batches_status', 'production_batches', ['tenant_id', 'status', 'production_date'], unique=False)
|
||||
op.create_index('idx_production_batches_tenant_date', 'production_batches', ['tenant_id', 'production_date'], unique=False)
|
||||
op.create_index(op.f('ix_production_batches_batch_number'), 'production_batches', ['batch_number'], unique=False)
|
||||
op.create_index(op.f('ix_production_batches_production_date'), 'production_batches', ['production_date'], unique=False)
|
||||
op.create_index(op.f('ix_production_batches_recipe_id'), 'production_batches', ['recipe_id'], unique=False)
|
||||
op.create_index(op.f('ix_production_batches_status'), 'production_batches', ['status'], unique=False)
|
||||
op.create_index(op.f('ix_production_batches_tenant_id'), 'production_batches', ['tenant_id'], unique=False)
|
||||
op.create_table('recipe_ingredients',
|
||||
sa.Column('id', sa.UUID(), nullable=False),
|
||||
sa.Column('tenant_id', sa.UUID(), nullable=False),
|
||||
sa.Column('recipe_id', sa.UUID(), nullable=False),
|
||||
sa.Column('ingredient_id', sa.UUID(), nullable=False),
|
||||
sa.Column('quantity', sa.Float(), nullable=False),
|
||||
sa.Column('unit', sa.Enum('GRAMS', 'KILOGRAMS', 'MILLILITERS', 'LITERS', 'CUPS', 'TABLESPOONS', 'TEASPOONS', 'UNITS', 'PIECES', 'PERCENTAGE', name='measurementunit'), nullable=False),
|
||||
sa.Column('quantity_in_base_unit', sa.Float(), nullable=True),
|
||||
sa.Column('alternative_quantity', sa.Float(), nullable=True),
|
||||
sa.Column('alternative_unit', sa.Enum('GRAMS', 'KILOGRAMS', 'MILLILITERS', 'LITERS', 'CUPS', 'TABLESPOONS', 'TEASPOONS', 'UNITS', 'PIECES', 'PERCENTAGE', name='measurementunit'), nullable=True),
|
||||
sa.Column('preparation_method', sa.String(length=255), nullable=True),
|
||||
sa.Column('ingredient_notes', sa.Text(), nullable=True),
|
||||
sa.Column('is_optional', sa.Boolean(), nullable=True),
|
||||
sa.Column('ingredient_order', sa.Integer(), nullable=False),
|
||||
sa.Column('ingredient_group', sa.String(length=100), nullable=True),
|
||||
sa.Column('substitution_options', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
|
||||
sa.Column('substitution_ratio', sa.Float(), nullable=True),
|
||||
sa.Column('unit_cost', sa.Numeric(precision=10, scale=2), nullable=True),
|
||||
sa.Column('total_cost', sa.Numeric(precision=10, scale=2), nullable=True),
|
||||
sa.Column('cost_updated_at', sa.DateTime(timezone=True), nullable=True),
|
||||
sa.ForeignKeyConstraint(['recipe_id'], ['recipes.id'], ),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
op.create_index('idx_recipe_ingredients_group', 'recipe_ingredients', ['recipe_id', 'ingredient_group', 'ingredient_order'], unique=False)
|
||||
op.create_index('idx_recipe_ingredients_ingredient', 'recipe_ingredients', ['ingredient_id'], unique=False)
|
||||
op.create_index('idx_recipe_ingredients_recipe', 'recipe_ingredients', ['recipe_id', 'ingredient_order'], unique=False)
|
||||
op.create_index('idx_recipe_ingredients_tenant', 'recipe_ingredients', ['tenant_id', 'recipe_id'], unique=False)
|
||||
op.create_index(op.f('ix_recipe_ingredients_ingredient_id'), 'recipe_ingredients', ['ingredient_id'], unique=False)
|
||||
op.create_index(op.f('ix_recipe_ingredients_recipe_id'), 'recipe_ingredients', ['recipe_id'], unique=False)
|
||||
op.create_index(op.f('ix_recipe_ingredients_tenant_id'), 'recipe_ingredients', ['tenant_id'], unique=False)
|
||||
op.create_table('production_ingredient_consumption',
|
||||
sa.Column('id', sa.UUID(), nullable=False),
|
||||
sa.Column('tenant_id', sa.UUID(), nullable=False),
|
||||
sa.Column('production_batch_id', sa.UUID(), nullable=False),
|
||||
sa.Column('recipe_ingredient_id', sa.UUID(), nullable=False),
|
||||
sa.Column('ingredient_id', sa.UUID(), nullable=False),
|
||||
sa.Column('stock_id', sa.UUID(), nullable=True),
|
||||
sa.Column('planned_quantity', sa.Float(), nullable=False),
|
||||
sa.Column('actual_quantity', sa.Float(), nullable=False),
|
||||
sa.Column('unit', sa.Enum('GRAMS', 'KILOGRAMS', 'MILLILITERS', 'LITERS', 'CUPS', 'TABLESPOONS', 'TEASPOONS', 'UNITS', 'PIECES', 'PERCENTAGE', name='measurementunit'), nullable=False),
|
||||
sa.Column('variance_quantity', sa.Float(), nullable=True),
|
||||
sa.Column('variance_percentage', sa.Float(), nullable=True),
|
||||
sa.Column('unit_cost', sa.Numeric(precision=10, scale=2), nullable=True),
|
||||
sa.Column('total_cost', sa.Numeric(precision=10, scale=2), nullable=True),
|
||||
sa.Column('consumption_time', sa.DateTime(timezone=True), nullable=False),
|
||||
sa.Column('consumption_notes', sa.Text(), nullable=True),
|
||||
sa.Column('staff_member', sa.UUID(), nullable=True),
|
||||
sa.Column('ingredient_condition', sa.String(length=50), nullable=True),
|
||||
sa.Column('quality_impact', sa.String(length=255), nullable=True),
|
||||
sa.Column('substitution_used', sa.Boolean(), nullable=True),
|
||||
sa.Column('substitution_details', sa.Text(), nullable=True),
|
||||
sa.ForeignKeyConstraint(['production_batch_id'], ['production_batches.id'], ),
|
||||
sa.ForeignKeyConstraint(['recipe_ingredient_id'], ['recipe_ingredients.id'], ),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
op.create_index('idx_consumption_batch', 'production_ingredient_consumption', ['production_batch_id'], unique=False)
|
||||
op.create_index('idx_consumption_ingredient', 'production_ingredient_consumption', ['ingredient_id', 'consumption_time'], unique=False)
|
||||
op.create_index('idx_consumption_recipe_ingredient', 'production_ingredient_consumption', ['recipe_ingredient_id'], unique=False)
|
||||
op.create_index('idx_consumption_stock', 'production_ingredient_consumption', ['stock_id'], unique=False)
|
||||
op.create_index('idx_consumption_tenant', 'production_ingredient_consumption', ['tenant_id', 'consumption_time'], unique=False)
|
||||
op.create_index(op.f('ix_production_ingredient_consumption_ingredient_id'), 'production_ingredient_consumption', ['ingredient_id'], unique=False)
|
||||
op.create_index(op.f('ix_production_ingredient_consumption_production_batch_id'), 'production_ingredient_consumption', ['production_batch_id'], unique=False)
|
||||
op.create_index(op.f('ix_production_ingredient_consumption_recipe_ingredient_id'), 'production_ingredient_consumption', ['recipe_ingredient_id'], unique=False)
|
||||
op.create_index(op.f('ix_production_ingredient_consumption_stock_id'), 'production_ingredient_consumption', ['stock_id'], unique=False)
|
||||
op.create_index(op.f('ix_production_ingredient_consumption_tenant_id'), 'production_ingredient_consumption', ['tenant_id'], unique=False)
|
||||
# ### end Alembic commands ###
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.drop_index(op.f('ix_production_ingredient_consumption_tenant_id'), table_name='production_ingredient_consumption')
|
||||
op.drop_index(op.f('ix_production_ingredient_consumption_stock_id'), table_name='production_ingredient_consumption')
|
||||
op.drop_index(op.f('ix_production_ingredient_consumption_recipe_ingredient_id'), table_name='production_ingredient_consumption')
|
||||
op.drop_index(op.f('ix_production_ingredient_consumption_production_batch_id'), table_name='production_ingredient_consumption')
|
||||
op.drop_index(op.f('ix_production_ingredient_consumption_ingredient_id'), table_name='production_ingredient_consumption')
|
||||
op.drop_index('idx_consumption_tenant', table_name='production_ingredient_consumption')
|
||||
op.drop_index('idx_consumption_stock', table_name='production_ingredient_consumption')
|
||||
op.drop_index('idx_consumption_recipe_ingredient', table_name='production_ingredient_consumption')
|
||||
op.drop_index('idx_consumption_ingredient', table_name='production_ingredient_consumption')
|
||||
op.drop_index('idx_consumption_batch', table_name='production_ingredient_consumption')
|
||||
op.drop_table('production_ingredient_consumption')
|
||||
op.drop_index(op.f('ix_recipe_ingredients_tenant_id'), table_name='recipe_ingredients')
|
||||
op.drop_index(op.f('ix_recipe_ingredients_recipe_id'), table_name='recipe_ingredients')
|
||||
op.drop_index(op.f('ix_recipe_ingredients_ingredient_id'), table_name='recipe_ingredients')
|
||||
op.drop_index('idx_recipe_ingredients_tenant', table_name='recipe_ingredients')
|
||||
op.drop_index('idx_recipe_ingredients_recipe', table_name='recipe_ingredients')
|
||||
op.drop_index('idx_recipe_ingredients_ingredient', table_name='recipe_ingredients')
|
||||
op.drop_index('idx_recipe_ingredients_group', table_name='recipe_ingredients')
|
||||
op.drop_table('recipe_ingredients')
|
||||
op.drop_index(op.f('ix_production_batches_tenant_id'), table_name='production_batches')
|
||||
op.drop_index(op.f('ix_production_batches_status'), table_name='production_batches')
|
||||
op.drop_index(op.f('ix_production_batches_recipe_id'), table_name='production_batches')
|
||||
op.drop_index(op.f('ix_production_batches_production_date'), table_name='production_batches')
|
||||
op.drop_index(op.f('ix_production_batches_batch_number'), table_name='production_batches')
|
||||
op.drop_index('idx_production_batches_tenant_date', table_name='production_batches')
|
||||
op.drop_index('idx_production_batches_status', table_name='production_batches')
|
||||
op.drop_index('idx_production_batches_recipe', table_name='production_batches')
|
||||
op.drop_index('idx_production_batches_priority', table_name='production_batches')
|
||||
op.drop_index('idx_production_batches_batch_number', table_name='production_batches')
|
||||
op.drop_table('production_batches')
|
||||
op.drop_index(op.f('ix_recipes_tenant_id'), table_name='recipes')
|
||||
op.drop_index(op.f('ix_recipes_status'), table_name='recipes')
|
||||
op.drop_index(op.f('ix_recipes_recipe_code'), table_name='recipes')
|
||||
op.drop_index(op.f('ix_recipes_name'), table_name='recipes')
|
||||
op.drop_index(op.f('ix_recipes_finished_product_id'), table_name='recipes')
|
||||
op.drop_index(op.f('ix_recipes_category'), table_name='recipes')
|
||||
op.drop_index('idx_recipes_tenant_product', table_name='recipes')
|
||||
op.drop_index('idx_recipes_tenant_name', table_name='recipes')
|
||||
op.drop_index('idx_recipes_status', table_name='recipes')
|
||||
op.drop_index('idx_recipes_signature', table_name='recipes')
|
||||
op.drop_index('idx_recipes_seasonal', table_name='recipes')
|
||||
op.drop_index('idx_recipes_category', table_name='recipes')
|
||||
op.drop_table('recipes')
|
||||
op.drop_index(op.f('ix_production_schedules_tenant_id'), table_name='production_schedules')
|
||||
op.drop_index(op.f('ix_production_schedules_schedule_date'), table_name='production_schedules')
|
||||
op.drop_index('idx_production_schedules_tenant_date', table_name='production_schedules')
|
||||
op.drop_index('idx_production_schedules_published', table_name='production_schedules')
|
||||
op.drop_index('idx_production_schedules_completed', table_name='production_schedules')
|
||||
op.drop_table('production_schedules')
|
||||
op.drop_index(op.f('ix_audit_logs_user_id'), table_name='audit_logs')
|
||||
op.drop_index(op.f('ix_audit_logs_tenant_id'), table_name='audit_logs')
|
||||
op.drop_index(op.f('ix_audit_logs_severity'), table_name='audit_logs')
|
||||
op.drop_index(op.f('ix_audit_logs_service_name'), table_name='audit_logs')
|
||||
op.drop_index(op.f('ix_audit_logs_resource_type'), table_name='audit_logs')
|
||||
op.drop_index(op.f('ix_audit_logs_resource_id'), table_name='audit_logs')
|
||||
op.drop_index(op.f('ix_audit_logs_created_at'), table_name='audit_logs')
|
||||
op.drop_index(op.f('ix_audit_logs_action'), table_name='audit_logs')
|
||||
op.drop_index('idx_audit_user_created', table_name='audit_logs')
|
||||
op.drop_index('idx_audit_tenant_created', table_name='audit_logs')
|
||||
op.drop_index('idx_audit_severity_created', table_name='audit_logs')
|
||||
op.drop_index('idx_audit_service_created', table_name='audit_logs')
|
||||
op.drop_index('idx_audit_resource_type_action', table_name='audit_logs')
|
||||
op.drop_table('audit_logs')
|
||||
# ### end Alembic commands ###
|
||||
@@ -0,0 +1,34 @@
|
||||
"""remove legacy quality fields
|
||||
|
||||
Revision ID: 20251027_remove_quality
|
||||
Revises: 3c4d0f57a312
|
||||
Create Date: 2025-10-27
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy.dialects import postgresql
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = '20251027_remove_quality'
|
||||
down_revision = '3c4d0f57a312'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
"""Remove deprecated quality fields from recipes table"""
|
||||
# Drop columns that are no longer used
|
||||
# Using batch operations for safer column drops
|
||||
with op.batch_alter_table('recipes', schema=None) as batch_op:
|
||||
batch_op.drop_column('quality_standards')
|
||||
batch_op.drop_column('quality_check_points')
|
||||
batch_op.drop_column('common_issues')
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
"""Restore deprecated quality fields (for rollback purposes only)"""
|
||||
# Add back the columns in case of rollback
|
||||
op.add_column('recipes', sa.Column('quality_standards', sa.Text(), nullable=True))
|
||||
op.add_column('recipes', sa.Column('quality_check_points', postgresql.JSONB(astext_type=sa.Text()), nullable=True))
|
||||
op.add_column('recipes', sa.Column('common_issues', postgresql.JSONB(astext_type=sa.Text()), nullable=True))
|
||||
62
services/recipes/requirements.txt
Normal file
62
services/recipes/requirements.txt
Normal file
@@ -0,0 +1,62 @@
|
||||
# Recipe Service Dependencies
|
||||
|
||||
# FastAPI and server
|
||||
fastapi==0.119.0
|
||||
uvicorn[standard]==0.32.1
|
||||
python-multipart==0.0.6
|
||||
|
||||
# Database
|
||||
sqlalchemy==2.0.44
|
||||
psycopg2-binary==2.9.10
|
||||
asyncpg==0.30.0
|
||||
alembic==1.17.0
|
||||
|
||||
# Data validation
|
||||
pydantic==2.12.3
|
||||
pydantic-settings==2.7.1
|
||||
email-validator==2.2.0
|
||||
|
||||
# HTTP requests
|
||||
httpx==0.28.1
|
||||
requests==2.32.3
|
||||
|
||||
# Async support
|
||||
asyncio-mqtt==0.16.2
|
||||
aiofiles==24.1.0
|
||||
|
||||
# Messaging
|
||||
aio-pika==9.4.3
|
||||
|
||||
# Caching
|
||||
redis==6.4.0
|
||||
python-redis-cache==0.1.0
|
||||
|
||||
# Monitoring and logging
|
||||
structlog==25.4.0
|
||||
python-json-logger==3.3.0
|
||||
psutil==5.9.8
|
||||
|
||||
# Date/time handling
|
||||
python-dateutil==2.9.0.post0
|
||||
pytz==2024.2
|
||||
|
||||
# Data processing
|
||||
pandas==2.2.3
|
||||
numpy==2.2.2
|
||||
|
||||
# Authentication and security
|
||||
python-jose[cryptography]==3.3.0
|
||||
passlib[bcrypt]==1.7.4
|
||||
cryptography==44.0.0
|
||||
|
||||
# Environment management
|
||||
python-dotenv==1.0.1
|
||||
|
||||
# Testing
|
||||
pytest==8.3.4
|
||||
pytest-asyncio==0.25.2
|
||||
|
||||
# Development
|
||||
black==24.10.0
|
||||
flake8==7.1.1
|
||||
mypy==1.14.1
|
||||
447
services/recipes/scripts/demo/recetas_es.json
Normal file
447
services/recipes/scripts/demo/recetas_es.json
Normal file
@@ -0,0 +1,447 @@
|
||||
{
|
||||
"recetas": [
|
||||
{
|
||||
"id": "30000000-0000-0000-0000-000000000001",
|
||||
"finished_product_id": "20000000-0000-0000-0000-000000000001",
|
||||
"name": "Baguette Francesa Tradicional",
|
||||
"category": "Panes",
|
||||
"cuisine_type": "Francesa",
|
||||
"difficulty_level": 2,
|
||||
"yield_quantity": 10.0,
|
||||
"yield_unit": "units",
|
||||
"prep_time_minutes": 20,
|
||||
"cook_time_minutes": 25,
|
||||
"total_time_minutes": 165,
|
||||
"rest_time_minutes": 120,
|
||||
"description": "Baguette francesa tradicional con corteza crujiente y miga alveolada. Perfecta para acompañar cualquier comida.",
|
||||
"instructions": {
|
||||
"steps": [
|
||||
{
|
||||
"step": 1,
|
||||
"title": "Amasado",
|
||||
"description": "Mezclar harina, agua, sal y levadura. Amasar durante 15 minutos hasta obtener una masa lisa y elástica.",
|
||||
"duration_minutes": 15
|
||||
},
|
||||
{
|
||||
"step": 2,
|
||||
"title": "Primera Fermentación",
|
||||
"description": "Dejar reposar la masa en un recipiente tapado durante 60 minutos a temperatura ambiente (22-24°C).",
|
||||
"duration_minutes": 60
|
||||
},
|
||||
{
|
||||
"step": 3,
|
||||
"title": "División y Formado",
|
||||
"description": "Dividir la masa en 10 piezas de 250g cada una. Formar las baguettes dándoles la forma alargada característica.",
|
||||
"duration_minutes": 20
|
||||
},
|
||||
{
|
||||
"step": 4,
|
||||
"title": "Segunda Fermentación",
|
||||
"description": "Colocar las baguettes en un lienzo enharinado y dejar fermentar 60 minutos más.",
|
||||
"duration_minutes": 60
|
||||
},
|
||||
{
|
||||
"step": 5,
|
||||
"title": "Greñado y Horneado",
|
||||
"description": "Hacer cortes diagonales en la superficie con una cuchilla. Hornear a 240°C con vapor inicial durante 25 minutos.",
|
||||
"duration_minutes": 25
|
||||
}
|
||||
]
|
||||
},
|
||||
"preparation_notes": "Es crucial usar vapor al inicio del horneado para lograr una corteza crujiente. La temperatura del agua debe estar entre 18-20°C.",
|
||||
"storage_instructions": "Consumir el mismo día de producción. Se puede congelar después del horneado.",
|
||||
|
||||
"is_seasonal": false,
|
||||
"is_signature_item": true,
|
||||
"ingredientes": [
|
||||
{
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000001",
|
||||
"ingredient_sku": "HAR-T55-001",
|
||||
"quantity": 1000.0,
|
||||
"unit": "g",
|
||||
"preparation_method": "tamizada",
|
||||
"ingredient_order": 1,
|
||||
"ingredient_group": "Secos"
|
||||
},
|
||||
{
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000033",
|
||||
"ingredient_sku": "BAS-AGU-003",
|
||||
"quantity": 650.0,
|
||||
"unit": "ml",
|
||||
"preparation_method": "temperatura ambiente",
|
||||
"ingredient_order": 2,
|
||||
"ingredient_group": "Líquidos"
|
||||
},
|
||||
{
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000031",
|
||||
"ingredient_sku": "BAS-SAL-001",
|
||||
"quantity": 20.0,
|
||||
"unit": "g",
|
||||
"ingredient_order": 3,
|
||||
"ingredient_group": "Secos"
|
||||
},
|
||||
{
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000021",
|
||||
"ingredient_sku": "LEV-FRE-001",
|
||||
"quantity": 15.0,
|
||||
"unit": "g",
|
||||
"preparation_method": "desmenuzada",
|
||||
"ingredient_order": 4,
|
||||
"ingredient_group": "Fermentos"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": "30000000-0000-0000-0000-000000000002",
|
||||
"finished_product_id": "20000000-0000-0000-0000-000000000002",
|
||||
"name": "Croissant de Mantequilla Artesanal",
|
||||
"category": "Bollería",
|
||||
"cuisine_type": "Francesa",
|
||||
"difficulty_level": 4,
|
||||
"yield_quantity": 12.0,
|
||||
"yield_unit": "units",
|
||||
"prep_time_minutes": 45,
|
||||
"cook_time_minutes": 18,
|
||||
"total_time_minutes": 333,
|
||||
"rest_time_minutes": 270,
|
||||
"description": "Croissant de mantequilla con laminado perfecto y textura hojaldrada. Elaboración artesanal con mantequilla de alta calidad.",
|
||||
"instructions": {
|
||||
"steps": [
|
||||
{
|
||||
"step": 1,
|
||||
"title": "Preparación de la Masa Base",
|
||||
"description": "Mezclar todos los ingredientes excepto la mantequilla de laminado. Amasar hasta obtener una masa homogénea.",
|
||||
"duration_minutes": 20
|
||||
},
|
||||
{
|
||||
"step": 2,
|
||||
"title": "Reposo en Frío",
|
||||
"description": "Envolver la masa en film y refrigerar durante 2 horas.",
|
||||
"duration_minutes": 120
|
||||
},
|
||||
{
|
||||
"step": 3,
|
||||
"title": "Laminado",
|
||||
"description": "Extender la masa en rectángulo. Colocar la mantequilla en el centro y hacer 3 dobleces sencillos con 30 minutos de reposo entre cada uno.",
|
||||
"duration_minutes": 90
|
||||
},
|
||||
{
|
||||
"step": 4,
|
||||
"title": "Formado",
|
||||
"description": "Extender a 3mm de grosor, cortar triángulos y enrollar para formar los croissants.",
|
||||
"duration_minutes": 25
|
||||
},
|
||||
{
|
||||
"step": 5,
|
||||
"title": "Fermentación Final",
|
||||
"description": "Dejar fermentar a 26°C durante 2-3 horas hasta que dupliquen su volumen.",
|
||||
"duration_minutes": 150
|
||||
},
|
||||
{
|
||||
"step": 6,
|
||||
"title": "Horneado",
|
||||
"description": "Pintar con huevo batido y hornear a 200°C durante 18 minutos hasta dorar.",
|
||||
"duration_minutes": 18
|
||||
}
|
||||
]
|
||||
},
|
||||
"preparation_notes": "La mantequilla para laminar debe estar a 15-16°C, flexible pero no blanda. Trabajar en ambiente fresco.",
|
||||
"storage_instructions": "Consumir el día de producción. Se puede congelar la masa formada antes de la fermentación final.",
|
||||
|
||||
"is_seasonal": false,
|
||||
"is_signature_item": true,
|
||||
"ingredientes": [
|
||||
{
|
||||
"ingredient_sku": "HAR-T55-001",
|
||||
"quantity": 500.0,
|
||||
"unit": "g",
|
||||
"ingredient_order": 1,
|
||||
"ingredient_group": "Masa base",
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000001"
|
||||
},
|
||||
{
|
||||
"ingredient_sku": "LAC-LEC-002",
|
||||
"quantity": 120.0,
|
||||
"unit": "ml",
|
||||
"preparation_method": "tibia",
|
||||
"ingredient_order": 2,
|
||||
"ingredient_group": "Masa base",
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000012"
|
||||
},
|
||||
{
|
||||
"ingredient_sku": "BAS-AGU-003",
|
||||
"quantity": 80.0,
|
||||
"unit": "ml",
|
||||
"ingredient_order": 3,
|
||||
"ingredient_group": "Masa base",
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000033"
|
||||
},
|
||||
{
|
||||
"ingredient_sku": "BAS-AZU-002",
|
||||
"quantity": 50.0,
|
||||
"unit": "g",
|
||||
"ingredient_order": 4,
|
||||
"ingredient_group": "Masa base",
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000032"
|
||||
},
|
||||
{
|
||||
"ingredient_sku": "BAS-SAL-001",
|
||||
"quantity": 10.0,
|
||||
"unit": "g",
|
||||
"ingredient_order": 5,
|
||||
"ingredient_group": "Masa base",
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000031"
|
||||
},
|
||||
{
|
||||
"ingredient_sku": "LEV-FRE-001",
|
||||
"quantity": 20.0,
|
||||
"unit": "g",
|
||||
"ingredient_order": 6,
|
||||
"ingredient_group": "Masa base",
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000021"
|
||||
},
|
||||
{
|
||||
"ingredient_sku": "LAC-MAN-001",
|
||||
"quantity": 25.0,
|
||||
"unit": "g",
|
||||
"preparation_method": "en la masa",
|
||||
"ingredient_order": 7,
|
||||
"ingredient_group": "Masa base",
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000011"
|
||||
},
|
||||
{
|
||||
"ingredient_sku": "LAC-MAN-001",
|
||||
"quantity": 250.0,
|
||||
"unit": "g",
|
||||
"preparation_method": "para laminar (15-16°C)",
|
||||
"ingredient_order": 8,
|
||||
"ingredient_group": "Laminado",
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000011"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "Pan de Pueblo con Masa Madre",
|
||||
"category": "Panes Artesanales",
|
||||
"cuisine_type": "Española",
|
||||
"difficulty_level": 3,
|
||||
"yield_quantity": 4.0,
|
||||
"yield_unit": "units",
|
||||
"prep_time_minutes": 30,
|
||||
"cook_time_minutes": 45,
|
||||
"total_time_minutes": 435,
|
||||
"rest_time_minutes": 360,
|
||||
"description": "Hogaza de pan rústico elaborada con masa madre natural. Corteza gruesa y miga densa con sabor ligeramente ácido.",
|
||||
"instructions": {
|
||||
"steps": [
|
||||
{
|
||||
"step": 1,
|
||||
"title": "Autolisis",
|
||||
"description": "Mezclar harinas y agua, dejar reposar 30 minutos para desarrollar el gluten.",
|
||||
"duration_minutes": 30
|
||||
},
|
||||
{
|
||||
"step": 2,
|
||||
"title": "Incorporación de Masa Madre y Sal",
|
||||
"description": "Añadir la masa madre y la sal. Amasar suavemente hasta integrar completamente.",
|
||||
"duration_minutes": 15
|
||||
},
|
||||
{
|
||||
"step": 3,
|
||||
"title": "Fermentación en Bloque con Pliegues",
|
||||
"description": "Realizar 4 series de pliegues cada 30 minutos durante las primeras 2 horas. Luego dejar reposar 2 horas más.",
|
||||
"duration_minutes": 240
|
||||
},
|
||||
{
|
||||
"step": 4,
|
||||
"title": "División y Preformado",
|
||||
"description": "Dividir en 4 piezas de 800g. Preformar en bolas y dejar reposar 30 minutos.",
|
||||
"duration_minutes": 30
|
||||
},
|
||||
{
|
||||
"step": 5,
|
||||
"title": "Formado Final",
|
||||
"description": "Formar las hogazas dándoles tensión superficial. Colocar en banneton o lienzo enharinado.",
|
||||
"duration_minutes": 15
|
||||
},
|
||||
{
|
||||
"step": 6,
|
||||
"title": "Fermentación Final",
|
||||
"description": "Dejar fermentar a temperatura ambiente durante 2 horas o en frío durante la noche.",
|
||||
"duration_minutes": 120
|
||||
},
|
||||
{
|
||||
"step": 7,
|
||||
"title": "Horneado",
|
||||
"description": "Hacer cortes en la superficie. Hornear a 230°C con vapor inicial durante 45 minutos.",
|
||||
"duration_minutes": 45
|
||||
}
|
||||
]
|
||||
},
|
||||
"preparation_notes": "La masa madre debe estar activa y en su punto óptimo. La temperatura final de la masa debe ser 24-25°C.",
|
||||
"storage_instructions": "Se conserva hasta 5-7 días en bolsa de papel. Mejora al segundo día.",
|
||||
|
||||
"is_seasonal": false,
|
||||
"is_signature_item": true,
|
||||
"ingredientes": [
|
||||
{
|
||||
"ingredient_sku": "HAR-T65-002",
|
||||
"quantity": 800.0,
|
||||
"unit": "g",
|
||||
"ingredient_order": 1,
|
||||
"ingredient_group": "Harinas",
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000002"
|
||||
},
|
||||
{
|
||||
"ingredient_sku": "HAR-INT-004",
|
||||
"quantity": 200.0,
|
||||
"unit": "g",
|
||||
"ingredient_order": 2,
|
||||
"ingredient_group": "Harinas",
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000004"
|
||||
},
|
||||
{
|
||||
"ingredient_sku": "LEV-MAD-003",
|
||||
"quantity": 300.0,
|
||||
"unit": "g",
|
||||
"preparation_method": "activa y alimentada",
|
||||
"ingredient_order": 3,
|
||||
"ingredient_group": "Fermentos",
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000023"
|
||||
},
|
||||
{
|
||||
"ingredient_sku": "BAS-AGU-003",
|
||||
"quantity": 650.0,
|
||||
"unit": "ml",
|
||||
"preparation_method": "temperatura ambiente",
|
||||
"ingredient_order": 4,
|
||||
"ingredient_group": "Líquidos",
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000033"
|
||||
},
|
||||
{
|
||||
"ingredient_sku": "BAS-SAL-001",
|
||||
"quantity": 22.0,
|
||||
"unit": "g",
|
||||
"ingredient_order": 5,
|
||||
"ingredient_group": "Condimentos",
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000031"
|
||||
}
|
||||
],
|
||||
"id": "30000000-0000-0000-0000-000000000003",
|
||||
"finished_product_id": "20000000-0000-0000-0000-000000000003"
|
||||
},
|
||||
{
|
||||
"name": "Napolitana de Chocolate",
|
||||
"category": "Bollería",
|
||||
"cuisine_type": "Española",
|
||||
"difficulty_level": 3,
|
||||
"yield_quantity": 16.0,
|
||||
"yield_unit": "units",
|
||||
"prep_time_minutes": 40,
|
||||
"cook_time_minutes": 15,
|
||||
"total_time_minutes": 325,
|
||||
"rest_time_minutes": 270,
|
||||
"description": "Bollería de hojaldre rectangular rellena de chocolate. Clásico de las panaderías españolas.",
|
||||
"instructions": {
|
||||
"steps": [
|
||||
{
|
||||
"step": 1,
|
||||
"title": "Masa Base y Laminado",
|
||||
"description": "Preparar masa de hojaldre siguiendo el mismo proceso que los croissants.",
|
||||
"duration_minutes": 180
|
||||
},
|
||||
{
|
||||
"step": 2,
|
||||
"title": "Corte y Formado",
|
||||
"description": "Extender la masa y cortar rectángulos de 10x15cm. Colocar barritas de chocolate en el centro.",
|
||||
"duration_minutes": 20
|
||||
},
|
||||
{
|
||||
"step": 3,
|
||||
"title": "Sellado",
|
||||
"description": "Doblar la masa sobre sí misma para cubrir el chocolate. Sellar bien los bordes.",
|
||||
"duration_minutes": 20
|
||||
},
|
||||
{
|
||||
"step": 4,
|
||||
"title": "Fermentación",
|
||||
"description": "Dejar fermentar a 26°C durante 90 minutos.",
|
||||
"duration_minutes": 90
|
||||
},
|
||||
{
|
||||
"step": 5,
|
||||
"title": "Horneado",
|
||||
"description": "Pintar con huevo y hornear a 190°C durante 15 minutos.",
|
||||
"duration_minutes": 15
|
||||
}
|
||||
]
|
||||
},
|
||||
"preparation_notes": "El chocolate debe ser de buena calidad para un mejor resultado. No sobrecargar de chocolate.",
|
||||
"storage_instructions": "Consumir preferiblemente el día de producción.",
|
||||
|
||||
"is_seasonal": false,
|
||||
"is_signature_item": false,
|
||||
"ingredientes": [
|
||||
{
|
||||
"ingredient_sku": "HAR-T55-001",
|
||||
"quantity": 500.0,
|
||||
"unit": "g",
|
||||
"ingredient_order": 1,
|
||||
"ingredient_group": "Masa",
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000001"
|
||||
},
|
||||
{
|
||||
"ingredient_sku": "LAC-MAN-001",
|
||||
"quantity": 300.0,
|
||||
"unit": "g",
|
||||
"ingredient_order": 2,
|
||||
"ingredient_group": "Laminado",
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000011"
|
||||
},
|
||||
{
|
||||
"ingredient_sku": "ESP-CHO-001",
|
||||
"quantity": 200.0,
|
||||
"unit": "g",
|
||||
"preparation_method": "en barritas",
|
||||
"ingredient_order": 3,
|
||||
"ingredient_group": "Relleno",
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000041"
|
||||
},
|
||||
{
|
||||
"ingredient_sku": "BAS-AZU-002",
|
||||
"quantity": 60.0,
|
||||
"unit": "g",
|
||||
"ingredient_order": 4,
|
||||
"ingredient_group": "Masa",
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000032"
|
||||
},
|
||||
{
|
||||
"ingredient_sku": "BAS-SAL-001",
|
||||
"quantity": 10.0,
|
||||
"unit": "g",
|
||||
"ingredient_order": 5,
|
||||
"ingredient_group": "Masa",
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000031"
|
||||
},
|
||||
{
|
||||
"ingredient_sku": "LEV-FRE-001",
|
||||
"quantity": 15.0,
|
||||
"unit": "g",
|
||||
"ingredient_order": 6,
|
||||
"ingredient_group": "Masa",
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000021"
|
||||
},
|
||||
{
|
||||
"ingredient_sku": "LAC-LEC-002",
|
||||
"quantity": 150.0,
|
||||
"unit": "ml",
|
||||
"ingredient_order": 7,
|
||||
"ingredient_group": "Masa",
|
||||
"ingredient_id": "10000000-0000-0000-0000-000000000012"
|
||||
}
|
||||
],
|
||||
"id": "30000000-0000-0000-0000-000000000004",
|
||||
"finished_product_id": "20000000-0000-0000-0000-000000000004"
|
||||
}
|
||||
]
|
||||
}
|
||||
Reference in New Issue
Block a user