REFACTOR data service
This commit is contained in:
216
MVP_GAP_ANALYSIS_REPORT.md
Normal file
216
MVP_GAP_ANALYSIS_REPORT.md
Normal file
@@ -0,0 +1,216 @@
|
|||||||
|
# Bakery AI Platform - MVP Gap Analysis Report
|
||||||
|
|
||||||
|
## Executive Summary
|
||||||
|
|
||||||
|
Based on the detailed bakery research report and analysis of the current platform, this document identifies critical missing features that are preventing the platform from delivering value to Madrid's small bakery owners. While the platform has a solid technical foundation with microservices architecture and AI forecasting capabilities, it lacks several core operational features that are essential for day-to-day bakery management.
|
||||||
|
|
||||||
|
## Current Platform Status
|
||||||
|
|
||||||
|
### ✅ **Implemented Features**
|
||||||
|
|
||||||
|
#### Backend Services (Functional)
|
||||||
|
- **Authentication Service**: Complete user registration, login, JWT tokens, role-based access
|
||||||
|
- **Tenant Service**: Multi-tenant architecture, subscription management, team member access
|
||||||
|
- **Training Service**: ML model training using Prophet for demand forecasting
|
||||||
|
- **Forecasting Service**: AI-powered demand predictions and alerts
|
||||||
|
- **Data Service**: Weather data integration (AEMET), traffic data, external data processing
|
||||||
|
- **Notification Service**: Email and WhatsApp notifications
|
||||||
|
- **API Gateway**: Centralized routing, rate limiting, service discovery
|
||||||
|
|
||||||
|
#### Frontend Features (Functional)
|
||||||
|
- **Dashboard**: Revenue metrics, weather display, production overview
|
||||||
|
- **Authentication**: Login/registration pages with proper validation
|
||||||
|
- **Forecasting**: Demand prediction visualizations, forecast charts
|
||||||
|
- **Production Planning**: Basic production scheduling interface
|
||||||
|
- **Order Management**: Mock order display with supplier information
|
||||||
|
- **Settings**: User profile and basic configuration
|
||||||
|
|
||||||
|
#### Technical Infrastructure
|
||||||
|
- Microservices architecture with Docker containerization
|
||||||
|
- PostgreSQL databases per service with proper migrations
|
||||||
|
- RabbitMQ message queuing for inter-service communication
|
||||||
|
- Monitoring with Prometheus and Grafana
|
||||||
|
- Comprehensive error handling and logging
|
||||||
|
|
||||||
|
### ❌ **Critical Missing Features for MVP Launch**
|
||||||
|
|
||||||
|
## 1. **INVENTORY MANAGEMENT SYSTEM** 🚨 **HIGHEST PRIORITY**
|
||||||
|
|
||||||
|
### **Problem Identified**:
|
||||||
|
According to the bakery research, manual inventory tracking is described as "too cumbersome," "time-consuming," and highly susceptible to "mistakes." This leads to:
|
||||||
|
- 1.5% to 20% losses due to spoilage and waste
|
||||||
|
- Production delays during peak hours
|
||||||
|
- Quality inconsistencies
|
||||||
|
- Lost sales opportunities
|
||||||
|
|
||||||
|
### **Missing Components**:
|
||||||
|
- **Ingredient tracking**: Real-time stock levels for flour, yeast, dairy products
|
||||||
|
- **Automatic reordering**: FIFO/FEFO expiration date management
|
||||||
|
- **Spoilage monitoring**: Track and predict ingredient expiration
|
||||||
|
- **Stock alerts**: Low stock warnings integrated with production planning
|
||||||
|
- **Barcode/QR scanning**: Easy inventory updates without manual entry
|
||||||
|
- **Supplier integration**: Automated ordering from suppliers like Harinas Castellana
|
||||||
|
|
||||||
|
### **Required Implementation**:
|
||||||
|
```
|
||||||
|
Backend Services Needed:
|
||||||
|
- Inventory Service (new microservice)
|
||||||
|
- Supplier Service (new microservice)
|
||||||
|
- Integration with existing Forecasting Service
|
||||||
|
|
||||||
|
Frontend Components Needed:
|
||||||
|
- Real-time inventory dashboard
|
||||||
|
- Mobile-friendly inventory scanning
|
||||||
|
- Automated reorder interface
|
||||||
|
- Expiration date tracking
|
||||||
|
```
|
||||||
|
|
||||||
|
## 2. **RECIPE & PRODUCTION MANAGEMENT** 🚨 **HIGH PRIORITY**
|
||||||
|
|
||||||
|
### **Problem Identified**:
|
||||||
|
Individual bakeries struggle with production planning complexity due to:
|
||||||
|
- Wide variety of products with different preparation times
|
||||||
|
- Manual calculation of ingredient quantities
|
||||||
|
- Lack of standardized recipes affecting quality consistency
|
||||||
|
|
||||||
|
### **Missing Components**:
|
||||||
|
- **Digital recipe management**: Store recipes with exact measurements
|
||||||
|
- **Bill of Materials (BOM)**: Automatic ingredient calculation based on production volume
|
||||||
|
- **Yield tracking**: Compare actual vs. expected production output
|
||||||
|
- **Cost calculation**: Real-time cost per product based on current ingredient prices
|
||||||
|
- **Production workflow**: Step-by-step production guidance
|
||||||
|
- **Quality control**: Track temperature, humidity, timing parameters
|
||||||
|
|
||||||
|
## 3. **SUPPLIER & PROCUREMENT SYSTEM** 🚨 **HIGH PRIORITY**
|
||||||
|
|
||||||
|
### **Problem Identified**:
|
||||||
|
Research shows small bakeries face "low buyer power" and struggle with:
|
||||||
|
- Manual ordering processes via phone/WhatsApp
|
||||||
|
- Difficulty tracking supplier performance
|
||||||
|
- Limited negotiation power with suppliers
|
||||||
|
|
||||||
|
### **Missing Components**:
|
||||||
|
- **Supplier database**: Contact information, lead times, reliability ratings
|
||||||
|
- **Purchase order system**: Digital ordering with approval workflows
|
||||||
|
- **Price comparison**: Compare prices across multiple suppliers
|
||||||
|
- **Delivery tracking**: Monitor order status and delivery reliability
|
||||||
|
- **Payment terms**: Track payment schedules and supplier agreements
|
||||||
|
- **Performance analytics**: Supplier reliability and cost analysis
|
||||||
|
|
||||||
|
## 4. **SALES DATA INTEGRATION** 🚨 **HIGH PRIORITY**
|
||||||
|
|
||||||
|
### **Problem Identified**:
|
||||||
|
Current forecasting relies on manual data entry. Research shows bakeries need:
|
||||||
|
- Integration with POS systems
|
||||||
|
- Historical sales pattern analysis
|
||||||
|
- External factor correlation (weather, events, holidays)
|
||||||
|
|
||||||
|
### **Missing Components**:
|
||||||
|
- **POS Integration**: Automatic sales data import from common Spanish POS systems
|
||||||
|
- **Manual sales entry**: Simple interface for bakeries without POS
|
||||||
|
- **Product categorization**: Organize sales by bread types, pastries, seasonal items
|
||||||
|
- **Customer analytics**: Track popular products and buying patterns
|
||||||
|
- **Seasonal adjustments**: Account for holidays, local events, weather impacts
|
||||||
|
|
||||||
|
## 5. **WASTE TRACKING & REDUCTION** 🚨 **MEDIUM PRIORITY**
|
||||||
|
|
||||||
|
### **Problem Identified**:
|
||||||
|
Research indicates waste reduction potential of 20-40% through AI optimization:
|
||||||
|
- Unsold products (1.5% of production)
|
||||||
|
- Ingredient spoilage
|
||||||
|
- Production errors
|
||||||
|
|
||||||
|
### **Missing Components**:
|
||||||
|
- **Daily waste logging**: Track unsold products, spoiled ingredients
|
||||||
|
- **Waste analytics**: Identify patterns in waste generation
|
||||||
|
- **Dynamic pricing**: Reduce prices on items approaching expiration
|
||||||
|
- **Donation tracking**: Manage food donations to reduce total waste
|
||||||
|
- **Cost impact analysis**: Calculate financial impact of waste reduction
|
||||||
|
|
||||||
|
## 6. **MOBILE-FIRST INTERFACE** 🚨 **MEDIUM PRIORITY**
|
||||||
|
|
||||||
|
### **Problem Identified**:
|
||||||
|
Research emphasizes bakery owners work demanding schedules starting at 4:30 AM and need "mobile accessibility" for on-the-go management.
|
||||||
|
|
||||||
|
### **Missing Components**:
|
||||||
|
- **Mobile-responsive design**: Current frontend is not optimized for mobile
|
||||||
|
- **Offline capabilities**: Work without internet connection
|
||||||
|
- **Quick actions**: Fast inventory checks, order placement
|
||||||
|
- **Voice input**: Hands-free operation in production environment
|
||||||
|
- **QR code scanning**: For inventory and product management
|
||||||
|
|
||||||
|
## 7. **FINANCIAL MANAGEMENT** 🚨 **LOW PRIORITY**
|
||||||
|
|
||||||
|
### **Problem Identified**:
|
||||||
|
With 75-85% of revenue consumed by operating costs and 4-9% profit margins, bakeries need precise cost control.
|
||||||
|
|
||||||
|
### **Missing Components**:
|
||||||
|
- **Cost tracking**: Monitor food costs (25-35% of sales) and labor costs (24-40% of sales)
|
||||||
|
- **Profit analysis**: Real-time profit margins per product
|
||||||
|
- **Budget planning**: Monthly expense forecasting
|
||||||
|
- **Tax preparation**: VAT calculations, expense categorization
|
||||||
|
- **Financial reporting**: P&L statements, cash flow analysis
|
||||||
|
|
||||||
|
## Implementation Priority Matrix
|
||||||
|
|
||||||
|
| Feature | Business Impact | Technical Complexity | Implementation Time | Priority |
|
||||||
|
|---------|----------------|---------------------|-------------------|----------|
|
||||||
|
| Inventory Management | Very High | Medium | 6-8 weeks | 1 |
|
||||||
|
| Recipe & BOM System | Very High | Medium | 4-6 weeks | 2 |
|
||||||
|
| Supplier Management | High | Low-Medium | 4-5 weeks | 3 |
|
||||||
|
| Sales Data Integration | High | Medium | 3-4 weeks | 4 |
|
||||||
|
| Waste Tracking | Medium | Low | 2-3 weeks | 5 |
|
||||||
|
| Mobile Optimization | Medium | Medium | 4-6 weeks | 6 |
|
||||||
|
| Financial Management | Low | High | 8-10 weeks | 7 |
|
||||||
|
|
||||||
|
## Technical Architecture Requirements
|
||||||
|
|
||||||
|
### New Microservices Needed:
|
||||||
|
1. **Inventory Service** - Real-time stock management, expiration tracking
|
||||||
|
2. **Recipe Service** - Digital recipes, BOM calculations, cost management
|
||||||
|
3. **Supplier Service** - Supplier database, purchase orders, performance tracking
|
||||||
|
4. **Integration Service** - POS system connectors, external data feeds
|
||||||
|
|
||||||
|
### Database Schema Extensions:
|
||||||
|
- Products table with recipes and ingredient relationships
|
||||||
|
- Inventory transactions with batch/lot tracking
|
||||||
|
- Supplier master data with performance metrics
|
||||||
|
- Purchase orders with approval workflows
|
||||||
|
|
||||||
|
### Frontend Components Required:
|
||||||
|
- Mobile-responsive inventory management interface
|
||||||
|
- Recipe editor with drag-drop ingredient addition
|
||||||
|
- Supplier portal for order placement and tracking
|
||||||
|
- Real-time dashboard with critical alerts
|
||||||
|
|
||||||
|
## MVP Launch Recommendations
|
||||||
|
|
||||||
|
### Phase 1 (8-10 weeks): Core Operations
|
||||||
|
- Implement Inventory Management System
|
||||||
|
- Build Recipe & BOM functionality
|
||||||
|
- Create Supplier Management portal
|
||||||
|
- Mobile UI optimization
|
||||||
|
|
||||||
|
### Phase 2 (4-6 weeks): Data Integration
|
||||||
|
- POS system integrations
|
||||||
|
- Enhanced sales data processing
|
||||||
|
- Waste tracking implementation
|
||||||
|
|
||||||
|
### Phase 3 (6-8 weeks): Advanced Features
|
||||||
|
- Financial management tools
|
||||||
|
- Advanced analytics and reporting
|
||||||
|
- Performance optimization
|
||||||
|
|
||||||
|
## Conclusion
|
||||||
|
|
||||||
|
The current platform has excellent technical foundations but lacks the core operational features that small Madrid bakeries desperately need. The research clearly shows that **inventory management inefficiencies are the #1 pain point**, causing 1.5-20% losses and significant operational stress.
|
||||||
|
|
||||||
|
**Without implementing inventory management, recipe management, and supplier systems, the platform cannot deliver the value proposition of waste reduction and cost savings that bakeries require for survival.**
|
||||||
|
|
||||||
|
The recommended approach is to focus on the top 4 priority features for MVP launch, which will provide immediate tangible value to bakery owners and justify the platform subscription costs.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Report Generated**: January 2025
|
||||||
|
**Status**: MVP Gap Analysis Complete
|
||||||
|
**Next Actions**: Begin Phase 1 implementation planning
|
||||||
@@ -18,7 +18,8 @@ volumes:
|
|||||||
auth_db_data:
|
auth_db_data:
|
||||||
training_db_data:
|
training_db_data:
|
||||||
forecasting_db_data:
|
forecasting_db_data:
|
||||||
data_db_data:
|
sales_db_data:
|
||||||
|
external_db_data:
|
||||||
tenant_db_data:
|
tenant_db_data:
|
||||||
notification_db_data:
|
notification_db_data:
|
||||||
redis_data:
|
redis_data:
|
||||||
@@ -153,23 +154,44 @@ services:
|
|||||||
timeout: 5s
|
timeout: 5s
|
||||||
retries: 5
|
retries: 5
|
||||||
|
|
||||||
data-db:
|
sales-db:
|
||||||
image: postgres:15-alpine
|
image: postgres:15-alpine
|
||||||
container_name: bakery-data-db
|
container_name: bakery-sales-db
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
environment:
|
environment:
|
||||||
- POSTGRES_DB=${DATA_DB_NAME}
|
- POSTGRES_DB=${SALES_DB_NAME}
|
||||||
- POSTGRES_USER=${DATA_DB_USER}
|
- POSTGRES_USER=${SALES_DB_USER}
|
||||||
- POSTGRES_PASSWORD=${DATA_DB_PASSWORD}
|
- POSTGRES_PASSWORD=${SALES_DB_PASSWORD}
|
||||||
- POSTGRES_INITDB_ARGS=${POSTGRES_INITDB_ARGS}
|
- POSTGRES_INITDB_ARGS=${POSTGRES_INITDB_ARGS}
|
||||||
- PGDATA=/var/lib/postgresql/data/pgdata
|
- PGDATA=/var/lib/postgresql/data/pgdata
|
||||||
volumes:
|
volumes:
|
||||||
- data_db_data:/var/lib/postgresql/data
|
- sales_db_data:/var/lib/postgresql/data
|
||||||
networks:
|
networks:
|
||||||
bakery-network:
|
bakery-network:
|
||||||
ipv4_address: 172.20.0.23
|
ipv4_address: 172.20.0.23
|
||||||
healthcheck:
|
healthcheck:
|
||||||
test: ["CMD-SHELL", "pg_isready -U ${DATA_DB_USER} -d ${DATA_DB_NAME}"]
|
test: ["CMD-SHELL", "pg_isready -U ${SALES_DB_USER} -d ${SALES_DB_NAME}"]
|
||||||
|
interval: 10s
|
||||||
|
timeout: 5s
|
||||||
|
retries: 5
|
||||||
|
|
||||||
|
external-db:
|
||||||
|
image: postgres:15-alpine
|
||||||
|
container_name: bakery-external-db
|
||||||
|
restart: unless-stopped
|
||||||
|
environment:
|
||||||
|
- POSTGRES_DB=${EXTERNAL_DB_NAME}
|
||||||
|
- POSTGRES_USER=${EXTERNAL_DB_USER}
|
||||||
|
- POSTGRES_PASSWORD=${EXTERNAL_DB_PASSWORD}
|
||||||
|
- POSTGRES_INITDB_ARGS=${POSTGRES_INITDB_ARGS}
|
||||||
|
- PGDATA=/var/lib/postgresql/data/pgdata
|
||||||
|
volumes:
|
||||||
|
- external_db_data:/var/lib/postgresql/data
|
||||||
|
networks:
|
||||||
|
bakery-network:
|
||||||
|
ipv4_address: 172.20.0.26
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD-SHELL", "pg_isready -U ${EXTERNAL_DB_USER} -d ${EXTERNAL_DB_NAME}"]
|
||||||
interval: 10s
|
interval: 10s
|
||||||
timeout: 5s
|
timeout: 5s
|
||||||
retries: 5
|
retries: 5
|
||||||
@@ -409,7 +431,9 @@ services:
|
|||||||
condition: service_healthy
|
condition: service_healthy
|
||||||
auth-service:
|
auth-service:
|
||||||
condition: service_healthy
|
condition: service_healthy
|
||||||
data-service:
|
sales-service:
|
||||||
|
condition: service_healthy
|
||||||
|
external-service:
|
||||||
condition: service_healthy
|
condition: service_healthy
|
||||||
networks:
|
networks:
|
||||||
bakery-network:
|
bakery-network:
|
||||||
@@ -468,21 +492,21 @@ services:
|
|||||||
timeout: 10s
|
timeout: 10s
|
||||||
retries: 3
|
retries: 3
|
||||||
|
|
||||||
data-service:
|
sales-service:
|
||||||
build:
|
build:
|
||||||
context: .
|
context: .
|
||||||
dockerfile: ./services/data/Dockerfile
|
dockerfile: ./services/sales/Dockerfile
|
||||||
args:
|
args:
|
||||||
- ENVIRONMENT=${ENVIRONMENT}
|
- ENVIRONMENT=${ENVIRONMENT}
|
||||||
- BUILD_DATE=${BUILD_DATE}
|
- BUILD_DATE=${BUILD_DATE}
|
||||||
image: bakery/data-service:${IMAGE_TAG}
|
image: bakery/sales-service:${IMAGE_TAG}
|
||||||
container_name: bakery-data-service
|
container_name: bakery-sales-service
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
env_file: .env
|
env_file: .env
|
||||||
ports:
|
ports:
|
||||||
- "${DATA_SERVICE_PORT}:8000"
|
- "${SALES_SERVICE_PORT}:8000"
|
||||||
depends_on:
|
depends_on:
|
||||||
data-db:
|
sales-db:
|
||||||
condition: service_healthy
|
condition: service_healthy
|
||||||
redis:
|
redis:
|
||||||
condition: service_healthy
|
condition: service_healthy
|
||||||
@@ -495,7 +519,42 @@ services:
|
|||||||
ipv4_address: 172.20.0.104
|
ipv4_address: 172.20.0.104
|
||||||
volumes:
|
volumes:
|
||||||
- log_storage:/app/logs
|
- log_storage:/app/logs
|
||||||
- ./services/data:/app
|
- ./services/sales:/app
|
||||||
|
- ./shared:/app/shared
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD", "python", "-c", "import urllib.request; urllib.request.urlopen('http://localhost:8000/health').read()"]
|
||||||
|
interval: 30s
|
||||||
|
timeout: 10s
|
||||||
|
retries: 3
|
||||||
|
|
||||||
|
external-service:
|
||||||
|
build:
|
||||||
|
context: .
|
||||||
|
dockerfile: ./services/external/Dockerfile
|
||||||
|
args:
|
||||||
|
- ENVIRONMENT=${ENVIRONMENT}
|
||||||
|
- BUILD_DATE=${BUILD_DATE}
|
||||||
|
image: bakery/external-service:${IMAGE_TAG}
|
||||||
|
container_name: bakery-external-service
|
||||||
|
restart: unless-stopped
|
||||||
|
env_file: .env
|
||||||
|
ports:
|
||||||
|
- "${EXTERNAL_SERVICE_PORT}:8000"
|
||||||
|
depends_on:
|
||||||
|
external-db:
|
||||||
|
condition: service_healthy
|
||||||
|
redis:
|
||||||
|
condition: service_healthy
|
||||||
|
rabbitmq:
|
||||||
|
condition: service_healthy
|
||||||
|
auth-service:
|
||||||
|
condition: service_healthy
|
||||||
|
networks:
|
||||||
|
bakery-network:
|
||||||
|
ipv4_address: 172.20.0.107
|
||||||
|
volumes:
|
||||||
|
- log_storage:/app/logs
|
||||||
|
- ./services/external:/app
|
||||||
- ./shared:/app/shared
|
- ./shared:/app/shared
|
||||||
healthcheck:
|
healthcheck:
|
||||||
test: ["CMD", "curl", "-f", "http://localhost:8000/health"]
|
test: ["CMD", "curl", "-f", "http://localhost:8000/health"]
|
||||||
|
|||||||
@@ -193,7 +193,8 @@ const App: React.FC = () => {
|
|||||||
isAuthenticated: false,
|
isAuthenticated: false,
|
||||||
isLoading: false,
|
isLoading: false,
|
||||||
user: null,
|
user: null,
|
||||||
currentPage: 'landing' // 👈 Return to landing page after logout
|
currentPage: 'landing', // 👈 Return to landing page after logout
|
||||||
|
routingDecision: null
|
||||||
});
|
});
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|||||||
@@ -333,7 +333,12 @@ private buildURL(endpoint: string): string {
|
|||||||
size: JSON.stringify(result).length,
|
size: JSON.stringify(result).length,
|
||||||
});
|
});
|
||||||
|
|
||||||
return result;
|
// Handle both wrapped and unwrapped responses
|
||||||
|
// If result has a 'data' property, return it; otherwise return the result itself
|
||||||
|
if (result && typeof result === 'object' && 'data' in result) {
|
||||||
|
return result.data as T;
|
||||||
|
}
|
||||||
|
return result as T;
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
// Record error metrics
|
// Record error metrics
|
||||||
this.recordMetrics({
|
this.recordMetrics({
|
||||||
|
|||||||
@@ -7,12 +7,15 @@ export interface RequestConfig {
|
|||||||
method?: 'GET' | 'POST' | 'PUT' | 'PATCH' | 'DELETE';
|
method?: 'GET' | 'POST' | 'PUT' | 'PATCH' | 'DELETE';
|
||||||
headers?: Record<string, string>;
|
headers?: Record<string, string>;
|
||||||
params?: Record<string, any>;
|
params?: Record<string, any>;
|
||||||
|
body?: any;
|
||||||
|
url?: string;
|
||||||
timeout?: number;
|
timeout?: number;
|
||||||
retries?: number;
|
retries?: number;
|
||||||
cache?: boolean;
|
cache?: boolean;
|
||||||
cacheTTL?: number;
|
cacheTTL?: number;
|
||||||
optimistic?: boolean;
|
optimistic?: boolean;
|
||||||
background?: boolean;
|
background?: boolean;
|
||||||
|
metadata?: any;
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface ApiResponse<T = any> {
|
export interface ApiResponse<T = any> {
|
||||||
@@ -20,12 +23,14 @@ export interface ApiResponse<T = any> {
|
|||||||
message?: string;
|
message?: string;
|
||||||
status: string;
|
status: string;
|
||||||
timestamp?: string;
|
timestamp?: string;
|
||||||
|
metadata?: any;
|
||||||
meta?: {
|
meta?: {
|
||||||
page?: number;
|
page?: number;
|
||||||
limit?: number;
|
limit?: number;
|
||||||
total?: number;
|
total?: number;
|
||||||
hasNext?: boolean;
|
hasNext?: boolean;
|
||||||
hasPrev?: boolean;
|
hasPrev?: boolean;
|
||||||
|
requestId?: string;
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -5,7 +5,8 @@
|
|||||||
|
|
||||||
export { useAuth, useAuthHeaders } from './useAuth';
|
export { useAuth, useAuthHeaders } from './useAuth';
|
||||||
export { useTenant } from './useTenant';
|
export { useTenant } from './useTenant';
|
||||||
export { useData } from './useData';
|
export { useSales } from './useSales';
|
||||||
|
export { useExternal } from './useExternal';
|
||||||
export { useTraining } from './useTraining';
|
export { useTraining } from './useTraining';
|
||||||
export { useForecast } from './useForecast';
|
export { useForecast } from './useForecast';
|
||||||
export { useNotification } from './useNotification';
|
export { useNotification } from './useNotification';
|
||||||
@@ -14,7 +15,8 @@ export { useOnboarding, useOnboardingStep } from './useOnboarding';
|
|||||||
// Import hooks for combined usage
|
// Import hooks for combined usage
|
||||||
import { useAuth } from './useAuth';
|
import { useAuth } from './useAuth';
|
||||||
import { useTenant } from './useTenant';
|
import { useTenant } from './useTenant';
|
||||||
import { useData } from './useData';
|
import { useSales } from './useSales';
|
||||||
|
import { useExternal } from './useExternal';
|
||||||
import { useTraining } from './useTraining';
|
import { useTraining } from './useTraining';
|
||||||
import { useForecast } from './useForecast';
|
import { useForecast } from './useForecast';
|
||||||
import { useNotification } from './useNotification';
|
import { useNotification } from './useNotification';
|
||||||
@@ -24,7 +26,8 @@ import { useOnboarding } from './useOnboarding';
|
|||||||
export const useApiHooks = () => {
|
export const useApiHooks = () => {
|
||||||
const auth = useAuth();
|
const auth = useAuth();
|
||||||
const tenant = useTenant();
|
const tenant = useTenant();
|
||||||
const data = useData();
|
const sales = useSales();
|
||||||
|
const external = useExternal();
|
||||||
const training = useTraining();
|
const training = useTraining();
|
||||||
const forecast = useForecast();
|
const forecast = useForecast();
|
||||||
const notification = useNotification();
|
const notification = useNotification();
|
||||||
@@ -33,7 +36,8 @@ export const useApiHooks = () => {
|
|||||||
return {
|
return {
|
||||||
auth,
|
auth,
|
||||||
tenant,
|
tenant,
|
||||||
data,
|
sales,
|
||||||
|
external,
|
||||||
training,
|
training,
|
||||||
forecast,
|
forecast,
|
||||||
notification,
|
notification,
|
||||||
|
|||||||
@@ -99,7 +99,11 @@ export const useAuth = () => {
|
|||||||
const response = await authService.register(data);
|
const response = await authService.register(data);
|
||||||
|
|
||||||
// Auto-login after successful registration
|
// Auto-login after successful registration
|
||||||
if (response.user) {
|
if (response && response.user) {
|
||||||
|
await login({ email: data.email, password: data.password });
|
||||||
|
} else {
|
||||||
|
// If response doesn't have user property, registration might still be successful
|
||||||
|
// Try to login anyway in case the user was created but response format is different
|
||||||
await login({ email: data.email, password: data.password });
|
await login({ email: data.email, password: data.password });
|
||||||
}
|
}
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
|
|||||||
209
frontend/src/api/hooks/useExternal.ts
Normal file
209
frontend/src/api/hooks/useExternal.ts
Normal file
@@ -0,0 +1,209 @@
|
|||||||
|
// frontend/src/api/hooks/useExternal.ts
|
||||||
|
/**
|
||||||
|
* External Data Management Hooks
|
||||||
|
* Handles weather and traffic data operations
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { useState, useCallback } from 'react';
|
||||||
|
import { externalService } from '../services/external.service';
|
||||||
|
import type { WeatherData, TrafficData, WeatherForecast } from '../services/external.service';
|
||||||
|
|
||||||
|
export const useExternal = () => {
|
||||||
|
const [weatherData, setWeatherData] = useState<WeatherData | null>(null);
|
||||||
|
const [trafficData, setTrafficData] = useState<TrafficData | null>(null);
|
||||||
|
const [weatherForecast, setWeatherForecast] = useState<WeatherForecast[]>([]);
|
||||||
|
const [trafficForecast, setTrafficForecast] = useState<TrafficData[]>([]);
|
||||||
|
const [isLoading, setIsLoading] = useState(false);
|
||||||
|
const [error, setError] = useState<string | null>(null);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get Current Weather
|
||||||
|
*/
|
||||||
|
const getCurrentWeather = useCallback(async (
|
||||||
|
tenantId: string,
|
||||||
|
lat: number,
|
||||||
|
lon: number
|
||||||
|
): Promise<WeatherData> => {
|
||||||
|
try {
|
||||||
|
setIsLoading(true);
|
||||||
|
setError(null);
|
||||||
|
|
||||||
|
const weather = await externalService.getCurrentWeather(tenantId, lat, lon);
|
||||||
|
setWeatherData(weather);
|
||||||
|
|
||||||
|
return weather;
|
||||||
|
} catch (error) {
|
||||||
|
const message = error instanceof Error ? error.message : 'Failed to get weather data';
|
||||||
|
setError(message);
|
||||||
|
throw error;
|
||||||
|
} finally {
|
||||||
|
setIsLoading(false);
|
||||||
|
}
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get Weather Forecast
|
||||||
|
*/
|
||||||
|
const getWeatherForecast = useCallback(async (
|
||||||
|
tenantId: string,
|
||||||
|
lat: number,
|
||||||
|
lon: number,
|
||||||
|
days: number = 7
|
||||||
|
): Promise<WeatherForecast[]> => {
|
||||||
|
try {
|
||||||
|
setIsLoading(true);
|
||||||
|
setError(null);
|
||||||
|
|
||||||
|
const forecast = await externalService.getWeatherForecast(tenantId, lat, lon, days);
|
||||||
|
setWeatherForecast(forecast);
|
||||||
|
|
||||||
|
return forecast;
|
||||||
|
} catch (error) {
|
||||||
|
const message = error instanceof Error ? error.message : 'Failed to get weather forecast';
|
||||||
|
setError(message);
|
||||||
|
throw error;
|
||||||
|
} finally {
|
||||||
|
setIsLoading(false);
|
||||||
|
}
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get Historical Weather Data
|
||||||
|
*/
|
||||||
|
const getHistoricalWeather = useCallback(async (
|
||||||
|
tenantId: string,
|
||||||
|
lat: number,
|
||||||
|
lon: number,
|
||||||
|
startDate: string,
|
||||||
|
endDate: string
|
||||||
|
): Promise<WeatherData[]> => {
|
||||||
|
try {
|
||||||
|
setIsLoading(true);
|
||||||
|
setError(null);
|
||||||
|
|
||||||
|
const data = await externalService.getHistoricalWeather(tenantId, lat, lon, startDate, endDate);
|
||||||
|
|
||||||
|
return data;
|
||||||
|
} catch (error) {
|
||||||
|
const message = error instanceof Error ? error.message : 'Failed to get historical weather';
|
||||||
|
setError(message);
|
||||||
|
throw error;
|
||||||
|
} finally {
|
||||||
|
setIsLoading(false);
|
||||||
|
}
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get Current Traffic
|
||||||
|
*/
|
||||||
|
const getCurrentTraffic = useCallback(async (
|
||||||
|
tenantId: string,
|
||||||
|
lat: number,
|
||||||
|
lon: number
|
||||||
|
): Promise<TrafficData> => {
|
||||||
|
try {
|
||||||
|
setIsLoading(true);
|
||||||
|
setError(null);
|
||||||
|
|
||||||
|
const traffic = await externalService.getCurrentTraffic(tenantId, lat, lon);
|
||||||
|
setTrafficData(traffic);
|
||||||
|
|
||||||
|
return traffic;
|
||||||
|
} catch (error) {
|
||||||
|
const message = error instanceof Error ? error.message : 'Failed to get traffic data';
|
||||||
|
setError(message);
|
||||||
|
throw error;
|
||||||
|
} finally {
|
||||||
|
setIsLoading(false);
|
||||||
|
}
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get Traffic Forecast
|
||||||
|
*/
|
||||||
|
const getTrafficForecast = useCallback(async (
|
||||||
|
tenantId: string,
|
||||||
|
lat: number,
|
||||||
|
lon: number,
|
||||||
|
hours: number = 24
|
||||||
|
): Promise<TrafficData[]> => {
|
||||||
|
try {
|
||||||
|
setIsLoading(true);
|
||||||
|
setError(null);
|
||||||
|
|
||||||
|
const forecast = await externalService.getTrafficForecast(tenantId, lat, lon, hours);
|
||||||
|
setTrafficForecast(forecast);
|
||||||
|
|
||||||
|
return forecast;
|
||||||
|
} catch (error) {
|
||||||
|
const message = error instanceof Error ? error.message : 'Failed to get traffic forecast';
|
||||||
|
setError(message);
|
||||||
|
throw error;
|
||||||
|
} finally {
|
||||||
|
setIsLoading(false);
|
||||||
|
}
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get Historical Traffic Data
|
||||||
|
*/
|
||||||
|
const getHistoricalTraffic = useCallback(async (
|
||||||
|
tenantId: string,
|
||||||
|
lat: number,
|
||||||
|
lon: number,
|
||||||
|
startDate: string,
|
||||||
|
endDate: string
|
||||||
|
): Promise<TrafficData[]> => {
|
||||||
|
try {
|
||||||
|
setIsLoading(true);
|
||||||
|
setError(null);
|
||||||
|
|
||||||
|
const data = await externalService.getHistoricalTraffic(tenantId, lat, lon, startDate, endDate);
|
||||||
|
|
||||||
|
return data;
|
||||||
|
} catch (error) {
|
||||||
|
const message = error instanceof Error ? error.message : 'Failed to get historical traffic';
|
||||||
|
setError(message);
|
||||||
|
throw error;
|
||||||
|
} finally {
|
||||||
|
setIsLoading(false);
|
||||||
|
}
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Test External Services Connectivity
|
||||||
|
*/
|
||||||
|
const testConnectivity = useCallback(async (tenantId: string) => {
|
||||||
|
try {
|
||||||
|
setIsLoading(true);
|
||||||
|
setError(null);
|
||||||
|
|
||||||
|
const results = await externalService.testConnectivity(tenantId);
|
||||||
|
|
||||||
|
return results;
|
||||||
|
} catch (error) {
|
||||||
|
const message = error instanceof Error ? error.message : 'Failed to test connectivity';
|
||||||
|
setError(message);
|
||||||
|
throw error;
|
||||||
|
} finally {
|
||||||
|
setIsLoading(false);
|
||||||
|
}
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
return {
|
||||||
|
weatherData,
|
||||||
|
trafficData,
|
||||||
|
weatherForecast,
|
||||||
|
trafficForecast,
|
||||||
|
isLoading,
|
||||||
|
error,
|
||||||
|
getCurrentWeather,
|
||||||
|
getWeatherForecast,
|
||||||
|
getHistoricalWeather,
|
||||||
|
getCurrentTraffic,
|
||||||
|
getTrafficForecast,
|
||||||
|
getHistoricalTraffic,
|
||||||
|
testConnectivity,
|
||||||
|
clearError: () => setError(null),
|
||||||
|
};
|
||||||
|
};
|
||||||
@@ -1,20 +1,21 @@
|
|||||||
// frontend/src/api/hooks/useData.ts
|
// frontend/src/api/hooks/useSales.ts
|
||||||
/**
|
/**
|
||||||
* Data Management Hooks
|
* Sales Data Management Hooks
|
||||||
*/
|
*/
|
||||||
|
|
||||||
import { useState, useCallback } from 'react';
|
import { useState, useCallback } from 'react';
|
||||||
import { dataService } from '../services';
|
import { salesService } from '../services/sales.service';
|
||||||
import type {
|
import type {
|
||||||
SalesData,
|
SalesData,
|
||||||
SalesValidationResult,
|
SalesValidationResult,
|
||||||
SalesDataQuery,
|
SalesDataQuery,
|
||||||
|
SalesDataImport,
|
||||||
SalesImportResult,
|
SalesImportResult,
|
||||||
DashboardStats,
|
DashboardStats,
|
||||||
ActivityItem,
|
ActivityItem,
|
||||||
} from '../types';
|
} from '../types';
|
||||||
|
|
||||||
export const useData = () => {
|
export const useSales = () => {
|
||||||
const [salesData, setSalesData] = useState<SalesData[]>([]);
|
const [salesData, setSalesData] = useState<SalesData[]>([]);
|
||||||
const [dashboardStats, setDashboardStats] = useState<DashboardStats | null>(null);
|
const [dashboardStats, setDashboardStats] = useState<DashboardStats | null>(null);
|
||||||
const [recentActivity, setRecentActivity] = useState<ActivityItem[]>([]);
|
const [recentActivity, setRecentActivity] = useState<ActivityItem[]>([]);
|
||||||
@@ -32,7 +33,7 @@ export const useData = () => {
|
|||||||
setError(null);
|
setError(null);
|
||||||
setUploadProgress(0);
|
setUploadProgress(0);
|
||||||
|
|
||||||
const result = await dataService.uploadSalesHistory(tenantId, file, {
|
const result = await salesService.uploadSalesHistory(tenantId, file, {
|
||||||
...additionalData,
|
...additionalData,
|
||||||
onProgress: (progress) => {
|
onProgress: (progress) => {
|
||||||
setUploadProgress(progress.percentage);
|
setUploadProgress(progress.percentage);
|
||||||
@@ -58,7 +59,7 @@ export const useData = () => {
|
|||||||
setIsLoading(true);
|
setIsLoading(true);
|
||||||
setError(null);
|
setError(null);
|
||||||
|
|
||||||
const result = await dataService.validateSalesData(tenantId, file);
|
const result = await salesService.validateSalesData(tenantId, file);
|
||||||
return result;
|
return result;
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
const message = error instanceof Error ? error.message : 'Validation failed';
|
const message = error instanceof Error ? error.message : 'Validation failed';
|
||||||
@@ -77,7 +78,7 @@ export const useData = () => {
|
|||||||
setIsLoading(true);
|
setIsLoading(true);
|
||||||
setError(null);
|
setError(null);
|
||||||
|
|
||||||
const response = await dataService.getSalesData(tenantId, query);
|
const response = await salesService.getSalesData(tenantId, query);
|
||||||
setSalesData(response.data);
|
setSalesData(response.data);
|
||||||
|
|
||||||
return response.data;
|
return response.data;
|
||||||
@@ -95,7 +96,7 @@ export const useData = () => {
|
|||||||
setIsLoading(true);
|
setIsLoading(true);
|
||||||
setError(null);
|
setError(null);
|
||||||
|
|
||||||
const stats = await dataService.getDashboardStats(tenantId);
|
const stats = await salesService.getDashboardStats(tenantId);
|
||||||
setDashboardStats(stats);
|
setDashboardStats(stats);
|
||||||
|
|
||||||
return stats;
|
return stats;
|
||||||
@@ -113,7 +114,7 @@ export const useData = () => {
|
|||||||
setIsLoading(true);
|
setIsLoading(true);
|
||||||
setError(null);
|
setError(null);
|
||||||
|
|
||||||
const activity = await dataService.getRecentActivity(tenantId, limit);
|
const activity = await salesService.getRecentActivity(tenantId, limit);
|
||||||
setRecentActivity(activity);
|
setRecentActivity(activity);
|
||||||
|
|
||||||
return activity;
|
return activity;
|
||||||
@@ -135,7 +136,7 @@ export const useData = () => {
|
|||||||
setIsLoading(true);
|
setIsLoading(true);
|
||||||
setError(null);
|
setError(null);
|
||||||
|
|
||||||
const blob = await dataService.exportSalesData(tenantId, format, query);
|
const blob = await salesService.exportSalesData(tenantId, format, query);
|
||||||
|
|
||||||
// Create download link
|
// Create download link
|
||||||
const url = window.URL.createObjectURL(blob);
|
const url = window.URL.createObjectURL(blob);
|
||||||
@@ -157,14 +158,13 @@ export const useData = () => {
|
|||||||
|
|
||||||
/**
|
/**
|
||||||
* Get Products List
|
* Get Products List
|
||||||
* Add this method to the useData hook
|
|
||||||
*/
|
*/
|
||||||
const getProductsList = useCallback(async (tenantId: string): Promise<string[]> => {
|
const getProductsList = useCallback(async (tenantId: string): Promise<string[]> => {
|
||||||
try {
|
try {
|
||||||
setIsLoading(true);
|
setIsLoading(true);
|
||||||
setError(null);
|
setError(null);
|
||||||
|
|
||||||
const products = await dataService.getProductsList(tenantId);
|
const products = await salesService.getProductsList(tenantId);
|
||||||
|
|
||||||
return products;
|
return products;
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
@@ -176,30 +176,8 @@ export const useData = () => {
|
|||||||
}
|
}
|
||||||
}, []);
|
}, []);
|
||||||
|
|
||||||
/**
|
|
||||||
* Get Current Weather
|
|
||||||
* Add this method to the useData hook
|
|
||||||
*/
|
|
||||||
const getCurrentWeather = useCallback(async (tenantId: string, lat: number, lon: number) => {
|
|
||||||
try {
|
|
||||||
setIsLoading(true);
|
|
||||||
setError(null);
|
|
||||||
|
|
||||||
const weather = await dataService.getCurrentWeather(tenantId, lat, lon);
|
|
||||||
|
|
||||||
return weather;
|
|
||||||
} catch (error) {
|
|
||||||
const message = error instanceof Error ? error.message : 'Failed to get weather data';
|
|
||||||
setError(message);
|
|
||||||
throw error;
|
|
||||||
} finally {
|
|
||||||
setIsLoading(false);
|
|
||||||
}
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Get Sales Analytics
|
* Get Sales Analytics
|
||||||
* Add this method to the useData hook
|
|
||||||
*/
|
*/
|
||||||
const getSalesAnalytics = useCallback(async (
|
const getSalesAnalytics = useCallback(async (
|
||||||
tenantId: string,
|
tenantId: string,
|
||||||
@@ -210,7 +188,7 @@ export const useData = () => {
|
|||||||
setIsLoading(true);
|
setIsLoading(true);
|
||||||
setError(null);
|
setError(null);
|
||||||
|
|
||||||
const analytics = await dataService.getSalesAnalytics(tenantId, startDate, endDate);
|
const analytics = await salesService.getSalesAnalytics(tenantId, startDate, endDate);
|
||||||
|
|
||||||
return analytics;
|
return analytics;
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
@@ -236,7 +214,6 @@ export const useData = () => {
|
|||||||
getRecentActivity,
|
getRecentActivity,
|
||||||
exportSalesData,
|
exportSalesData,
|
||||||
getProductsList,
|
getProductsList,
|
||||||
getCurrentWeather,
|
|
||||||
getSalesAnalytics,
|
getSalesAnalytics,
|
||||||
clearError: () => setError(null),
|
clearError: () => setError(null),
|
||||||
};
|
};
|
||||||
@@ -11,7 +11,8 @@ export { apiClient } from './client';
|
|||||||
export {
|
export {
|
||||||
authService,
|
authService,
|
||||||
tenantService,
|
tenantService,
|
||||||
dataService,
|
salesService,
|
||||||
|
externalService,
|
||||||
trainingService,
|
trainingService,
|
||||||
forecastingService,
|
forecastingService,
|
||||||
notificationService,
|
notificationService,
|
||||||
@@ -23,7 +24,8 @@ export {
|
|||||||
useAuth,
|
useAuth,
|
||||||
useAuthHeaders,
|
useAuthHeaders,
|
||||||
useTenant,
|
useTenant,
|
||||||
useData,
|
useSales,
|
||||||
|
useExternal,
|
||||||
useTraining,
|
useTraining,
|
||||||
useForecast,
|
useForecast,
|
||||||
useNotification,
|
useNotification,
|
||||||
|
|||||||
@@ -24,7 +24,7 @@ export class AuthService {
|
|||||||
/**
|
/**
|
||||||
* User Registration
|
* User Registration
|
||||||
*/
|
*/
|
||||||
async register(data: RegisterRequest): Promise<{ user: UserResponse }> {
|
async register(data: RegisterRequest): Promise<LoginResponse> {
|
||||||
return apiClient.post(`${this.baseEndpoint}/register`, data);
|
return apiClient.post(`${this.baseEndpoint}/register`, data);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
264
frontend/src/api/services/external.service.ts
Normal file
264
frontend/src/api/services/external.service.ts
Normal file
@@ -0,0 +1,264 @@
|
|||||||
|
// frontend/src/api/services/external.service.ts
|
||||||
|
/**
|
||||||
|
* External Data Service
|
||||||
|
* Handles weather and traffic data operations for the external microservice
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { apiClient } from '../client';
|
||||||
|
import { RequestTimeouts } from '../client/config';
|
||||||
|
|
||||||
|
// Align with backend WeatherDataResponse schema
|
||||||
|
export interface WeatherData {
|
||||||
|
date: string;
|
||||||
|
temperature?: number;
|
||||||
|
precipitation?: number;
|
||||||
|
humidity?: number;
|
||||||
|
wind_speed?: number;
|
||||||
|
pressure?: number;
|
||||||
|
description?: string;
|
||||||
|
source: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Align with backend TrafficDataResponse schema
|
||||||
|
export interface TrafficData {
|
||||||
|
date: string;
|
||||||
|
traffic_volume?: number;
|
||||||
|
pedestrian_count?: number;
|
||||||
|
congestion_level?: string;
|
||||||
|
average_speed?: number;
|
||||||
|
source: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface WeatherForecast {
|
||||||
|
date: string;
|
||||||
|
temperature_min: number;
|
||||||
|
temperature_max: number;
|
||||||
|
temperature_avg: number;
|
||||||
|
precipitation: number;
|
||||||
|
description: string;
|
||||||
|
humidity?: number;
|
||||||
|
wind_speed?: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export class ExternalService {
|
||||||
|
/**
|
||||||
|
* Get Current Weather Data
|
||||||
|
*/
|
||||||
|
async getCurrentWeather(
|
||||||
|
tenantId: string,
|
||||||
|
lat: number,
|
||||||
|
lon: number
|
||||||
|
): Promise<WeatherData> {
|
||||||
|
try {
|
||||||
|
// ✅ FIX 1: Correct endpoint path with tenant ID
|
||||||
|
const endpoint = `/tenants/${tenantId}/weather/current`;
|
||||||
|
|
||||||
|
// ✅ FIX 2: Correct parameter names (latitude/longitude, not lat/lon)
|
||||||
|
const response = await apiClient.get(endpoint, {
|
||||||
|
params: {
|
||||||
|
latitude: lat, // Backend expects 'latitude'
|
||||||
|
longitude: lon // Backend expects 'longitude'
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
console.log('Weather API response:', response);
|
||||||
|
|
||||||
|
// Return backend response directly (matches WeatherData interface)
|
||||||
|
return response;
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to fetch weather from backend:', error);
|
||||||
|
|
||||||
|
// Fallback weather for Madrid (matching WeatherData schema)
|
||||||
|
return {
|
||||||
|
date: new Date().toISOString(),
|
||||||
|
temperature: 18,
|
||||||
|
description: 'Parcialmente nublado',
|
||||||
|
precipitation: 0,
|
||||||
|
humidity: 65,
|
||||||
|
wind_speed: 10,
|
||||||
|
source: 'fallback'
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get Weather Forecast
|
||||||
|
*/
|
||||||
|
async getWeatherForecast(
|
||||||
|
tenantId: string,
|
||||||
|
lat: number,
|
||||||
|
lon: number,
|
||||||
|
days: number = 7
|
||||||
|
): Promise<WeatherForecast[]> {
|
||||||
|
try {
|
||||||
|
// Fix: Use POST with JSON body as expected by backend
|
||||||
|
const response = await apiClient.post(`/tenants/${tenantId}/weather/forecast`, {
|
||||||
|
latitude: lat,
|
||||||
|
longitude: lon,
|
||||||
|
days: days
|
||||||
|
});
|
||||||
|
|
||||||
|
// Handle response format
|
||||||
|
if (Array.isArray(response)) {
|
||||||
|
return response;
|
||||||
|
} else if (response && response.forecasts) {
|
||||||
|
return response.forecasts;
|
||||||
|
} else {
|
||||||
|
console.warn('Unexpected weather forecast response format:', response);
|
||||||
|
return [];
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to fetch weather forecast:', error);
|
||||||
|
return [];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get Historical Weather Data
|
||||||
|
*/
|
||||||
|
async getHistoricalWeather(
|
||||||
|
tenantId: string,
|
||||||
|
lat: number,
|
||||||
|
lon: number,
|
||||||
|
startDate: string,
|
||||||
|
endDate: string
|
||||||
|
): Promise<WeatherData[]> {
|
||||||
|
try {
|
||||||
|
// Fix: Use POST with JSON body as expected by backend
|
||||||
|
const response = await apiClient.post(`/tenants/${tenantId}/weather/historical`, {
|
||||||
|
latitude: lat,
|
||||||
|
longitude: lon,
|
||||||
|
start_date: startDate,
|
||||||
|
end_date: endDate
|
||||||
|
});
|
||||||
|
|
||||||
|
// Return backend response directly (matches WeatherData interface)
|
||||||
|
return Array.isArray(response) ? response : response.data || [];
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to fetch historical weather:', error);
|
||||||
|
return [];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get Current Traffic Data
|
||||||
|
*/
|
||||||
|
async getCurrentTraffic(
|
||||||
|
tenantId: string,
|
||||||
|
lat: number,
|
||||||
|
lon: number
|
||||||
|
): Promise<TrafficData> {
|
||||||
|
try {
|
||||||
|
const response = await apiClient.get(`/tenants/${tenantId}/traffic/current`, {
|
||||||
|
params: {
|
||||||
|
latitude: lat,
|
||||||
|
longitude: lon
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Return backend response directly (matches TrafficData interface)
|
||||||
|
return response;
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to fetch traffic data:', error);
|
||||||
|
|
||||||
|
// Fallback traffic data (matching TrafficData schema)
|
||||||
|
return {
|
||||||
|
date: new Date().toISOString(),
|
||||||
|
traffic_volume: 50,
|
||||||
|
pedestrian_count: 25,
|
||||||
|
congestion_level: 'medium',
|
||||||
|
average_speed: 30,
|
||||||
|
source: 'fallback'
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get Traffic Forecast
|
||||||
|
*/
|
||||||
|
async getTrafficForecast(
|
||||||
|
tenantId: string,
|
||||||
|
lat: number,
|
||||||
|
lon: number,
|
||||||
|
hours: number = 24
|
||||||
|
): Promise<TrafficData[]> {
|
||||||
|
try {
|
||||||
|
// Fix: Use POST with JSON body as expected by backend
|
||||||
|
const response = await apiClient.post(`/tenants/${tenantId}/traffic/forecast`, {
|
||||||
|
latitude: lat,
|
||||||
|
longitude: lon,
|
||||||
|
hours: hours
|
||||||
|
});
|
||||||
|
|
||||||
|
// Return backend response directly (matches TrafficData interface)
|
||||||
|
return Array.isArray(response) ? response : response.data || [];
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to fetch traffic forecast:', error);
|
||||||
|
return [];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get Historical Traffic Data
|
||||||
|
*/
|
||||||
|
async getHistoricalTraffic(
|
||||||
|
tenantId: string,
|
||||||
|
lat: number,
|
||||||
|
lon: number,
|
||||||
|
startDate: string,
|
||||||
|
endDate: string
|
||||||
|
): Promise<TrafficData[]> {
|
||||||
|
try {
|
||||||
|
// Fix: Use POST with JSON body as expected by backend
|
||||||
|
const response = await apiClient.post(`/tenants/${tenantId}/traffic/historical`, {
|
||||||
|
latitude: lat,
|
||||||
|
longitude: lon,
|
||||||
|
start_date: startDate,
|
||||||
|
end_date: endDate
|
||||||
|
});
|
||||||
|
|
||||||
|
// Return backend response directly (matches TrafficData interface)
|
||||||
|
return Array.isArray(response) ? response : response.data || [];
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to fetch historical traffic:', error);
|
||||||
|
return [];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Test External Service Connectivity
|
||||||
|
*/
|
||||||
|
async testConnectivity(tenantId: string): Promise<{
|
||||||
|
weather: boolean;
|
||||||
|
traffic: boolean;
|
||||||
|
overall: boolean;
|
||||||
|
}> {
|
||||||
|
const results = {
|
||||||
|
weather: false,
|
||||||
|
traffic: false,
|
||||||
|
overall: false
|
||||||
|
};
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Test weather service
|
||||||
|
await this.getCurrentWeather(tenantId, 40.4168, -3.7038); // Madrid coordinates
|
||||||
|
results.weather = true;
|
||||||
|
} catch (error) {
|
||||||
|
console.warn('Weather service connectivity test failed:', error);
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Test traffic service
|
||||||
|
await this.getCurrentTraffic(tenantId, 40.4168, -3.7038); // Madrid coordinates
|
||||||
|
results.traffic = true;
|
||||||
|
} catch (error) {
|
||||||
|
console.warn('Traffic service connectivity test failed:', error);
|
||||||
|
}
|
||||||
|
|
||||||
|
results.overall = results.weather && results.traffic;
|
||||||
|
return results;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export const externalService = new ExternalService();
|
||||||
@@ -7,7 +7,8 @@
|
|||||||
// Import and export individual services
|
// Import and export individual services
|
||||||
import { AuthService } from './auth.service';
|
import { AuthService } from './auth.service';
|
||||||
import { TenantService } from './tenant.service';
|
import { TenantService } from './tenant.service';
|
||||||
import { DataService } from './data.service';
|
import { SalesService } from './sales.service';
|
||||||
|
import { ExternalService } from './external.service';
|
||||||
import { TrainingService } from './training.service';
|
import { TrainingService } from './training.service';
|
||||||
import { ForecastingService } from './forecasting.service';
|
import { ForecastingService } from './forecasting.service';
|
||||||
import { NotificationService } from './notification.service';
|
import { NotificationService } from './notification.service';
|
||||||
@@ -16,17 +17,28 @@ import { OnboardingService } from './onboarding.service';
|
|||||||
// Create service instances
|
// Create service instances
|
||||||
export const authService = new AuthService();
|
export const authService = new AuthService();
|
||||||
export const tenantService = new TenantService();
|
export const tenantService = new TenantService();
|
||||||
export const dataService = new DataService();
|
export const salesService = new SalesService();
|
||||||
|
export const externalService = new ExternalService();
|
||||||
export const trainingService = new TrainingService();
|
export const trainingService = new TrainingService();
|
||||||
export const forecastingService = new ForecastingService();
|
export const forecastingService = new ForecastingService();
|
||||||
export const notificationService = new NotificationService();
|
export const notificationService = new NotificationService();
|
||||||
export const onboardingService = new OnboardingService();
|
export const onboardingService = new OnboardingService();
|
||||||
|
|
||||||
// Export the classes as well
|
// Export the classes as well
|
||||||
export { AuthService, TenantService, DataService, TrainingService, ForecastingService, NotificationService, OnboardingService };
|
export {
|
||||||
|
AuthService,
|
||||||
|
TenantService,
|
||||||
|
SalesService,
|
||||||
|
ExternalService,
|
||||||
|
TrainingService,
|
||||||
|
ForecastingService,
|
||||||
|
NotificationService,
|
||||||
|
OnboardingService
|
||||||
|
};
|
||||||
|
|
||||||
// Import base client
|
// Import base client
|
||||||
export { apiClient } from '../client';
|
import { apiClient } from '../client';
|
||||||
|
export { apiClient };
|
||||||
|
|
||||||
// Re-export all types
|
// Re-export all types
|
||||||
export * from '../types';
|
export * from '../types';
|
||||||
@@ -35,7 +47,8 @@ export * from '../types';
|
|||||||
export const api = {
|
export const api = {
|
||||||
auth: authService,
|
auth: authService,
|
||||||
tenant: tenantService,
|
tenant: tenantService,
|
||||||
data: dataService,
|
sales: salesService,
|
||||||
|
external: externalService,
|
||||||
training: trainingService,
|
training: trainingService,
|
||||||
forecasting: forecastingService,
|
forecasting: forecastingService,
|
||||||
notification: notificationService,
|
notification: notificationService,
|
||||||
@@ -56,7 +69,8 @@ export class HealthService {
|
|||||||
const services = [
|
const services = [
|
||||||
{ name: 'Auth', endpoint: '/auth/health' },
|
{ name: 'Auth', endpoint: '/auth/health' },
|
||||||
{ name: 'Tenant', endpoint: '/tenants/health' },
|
{ name: 'Tenant', endpoint: '/tenants/health' },
|
||||||
{ name: 'Data', endpoint: '/data/health' },
|
{ name: 'Sales', endpoint: '/sales/health' },
|
||||||
|
{ name: 'External', endpoint: '/external/health' },
|
||||||
{ name: 'Training', endpoint: '/training/health' },
|
{ name: 'Training', endpoint: '/training/health' },
|
||||||
{ name: 'Forecasting', endpoint: '/forecasting/health' },
|
{ name: 'Forecasting', endpoint: '/forecasting/health' },
|
||||||
{ name: 'Notification', endpoint: '/notifications/health' },
|
{ name: 'Notification', endpoint: '/notifications/health' },
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
// frontend/src/api/services/data.service.ts
|
// frontend/src/api/services/sales.service.ts
|
||||||
/**
|
/**
|
||||||
* Data Management Service
|
* Sales Data Service
|
||||||
* Handles sales data operations
|
* Handles sales data operations for the sales microservice
|
||||||
*/
|
*/
|
||||||
|
|
||||||
import { apiClient } from '../client';
|
import { apiClient } from '../client';
|
||||||
@@ -17,7 +17,7 @@ import type {
|
|||||||
ActivityItem,
|
ActivityItem,
|
||||||
} from '../types';
|
} from '../types';
|
||||||
|
|
||||||
export class DataService {
|
export class SalesService {
|
||||||
/**
|
/**
|
||||||
* Upload Sales History File
|
* Upload Sales History File
|
||||||
*/
|
*/
|
||||||
@@ -143,7 +143,7 @@ export class DataService {
|
|||||||
metrics?: string[];
|
metrics?: string[];
|
||||||
}
|
}
|
||||||
): Promise<any> {
|
): Promise<any> {
|
||||||
return apiClient.get(`/tenants/${tenantId}/analytics`, { params });
|
return apiClient.get(`/tenants/${tenantId}/sales/analytics`, { params });
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -175,14 +175,13 @@ export class DataService {
|
|||||||
* Get Recent Activity
|
* Get Recent Activity
|
||||||
*/
|
*/
|
||||||
async getRecentActivity(tenantId: string, limit?: number): Promise<ActivityItem[]> {
|
async getRecentActivity(tenantId: string, limit?: number): Promise<ActivityItem[]> {
|
||||||
return apiClient.get(`/tenants/${tenantId}/activity`, {
|
return apiClient.get(`/tenants/${tenantId}/sales/activity`, {
|
||||||
params: { limit },
|
params: { limit },
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Get Products List from Sales Data
|
* Get Products List from Sales Data
|
||||||
* This should be added to the DataService class
|
|
||||||
*/
|
*/
|
||||||
async getProductsList(tenantId: string): Promise<string[]> {
|
async getProductsList(tenantId: string): Promise<string[]> {
|
||||||
try {
|
try {
|
||||||
@@ -261,90 +260,8 @@ export class DataService {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get Current Weather Data
|
|
||||||
* This should be added to the DataService class
|
|
||||||
*/
|
|
||||||
async getCurrentWeather(
|
|
||||||
tenantId: string,
|
|
||||||
lat: number,
|
|
||||||
lon: number
|
|
||||||
): Promise<{
|
|
||||||
temperature: number;
|
|
||||||
description: string;
|
|
||||||
precipitation: number;
|
|
||||||
humidity?: number;
|
|
||||||
wind_speed?: number;
|
|
||||||
}> {
|
|
||||||
try {
|
|
||||||
// ✅ FIX 1: Correct endpoint path with tenant ID
|
|
||||||
const endpoint = `/tenants/${tenantId}/weather/current`;
|
|
||||||
|
|
||||||
// ✅ FIX 2: Correct parameter names (latitude/longitude, not lat/lon)
|
|
||||||
const response = await apiClient.get(endpoint, {
|
|
||||||
params: {
|
|
||||||
latitude: lat, // Backend expects 'latitude'
|
|
||||||
longitude: lon // Backend expects 'longitude'
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// ✅ FIX 3: Handle the actual backend response structure
|
|
||||||
// Backend returns WeatherDataResponse:
|
|
||||||
// {
|
|
||||||
// "date": "2025-08-04T12:00:00Z",
|
|
||||||
// "temperature": 25.5,
|
|
||||||
// "precipitation": 0.0,
|
|
||||||
// "humidity": 65.0,
|
|
||||||
// "wind_speed": 10.2,
|
|
||||||
// "pressure": 1013.2,
|
|
||||||
// "description": "Partly cloudy",
|
|
||||||
// "source": "aemet"
|
|
||||||
// }
|
|
||||||
|
|
||||||
console.log('Weather API response:', response);
|
|
||||||
|
|
||||||
// Map backend response to expected frontend format
|
|
||||||
return {
|
|
||||||
temperature: response.temperature || 18,
|
|
||||||
description: response.description || 'Parcialmente nublado',
|
|
||||||
precipitation: response.precipitation || 0,
|
|
||||||
humidity: response.humidity || 65,
|
|
||||||
wind_speed: response.wind_speed || 10
|
|
||||||
};
|
|
||||||
|
|
||||||
} catch (error) {
|
|
||||||
console.error('Failed to fetch weather from backend:', error);
|
|
||||||
|
|
||||||
// Fallback weather for Madrid
|
|
||||||
return {
|
|
||||||
temperature: 18,
|
|
||||||
description: 'Parcialmente nublado',
|
|
||||||
precipitation: 0,
|
|
||||||
humidity: 65,
|
|
||||||
wind_speed: 10
|
|
||||||
};
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get Weather Forecast
|
|
||||||
* This should be added to the DataService class
|
|
||||||
*/
|
|
||||||
async getWeatherForecast(
|
|
||||||
lat: number,
|
|
||||||
lon: number,
|
|
||||||
days: number = 7
|
|
||||||
): Promise<any[]> {
|
|
||||||
return apiClient.get(`/data/weather/forecast`, {
|
|
||||||
params: { lat, lon, days }
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Get Sales Summary by Period
|
* Get Sales Summary by Period
|
||||||
* This should be added to the DataService class
|
|
||||||
*/
|
*/
|
||||||
async getSalesSummary(
|
async getSalesSummary(
|
||||||
tenantId: string,
|
tenantId: string,
|
||||||
@@ -357,7 +274,6 @@ export class DataService {
|
|||||||
|
|
||||||
/**
|
/**
|
||||||
* Get Sales Analytics
|
* Get Sales Analytics
|
||||||
* This should be added to the DataService class
|
|
||||||
*/
|
*/
|
||||||
async getSalesAnalytics(
|
async getSalesAnalytics(
|
||||||
tenantId: string,
|
tenantId: string,
|
||||||
@@ -369,14 +285,13 @@ export class DataService {
|
|||||||
forecast_accuracy?: number;
|
forecast_accuracy?: number;
|
||||||
stockout_events?: number;
|
stockout_events?: number;
|
||||||
}> {
|
}> {
|
||||||
return apiClient.get(`/tenants/${tenantId}/sales/analytics`, {
|
return apiClient.get(`/tenants/${tenantId}/sales/analytics/summary`, {
|
||||||
params: {
|
params: {
|
||||||
start_date: startDate,
|
start_date: startDate,
|
||||||
end_date: endDate
|
end_date: endDate
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
export const dataService = new DataService();
|
export const salesService = new SalesService();
|
||||||
@@ -3,6 +3,8 @@
|
|||||||
* Data Management Types
|
* Data Management Types
|
||||||
*/
|
*/
|
||||||
|
|
||||||
|
import { BaseQueryParams } from './common';
|
||||||
|
|
||||||
export interface SalesData {
|
export interface SalesData {
|
||||||
id: string;
|
id: string;
|
||||||
tenant_id: string;
|
tenant_id: string;
|
||||||
@@ -26,6 +28,8 @@ export interface SalesValidationResult {
|
|||||||
errors: ValidationError[];
|
errors: ValidationError[];
|
||||||
warnings: ValidationError[];
|
warnings: ValidationError[];
|
||||||
summary: Record<string, any>;
|
summary: Record<string, any>;
|
||||||
|
message?: string;
|
||||||
|
details?: Record<string, any>;
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface ExternalFactors {
|
export interface ExternalFactors {
|
||||||
@@ -63,6 +67,7 @@ export interface SalesImportResult {
|
|||||||
success: boolean;
|
success: boolean;
|
||||||
message: string;
|
message: string;
|
||||||
imported_count: number;
|
imported_count: number;
|
||||||
|
records_imported: number;
|
||||||
skipped_count: number;
|
skipped_count: number;
|
||||||
error_count: number;
|
error_count: number;
|
||||||
validation_errors?: ValidationError[];
|
validation_errors?: ValidationError[];
|
||||||
|
|||||||
@@ -3,6 +3,8 @@
|
|||||||
* Forecasting Service Types
|
* Forecasting Service Types
|
||||||
*/
|
*/
|
||||||
|
|
||||||
|
import { ExternalFactors } from './data';
|
||||||
|
|
||||||
export interface SingleForecastRequest {
|
export interface SingleForecastRequest {
|
||||||
product_name: string;
|
product_name: string;
|
||||||
forecast_date: string;
|
forecast_date: string;
|
||||||
|
|||||||
@@ -61,11 +61,16 @@ export interface TenantSubscription {
|
|||||||
|
|
||||||
export interface TenantCreate {
|
export interface TenantCreate {
|
||||||
name: string;
|
name: string;
|
||||||
|
address?: string;
|
||||||
|
business_type?: 'individual' | 'central_workshop';
|
||||||
postal_code: string;
|
postal_code: string;
|
||||||
phone: string;
|
phone: string;
|
||||||
description?: string;
|
description?: string;
|
||||||
settings?: Partial<TenantSettings>;
|
settings?: Partial<TenantSettings>;
|
||||||
location?: TenantLocation;
|
location?: TenantLocation;
|
||||||
|
coordinates?: { lat: number; lng: number };
|
||||||
|
products?: string[];
|
||||||
|
has_historical_data?: boolean;
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface TenantUpdate {
|
export interface TenantUpdate {
|
||||||
|
|||||||
@@ -7,6 +7,10 @@ export interface TrainingJobRequest {
|
|||||||
config?: TrainingJobConfig;
|
config?: TrainingJobConfig;
|
||||||
priority?: number;
|
priority?: number;
|
||||||
schedule_time?: string;
|
schedule_time?: string;
|
||||||
|
include_weather?: boolean;
|
||||||
|
include_traffic?: boolean;
|
||||||
|
min_data_points?: number;
|
||||||
|
use_default_data?: boolean;
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface SingleProductTrainingRequest {
|
export interface SingleProductTrainingRequest {
|
||||||
|
|||||||
@@ -3,7 +3,7 @@
|
|||||||
* Error Handling Utilities
|
* Error Handling Utilities
|
||||||
*/
|
*/
|
||||||
|
|
||||||
import type { ApiError } from '../types';
|
import type { ApiError } from '../client/types';
|
||||||
|
|
||||||
export class ApiErrorHandler {
|
export class ApiErrorHandler {
|
||||||
static formatError(error: any): string {
|
static formatError(error: any): string {
|
||||||
|
|||||||
@@ -2,7 +2,7 @@
|
|||||||
// Complete dashboard hook using your API infrastructure
|
// Complete dashboard hook using your API infrastructure
|
||||||
|
|
||||||
import { useState, useEffect, useCallback } from 'react';
|
import { useState, useEffect, useCallback } from 'react';
|
||||||
import { useAuth, useData, useForecast } from '../api';
|
import { useAuth, useSales, useExternal, useForecast } from '../api';
|
||||||
|
|
||||||
import { useTenantId } from './useTenantId';
|
import { useTenantId } from './useTenantId';
|
||||||
|
|
||||||
@@ -31,12 +31,16 @@ export const useDashboard = () => {
|
|||||||
const { user } = useAuth();
|
const { user } = useAuth();
|
||||||
const {
|
const {
|
||||||
getProductsList,
|
getProductsList,
|
||||||
getCurrentWeather,
|
|
||||||
getSalesAnalytics,
|
getSalesAnalytics,
|
||||||
getDashboardStats,
|
getDashboardStats,
|
||||||
isLoading: dataLoading,
|
isLoading: salesLoading,
|
||||||
error: dataError
|
error: salesError
|
||||||
} = useData();
|
} = useSales();
|
||||||
|
const {
|
||||||
|
getCurrentWeather,
|
||||||
|
isLoading: externalLoading,
|
||||||
|
error: externalError
|
||||||
|
} = useExternal();
|
||||||
|
|
||||||
const {
|
const {
|
||||||
createSingleForecast,
|
createSingleForecast,
|
||||||
@@ -236,8 +240,8 @@ export const useDashboard = () => {
|
|||||||
|
|
||||||
return {
|
return {
|
||||||
...dashboardData,
|
...dashboardData,
|
||||||
isLoading: isLoading || dataLoading || forecastLoading,
|
isLoading: isLoading || salesLoading || externalLoading || forecastLoading,
|
||||||
error: error || dataError || forecastError,
|
error: error || salesError || externalError || forecastError,
|
||||||
reload: () => tenantId ? loadDashboardData(tenantId) : Promise.resolve(),
|
reload: () => tenantId ? loadDashboardData(tenantId) : Promise.resolve(),
|
||||||
clearError: () => setError(null)
|
clearError: () => setError(null)
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
// Real API hook for Order Suggestions using backend data
|
// Real API hook for Order Suggestions using backend data
|
||||||
import { useState, useCallback, useEffect } from 'react';
|
import { useState, useCallback, useEffect } from 'react';
|
||||||
import { useData, useForecast } from '../api';
|
import { useSales, useExternal, useForecast } from '../api';
|
||||||
import { useTenantId } from './useTenantId';
|
import { useTenantId } from './useTenantId';
|
||||||
import type { DailyOrderItem, WeeklyOrderItem } from '../components/simple/OrderSuggestions';
|
import type { DailyOrderItem, WeeklyOrderItem } from '../components/simple/OrderSuggestions';
|
||||||
|
|
||||||
@@ -44,9 +44,11 @@ export const useOrderSuggestions = () => {
|
|||||||
const {
|
const {
|
||||||
getProductsList,
|
getProductsList,
|
||||||
getSalesAnalytics,
|
getSalesAnalytics,
|
||||||
getDashboardStats,
|
getDashboardStats
|
||||||
|
} = useSales();
|
||||||
|
const {
|
||||||
getCurrentWeather
|
getCurrentWeather
|
||||||
} = useData();
|
} = useExternal();
|
||||||
const {
|
const {
|
||||||
createSingleForecast,
|
createSingleForecast,
|
||||||
getQuickForecasts,
|
getQuickForecasts,
|
||||||
@@ -158,8 +160,8 @@ export const useOrderSuggestions = () => {
|
|||||||
console.log('📊 OrderSuggestions: Generating weekly suggestions for tenant:', tenantId);
|
console.log('📊 OrderSuggestions: Generating weekly suggestions for tenant:', tenantId);
|
||||||
|
|
||||||
// Get sales analytics for the past month
|
// Get sales analytics for the past month
|
||||||
const endDate = new Date().toISOString().split('T')[0];
|
const endDate = new Date().toISOString();
|
||||||
const startDate = new Date(Date.now() - 30 * 24 * 60 * 60 * 1000).toISOString().split('T')[0];
|
const startDate = new Date(Date.now() - 30 * 24 * 60 * 60 * 1000).toISOString();
|
||||||
|
|
||||||
let analytics: any = null;
|
let analytics: any = null;
|
||||||
try {
|
try {
|
||||||
|
|||||||
@@ -58,6 +58,16 @@ interface RegisterForm {
|
|||||||
paymentCompleted: boolean;
|
paymentCompleted: boolean;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
interface RegisterFormErrors {
|
||||||
|
fullName?: string;
|
||||||
|
email?: string;
|
||||||
|
confirmEmail?: string;
|
||||||
|
password?: string;
|
||||||
|
confirmPassword?: string;
|
||||||
|
acceptTerms?: string;
|
||||||
|
paymentCompleted?: string;
|
||||||
|
}
|
||||||
|
|
||||||
const RegisterPage: React.FC<RegisterPageProps> = ({ onLogin, onNavigateToLogin }) => {
|
const RegisterPage: React.FC<RegisterPageProps> = ({ onLogin, onNavigateToLogin }) => {
|
||||||
const { register, isLoading } = useAuth();
|
const { register, isLoading } = useAuth();
|
||||||
|
|
||||||
@@ -73,7 +83,7 @@ const RegisterPage: React.FC<RegisterPageProps> = ({ onLogin, onNavigateToLogin
|
|||||||
|
|
||||||
const [showPassword, setShowPassword] = useState(false);
|
const [showPassword, setShowPassword] = useState(false);
|
||||||
const [showConfirmPassword, setShowConfirmPassword] = useState(false);
|
const [showConfirmPassword, setShowConfirmPassword] = useState(false);
|
||||||
const [errors, setErrors] = useState<Partial<RegisterForm>>({});
|
const [errors, setErrors] = useState<RegisterFormErrors>({});
|
||||||
const [passwordStrength, setPasswordStrength] = useState<{
|
const [passwordStrength, setPasswordStrength] = useState<{
|
||||||
score: number;
|
score: number;
|
||||||
checks: { [key: string]: boolean };
|
checks: { [key: string]: boolean };
|
||||||
@@ -246,7 +256,7 @@ const RegisterPage: React.FC<RegisterPageProps> = ({ onLogin, onNavigateToLogin
|
|||||||
e.preventDefault();
|
e.preventDefault();
|
||||||
|
|
||||||
// Validate form but exclude payment requirement for first step
|
// Validate form but exclude payment requirement for first step
|
||||||
const newErrors: Partial<RegisterForm> = {};
|
const newErrors: RegisterFormErrors = {};
|
||||||
|
|
||||||
if (!formData.fullName.trim()) {
|
if (!formData.fullName.trim()) {
|
||||||
newErrors.fullName = 'El nombre completo es obligatorio';
|
newErrors.fullName = 'El nombre completo es obligatorio';
|
||||||
@@ -346,7 +356,7 @@ const RegisterPage: React.FC<RegisterPageProps> = ({ onLogin, onNavigateToLogin
|
|||||||
}));
|
}));
|
||||||
|
|
||||||
// Clear error when user starts typing
|
// Clear error when user starts typing
|
||||||
if (errors[name as keyof RegisterForm]) {
|
if (errors[name as keyof RegisterFormErrors]) {
|
||||||
setErrors(prev => ({
|
setErrors(prev => ({
|
||||||
...prev,
|
...prev,
|
||||||
[name]: undefined
|
[name]: undefined
|
||||||
|
|||||||
@@ -219,9 +219,6 @@ const DashboardPage: React.FC<DashboardPageProps> = ({
|
|||||||
onUpdateStatus={(itemId: string, status: any) => {
|
onUpdateStatus={(itemId: string, status: any) => {
|
||||||
console.log('Update status:', itemId, status);
|
console.log('Update status:', itemId, status);
|
||||||
}}
|
}}
|
||||||
onViewDetails={() => {
|
|
||||||
onNavigateToProduction?.();
|
|
||||||
}}
|
|
||||||
/>
|
/>
|
||||||
|
|
||||||
{/* Quick Overview - Supporting Information */}
|
{/* Quick Overview - Supporting Information */}
|
||||||
|
|||||||
@@ -7,7 +7,7 @@ import SimplifiedTrainingProgress from '../../components/SimplifiedTrainingProgr
|
|||||||
import {
|
import {
|
||||||
useTenant,
|
useTenant,
|
||||||
useTraining,
|
useTraining,
|
||||||
useData,
|
useSales,
|
||||||
useTrainingWebSocket,
|
useTrainingWebSocket,
|
||||||
useOnboarding,
|
useOnboarding,
|
||||||
TenantCreate,
|
TenantCreate,
|
||||||
@@ -75,7 +75,7 @@ const OnboardingPage: React.FC<OnboardingPageProps> = ({ user, onComplete }) =>
|
|||||||
const [trainingJobId, setTrainingJobId] = useState<string>('');
|
const [trainingJobId, setTrainingJobId] = useState<string>('');
|
||||||
const { createTenant, getUserTenants, isLoading: tenantLoading } = useTenant();
|
const { createTenant, getUserTenants, isLoading: tenantLoading } = useTenant();
|
||||||
const { startTrainingJob } = useTraining({ disablePolling: true });
|
const { startTrainingJob } = useTraining({ disablePolling: true });
|
||||||
const { uploadSalesHistory, validateSalesData } = useData();
|
const { uploadSalesHistory, validateSalesData } = useSales();
|
||||||
|
|
||||||
const steps = [
|
const steps = [
|
||||||
{ id: 1, title: 'Datos de Panadería', icon: Store },
|
{ id: 1, title: 'Datos de Panadería', icon: Store },
|
||||||
@@ -315,7 +315,7 @@ const OnboardingPage: React.FC<OnboardingPageProps> = ({ user, onComplete }) =>
|
|||||||
const tenantData: TenantCreate = {
|
const tenantData: TenantCreate = {
|
||||||
name: bakeryData.name,
|
name: bakeryData.name,
|
||||||
address: bakeryData.address,
|
address: bakeryData.address,
|
||||||
business_type: "bakery",
|
business_type: "individual",
|
||||||
postal_code: "28010",
|
postal_code: "28010",
|
||||||
phone: "+34655334455",
|
phone: "+34655334455",
|
||||||
coordinates: bakeryData.coordinates,
|
coordinates: bakeryData.coordinates,
|
||||||
|
|||||||
@@ -355,9 +355,9 @@ const OrdersPage: React.FC = () => {
|
|||||||
<Package className="mx-auto h-12 w-12 text-gray-400 mb-4" />
|
<Package className="mx-auto h-12 w-12 text-gray-400 mb-4" />
|
||||||
<h3 className="text-lg font-medium text-gray-900 mb-2">No hay pedidos</h3>
|
<h3 className="text-lg font-medium text-gray-900 mb-2">No hay pedidos</h3>
|
||||||
<p className="text-gray-600 mb-4">
|
<p className="text-gray-600 mb-4">
|
||||||
{activeTab === 'all'
|
{activeTab === 'orders'
|
||||||
? 'Aún no has creado ningún pedido'
|
? 'Aún no has creado ningún pedido'
|
||||||
: `No hay pedidos ${activeTab === 'pending' ? 'pendientes' : 'entregados'}`
|
: 'No hay datos disponibles para esta sección'
|
||||||
}
|
}
|
||||||
</p>
|
</p>
|
||||||
<button
|
<button
|
||||||
|
|||||||
@@ -26,6 +26,6 @@
|
|||||||
"@/*": ["./src/*"]
|
"@/*": ["./src/*"]
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"include": ["src"],
|
"include": ["src", "vite-env.d.ts"],
|
||||||
"references": [{ "path": "./tsconfig.node.json" }]
|
"references": [{ "path": "./tsconfig.node.json" }]
|
||||||
}
|
}
|
||||||
@@ -66,25 +66,25 @@ async def proxy_all_tenant_sales_alternative(request: Request, tenant_id: str =
|
|||||||
path = "/" + path
|
path = "/" + path
|
||||||
target_path = base_path + path
|
target_path = base_path + path
|
||||||
|
|
||||||
return await _proxy_to_data_service(request, target_path)
|
return await _proxy_to_sales_service(request, target_path)
|
||||||
|
|
||||||
@router.api_route("/{tenant_id}/weather/{path:path}", methods=["GET", "POST", "OPTIONS"])
|
@router.api_route("/{tenant_id}/weather/{path:path}", methods=["GET", "POST", "OPTIONS"])
|
||||||
async def proxy_tenant_weather(request: Request, tenant_id: str = Path(...), path: str = ""):
|
async def proxy_tenant_weather(request: Request, tenant_id: str = Path(...), path: str = ""):
|
||||||
"""Proxy tenant weather requests to data service"""
|
"""Proxy tenant weather requests to external service"""
|
||||||
target_path = f"/api/v1/tenants/{tenant_id}/weather/{path}".rstrip("/")
|
target_path = f"/api/v1/tenants/{tenant_id}/weather/{path}".rstrip("/")
|
||||||
return await _proxy_to_data_service(request, target_path)
|
return await _proxy_to_external_service(request, target_path)
|
||||||
|
|
||||||
@router.api_route("/{tenant_id}/traffic/{path:path}", methods=["GET", "POST", "OPTIONS"])
|
@router.api_route("/{tenant_id}/traffic/{path:path}", methods=["GET", "POST", "OPTIONS"])
|
||||||
async def proxy_tenant_traffic(request: Request, tenant_id: str = Path(...), path: str = ""):
|
async def proxy_tenant_traffic(request: Request, tenant_id: str = Path(...), path: str = ""):
|
||||||
"""Proxy tenant traffic requests to data service"""
|
"""Proxy tenant traffic requests to external service"""
|
||||||
target_path = f"/api/v1/tenants/{tenant_id}/traffic/{path}".rstrip("/")
|
target_path = f"/api/v1/tenants/{tenant_id}/traffic/{path}".rstrip("/")
|
||||||
return await _proxy_to_data_service(request, target_path)
|
return await _proxy_to_external_service(request, target_path)
|
||||||
|
|
||||||
@router.api_route("/{tenant_id}/analytics/{path:path}", methods=["GET", "POST", "OPTIONS"])
|
@router.api_route("/{tenant_id}/analytics/{path:path}", methods=["GET", "POST", "OPTIONS"])
|
||||||
async def proxy_tenant_analytics(request: Request, tenant_id: str = Path(...), path: str = ""):
|
async def proxy_tenant_analytics(request: Request, tenant_id: str = Path(...), path: str = ""):
|
||||||
"""Proxy tenant analytics requests to data service"""
|
"""Proxy tenant analytics requests to sales service"""
|
||||||
target_path = f"/api/v1/tenants/{tenant_id}/analytics/{path}".rstrip("/")
|
target_path = f"/api/v1/tenants/{tenant_id}/analytics/{path}".rstrip("/")
|
||||||
return await _proxy_to_data_service(request, target_path)
|
return await _proxy_to_sales_service(request, target_path)
|
||||||
|
|
||||||
# ================================================================
|
# ================================================================
|
||||||
# TENANT-SCOPED TRAINING SERVICE ENDPOINTS
|
# TENANT-SCOPED TRAINING SERVICE ENDPOINTS
|
||||||
@@ -136,9 +136,13 @@ async def _proxy_to_tenant_service(request: Request, target_path: str):
|
|||||||
"""Proxy request to tenant service"""
|
"""Proxy request to tenant service"""
|
||||||
return await _proxy_request(request, target_path, settings.TENANT_SERVICE_URL)
|
return await _proxy_request(request, target_path, settings.TENANT_SERVICE_URL)
|
||||||
|
|
||||||
async def _proxy_to_data_service(request: Request, target_path: str):
|
async def _proxy_to_sales_service(request: Request, target_path: str):
|
||||||
"""Proxy request to data service"""
|
"""Proxy request to sales service"""
|
||||||
return await _proxy_request(request, target_path, settings.DATA_SERVICE_URL)
|
return await _proxy_request(request, target_path, settings.SALES_SERVICE_URL)
|
||||||
|
|
||||||
|
async def _proxy_to_external_service(request: Request, target_path: str):
|
||||||
|
"""Proxy request to external service"""
|
||||||
|
return await _proxy_request(request, target_path, settings.EXTERNAL_SERVICE_URL)
|
||||||
|
|
||||||
async def _proxy_to_training_service(request: Request, target_path: str):
|
async def _proxy_to_training_service(request: Request, target_path: str):
|
||||||
"""Proxy request to training service"""
|
"""Proxy request to training service"""
|
||||||
|
|||||||
@@ -50,9 +50,15 @@ scrape_configs:
|
|||||||
metrics_path: '/metrics'
|
metrics_path: '/metrics'
|
||||||
scrape_interval: 30s
|
scrape_interval: 30s
|
||||||
|
|
||||||
- job_name: 'data-service'
|
- job_name: 'sales-service'
|
||||||
static_configs:
|
static_configs:
|
||||||
- targets: ['data-service:8000']
|
- targets: ['sales-service:8000']
|
||||||
|
metrics_path: '/metrics'
|
||||||
|
scrape_interval: 30s
|
||||||
|
|
||||||
|
- job_name: 'external-service'
|
||||||
|
static_configs:
|
||||||
|
- targets: ['external-service:8000']
|
||||||
metrics_path: '/metrics'
|
metrics_path: '/metrics'
|
||||||
scrape_interval: 30s
|
scrape_interval: 30s
|
||||||
|
|
||||||
|
|||||||
@@ -1,40 +0,0 @@
|
|||||||
# Add this stage at the top of each service Dockerfile
|
|
||||||
FROM python:3.11-slim as shared
|
|
||||||
WORKDIR /shared
|
|
||||||
COPY shared/ /shared/
|
|
||||||
|
|
||||||
# Then your main service stage
|
|
||||||
FROM python:3.11-slim
|
|
||||||
|
|
||||||
WORKDIR /app
|
|
||||||
|
|
||||||
# Install system dependencies
|
|
||||||
RUN apt-get update && apt-get install -y \
|
|
||||||
gcc \
|
|
||||||
curl \
|
|
||||||
&& rm -rf /var/lib/apt/lists/*
|
|
||||||
|
|
||||||
# Copy requirements
|
|
||||||
COPY services/data/requirements.txt .
|
|
||||||
|
|
||||||
# Install Python dependencies
|
|
||||||
RUN pip install --no-cache-dir -r requirements.txt
|
|
||||||
|
|
||||||
# Copy shared libraries from the shared stage
|
|
||||||
COPY --from=shared /shared /app/shared
|
|
||||||
|
|
||||||
# Copy application code
|
|
||||||
COPY services/data/ .
|
|
||||||
|
|
||||||
# Add shared libraries to Python path
|
|
||||||
ENV PYTHONPATH="/app:/app/shared:$PYTHONPATH"
|
|
||||||
|
|
||||||
# Expose port
|
|
||||||
EXPOSE 8000
|
|
||||||
|
|
||||||
# Health check
|
|
||||||
HEALTHCHECK --interval=30s --timeout=10s --start-period=5s --retries=3 \
|
|
||||||
CMD curl -f http://localhost:8000/health || exit 1
|
|
||||||
|
|
||||||
# Run application
|
|
||||||
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
|
|
||||||
@@ -1,14 +0,0 @@
|
|||||||
"""
|
|
||||||
Data Service API Layer
|
|
||||||
API endpoints for data operations
|
|
||||||
"""
|
|
||||||
|
|
||||||
from .sales import router as sales_router
|
|
||||||
from .traffic import router as traffic_router
|
|
||||||
from .weather import router as weather_router
|
|
||||||
|
|
||||||
__all__ = [
|
|
||||||
"sales_router",
|
|
||||||
"traffic_router",
|
|
||||||
"weather_router"
|
|
||||||
]
|
|
||||||
@@ -1,500 +0,0 @@
|
|||||||
"""
|
|
||||||
Enhanced Sales API Endpoints
|
|
||||||
Updated to use repository pattern and enhanced services with dependency injection
|
|
||||||
"""
|
|
||||||
|
|
||||||
from fastapi import APIRouter, Depends, HTTPException, UploadFile, File, Form, Query, Response, Path
|
|
||||||
from fastapi.responses import StreamingResponse
|
|
||||||
from typing import List, Optional, Dict, Any
|
|
||||||
from uuid import UUID
|
|
||||||
from datetime import datetime
|
|
||||||
import structlog
|
|
||||||
|
|
||||||
from app.schemas.sales import (
|
|
||||||
SalesDataCreate,
|
|
||||||
SalesDataResponse,
|
|
||||||
SalesDataQuery,
|
|
||||||
SalesDataImport,
|
|
||||||
SalesImportResult,
|
|
||||||
SalesValidationResult,
|
|
||||||
SalesValidationRequest,
|
|
||||||
SalesExportRequest
|
|
||||||
)
|
|
||||||
from app.services.sales_service import SalesService
|
|
||||||
from app.services.data_import_service import EnhancedDataImportService
|
|
||||||
from app.services.messaging import (
|
|
||||||
publish_sales_created,
|
|
||||||
publish_data_imported,
|
|
||||||
publish_export_completed
|
|
||||||
)
|
|
||||||
from shared.database.base import create_database_manager
|
|
||||||
from shared.auth.decorators import get_current_user_dep
|
|
||||||
|
|
||||||
router = APIRouter(tags=["enhanced-sales"])
|
|
||||||
logger = structlog.get_logger()
|
|
||||||
|
|
||||||
|
|
||||||
def get_sales_service():
|
|
||||||
"""Dependency injection for SalesService"""
|
|
||||||
from app.core.config import settings
|
|
||||||
database_manager = create_database_manager(settings.DATABASE_URL, "data-service")
|
|
||||||
return SalesService(database_manager)
|
|
||||||
|
|
||||||
|
|
||||||
def get_import_service():
|
|
||||||
"""Dependency injection for EnhancedDataImportService"""
|
|
||||||
from app.core.config import settings
|
|
||||||
database_manager = create_database_manager(settings.DATABASE_URL, "data-service")
|
|
||||||
return EnhancedDataImportService(database_manager)
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/tenants/{tenant_id}/sales", response_model=SalesDataResponse)
|
|
||||||
async def create_sales_record(
|
|
||||||
sales_data: SalesDataCreate,
|
|
||||||
tenant_id: UUID = Path(..., description="Tenant ID"),
|
|
||||||
current_user: Dict[str, Any] = Depends(get_current_user_dep),
|
|
||||||
sales_service: SalesService = Depends(get_sales_service)
|
|
||||||
):
|
|
||||||
"""Create a new sales record using repository pattern"""
|
|
||||||
try:
|
|
||||||
logger.info("Creating sales record with repository pattern",
|
|
||||||
product=sales_data.product_name,
|
|
||||||
quantity=sales_data.quantity_sold,
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
user_id=current_user["user_id"])
|
|
||||||
|
|
||||||
# Override tenant_id from URL path
|
|
||||||
sales_data.tenant_id = tenant_id
|
|
||||||
|
|
||||||
record = await sales_service.create_sales_record(sales_data, str(tenant_id))
|
|
||||||
|
|
||||||
# Publish event (non-blocking)
|
|
||||||
try:
|
|
||||||
await publish_sales_created({
|
|
||||||
"tenant_id": str(tenant_id),
|
|
||||||
"product_name": sales_data.product_name,
|
|
||||||
"quantity_sold": sales_data.quantity_sold,
|
|
||||||
"revenue": sales_data.revenue,
|
|
||||||
"source": sales_data.source,
|
|
||||||
"created_by": current_user["user_id"],
|
|
||||||
"timestamp": datetime.utcnow().isoformat()
|
|
||||||
})
|
|
||||||
except Exception as pub_error:
|
|
||||||
logger.warning("Failed to publish sales created event", error=str(pub_error))
|
|
||||||
|
|
||||||
logger.info("Successfully created sales record using repository",
|
|
||||||
record_id=record.id,
|
|
||||||
tenant_id=tenant_id)
|
|
||||||
return record
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to create sales record",
|
|
||||||
error=str(e),
|
|
||||||
tenant_id=tenant_id)
|
|
||||||
raise HTTPException(status_code=500, detail=f"Failed to create sales record: {str(e)}")
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/tenants/{tenant_id}/sales", response_model=List[SalesDataResponse])
|
|
||||||
async def get_sales_data(
|
|
||||||
tenant_id: UUID = Path(..., description="Tenant ID"),
|
|
||||||
start_date: Optional[datetime] = Query(None, description="Start date filter"),
|
|
||||||
end_date: Optional[datetime] = Query(None, description="End date filter"),
|
|
||||||
product_name: Optional[str] = Query(None, description="Product name filter"),
|
|
||||||
limit: Optional[int] = Query(1000, le=5000, description="Maximum number of records to return"),
|
|
||||||
offset: Optional[int] = Query(0, ge=0, description="Number of records to skip"),
|
|
||||||
product_names: Optional[List[str]] = Query(None, description="Multiple product name filters"),
|
|
||||||
location_ids: Optional[List[str]] = Query(None, description="Location ID filters"),
|
|
||||||
sources: Optional[List[str]] = Query(None, description="Source filters"),
|
|
||||||
min_quantity: Optional[int] = Query(None, description="Minimum quantity filter"),
|
|
||||||
max_quantity: Optional[int] = Query(None, description="Maximum quantity filter"),
|
|
||||||
min_revenue: Optional[float] = Query(None, description="Minimum revenue filter"),
|
|
||||||
max_revenue: Optional[float] = Query(None, description="Maximum revenue filter"),
|
|
||||||
current_user: Dict[str, Any] = Depends(get_current_user_dep),
|
|
||||||
sales_service: SalesService = Depends(get_sales_service)
|
|
||||||
):
|
|
||||||
"""Get sales data using repository pattern with enhanced filtering"""
|
|
||||||
try:
|
|
||||||
logger.debug("Querying sales data with repository pattern",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
start_date=start_date,
|
|
||||||
end_date=end_date,
|
|
||||||
limit=limit,
|
|
||||||
offset=offset)
|
|
||||||
|
|
||||||
# Create enhanced query
|
|
||||||
query = SalesDataQuery(
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
start_date=start_date,
|
|
||||||
end_date=end_date,
|
|
||||||
product_names=[product_name] if product_name else product_names,
|
|
||||||
location_ids=location_ids,
|
|
||||||
sources=sources,
|
|
||||||
min_quantity=min_quantity,
|
|
||||||
max_quantity=max_quantity,
|
|
||||||
min_revenue=min_revenue,
|
|
||||||
max_revenue=max_revenue,
|
|
||||||
limit=limit,
|
|
||||||
offset=offset
|
|
||||||
)
|
|
||||||
|
|
||||||
records = await sales_service.get_sales_data(query)
|
|
||||||
|
|
||||||
logger.debug("Successfully retrieved sales data using repository",
|
|
||||||
count=len(records),
|
|
||||||
tenant_id=tenant_id)
|
|
||||||
return records
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to query sales data",
|
|
||||||
error=str(e),
|
|
||||||
tenant_id=tenant_id)
|
|
||||||
raise HTTPException(status_code=500, detail=f"Failed to query sales data: {str(e)}")
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/tenants/{tenant_id}/sales/analytics")
|
|
||||||
async def get_sales_analytics(
|
|
||||||
tenant_id: UUID = Path(..., description="Tenant ID"),
|
|
||||||
start_date: Optional[datetime] = Query(None, description="Start date"),
|
|
||||||
end_date: Optional[datetime] = Query(None, description="End date"),
|
|
||||||
current_user: Dict[str, Any] = Depends(get_current_user_dep),
|
|
||||||
sales_service: SalesService = Depends(get_sales_service)
|
|
||||||
):
|
|
||||||
"""Get sales analytics using repository pattern"""
|
|
||||||
try:
|
|
||||||
logger.debug("Getting sales analytics with repository pattern",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
start_date=start_date,
|
|
||||||
end_date=end_date)
|
|
||||||
|
|
||||||
analytics = await sales_service.get_sales_analytics(
|
|
||||||
str(tenant_id), start_date, end_date
|
|
||||||
)
|
|
||||||
|
|
||||||
logger.debug("Analytics generated successfully using repository", tenant_id=tenant_id)
|
|
||||||
return analytics
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to generate sales analytics",
|
|
||||||
error=str(e),
|
|
||||||
tenant_id=tenant_id)
|
|
||||||
raise HTTPException(status_code=500, detail=f"Failed to generate analytics: {str(e)}")
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/tenants/{tenant_id}/sales/aggregation")
|
|
||||||
async def get_sales_aggregation(
|
|
||||||
tenant_id: UUID = Path(..., description="Tenant ID"),
|
|
||||||
start_date: Optional[datetime] = Query(None, description="Start date"),
|
|
||||||
end_date: Optional[datetime] = Query(None, description="End date"),
|
|
||||||
group_by: str = Query("daily", description="Aggregation period: daily, weekly, monthly"),
|
|
||||||
current_user: Dict[str, Any] = Depends(get_current_user_dep),
|
|
||||||
sales_service: SalesService = Depends(get_sales_service)
|
|
||||||
):
|
|
||||||
"""Get sales aggregation data using repository pattern"""
|
|
||||||
try:
|
|
||||||
logger.debug("Getting sales aggregation with repository pattern",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
group_by=group_by)
|
|
||||||
|
|
||||||
aggregation = await sales_service.get_sales_aggregation(
|
|
||||||
str(tenant_id), start_date, end_date, group_by
|
|
||||||
)
|
|
||||||
|
|
||||||
logger.debug("Aggregation generated successfully using repository",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
group_by=group_by)
|
|
||||||
return aggregation
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to get sales aggregation",
|
|
||||||
error=str(e),
|
|
||||||
tenant_id=tenant_id)
|
|
||||||
raise HTTPException(status_code=500, detail=f"Failed to get aggregation: {str(e)}")
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/tenants/{tenant_id}/sales/import", response_model=SalesImportResult)
|
|
||||||
async def import_sales_data(
|
|
||||||
tenant_id: UUID = Path(..., description="Tenant ID"),
|
|
||||||
file: UploadFile = File(...),
|
|
||||||
file_format: str = Form(...),
|
|
||||||
current_user: Dict[str, Any] = Depends(get_current_user_dep),
|
|
||||||
import_service: EnhancedDataImportService = Depends(get_import_service)
|
|
||||||
):
|
|
||||||
"""Import sales data using enhanced repository pattern"""
|
|
||||||
try:
|
|
||||||
logger.info("Importing sales data with enhanced repository pattern",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
format=file_format,
|
|
||||||
filename=file.filename,
|
|
||||||
user_id=current_user["user_id"])
|
|
||||||
|
|
||||||
# Read file content
|
|
||||||
content = await file.read()
|
|
||||||
file_content = content.decode('utf-8')
|
|
||||||
|
|
||||||
# Process using enhanced import service
|
|
||||||
result = await import_service.process_import(
|
|
||||||
str(tenant_id),
|
|
||||||
file_content,
|
|
||||||
file_format,
|
|
||||||
filename=file.filename
|
|
||||||
)
|
|
||||||
|
|
||||||
if result.success:
|
|
||||||
# Publish event
|
|
||||||
try:
|
|
||||||
await publish_data_imported({
|
|
||||||
"tenant_id": str(tenant_id),
|
|
||||||
"type": "file_import",
|
|
||||||
"format": file_format,
|
|
||||||
"filename": file.filename,
|
|
||||||
"records_created": result.records_created,
|
|
||||||
"imported_by": current_user["user_id"],
|
|
||||||
"timestamp": datetime.utcnow().isoformat()
|
|
||||||
})
|
|
||||||
except Exception as pub_error:
|
|
||||||
logger.warning("Failed to publish import event", error=str(pub_error))
|
|
||||||
|
|
||||||
logger.info("Import completed with enhanced repository pattern",
|
|
||||||
success=result.success,
|
|
||||||
records_created=result.records_created,
|
|
||||||
tenant_id=tenant_id)
|
|
||||||
return result
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to import sales data",
|
|
||||||
error=str(e),
|
|
||||||
tenant_id=tenant_id)
|
|
||||||
raise HTTPException(status_code=500, detail=f"Failed to import sales data: {str(e)}")
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/tenants/{tenant_id}/sales/import/validate", response_model=SalesValidationResult)
|
|
||||||
async def validate_import_data(
|
|
||||||
tenant_id: UUID = Path(..., description="Tenant ID"),
|
|
||||||
file: UploadFile = File(..., description="File to validate"),
|
|
||||||
file_format: str = Form(default="csv", description="File format: csv, json, excel"),
|
|
||||||
validate_only: bool = Form(default=True, description="Only validate, don't import"),
|
|
||||||
source: str = Form(default="onboarding_upload", description="Source of the upload"),
|
|
||||||
current_user: Dict[str, Any] = Depends(get_current_user_dep),
|
|
||||||
import_service: EnhancedDataImportService = Depends(get_import_service)
|
|
||||||
):
|
|
||||||
"""Validate import data using enhanced repository pattern"""
|
|
||||||
try:
|
|
||||||
logger.info("Validating import data with enhanced repository pattern",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
format=file_format,
|
|
||||||
filename=file.filename,
|
|
||||||
user_id=current_user["user_id"])
|
|
||||||
|
|
||||||
# Read file content
|
|
||||||
content = await file.read()
|
|
||||||
file_content = content.decode('utf-8')
|
|
||||||
|
|
||||||
# Create validation data structure
|
|
||||||
validation_data = {
|
|
||||||
"tenant_id": str(tenant_id),
|
|
||||||
"data": file_content,
|
|
||||||
"data_format": file_format,
|
|
||||||
"source": source,
|
|
||||||
"validate_only": validate_only
|
|
||||||
}
|
|
||||||
|
|
||||||
# Use enhanced validation service
|
|
||||||
validation_result = await import_service.validate_import_data(validation_data)
|
|
||||||
|
|
||||||
logger.info("Validation completed with enhanced repository pattern",
|
|
||||||
is_valid=validation_result.is_valid,
|
|
||||||
total_records=validation_result.total_records,
|
|
||||||
tenant_id=tenant_id)
|
|
||||||
|
|
||||||
return validation_result
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to validate import data",
|
|
||||||
error=str(e),
|
|
||||||
tenant_id=tenant_id)
|
|
||||||
raise HTTPException(status_code=500, detail=f"Failed to validate import data: {str(e)}")
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/tenants/{tenant_id}/sales/import/validate-json", response_model=SalesValidationResult)
|
|
||||||
async def validate_import_data_json(
|
|
||||||
tenant_id: UUID = Path(..., description="Tenant ID"),
|
|
||||||
request: SalesValidationRequest = ...,
|
|
||||||
current_user: Dict[str, Any] = Depends(get_current_user_dep),
|
|
||||||
import_service: EnhancedDataImportService = Depends(get_import_service)
|
|
||||||
):
|
|
||||||
"""Validate import data from JSON request for onboarding flow"""
|
|
||||||
|
|
||||||
try:
|
|
||||||
logger.info("Starting JSON-based data validation",
|
|
||||||
tenant_id=str(tenant_id),
|
|
||||||
data_format=request.data_format,
|
|
||||||
data_length=len(request.data),
|
|
||||||
validate_only=request.validate_only)
|
|
||||||
|
|
||||||
# Create validation data structure
|
|
||||||
validation_data = {
|
|
||||||
"tenant_id": str(tenant_id),
|
|
||||||
"data": request.data, # Fixed: use 'data' not 'content'
|
|
||||||
"data_format": request.data_format,
|
|
||||||
"filename": f"onboarding_data.{request.data_format}",
|
|
||||||
"source": request.source,
|
|
||||||
"validate_only": request.validate_only
|
|
||||||
}
|
|
||||||
|
|
||||||
# Use enhanced validation service
|
|
||||||
validation_result = await import_service.validate_import_data(validation_data)
|
|
||||||
|
|
||||||
logger.info("JSON validation completed",
|
|
||||||
is_valid=validation_result.is_valid,
|
|
||||||
total_records=validation_result.total_records,
|
|
||||||
tenant_id=tenant_id)
|
|
||||||
|
|
||||||
return validation_result
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to validate JSON import data",
|
|
||||||
error=str(e),
|
|
||||||
tenant_id=tenant_id)
|
|
||||||
raise HTTPException(status_code=500, detail=f"Failed to validate import data: {str(e)}")
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/tenants/{tenant_id}/sales/export")
|
|
||||||
async def export_sales_data(
|
|
||||||
tenant_id: UUID = Path(..., description="Tenant ID"),
|
|
||||||
export_format: str = Query("csv", description="Export format: csv, excel, json"),
|
|
||||||
start_date: Optional[datetime] = Query(None, description="Start date"),
|
|
||||||
end_date: Optional[datetime] = Query(None, description="End date"),
|
|
||||||
products: Optional[List[str]] = Query(None, description="Filter by products"),
|
|
||||||
current_user: Dict[str, Any] = Depends(get_current_user_dep),
|
|
||||||
sales_service: SalesService = Depends(get_sales_service)
|
|
||||||
):
|
|
||||||
"""Export sales data using repository pattern"""
|
|
||||||
try:
|
|
||||||
logger.info("Exporting sales data with repository pattern",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
format=export_format,
|
|
||||||
user_id=current_user["user_id"])
|
|
||||||
|
|
||||||
export_result = await sales_service.export_sales_data(
|
|
||||||
str(tenant_id), export_format, start_date, end_date, products
|
|
||||||
)
|
|
||||||
|
|
||||||
if not export_result:
|
|
||||||
raise HTTPException(status_code=404, detail="No data found for export")
|
|
||||||
|
|
||||||
# Publish export event
|
|
||||||
try:
|
|
||||||
await publish_export_completed({
|
|
||||||
"tenant_id": str(tenant_id),
|
|
||||||
"format": export_format,
|
|
||||||
"exported_by": current_user["user_id"],
|
|
||||||
"record_count": export_result.get("record_count", 0),
|
|
||||||
"timestamp": datetime.utcnow().isoformat()
|
|
||||||
})
|
|
||||||
except Exception as pub_error:
|
|
||||||
logger.warning("Failed to publish export event", error=str(pub_error))
|
|
||||||
|
|
||||||
logger.info("Export completed successfully using repository",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
format=export_format)
|
|
||||||
|
|
||||||
return StreamingResponse(
|
|
||||||
iter([export_result["content"]]),
|
|
||||||
media_type=export_result["media_type"],
|
|
||||||
headers={"Content-Disposition": f"attachment; filename={export_result['filename']}"}
|
|
||||||
)
|
|
||||||
|
|
||||||
except HTTPException:
|
|
||||||
raise
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to export sales data",
|
|
||||||
error=str(e),
|
|
||||||
tenant_id=tenant_id)
|
|
||||||
raise HTTPException(status_code=500, detail=f"Failed to export sales data: {str(e)}")
|
|
||||||
|
|
||||||
|
|
||||||
@router.delete("/tenants/{tenant_id}/sales/{record_id}")
|
|
||||||
async def delete_sales_record(
|
|
||||||
tenant_id: UUID = Path(..., description="Tenant ID"),
|
|
||||||
record_id: str = Path(..., description="Sales record ID"),
|
|
||||||
current_user: Dict[str, Any] = Depends(get_current_user_dep),
|
|
||||||
sales_service: SalesService = Depends(get_sales_service)
|
|
||||||
):
|
|
||||||
"""Delete a sales record using repository pattern"""
|
|
||||||
try:
|
|
||||||
logger.info("Deleting sales record with repository pattern",
|
|
||||||
record_id=record_id,
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
user_id=current_user["user_id"])
|
|
||||||
|
|
||||||
success = await sales_service.delete_sales_record(record_id, str(tenant_id))
|
|
||||||
|
|
||||||
if not success:
|
|
||||||
raise HTTPException(status_code=404, detail="Sales record not found")
|
|
||||||
|
|
||||||
logger.info("Sales record deleted successfully using repository",
|
|
||||||
record_id=record_id,
|
|
||||||
tenant_id=tenant_id)
|
|
||||||
return {"status": "success", "message": "Sales record deleted successfully"}
|
|
||||||
|
|
||||||
except HTTPException:
|
|
||||||
raise
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to delete sales record",
|
|
||||||
error=str(e),
|
|
||||||
tenant_id=tenant_id)
|
|
||||||
raise HTTPException(status_code=500, detail=f"Failed to delete sales record: {str(e)}")
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/tenants/{tenant_id}/sales/products")
|
|
||||||
async def get_products_list(
|
|
||||||
tenant_id: UUID = Path(..., description="Tenant ID"),
|
|
||||||
current_user: Dict[str, Any] = Depends(get_current_user_dep),
|
|
||||||
sales_service: SalesService = Depends(get_sales_service)
|
|
||||||
):
|
|
||||||
"""Get list of products using repository pattern"""
|
|
||||||
try:
|
|
||||||
logger.debug("Getting products list with repository pattern", tenant_id=tenant_id)
|
|
||||||
|
|
||||||
products = await sales_service.get_products_list(str(tenant_id))
|
|
||||||
|
|
||||||
logger.debug("Products list retrieved using repository",
|
|
||||||
count=len(products),
|
|
||||||
tenant_id=tenant_id)
|
|
||||||
return products
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to get products list",
|
|
||||||
error=str(e),
|
|
||||||
tenant_id=tenant_id)
|
|
||||||
raise HTTPException(status_code=500, detail=f"Failed to get products list: {str(e)}")
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/tenants/{tenant_id}/sales/statistics")
|
|
||||||
async def get_sales_statistics(
|
|
||||||
tenant_id: UUID = Path(..., description="Tenant ID"),
|
|
||||||
current_user: Dict[str, Any] = Depends(get_current_user_dep),
|
|
||||||
sales_service: SalesService = Depends(get_sales_service)
|
|
||||||
):
|
|
||||||
"""Get comprehensive sales statistics using repository pattern"""
|
|
||||||
try:
|
|
||||||
logger.debug("Getting sales statistics with repository pattern", tenant_id=tenant_id)
|
|
||||||
|
|
||||||
# Get analytics which includes comprehensive statistics
|
|
||||||
analytics = await sales_service.get_sales_analytics(str(tenant_id))
|
|
||||||
|
|
||||||
# Create enhanced statistics response
|
|
||||||
statistics = {
|
|
||||||
"tenant_id": str(tenant_id),
|
|
||||||
"analytics": analytics,
|
|
||||||
"generated_at": datetime.utcnow().isoformat()
|
|
||||||
}
|
|
||||||
|
|
||||||
logger.debug("Sales statistics retrieved using repository", tenant_id=tenant_id)
|
|
||||||
return statistics
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to get sales statistics",
|
|
||||||
error=str(e),
|
|
||||||
tenant_id=tenant_id)
|
|
||||||
raise HTTPException(status_code=500, detail=f"Failed to get statistics: {str(e)}")
|
|
||||||
@@ -1,196 +0,0 @@
|
|||||||
"""
|
|
||||||
Database configuration for data service
|
|
||||||
Uses shared database infrastructure for consistency
|
|
||||||
"""
|
|
||||||
|
|
||||||
import structlog
|
|
||||||
from typing import AsyncGenerator
|
|
||||||
from sqlalchemy.ext.asyncio import AsyncSession
|
|
||||||
from sqlalchemy import text
|
|
||||||
|
|
||||||
from shared.database.base import DatabaseManager, Base
|
|
||||||
from app.core.config import settings
|
|
||||||
|
|
||||||
logger = structlog.get_logger()
|
|
||||||
|
|
||||||
# Initialize database manager using shared infrastructure
|
|
||||||
database_manager = DatabaseManager(
|
|
||||||
database_url=settings.DATABASE_URL,
|
|
||||||
service_name="data",
|
|
||||||
pool_size=15,
|
|
||||||
max_overflow=25,
|
|
||||||
echo=settings.DEBUG if hasattr(settings, 'DEBUG') else False
|
|
||||||
)
|
|
||||||
|
|
||||||
# Alias for convenience - matches the existing interface
|
|
||||||
get_db = database_manager.get_db
|
|
||||||
|
|
||||||
# Use the shared background session method
|
|
||||||
get_background_db_session = database_manager.get_background_session
|
|
||||||
|
|
||||||
async def get_db_health() -> bool:
|
|
||||||
"""Health check function for database connectivity"""
|
|
||||||
try:
|
|
||||||
async with database_manager.async_engine.begin() as conn:
|
|
||||||
await conn.execute(text("SELECT 1"))
|
|
||||||
logger.debug("Database health check passed")
|
|
||||||
return True
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Database health check failed", error=str(e))
|
|
||||||
return False
|
|
||||||
|
|
||||||
async def init_db():
|
|
||||||
"""Initialize database tables using shared infrastructure"""
|
|
||||||
try:
|
|
||||||
logger.info("Initializing data service database")
|
|
||||||
|
|
||||||
# Import models to ensure they're registered
|
|
||||||
from app.models.sales import SalesData
|
|
||||||
from app.models.traffic import TrafficData
|
|
||||||
from app.models.weather import WeatherData
|
|
||||||
|
|
||||||
# Create tables using shared infrastructure
|
|
||||||
await database_manager.create_tables()
|
|
||||||
|
|
||||||
logger.info("Data service database initialized successfully")
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to initialize data service database", error=str(e))
|
|
||||||
raise
|
|
||||||
|
|
||||||
# Data service specific database utilities
|
|
||||||
class DataDatabaseUtils:
|
|
||||||
"""Data service specific database utilities"""
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
async def cleanup_old_sales_data(days_old: int = 730):
|
|
||||||
"""Clean up old sales data (default 2 years)"""
|
|
||||||
try:
|
|
||||||
async with database_manager.get_background_session() as session:
|
|
||||||
if settings.DATABASE_URL.startswith("sqlite"):
|
|
||||||
query = text(
|
|
||||||
"DELETE FROM sales_data "
|
|
||||||
"WHERE created_at < datetime('now', :days_param)"
|
|
||||||
)
|
|
||||||
params = {"days_param": f"-{days_old} days"}
|
|
||||||
else:
|
|
||||||
query = text(
|
|
||||||
"DELETE FROM sales_data "
|
|
||||||
"WHERE created_at < NOW() - INTERVAL :days_param"
|
|
||||||
)
|
|
||||||
params = {"days_param": f"{days_old} days"}
|
|
||||||
|
|
||||||
result = await session.execute(query, params)
|
|
||||||
deleted_count = result.rowcount
|
|
||||||
|
|
||||||
logger.info("Cleaned up old sales data",
|
|
||||||
deleted_count=deleted_count,
|
|
||||||
days_old=days_old)
|
|
||||||
|
|
||||||
return deleted_count
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to cleanup old sales data", error=str(e))
|
|
||||||
return 0
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
async def get_data_statistics(tenant_id: str = None) -> dict:
|
|
||||||
"""Get data service statistics"""
|
|
||||||
try:
|
|
||||||
async with database_manager.get_background_session() as session:
|
|
||||||
# Get sales data statistics
|
|
||||||
if tenant_id:
|
|
||||||
sales_query = text(
|
|
||||||
"SELECT COUNT(*) as count "
|
|
||||||
"FROM sales_data "
|
|
||||||
"WHERE tenant_id = :tenant_id"
|
|
||||||
)
|
|
||||||
params = {"tenant_id": tenant_id}
|
|
||||||
else:
|
|
||||||
sales_query = text("SELECT COUNT(*) as count FROM sales_data")
|
|
||||||
params = {}
|
|
||||||
|
|
||||||
sales_result = await session.execute(sales_query, params)
|
|
||||||
sales_count = sales_result.scalar() or 0
|
|
||||||
|
|
||||||
# Get traffic data statistics (if exists)
|
|
||||||
try:
|
|
||||||
traffic_query = text("SELECT COUNT(*) as count FROM traffic_data")
|
|
||||||
if tenant_id:
|
|
||||||
# Traffic data might not have tenant_id, check table structure
|
|
||||||
pass
|
|
||||||
|
|
||||||
traffic_result = await session.execute(traffic_query)
|
|
||||||
traffic_count = traffic_result.scalar() or 0
|
|
||||||
except:
|
|
||||||
traffic_count = 0
|
|
||||||
|
|
||||||
# Get weather data statistics (if exists)
|
|
||||||
try:
|
|
||||||
weather_query = text("SELECT COUNT(*) as count FROM weather_data")
|
|
||||||
weather_result = await session.execute(weather_query)
|
|
||||||
weather_count = weather_result.scalar() or 0
|
|
||||||
except:
|
|
||||||
weather_count = 0
|
|
||||||
|
|
||||||
return {
|
|
||||||
"tenant_id": tenant_id,
|
|
||||||
"sales_records": sales_count,
|
|
||||||
"traffic_records": traffic_count,
|
|
||||||
"weather_records": weather_count,
|
|
||||||
"total_records": sales_count + traffic_count + weather_count
|
|
||||||
}
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to get data statistics", error=str(e))
|
|
||||||
return {
|
|
||||||
"tenant_id": tenant_id,
|
|
||||||
"sales_records": 0,
|
|
||||||
"traffic_records": 0,
|
|
||||||
"weather_records": 0,
|
|
||||||
"total_records": 0
|
|
||||||
}
|
|
||||||
|
|
||||||
# Enhanced database session dependency with better error handling
|
|
||||||
async def get_db_session() -> AsyncGenerator[AsyncSession, None]:
|
|
||||||
"""Enhanced database session dependency with better logging and error handling"""
|
|
||||||
async with database_manager.async_session_local() as session:
|
|
||||||
try:
|
|
||||||
logger.debug("Database session created")
|
|
||||||
yield session
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Database session error", error=str(e), exc_info=True)
|
|
||||||
await session.rollback()
|
|
||||||
raise
|
|
||||||
finally:
|
|
||||||
await session.close()
|
|
||||||
logger.debug("Database session closed")
|
|
||||||
|
|
||||||
# Database cleanup for data service
|
|
||||||
async def cleanup_data_database():
|
|
||||||
"""Cleanup database connections for data service"""
|
|
||||||
try:
|
|
||||||
logger.info("Cleaning up data service database connections")
|
|
||||||
|
|
||||||
# Close engine connections
|
|
||||||
if hasattr(database_manager, 'async_engine') and database_manager.async_engine:
|
|
||||||
await database_manager.async_engine.dispose()
|
|
||||||
|
|
||||||
logger.info("Data service database cleanup completed")
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to cleanup data service database", error=str(e))
|
|
||||||
|
|
||||||
# Export the commonly used items to maintain compatibility
|
|
||||||
__all__ = [
|
|
||||||
'Base',
|
|
||||||
'database_manager',
|
|
||||||
'get_db',
|
|
||||||
'get_background_db_session',
|
|
||||||
'get_db_session',
|
|
||||||
'get_db_health',
|
|
||||||
'DataDatabaseUtils',
|
|
||||||
'init_db',
|
|
||||||
'cleanup_data_database'
|
|
||||||
]
|
|
||||||
@@ -1,312 +0,0 @@
|
|||||||
# ================================================================
|
|
||||||
# services/data/app/core/performance.py
|
|
||||||
# ================================================================
|
|
||||||
"""
|
|
||||||
Performance optimization utilities for async operations
|
|
||||||
"""
|
|
||||||
|
|
||||||
import asyncio
|
|
||||||
import functools
|
|
||||||
from typing import Any, Callable, Dict, Optional, TypeVar
|
|
||||||
from datetime import datetime, timedelta, timezone
|
|
||||||
import hashlib
|
|
||||||
import json
|
|
||||||
import structlog
|
|
||||||
|
|
||||||
logger = structlog.get_logger()
|
|
||||||
|
|
||||||
T = TypeVar('T')
|
|
||||||
|
|
||||||
|
|
||||||
class AsyncCache:
|
|
||||||
"""Simple in-memory async cache with TTL"""
|
|
||||||
|
|
||||||
def __init__(self, default_ttl: int = 300):
|
|
||||||
self.cache: Dict[str, Dict[str, Any]] = {}
|
|
||||||
self.default_ttl = default_ttl
|
|
||||||
|
|
||||||
def _generate_key(self, *args, **kwargs) -> str:
|
|
||||||
"""Generate cache key from arguments"""
|
|
||||||
key_data = {
|
|
||||||
'args': args,
|
|
||||||
'kwargs': sorted(kwargs.items())
|
|
||||||
}
|
|
||||||
key_string = json.dumps(key_data, sort_keys=True, default=str)
|
|
||||||
return hashlib.md5(key_string.encode()).hexdigest()
|
|
||||||
|
|
||||||
def _is_expired(self, entry: Dict[str, Any]) -> bool:
|
|
||||||
"""Check if cache entry is expired"""
|
|
||||||
expires_at = entry.get('expires_at')
|
|
||||||
if not expires_at:
|
|
||||||
return True
|
|
||||||
return datetime.now(timezone.utc) > expires_at
|
|
||||||
|
|
||||||
async def get(self, key: str) -> Optional[Any]:
|
|
||||||
"""Get value from cache"""
|
|
||||||
if key in self.cache:
|
|
||||||
entry = self.cache[key]
|
|
||||||
if not self._is_expired(entry):
|
|
||||||
logger.debug("Cache hit", cache_key=key)
|
|
||||||
return entry['value']
|
|
||||||
else:
|
|
||||||
# Clean up expired entry
|
|
||||||
del self.cache[key]
|
|
||||||
logger.debug("Cache expired", cache_key=key)
|
|
||||||
|
|
||||||
logger.debug("Cache miss", cache_key=key)
|
|
||||||
return None
|
|
||||||
|
|
||||||
async def set(self, key: str, value: Any, ttl: Optional[int] = None) -> None:
|
|
||||||
"""Set value in cache"""
|
|
||||||
ttl = ttl or self.default_ttl
|
|
||||||
expires_at = datetime.now(timezone.utc) + timedelta(seconds=ttl)
|
|
||||||
|
|
||||||
self.cache[key] = {
|
|
||||||
'value': value,
|
|
||||||
'expires_at': expires_at,
|
|
||||||
'created_at': datetime.now(timezone.utc)
|
|
||||||
}
|
|
||||||
|
|
||||||
logger.debug("Cache set", cache_key=key, ttl=ttl)
|
|
||||||
|
|
||||||
async def clear(self) -> None:
|
|
||||||
"""Clear all cache entries"""
|
|
||||||
self.cache.clear()
|
|
||||||
logger.info("Cache cleared")
|
|
||||||
|
|
||||||
async def cleanup_expired(self) -> int:
|
|
||||||
"""Clean up expired entries"""
|
|
||||||
expired_keys = [
|
|
||||||
key for key, entry in self.cache.items()
|
|
||||||
if self._is_expired(entry)
|
|
||||||
]
|
|
||||||
|
|
||||||
for key in expired_keys:
|
|
||||||
del self.cache[key]
|
|
||||||
|
|
||||||
if expired_keys:
|
|
||||||
logger.info("Cleaned up expired cache entries", count=len(expired_keys))
|
|
||||||
|
|
||||||
return len(expired_keys)
|
|
||||||
|
|
||||||
|
|
||||||
def async_cache(ttl: int = 300, cache_instance: Optional[AsyncCache] = None):
|
|
||||||
"""Decorator for caching async function results"""
|
|
||||||
|
|
||||||
def decorator(func: Callable[..., T]) -> Callable[..., T]:
|
|
||||||
_cache = cache_instance or AsyncCache(ttl)
|
|
||||||
|
|
||||||
@functools.wraps(func)
|
|
||||||
async def wrapper(*args, **kwargs):
|
|
||||||
# Generate cache key
|
|
||||||
cache_key = _cache._generate_key(func.__name__, *args, **kwargs)
|
|
||||||
|
|
||||||
# Try to get from cache
|
|
||||||
cached_result = await _cache.get(cache_key)
|
|
||||||
if cached_result is not None:
|
|
||||||
return cached_result
|
|
||||||
|
|
||||||
# Execute function and cache result
|
|
||||||
result = await func(*args, **kwargs)
|
|
||||||
await _cache.set(cache_key, result, ttl)
|
|
||||||
|
|
||||||
return result
|
|
||||||
|
|
||||||
# Add cache management methods
|
|
||||||
wrapper.cache_clear = _cache.clear
|
|
||||||
wrapper.cache_cleanup = _cache.cleanup_expired
|
|
||||||
|
|
||||||
return wrapper
|
|
||||||
|
|
||||||
return decorator
|
|
||||||
|
|
||||||
|
|
||||||
class ConnectionPool:
|
|
||||||
"""Simple connection pool for HTTP clients"""
|
|
||||||
|
|
||||||
def __init__(self, max_connections: int = 10):
|
|
||||||
self.max_connections = max_connections
|
|
||||||
self.semaphore = asyncio.Semaphore(max_connections)
|
|
||||||
self._active_connections = 0
|
|
||||||
|
|
||||||
async def acquire(self):
|
|
||||||
"""Acquire a connection slot"""
|
|
||||||
await self.semaphore.acquire()
|
|
||||||
self._active_connections += 1
|
|
||||||
logger.debug("Connection acquired", active=self._active_connections, max=self.max_connections)
|
|
||||||
|
|
||||||
async def release(self):
|
|
||||||
"""Release a connection slot"""
|
|
||||||
self.semaphore.release()
|
|
||||||
self._active_connections = max(0, self._active_connections - 1)
|
|
||||||
logger.debug("Connection released", active=self._active_connections, max=self.max_connections)
|
|
||||||
|
|
||||||
async def __aenter__(self):
|
|
||||||
await self.acquire()
|
|
||||||
return self
|
|
||||||
|
|
||||||
async def __aexit__(self, exc_type, exc_val, exc_tb):
|
|
||||||
await self.release()
|
|
||||||
|
|
||||||
|
|
||||||
def rate_limit(calls: int, period: int):
|
|
||||||
"""Rate limiting decorator"""
|
|
||||||
|
|
||||||
def decorator(func: Callable[..., T]) -> Callable[..., T]:
|
|
||||||
call_times = []
|
|
||||||
lock = asyncio.Lock()
|
|
||||||
|
|
||||||
@functools.wraps(func)
|
|
||||||
async def wrapper(*args, **kwargs):
|
|
||||||
async with lock:
|
|
||||||
now = datetime.now(timezone.utc)
|
|
||||||
|
|
||||||
# Remove old call times
|
|
||||||
cutoff = now - timedelta(seconds=period)
|
|
||||||
call_times[:] = [t for t in call_times if t > cutoff]
|
|
||||||
|
|
||||||
# Check rate limit
|
|
||||||
if len(call_times) >= calls:
|
|
||||||
sleep_time = (call_times[0] + timedelta(seconds=period) - now).total_seconds()
|
|
||||||
if sleep_time > 0:
|
|
||||||
logger.warning("Rate limit reached, sleeping", sleep_time=sleep_time)
|
|
||||||
await asyncio.sleep(sleep_time)
|
|
||||||
|
|
||||||
# Record this call
|
|
||||||
call_times.append(now)
|
|
||||||
|
|
||||||
return await func(*args, **kwargs)
|
|
||||||
|
|
||||||
return wrapper
|
|
||||||
|
|
||||||
return decorator
|
|
||||||
|
|
||||||
|
|
||||||
async def batch_process(
|
|
||||||
items: list,
|
|
||||||
process_func: Callable,
|
|
||||||
batch_size: int = 10,
|
|
||||||
max_concurrency: int = 5
|
|
||||||
) -> list:
|
|
||||||
"""Process items in batches with controlled concurrency"""
|
|
||||||
|
|
||||||
results = []
|
|
||||||
semaphore = asyncio.Semaphore(max_concurrency)
|
|
||||||
|
|
||||||
async def process_batch(batch):
|
|
||||||
async with semaphore:
|
|
||||||
return await process_func(batch)
|
|
||||||
|
|
||||||
# Create batches
|
|
||||||
batches = [items[i:i + batch_size] for i in range(0, len(items), batch_size)]
|
|
||||||
|
|
||||||
logger.info("Processing items in batches",
|
|
||||||
total_items=len(items),
|
|
||||||
batches=len(batches),
|
|
||||||
batch_size=batch_size,
|
|
||||||
max_concurrency=max_concurrency)
|
|
||||||
|
|
||||||
# Process batches concurrently
|
|
||||||
batch_results = await asyncio.gather(
|
|
||||||
*[process_batch(batch) for batch in batches],
|
|
||||||
return_exceptions=True
|
|
||||||
)
|
|
||||||
|
|
||||||
# Flatten results
|
|
||||||
for batch_result in batch_results:
|
|
||||||
if isinstance(batch_result, Exception):
|
|
||||||
logger.error("Batch processing error", error=str(batch_result))
|
|
||||||
continue
|
|
||||||
|
|
||||||
if isinstance(batch_result, list):
|
|
||||||
results.extend(batch_result)
|
|
||||||
else:
|
|
||||||
results.append(batch_result)
|
|
||||||
|
|
||||||
logger.info("Batch processing completed",
|
|
||||||
processed_items=len(results),
|
|
||||||
total_batches=len(batches))
|
|
||||||
|
|
||||||
return results
|
|
||||||
|
|
||||||
|
|
||||||
class PerformanceMonitor:
|
|
||||||
"""Simple performance monitoring for async functions"""
|
|
||||||
|
|
||||||
def __init__(self):
|
|
||||||
self.metrics = {}
|
|
||||||
|
|
||||||
def record_execution(self, func_name: str, duration: float, success: bool = True):
|
|
||||||
"""Record function execution metrics"""
|
|
||||||
if func_name not in self.metrics:
|
|
||||||
self.metrics[func_name] = {
|
|
||||||
'call_count': 0,
|
|
||||||
'success_count': 0,
|
|
||||||
'error_count': 0,
|
|
||||||
'total_duration': 0.0,
|
|
||||||
'min_duration': float('inf'),
|
|
||||||
'max_duration': 0.0
|
|
||||||
}
|
|
||||||
|
|
||||||
metric = self.metrics[func_name]
|
|
||||||
metric['call_count'] += 1
|
|
||||||
metric['total_duration'] += duration
|
|
||||||
metric['min_duration'] = min(metric['min_duration'], duration)
|
|
||||||
metric['max_duration'] = max(metric['max_duration'], duration)
|
|
||||||
|
|
||||||
if success:
|
|
||||||
metric['success_count'] += 1
|
|
||||||
else:
|
|
||||||
metric['error_count'] += 1
|
|
||||||
|
|
||||||
def get_metrics(self, func_name: str = None) -> dict:
|
|
||||||
"""Get performance metrics"""
|
|
||||||
if func_name:
|
|
||||||
metric = self.metrics.get(func_name, {})
|
|
||||||
if metric and metric['call_count'] > 0:
|
|
||||||
metric['avg_duration'] = metric['total_duration'] / metric['call_count']
|
|
||||||
metric['success_rate'] = metric['success_count'] / metric['call_count']
|
|
||||||
return metric
|
|
||||||
|
|
||||||
return self.metrics
|
|
||||||
|
|
||||||
|
|
||||||
def monitor_performance(monitor: Optional[PerformanceMonitor] = None):
|
|
||||||
"""Decorator to monitor function performance"""
|
|
||||||
|
|
||||||
def decorator(func: Callable[..., T]) -> Callable[..., T]:
|
|
||||||
_monitor = monitor or PerformanceMonitor()
|
|
||||||
|
|
||||||
@functools.wraps(func)
|
|
||||||
async def wrapper(*args, **kwargs):
|
|
||||||
start_time = datetime.now(timezone.utc)
|
|
||||||
success = True
|
|
||||||
|
|
||||||
try:
|
|
||||||
result = await func(*args, **kwargs)
|
|
||||||
return result
|
|
||||||
except Exception as e:
|
|
||||||
success = False
|
|
||||||
raise
|
|
||||||
finally:
|
|
||||||
end_time = datetime.now(timezone.utc)
|
|
||||||
duration = (end_time - start_time).total_seconds()
|
|
||||||
_monitor.record_execution(func.__name__, duration, success)
|
|
||||||
|
|
||||||
logger.debug("Function performance",
|
|
||||||
function=func.__name__,
|
|
||||||
duration=duration,
|
|
||||||
success=success)
|
|
||||||
|
|
||||||
# Add metrics access
|
|
||||||
wrapper.get_metrics = lambda: _monitor.get_metrics(func.__name__)
|
|
||||||
|
|
||||||
return wrapper
|
|
||||||
|
|
||||||
return decorator
|
|
||||||
|
|
||||||
|
|
||||||
# Global instances
|
|
||||||
global_cache = AsyncCache(default_ttl=300)
|
|
||||||
global_connection_pool = ConnectionPool(max_connections=20)
|
|
||||||
global_performance_monitor = PerformanceMonitor()
|
|
||||||
0
services/data/app/external/__init__.py
vendored
0
services/data/app/external/__init__.py
vendored
@@ -1,34 +0,0 @@
|
|||||||
# ================================================================
|
|
||||||
# services/data/app/models/sales.py - MISSING FILE
|
|
||||||
# ================================================================
|
|
||||||
"""Sales data models"""
|
|
||||||
|
|
||||||
from sqlalchemy import Column, String, DateTime, Float, Integer, Text, Index
|
|
||||||
from sqlalchemy.dialects.postgresql import UUID
|
|
||||||
import uuid
|
|
||||||
from datetime import datetime, timezone
|
|
||||||
|
|
||||||
from app.core.database import Base
|
|
||||||
|
|
||||||
class SalesData(Base):
|
|
||||||
__tablename__ = "sales_data"
|
|
||||||
|
|
||||||
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
|
|
||||||
tenant_id = Column(UUID(as_uuid=True), nullable=False, index=True)
|
|
||||||
date = Column(DateTime(timezone=True), nullable=False, index=True)
|
|
||||||
product_name = Column(String(255), nullable=False, index=True)
|
|
||||||
quantity_sold = Column(Integer, nullable=False)
|
|
||||||
revenue = Column(Float, nullable=False)
|
|
||||||
location_id = Column(String(100), nullable=True, index=True)
|
|
||||||
source = Column(String(50), nullable=False, default="manual")
|
|
||||||
notes = Column(Text, nullable=True)
|
|
||||||
created_at = Column(DateTime(timezone=True), default=lambda: datetime.now(timezone.utc))
|
|
||||||
updated_at = Column(DateTime(timezone=True),
|
|
||||||
default=lambda: datetime.now(timezone.utc),
|
|
||||||
onupdate=lambda: datetime.now(timezone.utc))
|
|
||||||
|
|
||||||
__table_args__ = (
|
|
||||||
Index('idx_sales_tenant_date', 'tenant_id', 'date'),
|
|
||||||
Index('idx_sales_tenant_product', 'tenant_id', 'product_name'),
|
|
||||||
Index('idx_sales_tenant_location', 'tenant_id', 'location_id'),
|
|
||||||
)
|
|
||||||
@@ -1,12 +0,0 @@
|
|||||||
"""
|
|
||||||
Data Service Repositories
|
|
||||||
Repository implementations for data service
|
|
||||||
"""
|
|
||||||
|
|
||||||
from .base import DataBaseRepository
|
|
||||||
from .sales_repository import SalesRepository
|
|
||||||
|
|
||||||
__all__ = [
|
|
||||||
"DataBaseRepository",
|
|
||||||
"SalesRepository"
|
|
||||||
]
|
|
||||||
@@ -1,167 +0,0 @@
|
|||||||
"""
|
|
||||||
Base Repository for Data Service
|
|
||||||
Service-specific repository base class with data service utilities
|
|
||||||
"""
|
|
||||||
|
|
||||||
from typing import Optional, List, Dict, Any, Type, TypeVar, Generic
|
|
||||||
from sqlalchemy.ext.asyncio import AsyncSession
|
|
||||||
from datetime import datetime, timezone
|
|
||||||
import structlog
|
|
||||||
|
|
||||||
from shared.database.repository import BaseRepository
|
|
||||||
from shared.database.exceptions import DatabaseError, ValidationError
|
|
||||||
|
|
||||||
logger = structlog.get_logger()
|
|
||||||
|
|
||||||
# Type variables for the data service repository
|
|
||||||
Model = TypeVar('Model')
|
|
||||||
CreateSchema = TypeVar('CreateSchema')
|
|
||||||
UpdateSchema = TypeVar('UpdateSchema')
|
|
||||||
|
|
||||||
|
|
||||||
class DataBaseRepository(BaseRepository[Model, CreateSchema, UpdateSchema], Generic[Model, CreateSchema, UpdateSchema]):
|
|
||||||
"""Base repository for data service with common data operations"""
|
|
||||||
|
|
||||||
def __init__(self, model: Type, session: AsyncSession, cache_ttl: Optional[int] = 300):
|
|
||||||
super().__init__(model, session, cache_ttl)
|
|
||||||
|
|
||||||
async def get_by_tenant_id(
|
|
||||||
self,
|
|
||||||
tenant_id: str,
|
|
||||||
skip: int = 0,
|
|
||||||
limit: int = 100
|
|
||||||
) -> List:
|
|
||||||
"""Get records filtered by tenant_id"""
|
|
||||||
return await self.get_multi(
|
|
||||||
skip=skip,
|
|
||||||
limit=limit,
|
|
||||||
filters={"tenant_id": tenant_id}
|
|
||||||
)
|
|
||||||
|
|
||||||
async def get_by_date_range(
|
|
||||||
self,
|
|
||||||
tenant_id: str,
|
|
||||||
start_date: Optional[datetime] = None,
|
|
||||||
end_date: Optional[datetime] = None,
|
|
||||||
skip: int = 0,
|
|
||||||
limit: int = 100
|
|
||||||
) -> List:
|
|
||||||
"""Get records filtered by tenant and date range"""
|
|
||||||
try:
|
|
||||||
filters = {"tenant_id": tenant_id}
|
|
||||||
|
|
||||||
# Build date range filter
|
|
||||||
if start_date or end_date:
|
|
||||||
if not hasattr(self.model, 'date'):
|
|
||||||
raise ValidationError("Model does not have 'date' field for date filtering")
|
|
||||||
|
|
||||||
# This would need a more complex implementation for date ranges
|
|
||||||
# For now, we'll use the basic filter
|
|
||||||
if start_date and end_date:
|
|
||||||
# Would need custom query building for date ranges
|
|
||||||
pass
|
|
||||||
|
|
||||||
return await self.get_multi(
|
|
||||||
skip=skip,
|
|
||||||
limit=limit,
|
|
||||||
filters=filters,
|
|
||||||
order_by="date",
|
|
||||||
order_desc=True
|
|
||||||
)
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error(f"Failed to get records by date range",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
start_date=start_date,
|
|
||||||
end_date=end_date,
|
|
||||||
error=str(e))
|
|
||||||
raise DatabaseError(f"Date range query failed: {str(e)}")
|
|
||||||
|
|
||||||
async def count_by_tenant(self, tenant_id: str) -> int:
|
|
||||||
"""Count records for a specific tenant"""
|
|
||||||
return await self.count(filters={"tenant_id": tenant_id})
|
|
||||||
|
|
||||||
async def validate_tenant_access(self, tenant_id: str, record_id: Any) -> bool:
|
|
||||||
"""Validate that a record belongs to the specified tenant"""
|
|
||||||
try:
|
|
||||||
record = await self.get_by_id(record_id)
|
|
||||||
if not record:
|
|
||||||
return False
|
|
||||||
|
|
||||||
# Check if record has tenant_id field and matches
|
|
||||||
if hasattr(record, 'tenant_id'):
|
|
||||||
return str(record.tenant_id) == str(tenant_id)
|
|
||||||
|
|
||||||
return True # If no tenant_id field, allow access
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to validate tenant access",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
record_id=record_id,
|
|
||||||
error=str(e))
|
|
||||||
return False
|
|
||||||
|
|
||||||
async def get_tenant_stats(self, tenant_id: str) -> Dict[str, Any]:
|
|
||||||
"""Get statistics for a specific tenant"""
|
|
||||||
try:
|
|
||||||
total_records = await self.count_by_tenant(tenant_id)
|
|
||||||
|
|
||||||
# Get recent activity (if model has created_at)
|
|
||||||
recent_records = 0
|
|
||||||
if hasattr(self.model, 'created_at'):
|
|
||||||
# This would need custom query for date filtering
|
|
||||||
# For now, return basic stats
|
|
||||||
pass
|
|
||||||
|
|
||||||
return {
|
|
||||||
"tenant_id": tenant_id,
|
|
||||||
"total_records": total_records,
|
|
||||||
"recent_records": recent_records,
|
|
||||||
"model_type": self.model.__name__
|
|
||||||
}
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to get tenant statistics",
|
|
||||||
tenant_id=tenant_id, error=str(e))
|
|
||||||
return {
|
|
||||||
"tenant_id": tenant_id,
|
|
||||||
"total_records": 0,
|
|
||||||
"recent_records": 0,
|
|
||||||
"model_type": self.model.__name__,
|
|
||||||
"error": str(e)
|
|
||||||
}
|
|
||||||
|
|
||||||
async def cleanup_old_records(
|
|
||||||
self,
|
|
||||||
tenant_id: str,
|
|
||||||
days_old: int = 365,
|
|
||||||
batch_size: int = 1000
|
|
||||||
) -> int:
|
|
||||||
"""Clean up old records for a tenant (if model has date/created_at field)"""
|
|
||||||
try:
|
|
||||||
if not hasattr(self.model, 'created_at') and not hasattr(self.model, 'date'):
|
|
||||||
logger.warning(f"Model {self.model.__name__} has no date field for cleanup")
|
|
||||||
return 0
|
|
||||||
|
|
||||||
# This would need custom implementation with raw SQL
|
|
||||||
# For now, return 0 to indicate no cleanup performed
|
|
||||||
logger.info(f"Cleanup requested for {self.model.__name__} but not implemented")
|
|
||||||
return 0
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to cleanup old records",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
days_old=days_old,
|
|
||||||
error=str(e))
|
|
||||||
raise DatabaseError(f"Cleanup failed: {str(e)}")
|
|
||||||
|
|
||||||
def _ensure_utc_datetime(self, dt: Optional[datetime]) -> Optional[datetime]:
|
|
||||||
"""Ensure datetime is UTC timezone aware"""
|
|
||||||
if dt is None:
|
|
||||||
return None
|
|
||||||
|
|
||||||
if dt.tzinfo is None:
|
|
||||||
# Assume naive datetime is UTC
|
|
||||||
return dt.replace(tzinfo=timezone.utc)
|
|
||||||
|
|
||||||
return dt.astimezone(timezone.utc)
|
|
||||||
@@ -1,517 +0,0 @@
|
|||||||
"""
|
|
||||||
Sales Repository
|
|
||||||
Repository for sales data operations with business-specific queries
|
|
||||||
"""
|
|
||||||
|
|
||||||
from typing import Optional, List, Dict, Any, Type
|
|
||||||
from sqlalchemy.ext.asyncio import AsyncSession
|
|
||||||
from sqlalchemy import select, and_, or_, func, desc, asc, text
|
|
||||||
from datetime import datetime, timezone
|
|
||||||
import structlog
|
|
||||||
|
|
||||||
from .base import DataBaseRepository
|
|
||||||
from app.models.sales import SalesData
|
|
||||||
from app.schemas.sales import SalesDataCreate, SalesDataResponse
|
|
||||||
from shared.database.exceptions import DatabaseError, ValidationError
|
|
||||||
|
|
||||||
logger = structlog.get_logger()
|
|
||||||
|
|
||||||
|
|
||||||
class SalesRepository(DataBaseRepository[SalesData, SalesDataCreate, Dict]):
|
|
||||||
"""Repository for sales data operations"""
|
|
||||||
|
|
||||||
def __init__(self, model_class: Type, session: AsyncSession, cache_ttl: Optional[int] = 300):
|
|
||||||
super().__init__(model_class, session, cache_ttl)
|
|
||||||
|
|
||||||
async def get_by_tenant_and_date_range(
|
|
||||||
self,
|
|
||||||
tenant_id: str,
|
|
||||||
start_date: Optional[datetime] = None,
|
|
||||||
end_date: Optional[datetime] = None,
|
|
||||||
product_names: Optional[List[str]] = None,
|
|
||||||
location_ids: Optional[List[str]] = None,
|
|
||||||
skip: int = 0,
|
|
||||||
limit: int = 100
|
|
||||||
) -> List[SalesData]:
|
|
||||||
"""Get sales data filtered by tenant, date range, and optional filters"""
|
|
||||||
try:
|
|
||||||
query = select(self.model).where(self.model.tenant_id == tenant_id)
|
|
||||||
|
|
||||||
# Add date range filter
|
|
||||||
if start_date:
|
|
||||||
start_date = self._ensure_utc_datetime(start_date)
|
|
||||||
query = query.where(self.model.date >= start_date)
|
|
||||||
|
|
||||||
if end_date:
|
|
||||||
end_date = self._ensure_utc_datetime(end_date)
|
|
||||||
query = query.where(self.model.date <= end_date)
|
|
||||||
|
|
||||||
# Add product filter
|
|
||||||
if product_names:
|
|
||||||
query = query.where(self.model.product_name.in_(product_names))
|
|
||||||
|
|
||||||
# Add location filter
|
|
||||||
if location_ids:
|
|
||||||
query = query.where(self.model.location_id.in_(location_ids))
|
|
||||||
|
|
||||||
# Order by date descending (most recent first)
|
|
||||||
query = query.order_by(desc(self.model.date))
|
|
||||||
|
|
||||||
# Apply pagination
|
|
||||||
query = query.offset(skip).limit(limit)
|
|
||||||
|
|
||||||
result = await self.session.execute(query)
|
|
||||||
return result.scalars().all()
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to get sales by tenant and date range",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
start_date=start_date,
|
|
||||||
end_date=end_date,
|
|
||||||
error=str(e))
|
|
||||||
raise DatabaseError(f"Failed to get sales data: {str(e)}")
|
|
||||||
|
|
||||||
async def get_sales_aggregation(
|
|
||||||
self,
|
|
||||||
tenant_id: str,
|
|
||||||
start_date: Optional[datetime] = None,
|
|
||||||
end_date: Optional[datetime] = None,
|
|
||||||
group_by: str = "daily",
|
|
||||||
product_name: Optional[str] = None
|
|
||||||
) -> List[Dict[str, Any]]:
|
|
||||||
"""Get aggregated sales data for analytics"""
|
|
||||||
try:
|
|
||||||
# Determine date truncation based on group_by
|
|
||||||
if group_by == "daily":
|
|
||||||
date_trunc = "day"
|
|
||||||
elif group_by == "weekly":
|
|
||||||
date_trunc = "week"
|
|
||||||
elif group_by == "monthly":
|
|
||||||
date_trunc = "month"
|
|
||||||
else:
|
|
||||||
raise ValidationError(f"Invalid group_by value: {group_by}")
|
|
||||||
|
|
||||||
# Build base query
|
|
||||||
if self.session.bind.dialect.name == 'postgresql':
|
|
||||||
query = text("""
|
|
||||||
SELECT
|
|
||||||
DATE_TRUNC(:date_trunc, date) as period,
|
|
||||||
product_name,
|
|
||||||
COUNT(*) as record_count,
|
|
||||||
SUM(quantity_sold) as total_quantity,
|
|
||||||
SUM(revenue) as total_revenue,
|
|
||||||
AVG(quantity_sold) as average_quantity,
|
|
||||||
AVG(revenue) as average_revenue
|
|
||||||
FROM sales_data
|
|
||||||
WHERE tenant_id = :tenant_id
|
|
||||||
""")
|
|
||||||
else:
|
|
||||||
# SQLite fallback
|
|
||||||
query = text("""
|
|
||||||
SELECT
|
|
||||||
DATE(date) as period,
|
|
||||||
product_name,
|
|
||||||
COUNT(*) as record_count,
|
|
||||||
SUM(quantity_sold) as total_quantity,
|
|
||||||
SUM(revenue) as total_revenue,
|
|
||||||
AVG(quantity_sold) as average_quantity,
|
|
||||||
AVG(revenue) as average_revenue
|
|
||||||
FROM sales_data
|
|
||||||
WHERE tenant_id = :tenant_id
|
|
||||||
""")
|
|
||||||
|
|
||||||
params = {
|
|
||||||
"tenant_id": tenant_id,
|
|
||||||
"date_trunc": date_trunc
|
|
||||||
}
|
|
||||||
|
|
||||||
# Add date filters
|
|
||||||
if start_date:
|
|
||||||
query = text(str(query) + " AND date >= :start_date")
|
|
||||||
params["start_date"] = self._ensure_utc_datetime(start_date)
|
|
||||||
|
|
||||||
if end_date:
|
|
||||||
query = text(str(query) + " AND date <= :end_date")
|
|
||||||
params["end_date"] = self._ensure_utc_datetime(end_date)
|
|
||||||
|
|
||||||
# Add product filter
|
|
||||||
if product_name:
|
|
||||||
query = text(str(query) + " AND product_name = :product_name")
|
|
||||||
params["product_name"] = product_name
|
|
||||||
|
|
||||||
# Add GROUP BY and ORDER BY
|
|
||||||
query = text(str(query) + " GROUP BY period, product_name ORDER BY period DESC")
|
|
||||||
|
|
||||||
result = await self.session.execute(query, params)
|
|
||||||
rows = result.fetchall()
|
|
||||||
|
|
||||||
# Convert to list of dictionaries
|
|
||||||
aggregations = []
|
|
||||||
for row in rows:
|
|
||||||
aggregations.append({
|
|
||||||
"period": group_by,
|
|
||||||
"date": row.period,
|
|
||||||
"product_name": row.product_name,
|
|
||||||
"record_count": row.record_count,
|
|
||||||
"total_quantity": row.total_quantity,
|
|
||||||
"total_revenue": float(row.total_revenue),
|
|
||||||
"average_quantity": float(row.average_quantity),
|
|
||||||
"average_revenue": float(row.average_revenue)
|
|
||||||
})
|
|
||||||
|
|
||||||
return aggregations
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to get sales aggregation",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
group_by=group_by,
|
|
||||||
error=str(e))
|
|
||||||
raise DatabaseError(f"Sales aggregation failed: {str(e)}")
|
|
||||||
|
|
||||||
async def get_top_products(
|
|
||||||
self,
|
|
||||||
tenant_id: str,
|
|
||||||
start_date: Optional[datetime] = None,
|
|
||||||
end_date: Optional[datetime] = None,
|
|
||||||
limit: int = 10,
|
|
||||||
by_metric: str = "revenue"
|
|
||||||
) -> List[Dict[str, Any]]:
|
|
||||||
"""Get top products by quantity or revenue"""
|
|
||||||
try:
|
|
||||||
if by_metric not in ["revenue", "quantity"]:
|
|
||||||
raise ValidationError(f"Invalid metric: {by_metric}")
|
|
||||||
|
|
||||||
# Choose the aggregation column
|
|
||||||
metric_column = "revenue" if by_metric == "revenue" else "quantity_sold"
|
|
||||||
|
|
||||||
query = text(f"""
|
|
||||||
SELECT
|
|
||||||
product_name,
|
|
||||||
COUNT(*) as sale_count,
|
|
||||||
SUM(quantity_sold) as total_quantity,
|
|
||||||
SUM(revenue) as total_revenue,
|
|
||||||
AVG(revenue) as avg_revenue_per_sale
|
|
||||||
FROM sales_data
|
|
||||||
WHERE tenant_id = :tenant_id
|
|
||||||
{('AND date >= :start_date' if start_date else '')}
|
|
||||||
{('AND date <= :end_date' if end_date else '')}
|
|
||||||
GROUP BY product_name
|
|
||||||
ORDER BY SUM({metric_column}) DESC
|
|
||||||
LIMIT :limit
|
|
||||||
""")
|
|
||||||
|
|
||||||
params = {"tenant_id": tenant_id, "limit": limit}
|
|
||||||
if start_date:
|
|
||||||
params["start_date"] = self._ensure_utc_datetime(start_date)
|
|
||||||
if end_date:
|
|
||||||
params["end_date"] = self._ensure_utc_datetime(end_date)
|
|
||||||
|
|
||||||
result = await self.session.execute(query, params)
|
|
||||||
rows = result.fetchall()
|
|
||||||
|
|
||||||
products = []
|
|
||||||
for row in rows:
|
|
||||||
products.append({
|
|
||||||
"product_name": row.product_name,
|
|
||||||
"sale_count": row.sale_count,
|
|
||||||
"total_quantity": row.total_quantity,
|
|
||||||
"total_revenue": float(row.total_revenue),
|
|
||||||
"avg_revenue_per_sale": float(row.avg_revenue_per_sale),
|
|
||||||
"metric_used": by_metric
|
|
||||||
})
|
|
||||||
|
|
||||||
return products
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to get top products",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
by_metric=by_metric,
|
|
||||||
error=str(e))
|
|
||||||
raise DatabaseError(f"Top products query failed: {str(e)}")
|
|
||||||
|
|
||||||
async def get_sales_by_location(
|
|
||||||
self,
|
|
||||||
tenant_id: str,
|
|
||||||
start_date: Optional[datetime] = None,
|
|
||||||
end_date: Optional[datetime] = None
|
|
||||||
) -> List[Dict[str, Any]]:
|
|
||||||
"""Get sales statistics by location"""
|
|
||||||
try:
|
|
||||||
query = text("""
|
|
||||||
SELECT
|
|
||||||
COALESCE(location_id, 'unknown') as location_id,
|
|
||||||
COUNT(*) as sale_count,
|
|
||||||
SUM(quantity_sold) as total_quantity,
|
|
||||||
SUM(revenue) as total_revenue,
|
|
||||||
AVG(revenue) as avg_revenue_per_sale
|
|
||||||
FROM sales_data
|
|
||||||
WHERE tenant_id = :tenant_id
|
|
||||||
{date_filters}
|
|
||||||
GROUP BY location_id
|
|
||||||
ORDER BY SUM(revenue) DESC
|
|
||||||
""".format(
|
|
||||||
date_filters=(
|
|
||||||
"AND date >= :start_date" if start_date else ""
|
|
||||||
) + (
|
|
||||||
" AND date <= :end_date" if end_date else ""
|
|
||||||
)
|
|
||||||
))
|
|
||||||
|
|
||||||
params = {"tenant_id": tenant_id}
|
|
||||||
if start_date:
|
|
||||||
params["start_date"] = self._ensure_utc_datetime(start_date)
|
|
||||||
if end_date:
|
|
||||||
params["end_date"] = self._ensure_utc_datetime(end_date)
|
|
||||||
|
|
||||||
result = await self.session.execute(query, params)
|
|
||||||
rows = result.fetchall()
|
|
||||||
|
|
||||||
locations = []
|
|
||||||
for row in rows:
|
|
||||||
locations.append({
|
|
||||||
"location_id": row.location_id,
|
|
||||||
"sale_count": row.sale_count,
|
|
||||||
"total_quantity": row.total_quantity,
|
|
||||||
"total_revenue": float(row.total_revenue),
|
|
||||||
"avg_revenue_per_sale": float(row.avg_revenue_per_sale)
|
|
||||||
})
|
|
||||||
|
|
||||||
return locations
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to get sales by location",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
error=str(e))
|
|
||||||
raise DatabaseError(f"Sales by location query failed: {str(e)}")
|
|
||||||
|
|
||||||
async def create_bulk_sales(
|
|
||||||
self,
|
|
||||||
sales_records: List[Dict[str, Any]],
|
|
||||||
tenant_id: str
|
|
||||||
) -> List[SalesData]:
|
|
||||||
"""Create multiple sales records in bulk"""
|
|
||||||
try:
|
|
||||||
# Ensure all records have tenant_id
|
|
||||||
for record in sales_records:
|
|
||||||
record["tenant_id"] = tenant_id
|
|
||||||
# Ensure dates are timezone-aware
|
|
||||||
if "date" in record and record["date"]:
|
|
||||||
record["date"] = self._ensure_utc_datetime(record["date"])
|
|
||||||
|
|
||||||
return await self.bulk_create(sales_records)
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to create bulk sales",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
record_count=len(sales_records),
|
|
||||||
error=str(e))
|
|
||||||
raise DatabaseError(f"Bulk sales creation failed: {str(e)}")
|
|
||||||
|
|
||||||
async def search_sales(
|
|
||||||
self,
|
|
||||||
tenant_id: str,
|
|
||||||
search_term: str,
|
|
||||||
skip: int = 0,
|
|
||||||
limit: int = 100
|
|
||||||
) -> List[SalesData]:
|
|
||||||
"""Search sales by product name or notes"""
|
|
||||||
try:
|
|
||||||
# Use the parent search method with sales-specific fields
|
|
||||||
search_fields = ["product_name", "notes", "location_id"]
|
|
||||||
|
|
||||||
# Filter by tenant first
|
|
||||||
query = select(self.model).where(
|
|
||||||
and_(
|
|
||||||
self.model.tenant_id == tenant_id,
|
|
||||||
or_(
|
|
||||||
self.model.product_name.ilike(f"%{search_term}%"),
|
|
||||||
self.model.notes.ilike(f"%{search_term}%") if hasattr(self.model, 'notes') else False,
|
|
||||||
self.model.location_id.ilike(f"%{search_term}%") if hasattr(self.model, 'location_id') else False
|
|
||||||
)
|
|
||||||
)
|
|
||||||
).order_by(desc(self.model.date)).offset(skip).limit(limit)
|
|
||||||
|
|
||||||
result = await self.session.execute(query)
|
|
||||||
return result.scalars().all()
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to search sales",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
search_term=search_term,
|
|
||||||
error=str(e))
|
|
||||||
raise DatabaseError(f"Sales search failed: {str(e)}")
|
|
||||||
|
|
||||||
async def get_sales_summary(
|
|
||||||
self,
|
|
||||||
tenant_id: str,
|
|
||||||
start_date: Optional[datetime] = None,
|
|
||||||
end_date: Optional[datetime] = None
|
|
||||||
) -> Dict[str, Any]:
|
|
||||||
"""Get comprehensive sales summary for a tenant"""
|
|
||||||
try:
|
|
||||||
base_filters = {"tenant_id": tenant_id}
|
|
||||||
|
|
||||||
# Build date filter for count
|
|
||||||
date_query = select(func.count(self.model.id)).where(self.model.tenant_id == tenant_id)
|
|
||||||
|
|
||||||
if start_date:
|
|
||||||
date_query = date_query.where(self.model.date >= self._ensure_utc_datetime(start_date))
|
|
||||||
if end_date:
|
|
||||||
date_query = date_query.where(self.model.date <= self._ensure_utc_datetime(end_date))
|
|
||||||
|
|
||||||
# Get basic counts
|
|
||||||
total_result = await self.session.execute(date_query)
|
|
||||||
total_sales = total_result.scalar() or 0
|
|
||||||
|
|
||||||
# Get revenue and quantity totals
|
|
||||||
summary_query = text("""
|
|
||||||
SELECT
|
|
||||||
COUNT(*) as total_records,
|
|
||||||
SUM(quantity_sold) as total_quantity,
|
|
||||||
SUM(revenue) as total_revenue,
|
|
||||||
AVG(revenue) as avg_revenue,
|
|
||||||
MIN(date) as earliest_sale,
|
|
||||||
MAX(date) as latest_sale,
|
|
||||||
COUNT(DISTINCT product_name) as unique_products,
|
|
||||||
COUNT(DISTINCT location_id) as unique_locations
|
|
||||||
FROM sales_data
|
|
||||||
WHERE tenant_id = :tenant_id
|
|
||||||
{date_filters}
|
|
||||||
""".format(
|
|
||||||
date_filters=(
|
|
||||||
"AND date >= :start_date" if start_date else ""
|
|
||||||
) + (
|
|
||||||
" AND date <= :end_date" if end_date else ""
|
|
||||||
)
|
|
||||||
))
|
|
||||||
|
|
||||||
params = {"tenant_id": tenant_id}
|
|
||||||
if start_date:
|
|
||||||
params["start_date"] = self._ensure_utc_datetime(start_date)
|
|
||||||
if end_date:
|
|
||||||
params["end_date"] = self._ensure_utc_datetime(end_date)
|
|
||||||
|
|
||||||
result = await self.session.execute(summary_query, params)
|
|
||||||
row = result.fetchone()
|
|
||||||
|
|
||||||
if row:
|
|
||||||
return {
|
|
||||||
"tenant_id": tenant_id,
|
|
||||||
"period_start": start_date,
|
|
||||||
"period_end": end_date,
|
|
||||||
"total_sales": row.total_records or 0,
|
|
||||||
"total_quantity": row.total_quantity or 0,
|
|
||||||
"total_revenue": float(row.total_revenue or 0),
|
|
||||||
"average_revenue": float(row.avg_revenue or 0),
|
|
||||||
"earliest_sale": row.earliest_sale,
|
|
||||||
"latest_sale": row.latest_sale,
|
|
||||||
"unique_products": row.unique_products or 0,
|
|
||||||
"unique_locations": row.unique_locations or 0
|
|
||||||
}
|
|
||||||
else:
|
|
||||||
return {
|
|
||||||
"tenant_id": tenant_id,
|
|
||||||
"period_start": start_date,
|
|
||||||
"period_end": end_date,
|
|
||||||
"total_sales": 0,
|
|
||||||
"total_quantity": 0,
|
|
||||||
"total_revenue": 0.0,
|
|
||||||
"average_revenue": 0.0,
|
|
||||||
"earliest_sale": None,
|
|
||||||
"latest_sale": None,
|
|
||||||
"unique_products": 0,
|
|
||||||
"unique_locations": 0
|
|
||||||
}
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to get sales summary",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
error=str(e))
|
|
||||||
raise DatabaseError(f"Sales summary failed: {str(e)}")
|
|
||||||
|
|
||||||
async def validate_sales_data(self, sales_data: Dict[str, Any]) -> Dict[str, Any]:
|
|
||||||
"""Validate sales data before insertion"""
|
|
||||||
errors = []
|
|
||||||
warnings = []
|
|
||||||
|
|
||||||
try:
|
|
||||||
# Check required fields
|
|
||||||
required_fields = ["date", "product_name", "quantity_sold", "revenue"]
|
|
||||||
for field in required_fields:
|
|
||||||
if field not in sales_data or sales_data[field] is None:
|
|
||||||
errors.append(f"Missing required field: {field}")
|
|
||||||
|
|
||||||
# Validate data types and ranges
|
|
||||||
if "quantity_sold" in sales_data:
|
|
||||||
if not isinstance(sales_data["quantity_sold"], (int, float)) or sales_data["quantity_sold"] <= 0:
|
|
||||||
errors.append("quantity_sold must be a positive number")
|
|
||||||
|
|
||||||
if "revenue" in sales_data:
|
|
||||||
if not isinstance(sales_data["revenue"], (int, float)) or sales_data["revenue"] <= 0:
|
|
||||||
errors.append("revenue must be a positive number")
|
|
||||||
|
|
||||||
# Validate string lengths
|
|
||||||
if "product_name" in sales_data and len(str(sales_data["product_name"])) > 255:
|
|
||||||
errors.append("product_name exceeds maximum length of 255 characters")
|
|
||||||
|
|
||||||
# Check for suspicious data
|
|
||||||
if "quantity_sold" in sales_data and "revenue" in sales_data:
|
|
||||||
unit_price = sales_data["revenue"] / sales_data["quantity_sold"]
|
|
||||||
if unit_price > 10000: # Arbitrary high price threshold
|
|
||||||
warnings.append(f"Unusually high unit price: {unit_price:.2f}")
|
|
||||||
elif unit_price < 0.01: # Very low price
|
|
||||||
warnings.append(f"Unusually low unit price: {unit_price:.2f}")
|
|
||||||
|
|
||||||
return {
|
|
||||||
"is_valid": len(errors) == 0,
|
|
||||||
"errors": errors,
|
|
||||||
"warnings": warnings
|
|
||||||
}
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to validate sales data", error=str(e))
|
|
||||||
return {
|
|
||||||
"is_valid": False,
|
|
||||||
"errors": [f"Validation error: {str(e)}"],
|
|
||||||
"warnings": []
|
|
||||||
}
|
|
||||||
|
|
||||||
async def get_product_statistics(self, tenant_id: str) -> List[Dict[str, Any]]:
|
|
||||||
"""Get product statistics for tenant"""
|
|
||||||
try:
|
|
||||||
query = text("""
|
|
||||||
SELECT
|
|
||||||
product_name,
|
|
||||||
COUNT(*) as total_sales,
|
|
||||||
SUM(quantity_sold) as total_quantity,
|
|
||||||
SUM(revenue) as total_revenue,
|
|
||||||
AVG(revenue) as avg_revenue,
|
|
||||||
MIN(date) as first_sale,
|
|
||||||
MAX(date) as last_sale
|
|
||||||
FROM sales_data
|
|
||||||
WHERE tenant_id = :tenant_id
|
|
||||||
GROUP BY product_name
|
|
||||||
ORDER BY SUM(revenue) DESC
|
|
||||||
""")
|
|
||||||
|
|
||||||
result = await self.session.execute(query, {"tenant_id": tenant_id})
|
|
||||||
rows = result.fetchall()
|
|
||||||
|
|
||||||
products = []
|
|
||||||
for row in rows:
|
|
||||||
products.append({
|
|
||||||
"product_name": row.product_name,
|
|
||||||
"total_sales": int(row.total_sales or 0),
|
|
||||||
"total_quantity": int(row.total_quantity or 0),
|
|
||||||
"total_revenue": float(row.total_revenue or 0),
|
|
||||||
"avg_revenue": float(row.avg_revenue or 0),
|
|
||||||
"first_sale": row.first_sale.isoformat() if row.first_sale else None,
|
|
||||||
"last_sale": row.last_sale.isoformat() if row.last_sale else None
|
|
||||||
})
|
|
||||||
|
|
||||||
logger.debug(f"Found {len(products)} products for tenant {tenant_id}")
|
|
||||||
return products
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error(f"Error getting product statistics: {str(e)}", tenant_id=tenant_id)
|
|
||||||
return []
|
|
||||||
@@ -1,874 +0,0 @@
|
|||||||
# ================================================================
|
|
||||||
# services/data/app/repositories/traffic_repository.py
|
|
||||||
# ================================================================
|
|
||||||
"""
|
|
||||||
Traffic Repository - Enhanced for multiple cities with comprehensive data access patterns
|
|
||||||
Follows existing repository architecture while adding city-specific functionality
|
|
||||||
"""
|
|
||||||
|
|
||||||
from typing import Optional, List, Dict, Any, Type, Tuple
|
|
||||||
from sqlalchemy.ext.asyncio import AsyncSession
|
|
||||||
from sqlalchemy import select, and_, or_, func, desc, asc, text, update, delete
|
|
||||||
from sqlalchemy.orm import selectinload
|
|
||||||
from datetime import datetime, timezone, timedelta
|
|
||||||
import structlog
|
|
||||||
|
|
||||||
from .base import DataBaseRepository
|
|
||||||
from app.models.traffic import TrafficData, TrafficMeasurementPoint, TrafficDataBackgroundJob
|
|
||||||
from app.schemas.traffic import TrafficDataCreate, TrafficDataResponse
|
|
||||||
from shared.database.exceptions import DatabaseError, ValidationError
|
|
||||||
|
|
||||||
logger = structlog.get_logger()
|
|
||||||
|
|
||||||
|
|
||||||
class TrafficRepository(DataBaseRepository[TrafficData, TrafficDataCreate, Dict]):
|
|
||||||
"""
|
|
||||||
Enhanced repository for traffic data operations across multiple cities
|
|
||||||
Provides city-aware queries and advanced traffic analytics
|
|
||||||
"""
|
|
||||||
|
|
||||||
def __init__(self, model_class: Type, session: AsyncSession, cache_ttl: Optional[int] = 300):
|
|
||||||
super().__init__(model_class, session, cache_ttl)
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# CORE TRAFFIC DATA OPERATIONS
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
async def get_by_location_and_date_range(
|
|
||||||
self,
|
|
||||||
latitude: float,
|
|
||||||
longitude: float,
|
|
||||||
start_date: Optional[datetime] = None,
|
|
||||||
end_date: Optional[datetime] = None,
|
|
||||||
city: Optional[str] = None,
|
|
||||||
tenant_id: Optional[str] = None,
|
|
||||||
skip: int = 0,
|
|
||||||
limit: int = 100
|
|
||||||
) -> List[TrafficData]:
|
|
||||||
"""Get traffic data by location and date range with city filtering"""
|
|
||||||
try:
|
|
||||||
location_id = f"{latitude:.4f},{longitude:.4f}"
|
|
||||||
|
|
||||||
# Build base query
|
|
||||||
query = select(self.model).where(self.model.location_id == location_id)
|
|
||||||
|
|
||||||
# Add city filter if specified
|
|
||||||
if city:
|
|
||||||
query = query.where(self.model.city == city)
|
|
||||||
|
|
||||||
# Add tenant filter if specified
|
|
||||||
if tenant_id:
|
|
||||||
query = query.where(self.model.tenant_id == tenant_id)
|
|
||||||
|
|
||||||
# Add date range filters
|
|
||||||
if start_date:
|
|
||||||
start_date = self._ensure_utc_datetime(start_date)
|
|
||||||
query = query.where(self.model.date >= start_date)
|
|
||||||
|
|
||||||
if end_date:
|
|
||||||
end_date = self._ensure_utc_datetime(end_date)
|
|
||||||
query = query.where(self.model.date <= end_date)
|
|
||||||
|
|
||||||
# Order by date descending (most recent first)
|
|
||||||
query = query.order_by(desc(self.model.date))
|
|
||||||
|
|
||||||
# Apply pagination
|
|
||||||
query = query.offset(skip).limit(limit)
|
|
||||||
|
|
||||||
result = await self.session.execute(query)
|
|
||||||
return result.scalars().all()
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to get traffic data by location and date range",
|
|
||||||
latitude=latitude, longitude=longitude,
|
|
||||||
city=city, error=str(e))
|
|
||||||
raise DatabaseError(f"Failed to get traffic data: {str(e)}")
|
|
||||||
|
|
||||||
async def get_by_city_and_date_range(
|
|
||||||
self,
|
|
||||||
city: str,
|
|
||||||
start_date: Optional[datetime] = None,
|
|
||||||
end_date: Optional[datetime] = None,
|
|
||||||
district: Optional[str] = None,
|
|
||||||
measurement_point_ids: Optional[List[str]] = None,
|
|
||||||
include_synthetic: bool = True,
|
|
||||||
tenant_id: Optional[str] = None,
|
|
||||||
skip: int = 0,
|
|
||||||
limit: int = 1000
|
|
||||||
) -> List[TrafficData]:
|
|
||||||
"""Get traffic data by city with advanced filtering options"""
|
|
||||||
try:
|
|
||||||
# Build base query
|
|
||||||
query = select(self.model).where(self.model.city == city)
|
|
||||||
|
|
||||||
# Add tenant filter if specified
|
|
||||||
if tenant_id:
|
|
||||||
query = query.where(self.model.tenant_id == tenant_id)
|
|
||||||
|
|
||||||
# Add date range filters
|
|
||||||
if start_date:
|
|
||||||
start_date = self._ensure_utc_datetime(start_date)
|
|
||||||
query = query.where(self.model.date >= start_date)
|
|
||||||
|
|
||||||
if end_date:
|
|
||||||
end_date = self._ensure_utc_datetime(end_date)
|
|
||||||
query = query.where(self.model.date <= end_date)
|
|
||||||
|
|
||||||
# Add district filter
|
|
||||||
if district:
|
|
||||||
query = query.where(self.model.district == district)
|
|
||||||
|
|
||||||
# Add measurement point filter
|
|
||||||
if measurement_point_ids:
|
|
||||||
query = query.where(self.model.measurement_point_id.in_(measurement_point_ids))
|
|
||||||
|
|
||||||
# Filter synthetic data if requested
|
|
||||||
if not include_synthetic:
|
|
||||||
query = query.where(self.model.is_synthetic == False)
|
|
||||||
|
|
||||||
# Order by date and measurement point
|
|
||||||
query = query.order_by(desc(self.model.date), self.model.measurement_point_id)
|
|
||||||
|
|
||||||
# Apply pagination
|
|
||||||
query = query.offset(skip).limit(limit)
|
|
||||||
|
|
||||||
result = await self.session.execute(query)
|
|
||||||
return result.scalars().all()
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to get traffic data by city",
|
|
||||||
city=city, district=district, error=str(e))
|
|
||||||
raise DatabaseError(f"Failed to get traffic data: {str(e)}")
|
|
||||||
|
|
||||||
async def get_latest_by_measurement_points(
|
|
||||||
self,
|
|
||||||
measurement_point_ids: List[str],
|
|
||||||
city: str,
|
|
||||||
hours_back: int = 24
|
|
||||||
) -> List[TrafficData]:
|
|
||||||
"""Get latest traffic data for specific measurement points"""
|
|
||||||
try:
|
|
||||||
cutoff_time = datetime.now(timezone.utc) - timedelta(hours=hours_back)
|
|
||||||
|
|
||||||
query = select(self.model).where(
|
|
||||||
and_(
|
|
||||||
self.model.city == city,
|
|
||||||
self.model.measurement_point_id.in_(measurement_point_ids),
|
|
||||||
self.model.date >= cutoff_time
|
|
||||||
)
|
|
||||||
).order_by(
|
|
||||||
self.model.measurement_point_id,
|
|
||||||
desc(self.model.date)
|
|
||||||
)
|
|
||||||
|
|
||||||
result = await self.session.execute(query)
|
|
||||||
all_records = result.scalars().all()
|
|
||||||
|
|
||||||
# Get the latest record for each measurement point
|
|
||||||
latest_records = {}
|
|
||||||
for record in all_records:
|
|
||||||
point_id = record.measurement_point_id
|
|
||||||
if point_id not in latest_records:
|
|
||||||
latest_records[point_id] = record
|
|
||||||
|
|
||||||
return list(latest_records.values())
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to get latest traffic data by measurement points",
|
|
||||||
city=city, points=len(measurement_point_ids), error=str(e))
|
|
||||||
raise DatabaseError(f"Failed to get latest traffic data: {str(e)}")
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# ANALYTICS AND AGGREGATIONS
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
async def get_traffic_statistics_by_city(
|
|
||||||
self,
|
|
||||||
city: str,
|
|
||||||
start_date: Optional[datetime] = None,
|
|
||||||
end_date: Optional[datetime] = None,
|
|
||||||
group_by: str = "daily"
|
|
||||||
) -> List[Dict[str, Any]]:
|
|
||||||
"""Get aggregated traffic statistics by city"""
|
|
||||||
try:
|
|
||||||
# Determine date truncation based on group_by
|
|
||||||
if group_by == "hourly":
|
|
||||||
date_trunc = "hour"
|
|
||||||
elif group_by == "daily":
|
|
||||||
date_trunc = "day"
|
|
||||||
elif group_by == "weekly":
|
|
||||||
date_trunc = "week"
|
|
||||||
elif group_by == "monthly":
|
|
||||||
date_trunc = "month"
|
|
||||||
else:
|
|
||||||
raise ValidationError(f"Invalid group_by value: {group_by}")
|
|
||||||
|
|
||||||
# Build aggregation query
|
|
||||||
if self.session.bind.dialect.name == 'postgresql':
|
|
||||||
query = text("""
|
|
||||||
SELECT
|
|
||||||
DATE_TRUNC(:date_trunc, date) as period,
|
|
||||||
city,
|
|
||||||
district,
|
|
||||||
COUNT(*) as record_count,
|
|
||||||
AVG(traffic_volume) as avg_traffic_volume,
|
|
||||||
MAX(traffic_volume) as max_traffic_volume,
|
|
||||||
AVG(pedestrian_count) as avg_pedestrian_count,
|
|
||||||
AVG(average_speed) as avg_speed,
|
|
||||||
COUNT(CASE WHEN congestion_level = 'high' THEN 1 END) as high_congestion_count,
|
|
||||||
COUNT(CASE WHEN is_synthetic = false THEN 1 END) as real_data_count,
|
|
||||||
COUNT(CASE WHEN has_pedestrian_inference = true THEN 1 END) as pedestrian_inference_count
|
|
||||||
FROM traffic_data
|
|
||||||
WHERE city = :city
|
|
||||||
""")
|
|
||||||
else:
|
|
||||||
# SQLite fallback
|
|
||||||
query = text("""
|
|
||||||
SELECT
|
|
||||||
DATE(date) as period,
|
|
||||||
city,
|
|
||||||
district,
|
|
||||||
COUNT(*) as record_count,
|
|
||||||
AVG(traffic_volume) as avg_traffic_volume,
|
|
||||||
MAX(traffic_volume) as max_traffic_volume,
|
|
||||||
AVG(pedestrian_count) as avg_pedestrian_count,
|
|
||||||
AVG(average_speed) as avg_speed,
|
|
||||||
SUM(CASE WHEN congestion_level = 'high' THEN 1 ELSE 0 END) as high_congestion_count,
|
|
||||||
SUM(CASE WHEN is_synthetic = 0 THEN 1 ELSE 0 END) as real_data_count,
|
|
||||||
SUM(CASE WHEN has_pedestrian_inference = 1 THEN 1 ELSE 0 END) as pedestrian_inference_count
|
|
||||||
FROM traffic_data
|
|
||||||
WHERE city = :city
|
|
||||||
""")
|
|
||||||
|
|
||||||
params = {
|
|
||||||
"city": city,
|
|
||||||
"date_trunc": date_trunc
|
|
||||||
}
|
|
||||||
|
|
||||||
# Add date filters
|
|
||||||
if start_date:
|
|
||||||
query = text(str(query) + " AND date >= :start_date")
|
|
||||||
params["start_date"] = self._ensure_utc_datetime(start_date)
|
|
||||||
|
|
||||||
if end_date:
|
|
||||||
query = text(str(query) + " AND date <= :end_date")
|
|
||||||
params["end_date"] = self._ensure_utc_datetime(end_date)
|
|
||||||
|
|
||||||
# Add GROUP BY and ORDER BY
|
|
||||||
query = text(str(query) + " GROUP BY period, city, district ORDER BY period DESC")
|
|
||||||
|
|
||||||
result = await self.session.execute(query, params)
|
|
||||||
rows = result.fetchall()
|
|
||||||
|
|
||||||
# Convert to list of dictionaries
|
|
||||||
statistics = []
|
|
||||||
for row in rows:
|
|
||||||
statistics.append({
|
|
||||||
"period": group_by,
|
|
||||||
"date": row.period,
|
|
||||||
"city": row.city,
|
|
||||||
"district": row.district,
|
|
||||||
"record_count": row.record_count,
|
|
||||||
"avg_traffic_volume": float(row.avg_traffic_volume or 0),
|
|
||||||
"max_traffic_volume": row.max_traffic_volume or 0,
|
|
||||||
"avg_pedestrian_count": float(row.avg_pedestrian_count or 0),
|
|
||||||
"avg_speed": float(row.avg_speed or 0),
|
|
||||||
"high_congestion_count": row.high_congestion_count or 0,
|
|
||||||
"real_data_percentage": round((row.real_data_count or 0) / max(1, row.record_count) * 100, 2),
|
|
||||||
"pedestrian_inference_percentage": round((row.pedestrian_inference_count or 0) / max(1, row.record_count) * 100, 2)
|
|
||||||
})
|
|
||||||
|
|
||||||
return statistics
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to get traffic statistics by city",
|
|
||||||
city=city, group_by=group_by, error=str(e))
|
|
||||||
raise DatabaseError(f"Traffic statistics query failed: {str(e)}")
|
|
||||||
|
|
||||||
async def get_congestion_heatmap_data(
|
|
||||||
self,
|
|
||||||
city: str,
|
|
||||||
start_date: datetime,
|
|
||||||
end_date: datetime,
|
|
||||||
time_granularity: str = "hour"
|
|
||||||
) -> List[Dict[str, Any]]:
|
|
||||||
"""Get congestion data for heatmap visualization"""
|
|
||||||
try:
|
|
||||||
if time_granularity == "hour":
|
|
||||||
time_extract = "EXTRACT(hour FROM date)"
|
|
||||||
elif time_granularity == "day_of_week":
|
|
||||||
time_extract = "EXTRACT(dow FROM date)"
|
|
||||||
else:
|
|
||||||
time_extract = "EXTRACT(hour FROM date)"
|
|
||||||
|
|
||||||
query = text(f"""
|
|
||||||
SELECT
|
|
||||||
{time_extract} as time_period,
|
|
||||||
district,
|
|
||||||
measurement_point_id,
|
|
||||||
latitude,
|
|
||||||
longitude,
|
|
||||||
AVG(CASE
|
|
||||||
WHEN congestion_level = 'low' THEN 1
|
|
||||||
WHEN congestion_level = 'medium' THEN 2
|
|
||||||
WHEN congestion_level = 'high' THEN 3
|
|
||||||
WHEN congestion_level = 'blocked' THEN 4
|
|
||||||
ELSE 1
|
|
||||||
END) as avg_congestion_score,
|
|
||||||
COUNT(*) as data_points,
|
|
||||||
AVG(traffic_volume) as avg_traffic_volume,
|
|
||||||
AVG(pedestrian_count) as avg_pedestrian_count
|
|
||||||
FROM traffic_data
|
|
||||||
WHERE city = :city
|
|
||||||
AND date >= :start_date
|
|
||||||
AND date <= :end_date
|
|
||||||
AND latitude IS NOT NULL
|
|
||||||
AND longitude IS NOT NULL
|
|
||||||
GROUP BY time_period, district, measurement_point_id, latitude, longitude
|
|
||||||
ORDER BY time_period, district, avg_congestion_score DESC
|
|
||||||
""")
|
|
||||||
|
|
||||||
params = {
|
|
||||||
"city": city,
|
|
||||||
"start_date": self._ensure_utc_datetime(start_date),
|
|
||||||
"end_date": self._ensure_utc_datetime(end_date)
|
|
||||||
}
|
|
||||||
|
|
||||||
result = await self.session.execute(query, params)
|
|
||||||
rows = result.fetchall()
|
|
||||||
|
|
||||||
heatmap_data = []
|
|
||||||
for row in rows:
|
|
||||||
heatmap_data.append({
|
|
||||||
"time_period": int(row.time_period or 0),
|
|
||||||
"district": row.district,
|
|
||||||
"measurement_point_id": row.measurement_point_id,
|
|
||||||
"latitude": float(row.latitude),
|
|
||||||
"longitude": float(row.longitude),
|
|
||||||
"avg_congestion_score": float(row.avg_congestion_score),
|
|
||||||
"data_points": row.data_points,
|
|
||||||
"avg_traffic_volume": float(row.avg_traffic_volume or 0),
|
|
||||||
"avg_pedestrian_count": float(row.avg_pedestrian_count or 0)
|
|
||||||
})
|
|
||||||
|
|
||||||
return heatmap_data
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to get congestion heatmap data",
|
|
||||||
city=city, error=str(e))
|
|
||||||
raise DatabaseError(f"Congestion heatmap query failed: {str(e)}")
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# BULK OPERATIONS AND DATA MANAGEMENT
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
async def create_bulk_traffic_data(
|
|
||||||
self,
|
|
||||||
traffic_records: List[Dict[str, Any]],
|
|
||||||
city: str,
|
|
||||||
tenant_id: Optional[str] = None
|
|
||||||
) -> List[TrafficData]:
|
|
||||||
"""Create multiple traffic records in bulk with enhanced validation"""
|
|
||||||
try:
|
|
||||||
# Ensure all records have city and tenant_id
|
|
||||||
for record in traffic_records:
|
|
||||||
record["city"] = city
|
|
||||||
if tenant_id:
|
|
||||||
record["tenant_id"] = tenant_id
|
|
||||||
# Ensure dates are timezone-aware
|
|
||||||
if "date" in record and record["date"]:
|
|
||||||
record["date"] = self._ensure_utc_datetime(record["date"])
|
|
||||||
|
|
||||||
# Enhanced validation
|
|
||||||
validated_records = []
|
|
||||||
for record in traffic_records:
|
|
||||||
if self._validate_traffic_record(record):
|
|
||||||
validated_records.append(record)
|
|
||||||
else:
|
|
||||||
logger.warning("Invalid traffic record skipped",
|
|
||||||
city=city, record_keys=list(record.keys()))
|
|
||||||
|
|
||||||
if not validated_records:
|
|
||||||
logger.warning("No valid traffic records to create", city=city)
|
|
||||||
return []
|
|
||||||
|
|
||||||
# Use bulk create with deduplication
|
|
||||||
created_records = await self.bulk_create_with_deduplication(validated_records)
|
|
||||||
|
|
||||||
logger.info("Bulk traffic data creation completed",
|
|
||||||
city=city, requested=len(traffic_records),
|
|
||||||
validated=len(validated_records), created=len(created_records))
|
|
||||||
|
|
||||||
return created_records
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to create bulk traffic data",
|
|
||||||
city=city, record_count=len(traffic_records), error=str(e))
|
|
||||||
raise DatabaseError(f"Bulk traffic creation failed: {str(e)}")
|
|
||||||
|
|
||||||
async def bulk_create_with_deduplication(
|
|
||||||
self,
|
|
||||||
records: List[Dict[str, Any]]
|
|
||||||
) -> List[TrafficData]:
|
|
||||||
"""Bulk create with automatic deduplication based on location, city, and date"""
|
|
||||||
try:
|
|
||||||
if not records:
|
|
||||||
return []
|
|
||||||
|
|
||||||
# Extract unique keys for deduplication check
|
|
||||||
unique_keys = []
|
|
||||||
for record in records:
|
|
||||||
key = (
|
|
||||||
record.get('location_id'),
|
|
||||||
record.get('city'),
|
|
||||||
record.get('date'),
|
|
||||||
record.get('measurement_point_id')
|
|
||||||
)
|
|
||||||
unique_keys.append(key)
|
|
||||||
|
|
||||||
# Check for existing records
|
|
||||||
location_ids = [key[0] for key in unique_keys if key[0]]
|
|
||||||
cities = [key[1] for key in unique_keys if key[1]]
|
|
||||||
dates = [key[2] for key in unique_keys if key[2]]
|
|
||||||
|
|
||||||
# For large datasets, use chunked deduplication to avoid memory issues
|
|
||||||
if len(location_ids) > 1000:
|
|
||||||
logger.info(f"Large dataset detected ({len(records)} records), using chunked deduplication")
|
|
||||||
new_records = []
|
|
||||||
chunk_size = 1000
|
|
||||||
|
|
||||||
for i in range(0, len(records), chunk_size):
|
|
||||||
chunk_records = records[i:i + chunk_size]
|
|
||||||
chunk_keys = unique_keys[i:i + chunk_size]
|
|
||||||
|
|
||||||
# Get unique values for this chunk
|
|
||||||
chunk_location_ids = list(set(key[0] for key in chunk_keys if key[0]))
|
|
||||||
chunk_cities = list(set(key[1] for key in chunk_keys if key[1]))
|
|
||||||
chunk_dates = list(set(key[2] for key in chunk_keys if key[2]))
|
|
||||||
|
|
||||||
if chunk_location_ids and chunk_cities and chunk_dates:
|
|
||||||
existing_query = select(
|
|
||||||
self.model.location_id,
|
|
||||||
self.model.city,
|
|
||||||
self.model.date,
|
|
||||||
self.model.measurement_point_id
|
|
||||||
).where(
|
|
||||||
and_(
|
|
||||||
self.model.location_id.in_(chunk_location_ids),
|
|
||||||
self.model.city.in_(chunk_cities),
|
|
||||||
self.model.date.in_(chunk_dates)
|
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|
||||||
result = await self.session.execute(existing_query)
|
|
||||||
chunk_existing_keys = set(result.fetchall())
|
|
||||||
|
|
||||||
# Filter chunk duplicates
|
|
||||||
for j, record in enumerate(chunk_records):
|
|
||||||
key = chunk_keys[j]
|
|
||||||
if key not in chunk_existing_keys:
|
|
||||||
new_records.append(record)
|
|
||||||
else:
|
|
||||||
new_records.extend(chunk_records)
|
|
||||||
|
|
||||||
logger.debug("Chunked deduplication completed",
|
|
||||||
total_records=len(records),
|
|
||||||
new_records=len(new_records))
|
|
||||||
records = new_records
|
|
||||||
|
|
||||||
elif location_ids and cities and dates:
|
|
||||||
existing_query = select(
|
|
||||||
self.model.location_id,
|
|
||||||
self.model.city,
|
|
||||||
self.model.date,
|
|
||||||
self.model.measurement_point_id
|
|
||||||
).where(
|
|
||||||
and_(
|
|
||||||
self.model.location_id.in_(location_ids),
|
|
||||||
self.model.city.in_(cities),
|
|
||||||
self.model.date.in_(dates)
|
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|
||||||
result = await self.session.execute(existing_query)
|
|
||||||
existing_keys = set(result.fetchall())
|
|
||||||
|
|
||||||
# Filter out duplicates
|
|
||||||
new_records = []
|
|
||||||
for i, record in enumerate(records):
|
|
||||||
key = unique_keys[i]
|
|
||||||
if key not in existing_keys:
|
|
||||||
new_records.append(record)
|
|
||||||
|
|
||||||
logger.debug("Standard deduplication completed",
|
|
||||||
total_records=len(records),
|
|
||||||
existing_records=len(existing_keys),
|
|
||||||
new_records=len(new_records))
|
|
||||||
|
|
||||||
records = new_records
|
|
||||||
|
|
||||||
# Proceed with bulk creation
|
|
||||||
return await self.bulk_create(records)
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed bulk create with deduplication", error=str(e))
|
|
||||||
raise DatabaseError(f"Bulk create with deduplication failed: {str(e)}")
|
|
||||||
|
|
||||||
def _validate_traffic_record(self, record: Dict[str, Any]) -> bool:
|
|
||||||
"""Enhanced validation for traffic records"""
|
|
||||||
required_fields = ['date', 'city']
|
|
||||||
|
|
||||||
# Check required fields
|
|
||||||
for field in required_fields:
|
|
||||||
if not record.get(field):
|
|
||||||
return False
|
|
||||||
|
|
||||||
# Validate city
|
|
||||||
city = record.get('city', '').lower()
|
|
||||||
if city not in ['madrid', 'barcelona', 'valencia', 'test']: # Extendable list
|
|
||||||
return False
|
|
||||||
|
|
||||||
# Validate data ranges
|
|
||||||
traffic_volume = record.get('traffic_volume')
|
|
||||||
if traffic_volume is not None and (traffic_volume < 0 or traffic_volume > 50000):
|
|
||||||
return False
|
|
||||||
|
|
||||||
pedestrian_count = record.get('pedestrian_count')
|
|
||||||
if pedestrian_count is not None and (pedestrian_count < 0 or pedestrian_count > 10000):
|
|
||||||
return False
|
|
||||||
|
|
||||||
average_speed = record.get('average_speed')
|
|
||||||
if average_speed is not None and (average_speed < 0 or average_speed > 200):
|
|
||||||
return False
|
|
||||||
|
|
||||||
congestion_level = record.get('congestion_level')
|
|
||||||
if congestion_level and congestion_level not in ['low', 'medium', 'high', 'blocked']:
|
|
||||||
return False
|
|
||||||
|
|
||||||
return True
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# TRAINING DATA SPECIFIC OPERATIONS
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
async def get_training_data_by_location(
|
|
||||||
self,
|
|
||||||
latitude: float,
|
|
||||||
longitude: float,
|
|
||||||
start_date: datetime,
|
|
||||||
end_date: datetime,
|
|
||||||
tenant_id: Optional[str] = None,
|
|
||||||
include_pedestrian_inference: bool = True
|
|
||||||
) -> List[Dict[str, Any]]:
|
|
||||||
"""Get optimized training data for ML models"""
|
|
||||||
try:
|
|
||||||
location_id = f"{latitude:.4f},{longitude:.4f}"
|
|
||||||
|
|
||||||
query = select(self.model).where(
|
|
||||||
and_(
|
|
||||||
self.model.location_id == location_id,
|
|
||||||
self.model.date >= self._ensure_utc_datetime(start_date),
|
|
||||||
self.model.date <= self._ensure_utc_datetime(end_date)
|
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|
||||||
if tenant_id:
|
|
||||||
query = query.where(self.model.tenant_id == tenant_id)
|
|
||||||
|
|
||||||
if include_pedestrian_inference:
|
|
||||||
# Prefer records with pedestrian inference
|
|
||||||
query = query.order_by(
|
|
||||||
desc(self.model.has_pedestrian_inference),
|
|
||||||
desc(self.model.data_quality_score),
|
|
||||||
self.model.date
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
query = query.order_by(
|
|
||||||
desc(self.model.data_quality_score),
|
|
||||||
self.model.date
|
|
||||||
)
|
|
||||||
|
|
||||||
result = await self.session.execute(query)
|
|
||||||
records = result.scalars().all()
|
|
||||||
|
|
||||||
# Convert to training format with enhanced features
|
|
||||||
training_data = []
|
|
||||||
for record in records:
|
|
||||||
training_record = {
|
|
||||||
'date': record.date,
|
|
||||||
'traffic_volume': record.traffic_volume or 0,
|
|
||||||
'pedestrian_count': record.pedestrian_count or 0,
|
|
||||||
'congestion_level': record.congestion_level or 'medium',
|
|
||||||
'average_speed': record.average_speed or 25.0,
|
|
||||||
'city': record.city,
|
|
||||||
'district': record.district,
|
|
||||||
'measurement_point_id': record.measurement_point_id,
|
|
||||||
'source': record.source,
|
|
||||||
'is_synthetic': record.is_synthetic or False,
|
|
||||||
'has_pedestrian_inference': record.has_pedestrian_inference or False,
|
|
||||||
'data_quality_score': record.data_quality_score or 50.0,
|
|
||||||
|
|
||||||
# Enhanced features for training
|
|
||||||
'hour_of_day': record.date.hour if record.date else 12,
|
|
||||||
'day_of_week': record.date.weekday() if record.date else 0,
|
|
||||||
'month': record.date.month if record.date else 1,
|
|
||||||
|
|
||||||
# City-specific features
|
|
||||||
'city_specific_data': record.city_specific_data or {}
|
|
||||||
}
|
|
||||||
|
|
||||||
training_data.append(training_record)
|
|
||||||
|
|
||||||
logger.info("Retrieved training data",
|
|
||||||
location_id=location_id, records=len(training_data),
|
|
||||||
with_pedestrian_inference=sum(1 for r in training_data if r['has_pedestrian_inference']))
|
|
||||||
|
|
||||||
return training_data
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to get training data",
|
|
||||||
latitude=latitude, longitude=longitude, error=str(e))
|
|
||||||
raise DatabaseError(f"Training data retrieval failed: {str(e)}")
|
|
||||||
|
|
||||||
async def get_historical_data_by_location(
|
|
||||||
self,
|
|
||||||
latitude: float,
|
|
||||||
longitude: float,
|
|
||||||
start_date: datetime,
|
|
||||||
end_date: datetime,
|
|
||||||
tenant_id: Optional[str] = None
|
|
||||||
) -> List[TrafficData]:
|
|
||||||
"""Get historical traffic data for a specific location and date range"""
|
|
||||||
return await self.get_by_location_and_date_range(
|
|
||||||
latitude=latitude,
|
|
||||||
longitude=longitude,
|
|
||||||
start_date=start_date,
|
|
||||||
end_date=end_date,
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
limit=1000000 # Large limit for historical data
|
|
||||||
)
|
|
||||||
|
|
||||||
async def count_records_in_period(
|
|
||||||
self,
|
|
||||||
latitude: float,
|
|
||||||
longitude: float,
|
|
||||||
start_date: datetime,
|
|
||||||
end_date: datetime,
|
|
||||||
city: Optional[str] = None,
|
|
||||||
tenant_id: Optional[str] = None
|
|
||||||
) -> int:
|
|
||||||
"""Count traffic records for a specific location and time period"""
|
|
||||||
try:
|
|
||||||
location_id = f"{latitude:.4f},{longitude:.4f}"
|
|
||||||
|
|
||||||
query = select(func.count(self.model.id)).where(
|
|
||||||
and_(
|
|
||||||
self.model.location_id == location_id,
|
|
||||||
self.model.date >= self._ensure_utc_datetime(start_date),
|
|
||||||
self.model.date <= self._ensure_utc_datetime(end_date)
|
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|
||||||
if city:
|
|
||||||
query = query.where(self.model.city == city)
|
|
||||||
|
|
||||||
if tenant_id:
|
|
||||||
query = query.where(self.model.tenant_id == tenant_id)
|
|
||||||
|
|
||||||
result = await self.session.execute(query)
|
|
||||||
count = result.scalar()
|
|
||||||
|
|
||||||
return count or 0
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to count records in period",
|
|
||||||
latitude=latitude, longitude=longitude, error=str(e))
|
|
||||||
raise DatabaseError(f"Record count failed: {str(e)}")
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# DATA QUALITY AND MAINTENANCE
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
async def update_data_quality_scores(self, city: str) -> int:
|
|
||||||
"""Update data quality scores based on various criteria"""
|
|
||||||
try:
|
|
||||||
# Calculate quality scores based on data completeness and consistency
|
|
||||||
query = text("""
|
|
||||||
UPDATE traffic_data
|
|
||||||
SET data_quality_score = (
|
|
||||||
CASE
|
|
||||||
WHEN traffic_volume IS NOT NULL THEN 20 ELSE 0 END +
|
|
||||||
CASE
|
|
||||||
WHEN pedestrian_count IS NOT NULL THEN 20 ELSE 0 END +
|
|
||||||
CASE
|
|
||||||
WHEN average_speed IS NOT NULL AND average_speed > 0 THEN 20 ELSE 0 END +
|
|
||||||
CASE
|
|
||||||
WHEN congestion_level IS NOT NULL THEN 15 ELSE 0 END +
|
|
||||||
CASE
|
|
||||||
WHEN measurement_point_id IS NOT NULL THEN 10 ELSE 0 END +
|
|
||||||
CASE
|
|
||||||
WHEN district IS NOT NULL THEN 10 ELSE 0 END +
|
|
||||||
CASE
|
|
||||||
WHEN has_pedestrian_inference = true THEN 5 ELSE 0 END
|
|
||||||
),
|
|
||||||
updated_at = :updated_at
|
|
||||||
WHERE city = :city AND data_quality_score IS NULL
|
|
||||||
""")
|
|
||||||
|
|
||||||
params = {
|
|
||||||
"city": city,
|
|
||||||
"updated_at": datetime.now(timezone.utc)
|
|
||||||
}
|
|
||||||
|
|
||||||
result = await self.session.execute(query, params)
|
|
||||||
updated_count = result.rowcount
|
|
||||||
await self.session.commit()
|
|
||||||
|
|
||||||
logger.info("Updated data quality scores",
|
|
||||||
city=city, updated_count=updated_count)
|
|
||||||
|
|
||||||
return updated_count
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to update data quality scores",
|
|
||||||
city=city, error=str(e))
|
|
||||||
await self.session.rollback()
|
|
||||||
raise DatabaseError(f"Data quality update failed: {str(e)}")
|
|
||||||
|
|
||||||
async def cleanup_old_synthetic_data(
|
|
||||||
self,
|
|
||||||
city: str,
|
|
||||||
days_to_keep: int = 90
|
|
||||||
) -> int:
|
|
||||||
"""Clean up old synthetic data while preserving real data"""
|
|
||||||
try:
|
|
||||||
cutoff_date = datetime.now(timezone.utc) - timedelta(days=days_to_keep)
|
|
||||||
|
|
||||||
query = delete(self.model).where(
|
|
||||||
and_(
|
|
||||||
self.model.city == city,
|
|
||||||
self.model.is_synthetic == True,
|
|
||||||
self.model.date < cutoff_date
|
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|
||||||
result = await self.session.execute(query)
|
|
||||||
deleted_count = result.rowcount
|
|
||||||
await self.session.commit()
|
|
||||||
|
|
||||||
logger.info("Cleaned up old synthetic data",
|
|
||||||
city=city, deleted_count=deleted_count, days_kept=days_to_keep)
|
|
||||||
|
|
||||||
return deleted_count
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to cleanup old synthetic data",
|
|
||||||
city=city, error=str(e))
|
|
||||||
await self.session.rollback()
|
|
||||||
raise DatabaseError(f"Synthetic data cleanup failed: {str(e)}")
|
|
||||||
|
|
||||||
|
|
||||||
class TrafficMeasurementPointRepository(DataBaseRepository[TrafficMeasurementPoint, Dict, Dict]):
|
|
||||||
"""Repository for traffic measurement points across cities"""
|
|
||||||
|
|
||||||
async def get_points_near_location(
|
|
||||||
self,
|
|
||||||
latitude: float,
|
|
||||||
longitude: float,
|
|
||||||
city: str,
|
|
||||||
radius_km: float = 10.0,
|
|
||||||
limit: int = 20
|
|
||||||
) -> List[TrafficMeasurementPoint]:
|
|
||||||
"""Get measurement points near a location using spatial query"""
|
|
||||||
try:
|
|
||||||
# Simple distance calculation (for more precise, use PostGIS)
|
|
||||||
query = text("""
|
|
||||||
SELECT *,
|
|
||||||
(6371 * acos(
|
|
||||||
cos(radians(:lat)) * cos(radians(latitude)) *
|
|
||||||
cos(radians(longitude) - radians(:lon)) +
|
|
||||||
sin(radians(:lat)) * sin(radians(latitude))
|
|
||||||
)) as distance_km
|
|
||||||
FROM traffic_measurement_points
|
|
||||||
WHERE city = :city
|
|
||||||
AND is_active = true
|
|
||||||
HAVING distance_km <= :radius_km
|
|
||||||
ORDER BY distance_km
|
|
||||||
LIMIT :limit
|
|
||||||
""")
|
|
||||||
|
|
||||||
params = {
|
|
||||||
"lat": latitude,
|
|
||||||
"lon": longitude,
|
|
||||||
"city": city,
|
|
||||||
"radius_km": radius_km,
|
|
||||||
"limit": limit
|
|
||||||
}
|
|
||||||
|
|
||||||
result = await self.session.execute(query, params)
|
|
||||||
rows = result.fetchall()
|
|
||||||
|
|
||||||
# Convert rows to model instances
|
|
||||||
points = []
|
|
||||||
for row in rows:
|
|
||||||
point = TrafficMeasurementPoint()
|
|
||||||
for key, value in row._mapping.items():
|
|
||||||
if hasattr(point, key) and key != 'distance_km':
|
|
||||||
setattr(point, key, value)
|
|
||||||
points.append(point)
|
|
||||||
|
|
||||||
return points
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to get measurement points near location",
|
|
||||||
latitude=latitude, longitude=longitude, city=city, error=str(e))
|
|
||||||
raise DatabaseError(f"Measurement points query failed: {str(e)}")
|
|
||||||
|
|
||||||
|
|
||||||
class TrafficBackgroundJobRepository(DataBaseRepository[TrafficDataBackgroundJob, Dict, Dict]):
|
|
||||||
"""Repository for managing background traffic data jobs"""
|
|
||||||
|
|
||||||
async def get_pending_jobs_by_city(self, city: str) -> List[TrafficDataBackgroundJob]:
|
|
||||||
"""Get pending background jobs for a specific city"""
|
|
||||||
try:
|
|
||||||
query = select(self.model).where(
|
|
||||||
and_(
|
|
||||||
self.model.city == city,
|
|
||||||
self.model.status == 'pending'
|
|
||||||
)
|
|
||||||
).order_by(self.model.scheduled_at)
|
|
||||||
|
|
||||||
result = await self.session.execute(query)
|
|
||||||
return result.scalars().all()
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to get pending jobs by city", city=city, error=str(e))
|
|
||||||
raise DatabaseError(f"Background jobs query failed: {str(e)}")
|
|
||||||
|
|
||||||
async def update_job_progress(
|
|
||||||
self,
|
|
||||||
job_id: str,
|
|
||||||
progress_percentage: float,
|
|
||||||
records_processed: int = 0,
|
|
||||||
records_stored: int = 0
|
|
||||||
) -> bool:
|
|
||||||
"""Update job progress"""
|
|
||||||
try:
|
|
||||||
query = update(self.model).where(
|
|
||||||
self.model.id == job_id
|
|
||||||
).values(
|
|
||||||
progress_percentage=progress_percentage,
|
|
||||||
records_processed=records_processed,
|
|
||||||
records_stored=records_stored,
|
|
||||||
updated_at=datetime.now(timezone.utc)
|
|
||||||
)
|
|
||||||
|
|
||||||
result = await self.session.execute(query)
|
|
||||||
await self.session.commit()
|
|
||||||
|
|
||||||
return result.rowcount > 0
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to update job progress", job_id=job_id, error=str(e))
|
|
||||||
await self.session.rollback()
|
|
||||||
raise DatabaseError(f"Job progress update failed: {str(e)}")
|
|
||||||
@@ -1,62 +0,0 @@
|
|||||||
# ================================================================
|
|
||||||
# services/data/app/schemas/external.py
|
|
||||||
# ================================================================
|
|
||||||
"""External API response schemas"""
|
|
||||||
|
|
||||||
from pydantic import BaseModel
|
|
||||||
from datetime import datetime
|
|
||||||
from typing import Optional, List
|
|
||||||
|
|
||||||
class WeatherDataResponse(BaseModel):
|
|
||||||
date: datetime
|
|
||||||
temperature: Optional[float]
|
|
||||||
precipitation: Optional[float]
|
|
||||||
humidity: Optional[float]
|
|
||||||
wind_speed: Optional[float]
|
|
||||||
pressure: Optional[float]
|
|
||||||
description: Optional[str]
|
|
||||||
source: str
|
|
||||||
|
|
||||||
class WeatherForecastResponse(BaseModel):
|
|
||||||
forecast_date: datetime
|
|
||||||
generated_at: datetime
|
|
||||||
temperature: Optional[float]
|
|
||||||
precipitation: Optional[float]
|
|
||||||
humidity: Optional[float]
|
|
||||||
wind_speed: Optional[float]
|
|
||||||
description: Optional[str]
|
|
||||||
source: str
|
|
||||||
|
|
||||||
class TrafficDataResponse(BaseModel):
|
|
||||||
date: datetime
|
|
||||||
traffic_volume: Optional[int]
|
|
||||||
pedestrian_count: Optional[int]
|
|
||||||
congestion_level: Optional[str]
|
|
||||||
average_speed: Optional[float]
|
|
||||||
source: str
|
|
||||||
|
|
||||||
class LocationRequest(BaseModel):
|
|
||||||
latitude: float
|
|
||||||
longitude: float
|
|
||||||
address: Optional[str] = None
|
|
||||||
|
|
||||||
class DateRangeRequest(BaseModel):
|
|
||||||
start_date: datetime
|
|
||||||
end_date: datetime
|
|
||||||
|
|
||||||
class HistoricalTrafficRequest(BaseModel):
|
|
||||||
latitude: float
|
|
||||||
longitude: float
|
|
||||||
start_date: datetime
|
|
||||||
end_date: datetime
|
|
||||||
|
|
||||||
class HistoricalWeatherRequest(BaseModel):
|
|
||||||
latitude: float
|
|
||||||
longitude: float
|
|
||||||
start_date: datetime
|
|
||||||
end_date: datetime
|
|
||||||
|
|
||||||
class WeatherForecastRequest(BaseModel):
|
|
||||||
latitude: float
|
|
||||||
longitude: float
|
|
||||||
days: int
|
|
||||||
@@ -1,171 +0,0 @@
|
|||||||
# ================================================================
|
|
||||||
# services/data/app/schemas/sales.py - MISSING FILE
|
|
||||||
# ================================================================
|
|
||||||
"""Sales data schemas"""
|
|
||||||
|
|
||||||
from pydantic import BaseModel, Field, field_validator
|
|
||||||
from datetime import datetime
|
|
||||||
from typing import Optional, List, Dict, Any
|
|
||||||
from uuid import UUID
|
|
||||||
|
|
||||||
class SalesDataCreate(BaseModel):
|
|
||||||
"""Schema for creating sales data - FIXED to work with gateway"""
|
|
||||||
# ✅ FIX: Make tenant_id optional since it comes from URL path
|
|
||||||
tenant_id: Optional[UUID] = Field(None, description="Tenant ID (auto-injected from URL path)")
|
|
||||||
date: datetime
|
|
||||||
product_name: str = Field(..., min_length=1, max_length=255)
|
|
||||||
quantity_sold: int = Field(..., gt=0)
|
|
||||||
revenue: float = Field(..., gt=0)
|
|
||||||
location_id: Optional[str] = Field(None, max_length=100)
|
|
||||||
source: str = Field(default="manual", max_length=50)
|
|
||||||
notes: Optional[str] = Field(None, max_length=500)
|
|
||||||
|
|
||||||
@field_validator('product_name')
|
|
||||||
@classmethod
|
|
||||||
def normalize_product_name(cls, v):
|
|
||||||
return v.strip().lower()
|
|
||||||
|
|
||||||
class Config:
|
|
||||||
from_attributes = True
|
|
||||||
json_schema_extra = {
|
|
||||||
"example": {
|
|
||||||
"date": "2024-01-15T10:00:00Z",
|
|
||||||
"product_name": "Pan Integral",
|
|
||||||
"quantity_sold": 25,
|
|
||||||
"revenue": 37.50,
|
|
||||||
"source": "manual"
|
|
||||||
# Note: tenant_id is automatically injected from URL path by gateway
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
class SalesDataResponse(BaseModel):
|
|
||||||
"""Schema for sales data response"""
|
|
||||||
id: UUID
|
|
||||||
tenant_id: UUID
|
|
||||||
date: datetime
|
|
||||||
product_name: str
|
|
||||||
quantity_sold: int
|
|
||||||
revenue: float
|
|
||||||
location_id: Optional[str]
|
|
||||||
source: str
|
|
||||||
notes: Optional[str]
|
|
||||||
created_at: datetime
|
|
||||||
updated_at: Optional[datetime]
|
|
||||||
|
|
||||||
class Config:
|
|
||||||
from_attributes = True
|
|
||||||
|
|
||||||
class SalesDataQuery(BaseModel):
|
|
||||||
"""Schema for querying sales data"""
|
|
||||||
tenant_id: UUID
|
|
||||||
start_date: Optional[datetime] = None
|
|
||||||
end_date: Optional[datetime] = None
|
|
||||||
product_names: Optional[List[str]] = None
|
|
||||||
location_ids: Optional[List[str]] = None
|
|
||||||
sources: Optional[List[str]] = None
|
|
||||||
min_quantity: Optional[int] = None
|
|
||||||
max_quantity: Optional[int] = None
|
|
||||||
min_revenue: Optional[float] = None
|
|
||||||
max_revenue: Optional[float] = None
|
|
||||||
limit: Optional[int] = Field(default=1000, le=5000)
|
|
||||||
offset: Optional[int] = Field(default=0, ge=0)
|
|
||||||
|
|
||||||
class Config:
|
|
||||||
from_attributes = True
|
|
||||||
|
|
||||||
class SalesDataImport(BaseModel):
|
|
||||||
"""Schema for importing sales data - FIXED to work with gateway"""
|
|
||||||
# ✅ FIX: Make tenant_id optional since it comes from URL path
|
|
||||||
tenant_id: Optional[UUID] = Field(None, description="Tenant ID (auto-injected from URL path)")
|
|
||||||
data: str = Field(..., description="JSON string or CSV content")
|
|
||||||
data_format: str = Field(..., pattern="^(csv|json|excel)$")
|
|
||||||
source: str = Field(default="import", max_length=50)
|
|
||||||
validate_only: bool = Field(default=False)
|
|
||||||
|
|
||||||
class Config:
|
|
||||||
from_attributes = True
|
|
||||||
json_schema_extra = {
|
|
||||||
"example": {
|
|
||||||
"data": "date,product,quantity,revenue\n2024-01-01,bread,10,25.50",
|
|
||||||
"data_format": "csv",
|
|
||||||
# Note: tenant_id is automatically injected from URL path by gateway
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
class SalesDataBulkCreate(BaseModel):
|
|
||||||
"""Schema for bulk creating sales data"""
|
|
||||||
tenant_id: UUID
|
|
||||||
records: List[Dict[str, Any]]
|
|
||||||
source: str = Field(default="bulk_import", max_length=50)
|
|
||||||
|
|
||||||
class Config:
|
|
||||||
from_attributes = True
|
|
||||||
|
|
||||||
class SalesValidationResult(BaseModel):
|
|
||||||
"""Schema for sales data validation result"""
|
|
||||||
is_valid: bool
|
|
||||||
total_records: int
|
|
||||||
valid_records: int
|
|
||||||
invalid_records: int
|
|
||||||
errors: List[Dict[str, Any]]
|
|
||||||
warnings: List[Dict[str, Any]]
|
|
||||||
summary: Dict[str, Any]
|
|
||||||
|
|
||||||
class Config:
|
|
||||||
from_attributes = True
|
|
||||||
|
|
||||||
class SalesImportResult(BaseModel):
|
|
||||||
"""Complete schema that includes all expected fields"""
|
|
||||||
success: bool
|
|
||||||
records_processed: int # total_rows
|
|
||||||
records_created: int
|
|
||||||
records_updated: int = 0 # Default to 0 if not tracking updates
|
|
||||||
records_failed: int # error_count or calculated
|
|
||||||
errors: List[Dict[str, Any]] # Structured error objects
|
|
||||||
warnings: List[Dict[str, Any]] # Structured warning objects
|
|
||||||
processing_time_seconds: float
|
|
||||||
|
|
||||||
# Optional additional fields
|
|
||||||
source: Optional[str] = None
|
|
||||||
filename: Optional[str] = None
|
|
||||||
success_rate: Optional[float] = None
|
|
||||||
|
|
||||||
class Config:
|
|
||||||
from_attributes = True
|
|
||||||
|
|
||||||
class SalesAggregation(BaseModel):
|
|
||||||
"""Schema for sales aggregation results"""
|
|
||||||
period: str # "daily", "weekly", "monthly"
|
|
||||||
date: datetime
|
|
||||||
product_name: Optional[str] = None
|
|
||||||
total_quantity: int
|
|
||||||
total_revenue: float
|
|
||||||
average_quantity: float
|
|
||||||
average_revenue: float
|
|
||||||
record_count: int
|
|
||||||
|
|
||||||
class Config:
|
|
||||||
from_attributes = True
|
|
||||||
|
|
||||||
class SalesExportRequest(BaseModel):
|
|
||||||
"""Schema for sales export request"""
|
|
||||||
tenant_id: UUID
|
|
||||||
format: str = Field(..., pattern="^(csv|json|excel)$")
|
|
||||||
start_date: Optional[datetime] = None
|
|
||||||
end_date: Optional[datetime] = None
|
|
||||||
product_names: Optional[List[str]] = None
|
|
||||||
location_ids: Optional[List[str]] = None
|
|
||||||
include_metadata: bool = Field(default=True)
|
|
||||||
|
|
||||||
class Config:
|
|
||||||
from_attributes = True
|
|
||||||
|
|
||||||
class SalesValidationRequest(BaseModel):
|
|
||||||
"""Schema for JSON-based sales data validation request"""
|
|
||||||
data: str = Field(..., description="Raw data content (CSV, JSON, etc.)")
|
|
||||||
data_format: str = Field(..., pattern="^(csv|json|excel)$", description="Format of the data")
|
|
||||||
validate_only: bool = Field(default=True, description="Only validate, don't import")
|
|
||||||
source: str = Field(default="onboarding_upload", description="Source of the data")
|
|
||||||
|
|
||||||
class Config:
|
|
||||||
from_attributes = True
|
|
||||||
@@ -1,20 +0,0 @@
|
|||||||
"""
|
|
||||||
Data Service Layer
|
|
||||||
Business logic services for data operations
|
|
||||||
"""
|
|
||||||
|
|
||||||
from .sales_service import SalesService
|
|
||||||
from .data_import_service import DataImportService, EnhancedDataImportService
|
|
||||||
from .traffic_service import TrafficService
|
|
||||||
from .weather_service import WeatherService
|
|
||||||
from .messaging import publish_sales_data_imported, publish_data_updated
|
|
||||||
|
|
||||||
__all__ = [
|
|
||||||
"SalesService",
|
|
||||||
"DataImportService",
|
|
||||||
"EnhancedDataImportService",
|
|
||||||
"TrafficService",
|
|
||||||
"WeatherService",
|
|
||||||
"publish_sales_data_imported",
|
|
||||||
"publish_data_updated"
|
|
||||||
]
|
|
||||||
@@ -1,134 +0,0 @@
|
|||||||
# ================================================================
|
|
||||||
# services/data/app/services/messaging.py - FIXED VERSION
|
|
||||||
# ================================================================
|
|
||||||
"""Fixed messaging service with proper error handling"""
|
|
||||||
|
|
||||||
from shared.messaging.rabbitmq import RabbitMQClient
|
|
||||||
from app.core.config import settings
|
|
||||||
import structlog
|
|
||||||
|
|
||||||
logger = structlog.get_logger()
|
|
||||||
|
|
||||||
# Single global instance
|
|
||||||
data_publisher = RabbitMQClient(settings.RABBITMQ_URL, "data-service")
|
|
||||||
|
|
||||||
async def setup_messaging():
|
|
||||||
"""Initialize messaging for data service"""
|
|
||||||
try:
|
|
||||||
success = await data_publisher.connect()
|
|
||||||
if success:
|
|
||||||
logger.info("Data service messaging initialized")
|
|
||||||
else:
|
|
||||||
logger.warning("Data service messaging failed to initialize")
|
|
||||||
return success
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning("Failed to setup messaging", error=str(e))
|
|
||||||
return False
|
|
||||||
|
|
||||||
async def cleanup_messaging():
|
|
||||||
"""Cleanup messaging for data service"""
|
|
||||||
try:
|
|
||||||
await data_publisher.disconnect()
|
|
||||||
logger.info("Data service messaging cleaned up")
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning("Error during messaging cleanup", error=str(e))
|
|
||||||
|
|
||||||
# Convenience functions for data-specific events with error handling
|
|
||||||
async def publish_data_imported(data: dict) -> bool:
|
|
||||||
"""Publish data imported event"""
|
|
||||||
try:
|
|
||||||
return await data_publisher.publish_data_event("imported", data)
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning("Failed to publish data imported event", error=str(e))
|
|
||||||
return False
|
|
||||||
|
|
||||||
async def publish_weather_updated(data: dict) -> bool:
|
|
||||||
"""Publish weather updated event"""
|
|
||||||
try:
|
|
||||||
return await data_publisher.publish_data_event("weather.updated", data)
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning("Failed to publish weather updated event", error=str(e))
|
|
||||||
return False
|
|
||||||
|
|
||||||
async def publish_traffic_updated(data: dict) -> bool:
|
|
||||||
"""Publish traffic updated event"""
|
|
||||||
try:
|
|
||||||
return await data_publisher.publish_data_event("traffic.updated", data)
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning("Failed to publish traffic updated event", error=str(e))
|
|
||||||
return False
|
|
||||||
|
|
||||||
async def publish_sales_created(data: dict) -> bool:
|
|
||||||
"""Publish sales created event"""
|
|
||||||
try:
|
|
||||||
return await data_publisher.publish_data_event("sales.created", data)
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning("Failed to publish sales created event", error=str(e))
|
|
||||||
return False
|
|
||||||
|
|
||||||
async def publish_analytics_generated(data: dict) -> bool:
|
|
||||||
"""Publish analytics generated event"""
|
|
||||||
try:
|
|
||||||
return await data_publisher.publish_data_event("analytics.generated", data)
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning("Failed to publish analytics generated event", error=str(e))
|
|
||||||
return False
|
|
||||||
|
|
||||||
async def publish_export_completed(data: dict) -> bool:
|
|
||||||
"""Publish export completed event"""
|
|
||||||
try:
|
|
||||||
return await data_publisher.publish_data_event("export.completed", data)
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning("Failed to publish export completed event", error=str(e))
|
|
||||||
return False
|
|
||||||
|
|
||||||
async def publish_import_started(data: dict) -> bool:
|
|
||||||
"""Publish import started event"""
|
|
||||||
try:
|
|
||||||
return await data_publisher.publish_data_event("import.started", data)
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning("Failed to publish import started event", error=str(e))
|
|
||||||
return False
|
|
||||||
|
|
||||||
async def publish_import_completed(data: dict) -> bool:
|
|
||||||
"""Publish import completed event"""
|
|
||||||
try:
|
|
||||||
return await data_publisher.publish_data_event("import.completed", data)
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning("Failed to publish import completed event", error=str(e))
|
|
||||||
return False
|
|
||||||
|
|
||||||
async def publish_import_failed(data: dict) -> bool:
|
|
||||||
"""Publish import failed event"""
|
|
||||||
try:
|
|
||||||
return await data_publisher.publish_data_event("import.failed", data)
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning("Failed to publish import failed event", error=str(e))
|
|
||||||
return False
|
|
||||||
|
|
||||||
async def publish_sales_data_imported(data: dict) -> bool:
|
|
||||||
"""Publish sales data imported event"""
|
|
||||||
try:
|
|
||||||
return await data_publisher.publish_data_event("sales.imported", data)
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning("Failed to publish sales data imported event", error=str(e))
|
|
||||||
return False
|
|
||||||
|
|
||||||
async def publish_data_updated(data: dict) -> bool:
|
|
||||||
"""Publish data updated event"""
|
|
||||||
try:
|
|
||||||
return await data_publisher.publish_data_event("data.updated", data)
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning("Failed to publish data updated event", error=str(e))
|
|
||||||
return False
|
|
||||||
|
|
||||||
# Health check for messaging
|
|
||||||
async def check_messaging_health() -> dict:
|
|
||||||
"""Check messaging system health"""
|
|
||||||
try:
|
|
||||||
if data_publisher.connected:
|
|
||||||
return {"status": "healthy", "service": "rabbitmq", "connected": True}
|
|
||||||
else:
|
|
||||||
return {"status": "unhealthy", "service": "rabbitmq", "connected": False, "error": "Not connected"}
|
|
||||||
except Exception as e:
|
|
||||||
return {"status": "unhealthy", "service": "rabbitmq", "connected": False, "error": str(e)}
|
|
||||||
@@ -1,292 +0,0 @@
|
|||||||
"""
|
|
||||||
Sales Service with Repository Pattern
|
|
||||||
Enhanced service using the new repository architecture for better separation of concerns
|
|
||||||
"""
|
|
||||||
|
|
||||||
from typing import List, Dict, Any, Optional
|
|
||||||
from datetime import datetime
|
|
||||||
import structlog
|
|
||||||
|
|
||||||
from app.repositories.sales_repository import SalesRepository
|
|
||||||
from app.models.sales import SalesData
|
|
||||||
from app.schemas.sales import (
|
|
||||||
SalesDataCreate,
|
|
||||||
SalesDataResponse,
|
|
||||||
SalesDataQuery,
|
|
||||||
SalesAggregation,
|
|
||||||
SalesImportResult,
|
|
||||||
SalesValidationResult
|
|
||||||
)
|
|
||||||
from shared.database.unit_of_work import UnitOfWork
|
|
||||||
from shared.database.transactions import transactional
|
|
||||||
from shared.database.exceptions import DatabaseError, ValidationError
|
|
||||||
|
|
||||||
logger = structlog.get_logger()
|
|
||||||
|
|
||||||
class SalesService:
|
|
||||||
"""Enhanced Sales Service using Repository Pattern and Unit of Work"""
|
|
||||||
|
|
||||||
def __init__(self, database_manager):
|
|
||||||
"""Initialize service with database manager for dependency injection"""
|
|
||||||
self.database_manager = database_manager
|
|
||||||
|
|
||||||
async def create_sales_record(self, sales_data: SalesDataCreate, tenant_id: str) -> SalesDataResponse:
|
|
||||||
"""Create a new sales record using repository pattern"""
|
|
||||||
try:
|
|
||||||
async with self.database_manager.get_session() as session:
|
|
||||||
async with UnitOfWork(session) as uow:
|
|
||||||
# Register sales repository
|
|
||||||
sales_repo = uow.register_repository("sales", SalesRepository, SalesData)
|
|
||||||
|
|
||||||
# Ensure tenant_id is set
|
|
||||||
record_data = sales_data.model_dump()
|
|
||||||
record_data["tenant_id"] = tenant_id
|
|
||||||
|
|
||||||
# Validate the data first
|
|
||||||
validation_result = await sales_repo.validate_sales_data(record_data)
|
|
||||||
if not validation_result["is_valid"]:
|
|
||||||
raise ValidationError(f"Invalid sales data: {validation_result['errors']}")
|
|
||||||
|
|
||||||
# Create the record
|
|
||||||
db_record = await sales_repo.create(record_data)
|
|
||||||
|
|
||||||
# Commit transaction
|
|
||||||
await uow.commit()
|
|
||||||
|
|
||||||
logger.debug("Sales record created",
|
|
||||||
record_id=db_record.id,
|
|
||||||
product=db_record.product_name,
|
|
||||||
tenant_id=tenant_id)
|
|
||||||
|
|
||||||
return SalesDataResponse.model_validate(db_record)
|
|
||||||
|
|
||||||
except ValidationError:
|
|
||||||
raise
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to create sales record",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
product=sales_data.product_name,
|
|
||||||
error=str(e))
|
|
||||||
raise DatabaseError(f"Failed to create sales record: {str(e)}")
|
|
||||||
|
|
||||||
async def get_sales_data(self, query: SalesDataQuery) -> List[SalesDataResponse]:
|
|
||||||
"""Get sales data based on query parameters using repository pattern"""
|
|
||||||
try:
|
|
||||||
async with self.database_manager.get_session() as session:
|
|
||||||
async with UnitOfWork(session) as uow:
|
|
||||||
sales_repo = uow.register_repository("sales", SalesRepository, SalesData)
|
|
||||||
|
|
||||||
# Use repository's advanced query method
|
|
||||||
records = await sales_repo.get_by_tenant_and_date_range(
|
|
||||||
tenant_id=str(query.tenant_id),
|
|
||||||
start_date=query.start_date,
|
|
||||||
end_date=query.end_date,
|
|
||||||
product_names=query.product_names,
|
|
||||||
location_ids=query.location_ids,
|
|
||||||
skip=query.offset or 0,
|
|
||||||
limit=query.limit or 100
|
|
||||||
)
|
|
||||||
|
|
||||||
logger.debug("Sales data retrieved",
|
|
||||||
count=len(records),
|
|
||||||
tenant_id=query.tenant_id)
|
|
||||||
|
|
||||||
return [SalesDataResponse.model_validate(record) for record in records]
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to retrieve sales data",
|
|
||||||
tenant_id=query.tenant_id,
|
|
||||||
error=str(e))
|
|
||||||
raise DatabaseError(f"Failed to retrieve sales data: {str(e)}")
|
|
||||||
|
|
||||||
async def get_sales_analytics(self, tenant_id: str, start_date: Optional[datetime] = None,
|
|
||||||
end_date: Optional[datetime] = None) -> Dict[str, Any]:
|
|
||||||
"""Get comprehensive sales analytics using repository pattern"""
|
|
||||||
try:
|
|
||||||
async with self.database_manager.get_session() as session:
|
|
||||||
async with UnitOfWork(session) as uow:
|
|
||||||
sales_repo = uow.register_repository("sales", SalesRepository, SalesData)
|
|
||||||
|
|
||||||
# Get summary data
|
|
||||||
summary = await sales_repo.get_sales_summary(
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
start_date=start_date,
|
|
||||||
end_date=end_date
|
|
||||||
)
|
|
||||||
|
|
||||||
# Get top products
|
|
||||||
top_products = await sales_repo.get_top_products(
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
start_date=start_date,
|
|
||||||
end_date=end_date,
|
|
||||||
limit=5
|
|
||||||
)
|
|
||||||
|
|
||||||
# Get aggregated data by day
|
|
||||||
daily_aggregation = await sales_repo.get_sales_aggregation(
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
start_date=start_date,
|
|
||||||
end_date=end_date,
|
|
||||||
group_by="daily"
|
|
||||||
)
|
|
||||||
|
|
||||||
analytics = {
|
|
||||||
**summary,
|
|
||||||
"top_products": top_products,
|
|
||||||
"daily_sales": daily_aggregation[:30], # Last 30 days
|
|
||||||
"average_order_value": (
|
|
||||||
summary["total_revenue"] / max(summary["total_sales"], 1)
|
|
||||||
if summary["total_sales"] > 0 else 0.0
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
logger.debug("Sales analytics generated",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
total_records=analytics["total_sales"])
|
|
||||||
|
|
||||||
return analytics
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to generate sales analytics",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
error=str(e))
|
|
||||||
raise DatabaseError(f"Failed to generate analytics: {str(e)}")
|
|
||||||
|
|
||||||
async def get_sales_aggregation(self, tenant_id: str, start_date: Optional[datetime] = None,
|
|
||||||
end_date: Optional[datetime] = None, group_by: str = "daily") -> List[SalesAggregation]:
|
|
||||||
"""Get sales aggregation data"""
|
|
||||||
try:
|
|
||||||
async with self.database_manager.get_session() as session:
|
|
||||||
async with UnitOfWork(session) as uow:
|
|
||||||
sales_repo = uow.register_repository("sales", SalesRepository, SalesData)
|
|
||||||
|
|
||||||
aggregations = await sales_repo.get_sales_aggregation(
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
start_date=start_date,
|
|
||||||
end_date=end_date,
|
|
||||||
group_by=group_by
|
|
||||||
)
|
|
||||||
|
|
||||||
return [
|
|
||||||
SalesAggregation(
|
|
||||||
period=agg["period"],
|
|
||||||
date=agg["date"],
|
|
||||||
product_name=agg["product_name"],
|
|
||||||
total_quantity=agg["total_quantity"],
|
|
||||||
total_revenue=agg["total_revenue"],
|
|
||||||
average_quantity=agg["average_quantity"],
|
|
||||||
average_revenue=agg["average_revenue"],
|
|
||||||
record_count=agg["record_count"]
|
|
||||||
)
|
|
||||||
for agg in aggregations
|
|
||||||
]
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to get sales aggregation",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
error=str(e))
|
|
||||||
raise DatabaseError(f"Failed to get aggregation: {str(e)}")
|
|
||||||
|
|
||||||
async def export_sales_data(self, tenant_id: str, export_format: str, start_date: Optional[datetime] = None,
|
|
||||||
end_date: Optional[datetime] = None, products: Optional[List[str]] = None) -> Optional[Dict[str, Any]]:
|
|
||||||
"""Export sales data in specified format using repository pattern"""
|
|
||||||
try:
|
|
||||||
async with self.database_manager.get_session() as session:
|
|
||||||
async with UnitOfWork(session) as uow:
|
|
||||||
sales_repo = uow.register_repository("sales", SalesRepository, SalesData)
|
|
||||||
|
|
||||||
# Get sales data based on filters
|
|
||||||
records = await sales_repo.get_by_tenant_and_date_range(
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
start_date=start_date,
|
|
||||||
end_date=end_date,
|
|
||||||
product_names=products,
|
|
||||||
skip=0,
|
|
||||||
limit=10000 # Large limit for export
|
|
||||||
)
|
|
||||||
|
|
||||||
if not records:
|
|
||||||
return None
|
|
||||||
|
|
||||||
# Simple CSV export
|
|
||||||
if export_format.lower() == "csv":
|
|
||||||
import io
|
|
||||||
output = io.StringIO()
|
|
||||||
output.write("date,product_name,quantity_sold,revenue,location_id,source\n")
|
|
||||||
|
|
||||||
for record in records:
|
|
||||||
output.write(f"{record.date},{record.product_name},{record.quantity_sold},{record.revenue},{record.location_id or ''},{record.source}\n")
|
|
||||||
|
|
||||||
logger.info("Sales data exported",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
format=export_format,
|
|
||||||
record_count=len(records))
|
|
||||||
|
|
||||||
return {
|
|
||||||
"content": output.getvalue(),
|
|
||||||
"media_type": "text/csv",
|
|
||||||
"filename": f"sales_export_{datetime.now().strftime('%Y%m%d_%H%M%S')}.csv"
|
|
||||||
}
|
|
||||||
|
|
||||||
return None
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to export sales data",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
error=str(e))
|
|
||||||
raise DatabaseError(f"Failed to export sales data: {str(e)}")
|
|
||||||
|
|
||||||
async def delete_sales_record(self, record_id: str, tenant_id: str) -> bool:
|
|
||||||
"""Delete a sales record using repository pattern"""
|
|
||||||
try:
|
|
||||||
async with self.database_manager.get_session() as session:
|
|
||||||
async with UnitOfWork(session) as uow:
|
|
||||||
sales_repo = uow.register_repository("sales", SalesRepository, SalesData)
|
|
||||||
|
|
||||||
# First verify the record exists and belongs to the tenant
|
|
||||||
record = await sales_repo.get_by_id(record_id)
|
|
||||||
if not record:
|
|
||||||
return False
|
|
||||||
|
|
||||||
if str(record.tenant_id) != tenant_id:
|
|
||||||
raise ValidationError("Record does not belong to the specified tenant")
|
|
||||||
|
|
||||||
# Delete the record
|
|
||||||
success = await sales_repo.delete(record_id)
|
|
||||||
|
|
||||||
if success:
|
|
||||||
logger.info("Sales record deleted",
|
|
||||||
record_id=record_id,
|
|
||||||
tenant_id=tenant_id)
|
|
||||||
|
|
||||||
return success
|
|
||||||
|
|
||||||
except ValidationError:
|
|
||||||
raise
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to delete sales record",
|
|
||||||
record_id=record_id,
|
|
||||||
error=str(e))
|
|
||||||
raise DatabaseError(f"Failed to delete sales record: {str(e)}")
|
|
||||||
|
|
||||||
async def get_products_list(self, tenant_id: str) -> List[Dict[str, Any]]:
|
|
||||||
"""Get list of all products with sales data for tenant using repository pattern"""
|
|
||||||
try:
|
|
||||||
async with self.database_manager.get_session() as session:
|
|
||||||
async with UnitOfWork(session) as uow:
|
|
||||||
sales_repo = uow.register_repository("sales", SalesRepository, SalesData)
|
|
||||||
|
|
||||||
# Use repository method for product statistics
|
|
||||||
products = await sales_repo.get_product_statistics(tenant_id)
|
|
||||||
|
|
||||||
logger.debug("Products list retrieved successfully",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
product_count=len(products))
|
|
||||||
|
|
||||||
return products
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to get products list",
|
|
||||||
error=str(e),
|
|
||||||
tenant_id=tenant_id)
|
|
||||||
raise DatabaseError(f"Failed to get products list: {str(e)}")
|
|
||||||
@@ -1,468 +0,0 @@
|
|||||||
# ================================================================
|
|
||||||
# services/data/app/services/traffic_service.py
|
|
||||||
# ================================================================
|
|
||||||
"""
|
|
||||||
Abstracted Traffic Service - Universal interface for traffic data across multiple cities
|
|
||||||
"""
|
|
||||||
|
|
||||||
import asyncio
|
|
||||||
from datetime import datetime
|
|
||||||
from typing import Dict, List, Any, Optional, Tuple
|
|
||||||
from sqlalchemy.ext.asyncio import AsyncSession
|
|
||||||
from sqlalchemy import select, and_
|
|
||||||
import structlog
|
|
||||||
|
|
||||||
from app.external.apis.traffic import UniversalTrafficClient
|
|
||||||
from app.models.traffic import TrafficData
|
|
||||||
from app.core.performance import (
|
|
||||||
async_cache,
|
|
||||||
monitor_performance,
|
|
||||||
global_connection_pool,
|
|
||||||
global_performance_monitor,
|
|
||||||
batch_process
|
|
||||||
)
|
|
||||||
|
|
||||||
logger = structlog.get_logger()
|
|
||||||
|
|
||||||
|
|
||||||
class TrafficService:
|
|
||||||
"""
|
|
||||||
Abstracted traffic service providing unified interface for traffic data
|
|
||||||
Routes requests to appropriate city-specific clients automatically
|
|
||||||
"""
|
|
||||||
|
|
||||||
def __init__(self):
|
|
||||||
self.universal_client = UniversalTrafficClient()
|
|
||||||
self.logger = structlog.get_logger(__name__)
|
|
||||||
|
|
||||||
@async_cache(ttl=300) # Cache for 5 minutes
|
|
||||||
@monitor_performance(monitor=global_performance_monitor)
|
|
||||||
async def get_current_traffic(
|
|
||||||
self,
|
|
||||||
latitude: float,
|
|
||||||
longitude: float,
|
|
||||||
tenant_id: Optional[str] = None
|
|
||||||
) -> Optional[Dict[str, Any]]:
|
|
||||||
"""
|
|
||||||
Get current traffic data for any supported location
|
|
||||||
|
|
||||||
Args:
|
|
||||||
latitude: Query location latitude
|
|
||||||
longitude: Query location longitude
|
|
||||||
tenant_id: Optional tenant identifier for logging/analytics
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
Dict with current traffic data or None if not available
|
|
||||||
"""
|
|
||||||
try:
|
|
||||||
self.logger.info("Getting current traffic data",
|
|
||||||
lat=latitude, lon=longitude, tenant_id=tenant_id)
|
|
||||||
|
|
||||||
# Delegate to universal client
|
|
||||||
traffic_data = await self.universal_client.get_current_traffic(latitude, longitude)
|
|
||||||
|
|
||||||
if traffic_data:
|
|
||||||
# Add service metadata
|
|
||||||
traffic_data['service_metadata'] = {
|
|
||||||
'request_timestamp': datetime.now().isoformat(),
|
|
||||||
'tenant_id': tenant_id,
|
|
||||||
'service_version': '2.0',
|
|
||||||
'query_location': {'latitude': latitude, 'longitude': longitude}
|
|
||||||
}
|
|
||||||
|
|
||||||
self.logger.info("Successfully retrieved current traffic data",
|
|
||||||
lat=latitude, lon=longitude,
|
|
||||||
source=traffic_data.get('source', 'unknown'))
|
|
||||||
|
|
||||||
return traffic_data
|
|
||||||
else:
|
|
||||||
self.logger.warning("No current traffic data available",
|
|
||||||
lat=latitude, lon=longitude)
|
|
||||||
return None
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
self.logger.error("Error getting current traffic data",
|
|
||||||
lat=latitude, lon=longitude, error=str(e))
|
|
||||||
return None
|
|
||||||
|
|
||||||
@async_cache(ttl=1800) # Cache for 30 minutes (historical data changes less frequently)
|
|
||||||
@monitor_performance(monitor=global_performance_monitor)
|
|
||||||
async def get_historical_traffic(
|
|
||||||
self,
|
|
||||||
latitude: float,
|
|
||||||
longitude: float,
|
|
||||||
start_date: datetime,
|
|
||||||
end_date: datetime,
|
|
||||||
tenant_id: Optional[str] = None,
|
|
||||||
db: Optional[AsyncSession] = None
|
|
||||||
) -> List[Dict[str, Any]]:
|
|
||||||
"""
|
|
||||||
Get historical traffic data for any supported location with database storage
|
|
||||||
|
|
||||||
Args:
|
|
||||||
latitude: Query location latitude
|
|
||||||
longitude: Query location longitude
|
|
||||||
start_date: Start date for historical data
|
|
||||||
end_date: End date for historical data
|
|
||||||
tenant_id: Optional tenant identifier
|
|
||||||
db: Optional database session for storage
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
List of historical traffic data dictionaries
|
|
||||||
"""
|
|
||||||
try:
|
|
||||||
self.logger.info("Getting historical traffic data",
|
|
||||||
lat=latitude, lon=longitude,
|
|
||||||
start=start_date, end=end_date, tenant_id=tenant_id)
|
|
||||||
|
|
||||||
# Validate date range
|
|
||||||
if start_date >= end_date:
|
|
||||||
self.logger.warning("Invalid date range", start=start_date, end=end_date)
|
|
||||||
return []
|
|
||||||
|
|
||||||
location_id = f"{latitude:.4f},{longitude:.4f}"
|
|
||||||
|
|
||||||
# Check database first if session provided
|
|
||||||
if db:
|
|
||||||
stmt = select(TrafficData).where(
|
|
||||||
and_(
|
|
||||||
TrafficData.location_id == location_id,
|
|
||||||
TrafficData.date >= start_date,
|
|
||||||
TrafficData.date <= end_date
|
|
||||||
)
|
|
||||||
).order_by(TrafficData.date)
|
|
||||||
|
|
||||||
result = await db.execute(stmt)
|
|
||||||
db_records = result.scalars().all()
|
|
||||||
|
|
||||||
if db_records:
|
|
||||||
self.logger.info("Historical traffic data found in database",
|
|
||||||
count=len(db_records))
|
|
||||||
return [self._convert_db_record_to_dict(record) for record in db_records]
|
|
||||||
|
|
||||||
# Delegate to universal client
|
|
||||||
traffic_data = await self.universal_client.get_historical_traffic(
|
|
||||||
latitude, longitude, start_date, end_date
|
|
||||||
)
|
|
||||||
|
|
||||||
if traffic_data:
|
|
||||||
# Add service metadata to each record
|
|
||||||
for record in traffic_data:
|
|
||||||
record['service_metadata'] = {
|
|
||||||
'request_timestamp': datetime.now().isoformat(),
|
|
||||||
'tenant_id': tenant_id,
|
|
||||||
'service_version': '2.0',
|
|
||||||
'query_location': {'latitude': latitude, 'longitude': longitude},
|
|
||||||
'date_range': {
|
|
||||||
'start': start_date.isoformat(),
|
|
||||||
'end': end_date.isoformat()
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
# Store in database if session provided
|
|
||||||
if db:
|
|
||||||
stored_count = await self._store_traffic_data_batch(
|
|
||||||
traffic_data, location_id, db
|
|
||||||
)
|
|
||||||
self.logger.info("Traffic data stored for re-training",
|
|
||||||
fetched=len(traffic_data), stored=stored_count,
|
|
||||||
location=location_id)
|
|
||||||
|
|
||||||
self.logger.info("Successfully retrieved historical traffic data",
|
|
||||||
lat=latitude, lon=longitude, records=len(traffic_data))
|
|
||||||
|
|
||||||
return traffic_data
|
|
||||||
else:
|
|
||||||
self.logger.info("No historical traffic data available",
|
|
||||||
lat=latitude, lon=longitude)
|
|
||||||
return []
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
self.logger.error("Error getting historical traffic data",
|
|
||||||
lat=latitude, lon=longitude, error=str(e))
|
|
||||||
return []
|
|
||||||
|
|
||||||
def _convert_db_record_to_dict(self, record: TrafficData) -> Dict[str, Any]:
|
|
||||||
"""Convert database record to dictionary format"""
|
|
||||||
return {
|
|
||||||
'date': record.date,
|
|
||||||
'traffic_volume': record.traffic_volume,
|
|
||||||
'pedestrian_count': record.pedestrian_count,
|
|
||||||
'congestion_level': record.congestion_level,
|
|
||||||
'average_speed': record.average_speed,
|
|
||||||
'source': record.source,
|
|
||||||
'location_id': record.location_id,
|
|
||||||
'raw_data': record.raw_data
|
|
||||||
}
|
|
||||||
|
|
||||||
async def get_traffic_events(
|
|
||||||
self,
|
|
||||||
latitude: float,
|
|
||||||
longitude: float,
|
|
||||||
radius_km: float = 5.0,
|
|
||||||
tenant_id: Optional[str] = None
|
|
||||||
) -> List[Dict[str, Any]]:
|
|
||||||
"""
|
|
||||||
Get traffic events and incidents for any supported location
|
|
||||||
|
|
||||||
Args:
|
|
||||||
latitude: Query location latitude
|
|
||||||
longitude: Query location longitude
|
|
||||||
radius_km: Search radius in kilometers
|
|
||||||
tenant_id: Optional tenant identifier
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
List of traffic events
|
|
||||||
"""
|
|
||||||
try:
|
|
||||||
self.logger.info("Getting traffic events",
|
|
||||||
lat=latitude, lon=longitude, radius=radius_km, tenant_id=tenant_id)
|
|
||||||
|
|
||||||
# Delegate to universal client
|
|
||||||
events = await self.universal_client.get_events(latitude, longitude, radius_km)
|
|
||||||
|
|
||||||
# Add metadata to events
|
|
||||||
for event in events:
|
|
||||||
event['service_metadata'] = {
|
|
||||||
'request_timestamp': datetime.now().isoformat(),
|
|
||||||
'tenant_id': tenant_id,
|
|
||||||
'service_version': '2.0',
|
|
||||||
'query_location': {'latitude': latitude, 'longitude': longitude},
|
|
||||||
'search_radius_km': radius_km
|
|
||||||
}
|
|
||||||
|
|
||||||
self.logger.info("Retrieved traffic events",
|
|
||||||
lat=latitude, lon=longitude, events=len(events))
|
|
||||||
|
|
||||||
return events
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
self.logger.error("Error getting traffic events",
|
|
||||||
lat=latitude, lon=longitude, error=str(e))
|
|
||||||
return []
|
|
||||||
|
|
||||||
def get_location_info(self, latitude: float, longitude: float) -> Dict[str, Any]:
|
|
||||||
"""
|
|
||||||
Get information about traffic data availability for location
|
|
||||||
|
|
||||||
Args:
|
|
||||||
latitude: Query location latitude
|
|
||||||
longitude: Query location longitude
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
Dict with location support information
|
|
||||||
"""
|
|
||||||
try:
|
|
||||||
info = self.universal_client.get_location_info(latitude, longitude)
|
|
||||||
|
|
||||||
# Add service layer information
|
|
||||||
info['service_layer'] = {
|
|
||||||
'version': '2.0',
|
|
||||||
'abstraction_level': 'universal',
|
|
||||||
'supported_operations': [
|
|
||||||
'current_traffic',
|
|
||||||
'historical_traffic',
|
|
||||||
'traffic_events',
|
|
||||||
'bulk_requests'
|
|
||||||
]
|
|
||||||
}
|
|
||||||
|
|
||||||
return info
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
self.logger.error("Error getting location info",
|
|
||||||
lat=latitude, lon=longitude, error=str(e))
|
|
||||||
return {
|
|
||||||
'supported': False,
|
|
||||||
'error': str(e),
|
|
||||||
'service_layer': {'version': '2.0'}
|
|
||||||
}
|
|
||||||
|
|
||||||
async def store_traffic_data(self,
|
|
||||||
latitude: float,
|
|
||||||
longitude: float,
|
|
||||||
traffic_data: Dict[str, Any],
|
|
||||||
db: AsyncSession) -> bool:
|
|
||||||
"""Store single traffic data record to database"""
|
|
||||||
try:
|
|
||||||
location_id = f"{latitude:.4f},{longitude:.4f}"
|
|
||||||
|
|
||||||
traffic_record = TrafficData(
|
|
||||||
location_id=location_id,
|
|
||||||
date=traffic_data.get("date", datetime.now()),
|
|
||||||
traffic_volume=traffic_data.get("traffic_volume"),
|
|
||||||
pedestrian_count=traffic_data.get("pedestrian_count"),
|
|
||||||
congestion_level=traffic_data.get("congestion_level"),
|
|
||||||
average_speed=traffic_data.get("average_speed"),
|
|
||||||
source=traffic_data.get("source", "madrid_opendata"),
|
|
||||||
raw_data=str(traffic_data) if traffic_data else None
|
|
||||||
)
|
|
||||||
|
|
||||||
db.add(traffic_record)
|
|
||||||
await db.commit()
|
|
||||||
|
|
||||||
logger.debug("Traffic data stored successfully", location_id=location_id)
|
|
||||||
return True
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to store traffic data", error=str(e))
|
|
||||||
await db.rollback()
|
|
||||||
return False
|
|
||||||
|
|
||||||
async def _store_traffic_data_batch(self,
|
|
||||||
traffic_data: List[Dict[str, Any]],
|
|
||||||
location_id: str,
|
|
||||||
db: AsyncSession) -> int:
|
|
||||||
"""Store batch of traffic data with enhanced validation and duplicate handling"""
|
|
||||||
stored_count = 0
|
|
||||||
|
|
||||||
try:
|
|
||||||
# Check for existing records to avoid duplicates
|
|
||||||
if traffic_data:
|
|
||||||
dates = [data.get('date') for data in traffic_data if data.get('date')]
|
|
||||||
if dates:
|
|
||||||
# Query existing records for this location and date range
|
|
||||||
existing_stmt = select(TrafficData.date).where(
|
|
||||||
and_(
|
|
||||||
TrafficData.location_id == location_id,
|
|
||||||
TrafficData.date.in_(dates)
|
|
||||||
)
|
|
||||||
)
|
|
||||||
result = await db.execute(existing_stmt)
|
|
||||||
existing_dates = {row[0] for row in result.fetchall()}
|
|
||||||
|
|
||||||
logger.debug(f"Found {len(existing_dates)} existing records for location {location_id}")
|
|
||||||
else:
|
|
||||||
existing_dates = set()
|
|
||||||
else:
|
|
||||||
existing_dates = set()
|
|
||||||
|
|
||||||
# Prepare batch of new records for bulk insert
|
|
||||||
batch_records = []
|
|
||||||
for data in traffic_data:
|
|
||||||
try:
|
|
||||||
record_date = data.get('date')
|
|
||||||
if not record_date or record_date in existing_dates:
|
|
||||||
continue # Skip duplicates
|
|
||||||
|
|
||||||
# Validate required fields
|
|
||||||
if not self._validate_traffic_data(data):
|
|
||||||
logger.warning("Invalid traffic data, skipping", data=data)
|
|
||||||
continue
|
|
||||||
|
|
||||||
# Prepare record data for bulk insert
|
|
||||||
record_data = {
|
|
||||||
'location_id': location_id,
|
|
||||||
'date': record_date,
|
|
||||||
'traffic_volume': data.get('traffic_volume'),
|
|
||||||
'pedestrian_count': data.get('pedestrian_count'),
|
|
||||||
'congestion_level': data.get('congestion_level'),
|
|
||||||
'average_speed': data.get('average_speed'),
|
|
||||||
'source': data.get('source', 'madrid_opendata'),
|
|
||||||
'raw_data': str(data)
|
|
||||||
}
|
|
||||||
batch_records.append(record_data)
|
|
||||||
|
|
||||||
except Exception as record_error:
|
|
||||||
logger.warning("Failed to prepare traffic record",
|
|
||||||
error=str(record_error), data=data)
|
|
||||||
continue
|
|
||||||
|
|
||||||
# Use efficient bulk insert instead of individual records
|
|
||||||
if batch_records:
|
|
||||||
# Process in chunks to avoid memory issues
|
|
||||||
chunk_size = 5000
|
|
||||||
for i in range(0, len(batch_records), chunk_size):
|
|
||||||
chunk = batch_records[i:i + chunk_size]
|
|
||||||
|
|
||||||
# Use SQLAlchemy bulk insert for maximum performance
|
|
||||||
await db.execute(
|
|
||||||
TrafficData.__table__.insert(),
|
|
||||||
chunk
|
|
||||||
)
|
|
||||||
await db.commit()
|
|
||||||
stored_count += len(chunk)
|
|
||||||
|
|
||||||
logger.debug(f"Bulk inserted {len(chunk)} records (total: {stored_count})")
|
|
||||||
|
|
||||||
logger.info(f"Successfully stored {stored_count} traffic records for location {location_id}")
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to store traffic data batch",
|
|
||||||
error=str(e), location_id=location_id)
|
|
||||||
await db.rollback()
|
|
||||||
|
|
||||||
return stored_count
|
|
||||||
|
|
||||||
def _validate_traffic_data(self, data: Dict[str, Any]) -> bool:
|
|
||||||
"""Validate traffic data before storage"""
|
|
||||||
required_fields = ['date']
|
|
||||||
|
|
||||||
# Check required fields
|
|
||||||
for field in required_fields:
|
|
||||||
if not data.get(field):
|
|
||||||
return False
|
|
||||||
|
|
||||||
# Validate data types and ranges
|
|
||||||
traffic_volume = data.get('traffic_volume')
|
|
||||||
if traffic_volume is not None and (traffic_volume < 0 or traffic_volume > 10000):
|
|
||||||
return False
|
|
||||||
|
|
||||||
pedestrian_count = data.get('pedestrian_count')
|
|
||||||
if pedestrian_count is not None and (pedestrian_count < 0 or pedestrian_count > 10000):
|
|
||||||
return False
|
|
||||||
|
|
||||||
average_speed = data.get('average_speed')
|
|
||||||
if average_speed is not None and (average_speed < 0 or average_speed > 200):
|
|
||||||
return False
|
|
||||||
|
|
||||||
congestion_level = data.get('congestion_level')
|
|
||||||
if congestion_level and congestion_level not in ['low', 'medium', 'high', 'blocked']:
|
|
||||||
return False
|
|
||||||
|
|
||||||
return True
|
|
||||||
|
|
||||||
async def get_stored_traffic_for_training(self,
|
|
||||||
latitude: float,
|
|
||||||
longitude: float,
|
|
||||||
start_date: datetime,
|
|
||||||
end_date: datetime,
|
|
||||||
db: AsyncSession) -> List[Dict[str, Any]]:
|
|
||||||
"""Retrieve stored traffic data specifically for training purposes"""
|
|
||||||
try:
|
|
||||||
location_id = f"{latitude:.4f},{longitude:.4f}"
|
|
||||||
|
|
||||||
stmt = select(TrafficData).where(
|
|
||||||
and_(
|
|
||||||
TrafficData.location_id == location_id,
|
|
||||||
TrafficData.date >= start_date,
|
|
||||||
TrafficData.date <= end_date
|
|
||||||
)
|
|
||||||
).order_by(TrafficData.date)
|
|
||||||
|
|
||||||
result = await db.execute(stmt)
|
|
||||||
records = result.scalars().all()
|
|
||||||
|
|
||||||
# Convert to training format
|
|
||||||
training_data = []
|
|
||||||
for record in records:
|
|
||||||
training_data.append({
|
|
||||||
'date': record.date,
|
|
||||||
'traffic_volume': record.traffic_volume,
|
|
||||||
'pedestrian_count': record.pedestrian_count,
|
|
||||||
'congestion_level': record.congestion_level,
|
|
||||||
'average_speed': record.average_speed,
|
|
||||||
'location_id': record.location_id,
|
|
||||||
'source': record.source,
|
|
||||||
'measurement_point_id': record.raw_data # Contains additional metadata
|
|
||||||
})
|
|
||||||
|
|
||||||
logger.info(f"Retrieved {len(training_data)} traffic records for training",
|
|
||||||
location_id=location_id, start=start_date, end=end_date)
|
|
||||||
|
|
||||||
return training_data
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to retrieve traffic data for training",
|
|
||||||
error=str(e), location_id=location_id)
|
|
||||||
return []
|
|
||||||
@@ -1,117 +0,0 @@
|
|||||||
# ================================================================
|
|
||||||
# services/data/alembic.ini
|
|
||||||
# ================================================================
|
|
||||||
# A generic, single database configuration.
|
|
||||||
|
|
||||||
[alembic]
|
|
||||||
# path to migration scripts
|
|
||||||
script_location = migrations
|
|
||||||
|
|
||||||
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
|
|
||||||
# Uncomment the line below if you want the files to be prepended with date and time
|
|
||||||
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s
|
|
||||||
|
|
||||||
# sys.path path, will be prepended to sys.path if present.
|
|
||||||
# defaults to the current working directory.
|
|
||||||
prepend_sys_path = .
|
|
||||||
|
|
||||||
# timezone to use when rendering the date within the migration file
|
|
||||||
# as well as the filename.
|
|
||||||
# If specified, requires the python-dateutil library that can be
|
|
||||||
# installed by adding `alembic[tz]` to the pip requirements
|
|
||||||
# string value is passed to dateutil.tz.gettz()
|
|
||||||
# leave blank for localtime
|
|
||||||
# timezone =
|
|
||||||
|
|
||||||
# max length of characters to apply to the
|
|
||||||
# "slug" field
|
|
||||||
# truncate_slug_length = 40
|
|
||||||
|
|
||||||
# set to 'true' to run the environment during
|
|
||||||
# the 'revision' command, regardless of autogenerate
|
|
||||||
# revision_environment = false
|
|
||||||
|
|
||||||
# set to 'true' to allow .pyc and .pyo files without
|
|
||||||
# a source .py file to be detected as revisions in the
|
|
||||||
# versions/ directory
|
|
||||||
# sourceless = false
|
|
||||||
|
|
||||||
# version number format. This value is passed to the Python
|
|
||||||
# datetime.datetime.strftime() method for formatting the creation date.
|
|
||||||
# For UTC time zone add 'utc' prefix (ex: utc%Y_%m_%d_%H%M )
|
|
||||||
version_num_format = %%(year)d%%(month).2d%%(day).2d_%%(hour).2d%%(minute).2d
|
|
||||||
|
|
||||||
# version path separator; As mentioned above, this is the character used to split
|
|
||||||
# version_locations. The default within new alembic.ini files is "os", which uses
|
|
||||||
# os.pathsep. If this key is omitted entirely, it falls back to the legacy
|
|
||||||
# behavior of splitting on spaces and/or commas.
|
|
||||||
# valid values for version_path_separator are:
|
|
||||||
#
|
|
||||||
# version_path_separator = :
|
|
||||||
# version_path_separator = ;
|
|
||||||
# version_path_separator = space
|
|
||||||
version_path_separator = os
|
|
||||||
|
|
||||||
# set to 'true' to search source files recursively
|
|
||||||
# in each "version_locations" directory
|
|
||||||
# new in Alembic version 1.10
|
|
||||||
# recursive_version_locations = false
|
|
||||||
|
|
||||||
# the output encoding used when revision files
|
|
||||||
# are written from script.py.mako
|
|
||||||
# output_encoding = utf-8
|
|
||||||
|
|
||||||
sqlalchemy.url = driver://user:pass@localhost/dbname
|
|
||||||
|
|
||||||
|
|
||||||
[post_write_hooks]
|
|
||||||
# post_write_hooks defines scripts or Python functions that are run
|
|
||||||
# on newly generated revision scripts. See the documentation for further
|
|
||||||
# detail and examples
|
|
||||||
|
|
||||||
# format using "black" - use the console_scripts runner, against the "black" entrypoint
|
|
||||||
# hooks = black
|
|
||||||
# black.type = console_scripts
|
|
||||||
# black.entrypoint = black
|
|
||||||
# black.options = -l 79 REVISION_SCRIPT_FILENAME
|
|
||||||
|
|
||||||
# lint with attempts to fix using "ruff" - use the exec runner, execute a binary
|
|
||||||
# hooks = ruff
|
|
||||||
# ruff.type = exec
|
|
||||||
# ruff.executable = %(here)s/.venv/bin/ruff
|
|
||||||
# ruff.options = --fix REVISION_SCRIPT_FILENAME
|
|
||||||
|
|
||||||
# Logging configuration
|
|
||||||
[loggers]
|
|
||||||
keys = root,sqlalchemy,alembic
|
|
||||||
|
|
||||||
[handlers]
|
|
||||||
keys = console
|
|
||||||
|
|
||||||
[formatters]
|
|
||||||
keys = generic
|
|
||||||
|
|
||||||
[logger_root]
|
|
||||||
level = WARN
|
|
||||||
handlers = console
|
|
||||||
qualname =
|
|
||||||
|
|
||||||
[logger_sqlalchemy]
|
|
||||||
level = WARN
|
|
||||||
handlers =
|
|
||||||
qualname = sqlalchemy.engine
|
|
||||||
|
|
||||||
[logger_alembic]
|
|
||||||
level = INFO
|
|
||||||
handlers =
|
|
||||||
qualname = alembic
|
|
||||||
|
|
||||||
[handler_console]
|
|
||||||
class = StreamHandler
|
|
||||||
args = (sys.stderr,)
|
|
||||||
level = NOTSET
|
|
||||||
formatter = generic
|
|
||||||
|
|
||||||
[formatter_generic]
|
|
||||||
format = %(levelname)-5.5s [%(name)s] %(message)s
|
|
||||||
datefmt = %H:%M:%S
|
|
||||||
@@ -1,68 +0,0 @@
|
|||||||
# ================================================================
|
|
||||||
# services/data/migrations/env.py
|
|
||||||
# ================================================================
|
|
||||||
"""Alembic environment configuration"""
|
|
||||||
|
|
||||||
import asyncio
|
|
||||||
from logging.config import fileConfig
|
|
||||||
from sqlalchemy import pool
|
|
||||||
from sqlalchemy.engine import Connection
|
|
||||||
from sqlalchemy.ext.asyncio import async_engine_from_config
|
|
||||||
from alembic import context
|
|
||||||
|
|
||||||
from app.core.config import settings
|
|
||||||
from app.core.database import Base
|
|
||||||
from app.models import sales, weather, traffic
|
|
||||||
|
|
||||||
# this is the Alembic Config object
|
|
||||||
config = context.config
|
|
||||||
|
|
||||||
# Interpret the config file for Python logging
|
|
||||||
if config.config_file_name is not None:
|
|
||||||
fileConfig(config.config_file_name)
|
|
||||||
|
|
||||||
# Set database URL
|
|
||||||
config.set_main_option("sqlalchemy.url", settings.DATABASE_URL.replace('+asyncpg', ''))
|
|
||||||
|
|
||||||
target_metadata = Base.metadata
|
|
||||||
|
|
||||||
def run_migrations_offline() -> None:
|
|
||||||
"""Run migrations in 'offline' mode."""
|
|
||||||
url = config.get_main_option("sqlalchemy.url")
|
|
||||||
context.configure(
|
|
||||||
url=url,
|
|
||||||
target_metadata=target_metadata,
|
|
||||||
literal_binds=True,
|
|
||||||
dialect_opts={"paramstyle": "named"},
|
|
||||||
)
|
|
||||||
|
|
||||||
with context.begin_transaction():
|
|
||||||
context.run_migrations()
|
|
||||||
|
|
||||||
def do_run_migrations(connection: Connection) -> None:
|
|
||||||
context.configure(connection=connection, target_metadata=target_metadata)
|
|
||||||
|
|
||||||
with context.begin_transaction():
|
|
||||||
context.run_migrations()
|
|
||||||
|
|
||||||
async def run_async_migrations() -> None:
|
|
||||||
"""Run migrations in 'online' mode with async engine."""
|
|
||||||
connectable = async_engine_from_config(
|
|
||||||
config.get_section(config.config_ini_section, {}),
|
|
||||||
prefix="sqlalchemy.",
|
|
||||||
poolclass=pool.NullPool,
|
|
||||||
)
|
|
||||||
|
|
||||||
async with connectable.connect() as connection:
|
|
||||||
await connection.run_sync(do_run_migrations)
|
|
||||||
|
|
||||||
await connectable.dispose()
|
|
||||||
|
|
||||||
def run_migrations_online() -> None:
|
|
||||||
"""Run migrations in 'online' mode."""
|
|
||||||
asyncio.run(run_async_migrations())
|
|
||||||
|
|
||||||
if context.is_offline_mode():
|
|
||||||
run_migrations_offline()
|
|
||||||
else:
|
|
||||||
run_migrations_online()
|
|
||||||
@@ -1,29 +0,0 @@
|
|||||||
# ================================================================
|
|
||||||
# services/data/migrations/script.py.mako
|
|
||||||
# ================================================================
|
|
||||||
"""${message}
|
|
||||||
|
|
||||||
Revision ID: ${up_revision}
|
|
||||||
Revises: ${down_revision | comma,n}
|
|
||||||
Create Date: ${create_date}
|
|
||||||
|
|
||||||
"""
|
|
||||||
from typing import Sequence, Union
|
|
||||||
|
|
||||||
from alembic import op
|
|
||||||
import sqlalchemy as sa
|
|
||||||
${imports if imports else ""}
|
|
||||||
|
|
||||||
# revision identifiers, used by Alembic.
|
|
||||||
revision: str = ${repr(up_revision)}
|
|
||||||
down_revision: Union[str, None] = ${repr(down_revision)}
|
|
||||||
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
|
|
||||||
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
|
|
||||||
|
|
||||||
|
|
||||||
def upgrade() -> None:
|
|
||||||
${upgrades if upgrades else "pass"}
|
|
||||||
|
|
||||||
|
|
||||||
def downgrade() -> None:
|
|
||||||
${downgrades if downgrades else "pass"}
|
|
||||||
@@ -1,54 +0,0 @@
|
|||||||
"""Create traffic_data table for storing traffic data for re-training
|
|
||||||
|
|
||||||
Revision ID: 001_traffic_data
|
|
||||||
Revises:
|
|
||||||
Create Date: 2025-01-08 12:00:00.000000
|
|
||||||
|
|
||||||
"""
|
|
||||||
from alembic import op
|
|
||||||
import sqlalchemy as sa
|
|
||||||
from sqlalchemy.dialects.postgresql import UUID
|
|
||||||
|
|
||||||
# revision identifiers, used by Alembic.
|
|
||||||
revision = '001_traffic_data'
|
|
||||||
down_revision = None
|
|
||||||
branch_labels = None
|
|
||||||
depends_on = None
|
|
||||||
|
|
||||||
|
|
||||||
def upgrade():
|
|
||||||
"""Create traffic_data table"""
|
|
||||||
op.create_table('traffic_data',
|
|
||||||
sa.Column('id', UUID(as_uuid=True), nullable=False, primary_key=True),
|
|
||||||
sa.Column('location_id', sa.String(100), nullable=False, index=True),
|
|
||||||
sa.Column('date', sa.DateTime(timezone=True), nullable=False, index=True),
|
|
||||||
sa.Column('traffic_volume', sa.Integer, nullable=True),
|
|
||||||
sa.Column('pedestrian_count', sa.Integer, nullable=True),
|
|
||||||
sa.Column('congestion_level', sa.String(20), nullable=True),
|
|
||||||
sa.Column('average_speed', sa.Float, nullable=True),
|
|
||||||
sa.Column('source', sa.String(50), nullable=False, server_default='madrid_opendata'),
|
|
||||||
sa.Column('raw_data', sa.Text, nullable=True),
|
|
||||||
sa.Column('created_at', sa.DateTime(timezone=True), nullable=False),
|
|
||||||
sa.Column('updated_at', sa.DateTime(timezone=True), nullable=False),
|
|
||||||
)
|
|
||||||
|
|
||||||
# Create index for efficient querying by location and date
|
|
||||||
op.create_index(
|
|
||||||
'idx_traffic_location_date',
|
|
||||||
'traffic_data',
|
|
||||||
['location_id', 'date']
|
|
||||||
)
|
|
||||||
|
|
||||||
# Create index for date range queries
|
|
||||||
op.create_index(
|
|
||||||
'idx_traffic_date_range',
|
|
||||||
'traffic_data',
|
|
||||||
['date']
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def downgrade():
|
|
||||||
"""Drop traffic_data table"""
|
|
||||||
op.drop_index('idx_traffic_date_range', table_name='traffic_data')
|
|
||||||
op.drop_index('idx_traffic_location_date', table_name='traffic_data')
|
|
||||||
op.drop_table('traffic_data')
|
|
||||||
@@ -1,49 +0,0 @@
|
|||||||
# ================================================================
|
|
||||||
# services/data/migrations/versions/20250727_add_timezone_to_datetime_columns.py
|
|
||||||
# ================================================================
|
|
||||||
"""Add timezone support to datetime columns
|
|
||||||
|
|
||||||
Revision ID: 20250727_193000
|
|
||||||
Revises:
|
|
||||||
Create Date: 2025-07-27 19:30:00.000000
|
|
||||||
|
|
||||||
"""
|
|
||||||
from alembic import op
|
|
||||||
import sqlalchemy as sa
|
|
||||||
from sqlalchemy.dialects import postgresql
|
|
||||||
|
|
||||||
# revision identifiers, used by Alembic.
|
|
||||||
revision = '20250727_193000'
|
|
||||||
down_revision = None # Replace with actual previous revision if exists
|
|
||||||
branch_labels = None
|
|
||||||
depends_on = None
|
|
||||||
|
|
||||||
def upgrade() -> None:
|
|
||||||
"""Convert TIMESTAMP WITHOUT TIME ZONE to TIMESTAMP WITH TIME ZONE"""
|
|
||||||
|
|
||||||
# Weather data table
|
|
||||||
op.execute("ALTER TABLE weather_data ALTER COLUMN date TYPE TIMESTAMP WITH TIME ZONE USING date AT TIME ZONE 'UTC'")
|
|
||||||
op.execute("ALTER TABLE weather_data ALTER COLUMN created_at TYPE TIMESTAMP WITH TIME ZONE USING created_at AT TIME ZONE 'UTC'")
|
|
||||||
|
|
||||||
# Weather forecasts table
|
|
||||||
op.execute("ALTER TABLE weather_forecasts ALTER COLUMN forecast_date TYPE TIMESTAMP WITH TIME ZONE USING forecast_date AT TIME ZONE 'UTC'")
|
|
||||||
op.execute("ALTER TABLE weather_forecasts ALTER COLUMN generated_at TYPE TIMESTAMP WITH TIME ZONE USING generated_at AT TIME ZONE 'UTC'")
|
|
||||||
|
|
||||||
# Traffic data table
|
|
||||||
op.execute("ALTER TABLE traffic_data ALTER COLUMN date TYPE TIMESTAMP WITH TIME ZONE USING date AT TIME ZONE 'UTC'")
|
|
||||||
op.execute("ALTER TABLE traffic_data ALTER COLUMN created_at TYPE TIMESTAMP WITH TIME ZONE USING created_at AT TIME ZONE 'UTC'")
|
|
||||||
|
|
||||||
def downgrade() -> None:
|
|
||||||
"""Convert TIMESTAMP WITH TIME ZONE back to TIMESTAMP WITHOUT TIME ZONE"""
|
|
||||||
|
|
||||||
# Weather data table
|
|
||||||
op.execute("ALTER TABLE weather_data ALTER COLUMN date TYPE TIMESTAMP WITHOUT TIME ZONE USING date AT TIME ZONE 'UTC'")
|
|
||||||
op.execute("ALTER TABLE weather_data ALTER COLUMN created_at TYPE TIMESTAMP WITHOUT TIME ZONE USING created_at AT TIME ZONE 'UTC'")
|
|
||||||
|
|
||||||
# Weather forecasts table
|
|
||||||
op.execute("ALTER TABLE weather_forecasts ALTER COLUMN forecast_date TYPE TIMESTAMP WITHOUT TIME ZONE USING forecast_date AT TIME ZONE 'UTC'")
|
|
||||||
op.execute("ALTER TABLE weather_forecasts ALTER COLUMN generated_at TYPE TIMESTAMP WITHOUT TIME ZONE USING generated_at AT TIME ZONE 'UTC'")
|
|
||||||
|
|
||||||
# Traffic data table
|
|
||||||
op.execute("ALTER TABLE traffic_data ALTER COLUMN date TYPE TIMESTAMP WITHOUT TIME ZONE USING date AT TIME ZONE 'UTC'")
|
|
||||||
op.execute("ALTER TABLE traffic_data ALTER COLUMN created_at TYPE TIMESTAMP WITHOUT TIME ZONE USING created_at AT TIME ZONE 'UTC'")
|
|
||||||
@@ -1,52 +0,0 @@
|
|||||||
# ================================================================
|
|
||||||
# services/data/requirements.txt - UPDATED
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
# FastAPI and web framework
|
|
||||||
fastapi==0.104.1
|
|
||||||
uvicorn[standard]==0.24.0
|
|
||||||
|
|
||||||
# Database
|
|
||||||
sqlalchemy[asyncio]==2.0.23
|
|
||||||
asyncpg==0.29.0
|
|
||||||
alembic==1.12.1
|
|
||||||
|
|
||||||
# Data validation
|
|
||||||
pydantic==2.5.0
|
|
||||||
pydantic-settings==2.1.0
|
|
||||||
|
|
||||||
# Cache and messaging
|
|
||||||
redis==5.0.1
|
|
||||||
aio-pika==9.3.1
|
|
||||||
|
|
||||||
# HTTP client
|
|
||||||
httpx==0.25.2
|
|
||||||
|
|
||||||
# Data processing
|
|
||||||
pandas==2.1.3
|
|
||||||
numpy==1.25.2
|
|
||||||
openpyxl==3.1.2 # For Excel (.xlsx) files
|
|
||||||
xlrd==2.0.1 # For Excel (.xls) files
|
|
||||||
python-multipart==0.0.6
|
|
||||||
|
|
||||||
# Monitoring and logging
|
|
||||||
prometheus-client==0.19.0
|
|
||||||
structlog==23.2.0
|
|
||||||
python-logstash==0.4.8
|
|
||||||
python-json-logger==2.0.4
|
|
||||||
|
|
||||||
# Security
|
|
||||||
python-jose[cryptography]==3.3.0
|
|
||||||
passlib[bcrypt]==1.7.4
|
|
||||||
|
|
||||||
# Testing
|
|
||||||
pytest==7.4.3
|
|
||||||
pytest-asyncio==0.21.1
|
|
||||||
pytest-cov==4.1.0
|
|
||||||
pytest-mock==3.12.0
|
|
||||||
pytest-xdist==3.5.0
|
|
||||||
pytest-timeout==2.2.0
|
|
||||||
psutil==5.9.8
|
|
||||||
|
|
||||||
# Cartographic projections and coordinate transformations library
|
|
||||||
pyproj==3.4.0
|
|
||||||
@@ -1,653 +0,0 @@
|
|||||||
# ================================================================
|
|
||||||
# services/data/tests/conftest.py - AEMET Test Configuration
|
|
||||||
# ================================================================
|
|
||||||
"""
|
|
||||||
Test configuration and fixtures for AEMET weather API client tests
|
|
||||||
Provides shared fixtures, mock data, and test utilities
|
|
||||||
"""
|
|
||||||
|
|
||||||
import pytest
|
|
||||||
import asyncio
|
|
||||||
from datetime import datetime, timedelta
|
|
||||||
from unittest.mock import Mock, AsyncMock, patch
|
|
||||||
from typing import Dict, List, Any, Generator
|
|
||||||
import os
|
|
||||||
|
|
||||||
# Import the classes we're testing
|
|
||||||
from app.external.aemet import (
|
|
||||||
AEMETClient,
|
|
||||||
WeatherDataParser,
|
|
||||||
SyntheticWeatherGenerator,
|
|
||||||
LocationService,
|
|
||||||
WeatherSource
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# PYTEST CONFIGURATION
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
@pytest.fixture(scope="session")
|
|
||||||
def event_loop():
|
|
||||||
"""Create an instance of the default event loop for the test session."""
|
|
||||||
loop = asyncio.get_event_loop_policy().new_event_loop()
|
|
||||||
yield loop
|
|
||||||
loop.close()
|
|
||||||
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# CLIENT AND SERVICE FIXTURES
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def aemet_client():
|
|
||||||
"""Create AEMET client instance for testing"""
|
|
||||||
return AEMETClient()
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def weather_parser():
|
|
||||||
"""Create WeatherDataParser instance for testing"""
|
|
||||||
return WeatherDataParser()
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def synthetic_generator():
|
|
||||||
"""Create SyntheticWeatherGenerator instance for testing"""
|
|
||||||
return SyntheticWeatherGenerator()
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def location_service():
|
|
||||||
"""Create LocationService instance for testing"""
|
|
||||||
return LocationService()
|
|
||||||
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# COORDINATE AND LOCATION FIXTURES
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def madrid_coords():
|
|
||||||
"""Standard Madrid coordinates for testing"""
|
|
||||||
return (40.4168, -3.7038) # Madrid city center
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def madrid_coords_variants():
|
|
||||||
"""Various Madrid area coordinates for testing"""
|
|
||||||
return {
|
|
||||||
"center": (40.4168, -3.7038), # Madrid center
|
|
||||||
"north": (40.4677, -3.5552), # Madrid north (near station)
|
|
||||||
"south": (40.2987, -3.7216), # Madrid south (near station)
|
|
||||||
"east": (40.4200, -3.6500), # Madrid east
|
|
||||||
"west": (40.4100, -3.7500), # Madrid west
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def invalid_coords():
|
|
||||||
"""Invalid coordinates for error testing"""
|
|
||||||
return [
|
|
||||||
(200, 200), # Out of range
|
|
||||||
(-200, -200), # Out of range
|
|
||||||
(0, 0), # Not in Madrid area
|
|
||||||
(50, 10), # Europe but not Madrid
|
|
||||||
(None, None), # None values
|
|
||||||
]
|
|
||||||
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# DATE AND TIME FIXTURES
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def test_dates():
|
|
||||||
"""Standard date ranges for testing"""
|
|
||||||
now = datetime.now()
|
|
||||||
return {
|
|
||||||
"now": now,
|
|
||||||
"yesterday": now - timedelta(days=1),
|
|
||||||
"last_week": now - timedelta(days=7),
|
|
||||||
"last_month": now - timedelta(days=30),
|
|
||||||
"last_quarter": now - timedelta(days=90),
|
|
||||||
"one_year_ago": now - timedelta(days=365),
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def historical_date_ranges():
|
|
||||||
"""Historical date ranges for testing"""
|
|
||||||
end_date = datetime.now()
|
|
||||||
return {
|
|
||||||
"one_day": {
|
|
||||||
"start": end_date - timedelta(days=1),
|
|
||||||
"end": end_date,
|
|
||||||
"expected_days": 1
|
|
||||||
},
|
|
||||||
"one_week": {
|
|
||||||
"start": end_date - timedelta(days=7),
|
|
||||||
"end": end_date,
|
|
||||||
"expected_days": 7
|
|
||||||
},
|
|
||||||
"one_month": {
|
|
||||||
"start": end_date - timedelta(days=30),
|
|
||||||
"end": end_date,
|
|
||||||
"expected_days": 30
|
|
||||||
},
|
|
||||||
"large_range": {
|
|
||||||
"start": end_date - timedelta(days=65),
|
|
||||||
"end": end_date,
|
|
||||||
"expected_days": 65
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# MOCK API RESPONSE FIXTURES
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def mock_aemet_api_response():
|
|
||||||
"""Mock AEMET API initial response structure"""
|
|
||||||
return {
|
|
||||||
"datos": "https://opendata.aemet.es/opendata/sh/12345abcdef",
|
|
||||||
"metadatos": "https://opendata.aemet.es/opendata/sh/metadata123"
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def mock_aemet_error_response():
|
|
||||||
"""Mock AEMET API error response"""
|
|
||||||
return {
|
|
||||||
"descripcion": "Error en la petición",
|
|
||||||
"estado": 404
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# WEATHER DATA FIXTURES
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def mock_current_weather_data():
|
|
||||||
"""Mock current weather data from AEMET API"""
|
|
||||||
return {
|
|
||||||
"idema": "3195", # Station ID
|
|
||||||
"ubi": "MADRID", # Location
|
|
||||||
"fint": "2025-07-24T14:00:00", # Observation time
|
|
||||||
"ta": 18.5, # Temperature (°C)
|
|
||||||
"tamin": 12.3, # Min temperature
|
|
||||||
"tamax": 25.7, # Max temperature
|
|
||||||
"hr": 65.0, # Humidity (%)
|
|
||||||
"prec": 0.0, # Precipitation (mm)
|
|
||||||
"vv": 12.0, # Wind speed (km/h)
|
|
||||||
"dv": 180, # Wind direction (degrees)
|
|
||||||
"pres": 1015.2, # Pressure (hPa)
|
|
||||||
"presMax": 1018.5, # Max pressure
|
|
||||||
"presMin": 1012.1, # Min pressure
|
|
||||||
"descripcion": "Despejado" # Description
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def mock_forecast_data():
|
|
||||||
"""Mock forecast data from AEMET API"""
|
|
||||||
return [{
|
|
||||||
"origen": {
|
|
||||||
"productor": "Agencia Estatal de Meteorología - AEMET"
|
|
||||||
},
|
|
||||||
"elaborado": "2025-07-24T12:00:00UTC",
|
|
||||||
"nombre": "Madrid",
|
|
||||||
"provincia": "Madrid",
|
|
||||||
"prediccion": {
|
|
||||||
"dia": [
|
|
||||||
{
|
|
||||||
"fecha": "2025-07-25T00:00:00",
|
|
||||||
"temperatura": {
|
|
||||||
"maxima": 28,
|
|
||||||
"minima": 15,
|
|
||||||
"dato": [
|
|
||||||
{"value": 15, "hora": 6},
|
|
||||||
{"value": 28, "hora": 15}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"sensTermica": {
|
|
||||||
"maxima": 30,
|
|
||||||
"minima": 16
|
|
||||||
},
|
|
||||||
"humedadRelativa": {
|
|
||||||
"maxima": 85,
|
|
||||||
"minima": 45,
|
|
||||||
"dato": [
|
|
||||||
{"value": 85, "hora": 6},
|
|
||||||
{"value": 45, "hora": 15}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"probPrecipitacion": [
|
|
||||||
{"value": 10, "periodo": "00-24"}
|
|
||||||
],
|
|
||||||
"viento": [
|
|
||||||
{
|
|
||||||
"direccion": ["N"],
|
|
||||||
"velocidad": [15],
|
|
||||||
"periodo": "00-24"
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"estadoCielo": [
|
|
||||||
{
|
|
||||||
"value": "11",
|
|
||||||
"descripcion": "Despejado",
|
|
||||||
"periodo": "00-24"
|
|
||||||
}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"fecha": "2025-07-26T00:00:00",
|
|
||||||
"temperatura": {
|
|
||||||
"maxima": 30,
|
|
||||||
"minima": 17
|
|
||||||
},
|
|
||||||
"probPrecipitacion": [
|
|
||||||
{"value": 5, "periodo": "00-24"}
|
|
||||||
],
|
|
||||||
"viento": [
|
|
||||||
{
|
|
||||||
"direccion": ["NE"],
|
|
||||||
"velocidad": [10],
|
|
||||||
"periodo": "00-24"
|
|
||||||
}
|
|
||||||
]
|
|
||||||
}
|
|
||||||
]
|
|
||||||
}
|
|
||||||
}]
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def mock_historical_data():
|
|
||||||
"""Mock historical weather data from AEMET API"""
|
|
||||||
return [
|
|
||||||
{
|
|
||||||
"indicativo": "3195",
|
|
||||||
"nombre": "MADRID",
|
|
||||||
"fecha": "2025-07-20",
|
|
||||||
"tmax": 25.2,
|
|
||||||
"horatmax": "1530",
|
|
||||||
"tmin": 14.8,
|
|
||||||
"horatmin": "0630",
|
|
||||||
"tmed": 20.0,
|
|
||||||
"prec": 0.0,
|
|
||||||
"racha": 25.0,
|
|
||||||
"horaracha": "1445",
|
|
||||||
"sol": 8.5,
|
|
||||||
"presMax": 1018.5,
|
|
||||||
"horaPresMax": "1000",
|
|
||||||
"presMin": 1012.3,
|
|
||||||
"horaPresMin": "1700",
|
|
||||||
"hr": 58,
|
|
||||||
"velmedia": 8.5,
|
|
||||||
"dir": "180"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"indicativo": "3195",
|
|
||||||
"nombre": "MADRID",
|
|
||||||
"fecha": "2025-07-21",
|
|
||||||
"tmax": 27.1,
|
|
||||||
"horatmax": "1615",
|
|
||||||
"tmin": 16.2,
|
|
||||||
"horatmin": "0700",
|
|
||||||
"tmed": 21.6,
|
|
||||||
"prec": 2.5,
|
|
||||||
"racha": 30.0,
|
|
||||||
"horaracha": "1330",
|
|
||||||
"sol": 6.2,
|
|
||||||
"presMax": 1015.8,
|
|
||||||
"horaPresMax": "0930",
|
|
||||||
"presMin": 1010.1,
|
|
||||||
"horaPresMin": "1800",
|
|
||||||
"hr": 72,
|
|
||||||
"velmedia": 12.0,
|
|
||||||
"dir": "225"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"indicativo": "3195",
|
|
||||||
"nombre": "MADRID",
|
|
||||||
"fecha": "2025-07-22",
|
|
||||||
"tmax": 23.8,
|
|
||||||
"horatmax": "1500",
|
|
||||||
"tmin": 13.5,
|
|
||||||
"horatmin": "0615",
|
|
||||||
"tmed": 18.7,
|
|
||||||
"prec": 0.2,
|
|
||||||
"racha": 22.0,
|
|
||||||
"horaracha": "1200",
|
|
||||||
"sol": 7.8,
|
|
||||||
"presMax": 1020.2,
|
|
||||||
"horaPresMax": "1100",
|
|
||||||
"presMin": 1014.7,
|
|
||||||
"horaPresMin": "1900",
|
|
||||||
"hr": 63,
|
|
||||||
"velmedia": 9.2,
|
|
||||||
"dir": "270"
|
|
||||||
}
|
|
||||||
]
|
|
||||||
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# EXPECTED RESULT FIXTURES
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def expected_current_weather_structure():
|
|
||||||
"""Expected structure for current weather results"""
|
|
||||||
return {
|
|
||||||
"required_fields": [
|
|
||||||
"date", "temperature", "precipitation", "humidity",
|
|
||||||
"wind_speed", "pressure", "description", "source"
|
|
||||||
],
|
|
||||||
"field_types": {
|
|
||||||
"date": datetime,
|
|
||||||
"temperature": (int, float),
|
|
||||||
"precipitation": (int, float),
|
|
||||||
"humidity": (int, float),
|
|
||||||
"wind_speed": (int, float),
|
|
||||||
"pressure": (int, float),
|
|
||||||
"description": str,
|
|
||||||
"source": str
|
|
||||||
},
|
|
||||||
"valid_ranges": {
|
|
||||||
"temperature": (-30, 50),
|
|
||||||
"precipitation": (0, 200),
|
|
||||||
"humidity": (0, 100),
|
|
||||||
"wind_speed": (0, 200),
|
|
||||||
"pressure": (900, 1100)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def expected_forecast_structure():
|
|
||||||
"""Expected structure for forecast results"""
|
|
||||||
return {
|
|
||||||
"required_fields": [
|
|
||||||
"forecast_date", "generated_at", "temperature", "precipitation",
|
|
||||||
"humidity", "wind_speed", "description", "source"
|
|
||||||
],
|
|
||||||
"field_types": {
|
|
||||||
"forecast_date": datetime,
|
|
||||||
"generated_at": datetime,
|
|
||||||
"temperature": (int, float),
|
|
||||||
"precipitation": (int, float),
|
|
||||||
"humidity": (int, float),
|
|
||||||
"wind_speed": (int, float),
|
|
||||||
"description": str,
|
|
||||||
"source": str
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def expected_historical_structure():
|
|
||||||
"""Expected structure for historical weather results"""
|
|
||||||
return {
|
|
||||||
"required_fields": [
|
|
||||||
"date", "temperature", "precipitation", "humidity",
|
|
||||||
"wind_speed", "pressure", "description", "source"
|
|
||||||
],
|
|
||||||
"field_types": {
|
|
||||||
"date": datetime,
|
|
||||||
"temperature": (int, float, type(None)),
|
|
||||||
"precipitation": (int, float),
|
|
||||||
"humidity": (int, float, type(None)),
|
|
||||||
"wind_speed": (int, float, type(None)),
|
|
||||||
"pressure": (int, float, type(None)),
|
|
||||||
"description": str,
|
|
||||||
"source": str
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# MOCK AND PATCH FIXTURES
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def mock_successful_api_calls():
|
|
||||||
"""Mock successful AEMET API calls"""
|
|
||||||
def _mock_api_calls(client, response_data, fetch_data):
|
|
||||||
with patch.object(client, '_get', new_callable=AsyncMock) as mock_get, \
|
|
||||||
patch.object(client, '_fetch_from_url', new_callable=AsyncMock) as mock_fetch:
|
|
||||||
|
|
||||||
mock_get.return_value = response_data
|
|
||||||
mock_fetch.return_value = fetch_data
|
|
||||||
|
|
||||||
return mock_get, mock_fetch
|
|
||||||
|
|
||||||
return _mock_api_calls
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def mock_failed_api_calls():
|
|
||||||
"""Mock failed AEMET API calls"""
|
|
||||||
def _mock_failed_calls(client, error_type="network"):
|
|
||||||
if error_type == "network":
|
|
||||||
return patch.object(client, '_get', side_effect=Exception("Network error"))
|
|
||||||
elif error_type == "timeout":
|
|
||||||
return patch.object(client, '_get', side_effect=asyncio.TimeoutError("Request timeout"))
|
|
||||||
elif error_type == "invalid_response":
|
|
||||||
return patch.object(client, '_get', new_callable=AsyncMock, return_value=None)
|
|
||||||
else:
|
|
||||||
return patch.object(client, '_get', new_callable=AsyncMock, return_value={"error": "API error"})
|
|
||||||
|
|
||||||
return _mock_failed_calls
|
|
||||||
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# VALIDATION HELPER FIXTURES
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def weather_data_validator():
|
|
||||||
"""Weather data validation helper functions"""
|
|
||||||
|
|
||||||
def validate_weather_record(record: Dict[str, Any], expected_structure: Dict[str, Any]) -> None:
|
|
||||||
"""Validate a weather record against expected structure"""
|
|
||||||
# Check required fields
|
|
||||||
for field in expected_structure["required_fields"]:
|
|
||||||
assert field in record, f"Missing required field: {field}"
|
|
||||||
|
|
||||||
# Check field types
|
|
||||||
for field, expected_type in expected_structure["field_types"].items():
|
|
||||||
if field in record and record[field] is not None:
|
|
||||||
assert isinstance(record[field], expected_type), f"Field {field} has wrong type: {type(record[field])}"
|
|
||||||
|
|
||||||
# Check valid ranges where applicable
|
|
||||||
if "valid_ranges" in expected_structure:
|
|
||||||
for field, (min_val, max_val) in expected_structure["valid_ranges"].items():
|
|
||||||
if field in record and record[field] is not None:
|
|
||||||
value = record[field]
|
|
||||||
assert min_val <= value <= max_val, f"Field {field} value {value} outside valid range [{min_val}, {max_val}]"
|
|
||||||
|
|
||||||
def validate_weather_list(records: List[Dict[str, Any]], expected_structure: Dict[str, Any]) -> None:
|
|
||||||
"""Validate a list of weather records"""
|
|
||||||
assert isinstance(records, list), "Records should be a list"
|
|
||||||
|
|
||||||
for i, record in enumerate(records):
|
|
||||||
try:
|
|
||||||
validate_weather_record(record, expected_structure)
|
|
||||||
except AssertionError as e:
|
|
||||||
raise AssertionError(f"Record {i} validation failed: {e}")
|
|
||||||
|
|
||||||
def validate_date_sequence(records: List[Dict[str, Any]], date_field: str = "date") -> None:
|
|
||||||
"""Validate that dates in records are in chronological order"""
|
|
||||||
dates = [r[date_field] for r in records if date_field in r and r[date_field] is not None]
|
|
||||||
|
|
||||||
if len(dates) > 1:
|
|
||||||
assert dates == sorted(dates), "Dates should be in chronological order"
|
|
||||||
|
|
||||||
return {
|
|
||||||
"validate_record": validate_weather_record,
|
|
||||||
"validate_list": validate_weather_list,
|
|
||||||
"validate_dates": validate_date_sequence
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# PERFORMANCE TESTING FIXTURES
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def performance_tracker():
|
|
||||||
"""Performance tracking utilities for tests"""
|
|
||||||
|
|
||||||
class PerformanceTracker:
|
|
||||||
def __init__(self):
|
|
||||||
self.start_time = None
|
|
||||||
self.measurements = {}
|
|
||||||
|
|
||||||
def start(self, operation_name: str = "default"):
|
|
||||||
self.start_time = datetime.now()
|
|
||||||
self.operation_name = operation_name
|
|
||||||
|
|
||||||
def stop(self) -> float:
|
|
||||||
if self.start_time:
|
|
||||||
duration = (datetime.now() - self.start_time).total_seconds() * 1000
|
|
||||||
self.measurements[self.operation_name] = duration
|
|
||||||
return duration
|
|
||||||
return 0.0
|
|
||||||
|
|
||||||
def assert_performance(self, max_duration_ms: float, operation_name: str = "default"):
|
|
||||||
duration = self.measurements.get(operation_name, float('inf'))
|
|
||||||
assert duration <= max_duration_ms, f"Operation {operation_name} took {duration:.0f}ms, expected <= {max_duration_ms}ms"
|
|
||||||
|
|
||||||
return PerformanceTracker()
|
|
||||||
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# INTEGRATION TEST FIXTURES
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def integration_test_config():
|
|
||||||
"""Configuration for integration tests"""
|
|
||||||
return {
|
|
||||||
"api_timeout_ms": 5000,
|
|
||||||
"max_retries": 3,
|
|
||||||
"test_api_key": os.getenv("AEMET_API_KEY_TEST", ""),
|
|
||||||
"skip_real_api_tests": os.getenv("SKIP_REAL_API_TESTS", "false").lower() == "true",
|
|
||||||
"madrid_test_coords": (40.4168, -3.7038),
|
|
||||||
"performance_thresholds": {
|
|
||||||
"current_weather_ms": 5000,
|
|
||||||
"forecast_ms": 5000,
|
|
||||||
"historical_ms": 10000
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# TEST REPORTING FIXTURES
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def test_reporter():
|
|
||||||
"""Test reporting utilities"""
|
|
||||||
|
|
||||||
class TestReporter:
|
|
||||||
def __init__(self):
|
|
||||||
self.results = []
|
|
||||||
|
|
||||||
def log_success(self, test_name: str, details: str = ""):
|
|
||||||
message = f"✅ {test_name}"
|
|
||||||
if details:
|
|
||||||
message += f" - {details}"
|
|
||||||
print(message)
|
|
||||||
self.results.append({"test": test_name, "status": "PASS", "details": details})
|
|
||||||
|
|
||||||
def log_failure(self, test_name: str, error: str = ""):
|
|
||||||
message = f"❌ {test_name}"
|
|
||||||
if error:
|
|
||||||
message += f" - {error}"
|
|
||||||
print(message)
|
|
||||||
self.results.append({"test": test_name, "status": "FAIL", "error": error})
|
|
||||||
|
|
||||||
def log_info(self, test_name: str, info: str = ""):
|
|
||||||
message = f"ℹ️ {test_name}"
|
|
||||||
if info:
|
|
||||||
message += f" - {info}"
|
|
||||||
print(message)
|
|
||||||
self.results.append({"test": test_name, "status": "INFO", "info": info})
|
|
||||||
|
|
||||||
def summary(self):
|
|
||||||
passed = len([r for r in self.results if r["status"] == "PASS"])
|
|
||||||
failed = len([r for r in self.results if r["status"] == "FAIL"])
|
|
||||||
print(f"\n📊 Test Summary: {passed} passed, {failed} failed")
|
|
||||||
return passed, failed
|
|
||||||
|
|
||||||
return TestReporter()
|
|
||||||
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# CLEANUP FIXTURES
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
@pytest.fixture(autouse=True)
|
|
||||||
def cleanup_after_test():
|
|
||||||
"""Automatic cleanup after each test"""
|
|
||||||
yield
|
|
||||||
# Add any cleanup logic here
|
|
||||||
# For example, clearing caches, resetting global state, etc.
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# HELPER FUNCTIONS
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
def assert_weather_data_structure(data: Dict[str, Any], data_type: str = "current"):
|
|
||||||
"""Assert that weather data has the correct structure"""
|
|
||||||
if data_type == "current":
|
|
||||||
required_fields = ["date", "temperature", "precipitation", "humidity", "wind_speed", "pressure", "description", "source"]
|
|
||||||
elif data_type == "forecast":
|
|
||||||
required_fields = ["forecast_date", "generated_at", "temperature", "precipitation", "humidity", "wind_speed", "description", "source"]
|
|
||||||
elif data_type == "historical":
|
|
||||||
required_fields = ["date", "temperature", "precipitation", "humidity", "wind_speed", "pressure", "description", "source"]
|
|
||||||
else:
|
|
||||||
raise ValueError(f"Unknown data type: {data_type}")
|
|
||||||
|
|
||||||
for field in required_fields:
|
|
||||||
assert field in data, f"Missing required field: {field}"
|
|
||||||
|
|
||||||
# Validate source
|
|
||||||
valid_sources = [WeatherSource.AEMET.value, WeatherSource.SYNTHETIC.value, WeatherSource.DEFAULT.value]
|
|
||||||
assert data["source"] in valid_sources, f"Invalid source: {data['source']}"
|
|
||||||
|
|
||||||
|
|
||||||
def assert_forecast_list_structure(forecast_list: List[Dict[str, Any]], expected_days: int):
|
|
||||||
"""Assert that forecast list has correct structure"""
|
|
||||||
assert isinstance(forecast_list, list), "Forecast should be a list"
|
|
||||||
assert len(forecast_list) == expected_days, f"Expected {expected_days} forecast days, got {len(forecast_list)}"
|
|
||||||
|
|
||||||
for i, day in enumerate(forecast_list):
|
|
||||||
assert_weather_data_structure(day, "forecast")
|
|
||||||
|
|
||||||
# Check date progression
|
|
||||||
if len(forecast_list) > 1:
|
|
||||||
for i in range(1, len(forecast_list)):
|
|
||||||
prev_date = forecast_list[i-1]["forecast_date"]
|
|
||||||
curr_date = forecast_list[i]["forecast_date"]
|
|
||||||
date_diff = (curr_date - prev_date).days
|
|
||||||
assert date_diff == 1, f"Forecast dates should be consecutive, got {date_diff} day difference"
|
|
||||||
|
|
||||||
|
|
||||||
def assert_historical_list_structure(historical_list: List[Dict[str, Any]]):
|
|
||||||
"""Assert that historical list has correct structure"""
|
|
||||||
assert isinstance(historical_list, list), "Historical data should be a list"
|
|
||||||
|
|
||||||
for i, record in enumerate(historical_list):
|
|
||||||
assert_weather_data_structure(record, "historical")
|
|
||||||
|
|
||||||
# Check date ordering
|
|
||||||
dates = [r["date"] for r in historical_list if "date" in r]
|
|
||||||
if len(dates) > 1:
|
|
||||||
assert dates == sorted(dates), "Historical dates should be in chronological order"
|
|
||||||
@@ -1,44 +0,0 @@
|
|||||||
[tool:pytest]
|
|
||||||
# pytest.ini - Configuration file for AEMET tests
|
|
||||||
|
|
||||||
# Minimum version requirements
|
|
||||||
minversion = 6.0
|
|
||||||
|
|
||||||
# Add options
|
|
||||||
addopts =
|
|
||||||
-ra
|
|
||||||
--strict-markers
|
|
||||||
--strict-config
|
|
||||||
--disable-warnings
|
|
||||||
--tb=short
|
|
||||||
-v
|
|
||||||
|
|
||||||
# Test discovery
|
|
||||||
testpaths = tests
|
|
||||||
python_files = test_*.py
|
|
||||||
python_classes = Test*
|
|
||||||
python_functions = test_*
|
|
||||||
|
|
||||||
# Async support
|
|
||||||
asyncio_mode = auto
|
|
||||||
|
|
||||||
# Markers
|
|
||||||
markers =
|
|
||||||
unit: Unit tests
|
|
||||||
integration: Integration tests
|
|
||||||
api: API tests
|
|
||||||
performance: Performance tests
|
|
||||||
slow: Slow tests
|
|
||||||
asyncio: Async tests
|
|
||||||
|
|
||||||
# Logging
|
|
||||||
log_cli = true
|
|
||||||
log_cli_level = INFO
|
|
||||||
log_cli_format = %(asctime)s [%(levelname)8s] %(name)s: %(message)s
|
|
||||||
log_cli_date_format = %Y-%m-%d %H:%M:%S
|
|
||||||
|
|
||||||
# Filtering
|
|
||||||
filterwarnings =
|
|
||||||
ignore::DeprecationWarning
|
|
||||||
ignore::PendingDeprecationWarning
|
|
||||||
ignore::PytestUnhandledCoroutineWarning
|
|
||||||
@@ -1,677 +0,0 @@
|
|||||||
# ================================================================
|
|
||||||
# services/data/tests/test_aemet.py
|
|
||||||
# ================================================================
|
|
||||||
"""
|
|
||||||
Comprehensive test suite for AEMET weather API client
|
|
||||||
Following the same patterns as test_madrid_opendata.py
|
|
||||||
"""
|
|
||||||
|
|
||||||
import pytest
|
|
||||||
import asyncio
|
|
||||||
from datetime import datetime, timedelta
|
|
||||||
from unittest.mock import Mock, patch, AsyncMock
|
|
||||||
import math
|
|
||||||
from typing import Dict, List, Any
|
|
||||||
|
|
||||||
from app.external.aemet import (
|
|
||||||
AEMETClient,
|
|
||||||
WeatherDataParser,
|
|
||||||
SyntheticWeatherGenerator,
|
|
||||||
LocationService,
|
|
||||||
AEMETConstants,
|
|
||||||
WeatherSource,
|
|
||||||
WeatherStation,
|
|
||||||
GeographicBounds
|
|
||||||
)
|
|
||||||
|
|
||||||
# Configure pytest-asyncio
|
|
||||||
pytestmark = pytest.mark.asyncio
|
|
||||||
|
|
||||||
|
|
||||||
class TestAEMETClient:
|
|
||||||
"""Main test class for AEMET API client functionality"""
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def client(self):
|
|
||||||
"""Create AEMET client instance for testing"""
|
|
||||||
return AEMETClient()
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def madrid_coords(self):
|
|
||||||
"""Standard Madrid coordinates for testing"""
|
|
||||||
return (40.4168, -3.7038) # Madrid city center
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def mock_aemet_response(self):
|
|
||||||
"""Mock AEMET API response structure"""
|
|
||||||
return {
|
|
||||||
"datos": "https://opendata.aemet.es/opendata/sh/12345",
|
|
||||||
"metadatos": "https://opendata.aemet.es/opendata/sh/metadata"
|
|
||||||
}
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def mock_weather_data(self):
|
|
||||||
"""Mock current weather data from AEMET"""
|
|
||||||
return {
|
|
||||||
"ta": 18.5, # Temperature
|
|
||||||
"prec": 0.0, # Precipitation
|
|
||||||
"hr": 65.0, # Humidity
|
|
||||||
"vv": 12.0, # Wind speed
|
|
||||||
"pres": 1015.2, # Pressure
|
|
||||||
"descripcion": "Despejado"
|
|
||||||
}
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def mock_forecast_data(self):
|
|
||||||
"""Mock forecast data from AEMET"""
|
|
||||||
return [{
|
|
||||||
"prediccion": {
|
|
||||||
"dia": [
|
|
||||||
{
|
|
||||||
"fecha": "2025-07-25T00:00:00",
|
|
||||||
"temperatura": {
|
|
||||||
"maxima": 28,
|
|
||||||
"minima": 15
|
|
||||||
},
|
|
||||||
"probPrecipitacion": [
|
|
||||||
{"value": 10, "periodo": "00-24"}
|
|
||||||
],
|
|
||||||
"viento": [
|
|
||||||
{"velocidad": [15], "direccion": ["N"]}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"fecha": "2025-07-26T00:00:00",
|
|
||||||
"temperatura": {
|
|
||||||
"maxima": 30,
|
|
||||||
"minima": 17
|
|
||||||
},
|
|
||||||
"probPrecipitacion": [
|
|
||||||
{"value": 5, "periodo": "00-24"}
|
|
||||||
],
|
|
||||||
"viento": [
|
|
||||||
{"velocidad": [10], "direccion": ["NE"]}
|
|
||||||
]
|
|
||||||
}
|
|
||||||
]
|
|
||||||
}
|
|
||||||
}]
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def mock_historical_data(self):
|
|
||||||
"""Mock historical weather data from AEMET"""
|
|
||||||
return [
|
|
||||||
{
|
|
||||||
"fecha": "2025-07-20",
|
|
||||||
"tmax": 25.2,
|
|
||||||
"tmin": 14.8,
|
|
||||||
"prec": 0.0,
|
|
||||||
"hr": 58,
|
|
||||||
"velmedia": 8.5,
|
|
||||||
"presMax": 1018.5,
|
|
||||||
"presMin": 1012.3
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"fecha": "2025-07-21",
|
|
||||||
"tmax": 27.1,
|
|
||||||
"tmin": 16.2,
|
|
||||||
"prec": 2.5,
|
|
||||||
"hr": 72,
|
|
||||||
"velmedia": 12.0,
|
|
||||||
"presMax": 1015.8,
|
|
||||||
"presMin": 1010.1
|
|
||||||
}
|
|
||||||
]
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# CURRENT WEATHER TESTS
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
async def test_get_current_weather_success(self, client, madrid_coords, mock_aemet_response, mock_weather_data):
|
|
||||||
"""Test successful current weather retrieval"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
with patch.object(client, '_get', new_callable=AsyncMock) as mock_get, \
|
|
||||||
patch.object(client, '_fetch_from_url', new_callable=AsyncMock) as mock_fetch:
|
|
||||||
|
|
||||||
mock_get.return_value = mock_aemet_response
|
|
||||||
mock_fetch.return_value = [mock_weather_data]
|
|
||||||
|
|
||||||
result = await client.get_current_weather(lat, lon)
|
|
||||||
|
|
||||||
# Validate result structure
|
|
||||||
assert result is not None, "Should return weather data"
|
|
||||||
assert isinstance(result, dict), "Result should be a dictionary"
|
|
||||||
|
|
||||||
# Check required fields
|
|
||||||
required_fields = ['date', 'temperature', 'precipitation', 'humidity', 'wind_speed', 'pressure', 'description', 'source']
|
|
||||||
for field in required_fields:
|
|
||||||
assert field in result, f"Missing required field: {field}"
|
|
||||||
|
|
||||||
# Validate data types and ranges
|
|
||||||
assert isinstance(result['temperature'], float), "Temperature should be float"
|
|
||||||
assert -20 <= result['temperature'] <= 50, "Temperature should be reasonable"
|
|
||||||
assert isinstance(result['precipitation'], float), "Precipitation should be float"
|
|
||||||
assert result['precipitation'] >= 0, "Precipitation should be non-negative"
|
|
||||||
assert 0 <= result['humidity'] <= 100, "Humidity should be percentage"
|
|
||||||
assert result['wind_speed'] >= 0, "Wind speed should be non-negative"
|
|
||||||
assert result['pressure'] > 900, "Pressure should be reasonable"
|
|
||||||
assert result['source'] == WeatherSource.AEMET.value, "Source should be AEMET"
|
|
||||||
|
|
||||||
print(f"✅ Current weather test passed - Temp: {result['temperature']}°C, Source: {result['source']}")
|
|
||||||
|
|
||||||
async def test_get_current_weather_fallback_to_synthetic(self, client, madrid_coords):
|
|
||||||
"""Test fallback to synthetic data when AEMET API fails"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
with patch.object(client, '_get', new_callable=AsyncMock) as mock_get:
|
|
||||||
mock_get.return_value = None # Simulate API failure
|
|
||||||
|
|
||||||
result = await client.get_current_weather(lat, lon)
|
|
||||||
|
|
||||||
assert result is not None, "Should return synthetic data"
|
|
||||||
assert result['source'] == WeatherSource.SYNTHETIC.value, "Should use synthetic source"
|
|
||||||
assert isinstance(result['temperature'], float), "Temperature should be float"
|
|
||||||
|
|
||||||
print(f"✅ Synthetic fallback test passed - Source: {result['source']}")
|
|
||||||
|
|
||||||
async def test_get_current_weather_invalid_coordinates(self, client):
|
|
||||||
"""Test current weather with invalid coordinates"""
|
|
||||||
invalid_coords = [
|
|
||||||
(200, 200), # Out of range
|
|
||||||
(-200, -200), # Out of range
|
|
||||||
(0, 0), # Not in Madrid area
|
|
||||||
]
|
|
||||||
|
|
||||||
for lat, lon in invalid_coords:
|
|
||||||
result = await client.get_current_weather(lat, lon)
|
|
||||||
|
|
||||||
# Should still return data (synthetic)
|
|
||||||
assert result is not None, f"Should handle invalid coords ({lat}, {lon})"
|
|
||||||
assert result['source'] == WeatherSource.SYNTHETIC.value, "Should use synthetic for invalid coords"
|
|
||||||
|
|
||||||
print(f"✅ Invalid coordinates test passed")
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# FORECAST TESTS
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
async def test_get_forecast_success(self, client, madrid_coords, mock_aemet_response, mock_forecast_data):
|
|
||||||
"""Test successful weather forecast retrieval"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
days = 7
|
|
||||||
|
|
||||||
with patch.object(client, '_get', new_callable=AsyncMock) as mock_get, \
|
|
||||||
patch.object(client, '_fetch_from_url', new_callable=AsyncMock) as mock_fetch:
|
|
||||||
|
|
||||||
mock_get.return_value = mock_aemet_response
|
|
||||||
mock_fetch.return_value = mock_forecast_data
|
|
||||||
|
|
||||||
result = await client.get_forecast(lat, lon, days)
|
|
||||||
|
|
||||||
# Validate result structure
|
|
||||||
assert isinstance(result, list), "Result should be a list"
|
|
||||||
assert len(result) == days, f"Should return {days} forecast days"
|
|
||||||
|
|
||||||
# Check first forecast day
|
|
||||||
if result:
|
|
||||||
forecast_day = result[0]
|
|
||||||
|
|
||||||
required_fields = ['forecast_date', 'generated_at', 'temperature', 'precipitation', 'humidity', 'wind_speed', 'description', 'source']
|
|
||||||
for field in required_fields:
|
|
||||||
assert field in forecast_day, f"Missing required field: {field}"
|
|
||||||
|
|
||||||
# Validate data types
|
|
||||||
assert isinstance(forecast_day['forecast_date'], datetime), "Forecast date should be datetime"
|
|
||||||
assert isinstance(forecast_day['temperature'], (int, float)), "Temperature should be numeric"
|
|
||||||
assert isinstance(forecast_day['precipitation'], (int, float)), "Precipitation should be numeric"
|
|
||||||
assert forecast_day['source'] in [WeatherSource.AEMET.value, WeatherSource.SYNTHETIC.value], "Valid source"
|
|
||||||
|
|
||||||
print(f"✅ Forecast test passed - {len(result)} days, Source: {forecast_day['source']}")
|
|
||||||
|
|
||||||
async def test_get_forecast_different_durations(self, client, madrid_coords):
|
|
||||||
"""Test forecast for different time durations"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
test_durations = [1, 3, 7, 14]
|
|
||||||
|
|
||||||
for days in test_durations:
|
|
||||||
result = await client.get_forecast(lat, lon, days)
|
|
||||||
|
|
||||||
assert isinstance(result, list), f"Result should be list for {days} days"
|
|
||||||
assert len(result) == days, f"Should return exactly {days} forecast days"
|
|
||||||
|
|
||||||
# Check date progression
|
|
||||||
if len(result) > 1:
|
|
||||||
for i in range(1, len(result)):
|
|
||||||
date_diff = result[i]['forecast_date'] - result[i-1]['forecast_date']
|
|
||||||
assert date_diff.days == 1, "Forecast dates should be consecutive days"
|
|
||||||
|
|
||||||
print(f"✅ Multiple duration forecast test passed")
|
|
||||||
|
|
||||||
async def test_get_forecast_fallback_to_synthetic(self, client, madrid_coords):
|
|
||||||
"""Test forecast fallback to synthetic data"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
with patch.object(client.location_service, 'get_municipality_code') as mock_municipality:
|
|
||||||
mock_municipality.return_value = None # No municipality found
|
|
||||||
|
|
||||||
result = await client.get_forecast(lat, lon, 7)
|
|
||||||
|
|
||||||
assert isinstance(result, list), "Should return synthetic forecast"
|
|
||||||
assert len(result) == 7, "Should return 7 days"
|
|
||||||
assert all(day['source'] == WeatherSource.SYNTHETIC.value for day in result), "All should be synthetic"
|
|
||||||
|
|
||||||
print(f"✅ Forecast synthetic fallback test passed")
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# HISTORICAL WEATHER TESTS
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
async def test_get_historical_weather_success(self, client, madrid_coords, mock_aemet_response, mock_historical_data):
|
|
||||||
"""Test successful historical weather retrieval"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
end_date = datetime.now()
|
|
||||||
start_date = end_date - timedelta(days=7)
|
|
||||||
|
|
||||||
with patch.object(client, '_get', new_callable=AsyncMock) as mock_get, \
|
|
||||||
patch.object(client, '_fetch_from_url', new_callable=AsyncMock) as mock_fetch:
|
|
||||||
|
|
||||||
mock_get.return_value = mock_aemet_response
|
|
||||||
mock_fetch.return_value = mock_historical_data
|
|
||||||
|
|
||||||
result = await client.get_historical_weather(lat, lon, start_date, end_date)
|
|
||||||
|
|
||||||
# Validate result structure
|
|
||||||
assert isinstance(result, list), "Result should be a list"
|
|
||||||
assert len(result) > 0, "Should return historical data"
|
|
||||||
|
|
||||||
# Check first historical record
|
|
||||||
if result:
|
|
||||||
record = result[0]
|
|
||||||
|
|
||||||
required_fields = ['date', 'temperature', 'precipitation', 'humidity', 'wind_speed', 'pressure', 'description', 'source']
|
|
||||||
for field in required_fields:
|
|
||||||
assert field in record, f"Missing required field: {field}"
|
|
||||||
|
|
||||||
# Validate data types and ranges
|
|
||||||
assert isinstance(record['date'], datetime), "Date should be datetime"
|
|
||||||
assert isinstance(record['temperature'], (int, float, type(None))), "Temperature should be numeric or None"
|
|
||||||
if record['temperature']:
|
|
||||||
assert -30 <= record['temperature'] <= 50, "Temperature should be reasonable"
|
|
||||||
assert record['precipitation'] >= 0, "Precipitation should be non-negative"
|
|
||||||
assert record['source'] == WeatherSource.AEMET.value, "Source should be AEMET"
|
|
||||||
|
|
||||||
print(f"✅ Historical weather test passed - {len(result)} records, Source: {record['source']}")
|
|
||||||
|
|
||||||
async def test_get_historical_weather_date_ranges(self, client, madrid_coords):
|
|
||||||
"""Test historical weather with different date ranges"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
end_date = datetime.now()
|
|
||||||
|
|
||||||
test_ranges = [
|
|
||||||
1, # 1 day
|
|
||||||
7, # 1 week
|
|
||||||
30, # 1 month
|
|
||||||
90, # 3 months
|
|
||||||
]
|
|
||||||
|
|
||||||
for days in test_ranges:
|
|
||||||
start_date = end_date - timedelta(days=days)
|
|
||||||
|
|
||||||
result = await client.get_historical_weather(lat, lon, start_date, end_date)
|
|
||||||
|
|
||||||
assert isinstance(result, list), f"Result should be list for {days} days"
|
|
||||||
# Note: Actual count may vary due to chunking and data availability
|
|
||||||
assert len(result) >= 0, f"Should return non-negative count for {days} days"
|
|
||||||
|
|
||||||
if result:
|
|
||||||
# Check date ordering
|
|
||||||
dates = [r['date'] for r in result if 'date' in r]
|
|
||||||
if len(dates) > 1:
|
|
||||||
assert dates == sorted(dates), "Historical dates should be in chronological order"
|
|
||||||
|
|
||||||
print(f"✅ Historical date ranges test passed")
|
|
||||||
|
|
||||||
async def test_get_historical_weather_chunking(self, client, madrid_coords):
|
|
||||||
"""Test historical weather data chunking for large date ranges"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
end_date = datetime.now()
|
|
||||||
start_date = end_date - timedelta(days=65) # More than 30 days to trigger chunking
|
|
||||||
|
|
||||||
with patch.object(client, '_fetch_historical_chunk', new_callable=AsyncMock) as mock_chunk:
|
|
||||||
mock_chunk.return_value = [] # Empty chunks
|
|
||||||
|
|
||||||
result = await client.get_historical_weather(lat, lon, start_date, end_date)
|
|
||||||
|
|
||||||
# Should have called chunking at least twice (65 days > 30 day limit)
|
|
||||||
assert mock_chunk.call_count >= 2, "Should chunk large date ranges"
|
|
||||||
|
|
||||||
print(f"✅ Historical chunking test passed - {mock_chunk.call_count} chunks")
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# COMPONENT TESTS
|
|
||||||
# ================================================================
|
|
||||||
@pytest.mark.skip_asyncio
|
|
||||||
def test_weather_data_parser(self):
|
|
||||||
"""Test WeatherDataParser functionality"""
|
|
||||||
parser = WeatherDataParser()
|
|
||||||
|
|
||||||
# Test safe_float
|
|
||||||
assert parser.safe_float("15.5", 0.0) == 15.5
|
|
||||||
assert parser.safe_float(None, 10.0) == 10.0
|
|
||||||
assert parser.safe_float("invalid", 5.0) == 5.0
|
|
||||||
assert parser.safe_float(20) == 20.0
|
|
||||||
|
|
||||||
# Test extract_temperature_value
|
|
||||||
assert parser.extract_temperature_value(25.5) == 25.5
|
|
||||||
assert parser.extract_temperature_value("20.0") == 20.0
|
|
||||||
assert parser.extract_temperature_value({"valor": 18.5}) == 18.5
|
|
||||||
assert parser.extract_temperature_value([{"valor": 22.0}]) == 22.0
|
|
||||||
assert parser.extract_temperature_value(None) is None
|
|
||||||
|
|
||||||
# Test generate_weather_description
|
|
||||||
assert "Lluvioso" in parser.generate_weather_description(20, 6.0, 60)
|
|
||||||
assert "Nuboso con lluvia" in parser.generate_weather_description(20, 1.0, 60)
|
|
||||||
assert "Nuboso" in parser.generate_weather_description(20, 0, 85)
|
|
||||||
assert "Soleado y cálido" in parser.generate_weather_description(30, 0, 60)
|
|
||||||
assert "Frío" in parser.generate_weather_description(2, 0, 60)
|
|
||||||
|
|
||||||
print(f"✅ WeatherDataParser tests passed")
|
|
||||||
|
|
||||||
@pytest.mark.skip_asyncio
|
|
||||||
def test_synthetic_weather_generator(self):
|
|
||||||
"""Test SyntheticWeatherGenerator functionality"""
|
|
||||||
generator = SyntheticWeatherGenerator()
|
|
||||||
|
|
||||||
# Test current weather generation
|
|
||||||
current = generator.generate_current_weather()
|
|
||||||
|
|
||||||
assert isinstance(current, dict), "Should return dictionary"
|
|
||||||
assert 'temperature' in current, "Should have temperature"
|
|
||||||
assert 'precipitation' in current, "Should have precipitation"
|
|
||||||
assert current['source'] == WeatherSource.SYNTHETIC.value, "Should be synthetic source"
|
|
||||||
assert isinstance(current['date'], datetime), "Should have datetime"
|
|
||||||
|
|
||||||
# Test forecast generation
|
|
||||||
forecast = generator.generate_forecast_sync(5)
|
|
||||||
|
|
||||||
assert isinstance(forecast, list), "Should return list"
|
|
||||||
assert len(forecast) == 5, "Should return requested days"
|
|
||||||
assert all('forecast_date' in day for day in forecast), "All days should have forecast_date"
|
|
||||||
assert all(day['source'] == WeatherSource.SYNTHETIC.value for day in forecast), "All should be synthetic"
|
|
||||||
|
|
||||||
# Test historical generation
|
|
||||||
end_date = datetime.now()
|
|
||||||
start_date = end_date - timedelta(days=7)
|
|
||||||
historical = generator.generate_historical_data(start_date, end_date)
|
|
||||||
|
|
||||||
assert isinstance(historical, list), "Should return list"
|
|
||||||
assert len(historical) == 8, "Should return 8 days (inclusive)"
|
|
||||||
assert all('date' in day for day in historical), "All days should have date"
|
|
||||||
assert all(day['source'] == WeatherSource.SYNTHETIC.value for day in historical), "All should be synthetic"
|
|
||||||
|
|
||||||
print(f"✅ SyntheticWeatherGenerator tests passed")
|
|
||||||
|
|
||||||
@pytest.mark.skip_asyncio
|
|
||||||
def test_location_service(self):
|
|
||||||
"""Test LocationService functionality"""
|
|
||||||
# Test distance calculation
|
|
||||||
madrid_center = (40.4168, -3.7038)
|
|
||||||
madrid_north = (40.4677, -3.5552)
|
|
||||||
|
|
||||||
distance = LocationService.calculate_distance(
|
|
||||||
madrid_center[0], madrid_center[1],
|
|
||||||
madrid_north[0], madrid_north[1]
|
|
||||||
)
|
|
||||||
|
|
||||||
assert isinstance(distance, float), "Distance should be float"
|
|
||||||
assert 0 < distance < 50, "Distance should be reasonable for Madrid area"
|
|
||||||
|
|
||||||
# Test nearest station finding
|
|
||||||
station_id = LocationService.find_nearest_station(madrid_center[0], madrid_center[1])
|
|
||||||
|
|
||||||
assert station_id is not None, "Should find a station"
|
|
||||||
assert station_id in [station.id for station in AEMETConstants.MADRID_STATIONS], "Should be valid station"
|
|
||||||
|
|
||||||
# Test municipality code
|
|
||||||
municipality = LocationService.get_municipality_code(madrid_center[0], madrid_center[1])
|
|
||||||
assert municipality == AEMETConstants.MADRID_MUNICIPALITY_CODE, "Should return Madrid code"
|
|
||||||
|
|
||||||
# Test outside Madrid
|
|
||||||
outside_madrid = LocationService.get_municipality_code(41.0, -4.0) # Outside bounds
|
|
||||||
assert outside_madrid is None, "Should return None for outside Madrid"
|
|
||||||
|
|
||||||
print(f"✅ LocationService tests passed")
|
|
||||||
|
|
||||||
@pytest.mark.skip_asyncio
|
|
||||||
def test_constants_and_enums(self):
|
|
||||||
"""Test constants and enum definitions"""
|
|
||||||
# Test WeatherSource enum
|
|
||||||
assert WeatherSource.AEMET.value == "aemet"
|
|
||||||
assert WeatherSource.SYNTHETIC.value == "synthetic"
|
|
||||||
assert WeatherSource.DEFAULT.value == "default"
|
|
||||||
|
|
||||||
# Test GeographicBounds
|
|
||||||
bounds = AEMETConstants.MADRID_BOUNDS
|
|
||||||
assert bounds.contains(40.4168, -3.7038), "Should contain Madrid center"
|
|
||||||
assert not bounds.contains(41.0, -4.0), "Should not contain coordinates outside Madrid"
|
|
||||||
|
|
||||||
# Test WeatherStation
|
|
||||||
station = AEMETConstants.MADRID_STATIONS[0]
|
|
||||||
assert isinstance(station, WeatherStation), "Should be WeatherStation instance"
|
|
||||||
assert station.id is not None, "Station should have ID"
|
|
||||||
assert station.name is not None, "Station should have name"
|
|
||||||
|
|
||||||
print(f"✅ Constants and enums tests passed")
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# ERROR HANDLING TESTS
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
async def test_api_error_handling(self, client, madrid_coords):
|
|
||||||
"""Test handling of various API errors"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
# Test network error
|
|
||||||
with patch.object(client, '_get', side_effect=Exception("Network error")):
|
|
||||||
result = await client.get_current_weather(lat, lon)
|
|
||||||
assert result['source'] == WeatherSource.SYNTHETIC.value, "Should fallback on network error"
|
|
||||||
|
|
||||||
# Test invalid API response
|
|
||||||
with patch.object(client, '_get', new_callable=AsyncMock) as mock_get:
|
|
||||||
mock_get.return_value = {"error": "Invalid API key"}
|
|
||||||
result = await client.get_current_weather(lat, lon)
|
|
||||||
assert result['source'] == WeatherSource.SYNTHETIC.value, "Should fallback on API error"
|
|
||||||
|
|
||||||
# Test malformed data
|
|
||||||
with patch.object(client, '_get', new_callable=AsyncMock) as mock_get, \
|
|
||||||
patch.object(client, '_fetch_from_url', new_callable=AsyncMock) as mock_fetch:
|
|
||||||
|
|
||||||
mock_get.return_value = {"datos": "http://example.com"}
|
|
||||||
mock_fetch.return_value = [{"invalid": "data"}] # Missing expected fields
|
|
||||||
|
|
||||||
result = await client.get_current_weather(lat, lon)
|
|
||||||
assert result is not None, "Should handle malformed data gracefully"
|
|
||||||
|
|
||||||
print(f"✅ API error handling tests passed")
|
|
||||||
|
|
||||||
async def test_timeout_handling(self, client, madrid_coords):
|
|
||||||
"""Test timeout handling"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
with patch.object(client, '_get', side_effect=asyncio.TimeoutError("Request timeout")):
|
|
||||||
result = await client.get_current_weather(lat, lon)
|
|
||||||
assert result['source'] == WeatherSource.SYNTHETIC.value, "Should fallback on timeout"
|
|
||||||
|
|
||||||
print(f"✅ Timeout handling test passed")
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# PERFORMANCE TESTS
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
async def test_performance_current_weather(self, client, madrid_coords):
|
|
||||||
"""Test current weather performance"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
start_time = datetime.now()
|
|
||||||
result = await client.get_current_weather(lat, lon)
|
|
||||||
execution_time = (datetime.now() - start_time).total_seconds() * 1000
|
|
||||||
|
|
||||||
assert result is not None, "Should return weather data"
|
|
||||||
assert execution_time < 5000, "Should execute within 5 seconds"
|
|
||||||
|
|
||||||
print(f"✅ Current weather performance test passed - {execution_time:.0f}ms")
|
|
||||||
|
|
||||||
async def test_performance_forecast(self, client, madrid_coords):
|
|
||||||
"""Test forecast performance"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
start_time = datetime.now()
|
|
||||||
result = await client.get_forecast(lat, lon, 7)
|
|
||||||
execution_time = (datetime.now() - start_time).total_seconds() * 1000
|
|
||||||
|
|
||||||
assert isinstance(result, list), "Should return forecast list"
|
|
||||||
assert len(result) == 7, "Should return 7 days"
|
|
||||||
assert execution_time < 5000, "Should execute within 5 seconds"
|
|
||||||
|
|
||||||
print(f"✅ Forecast performance test passed - {execution_time:.0f}ms")
|
|
||||||
|
|
||||||
async def test_performance_historical(self, client, madrid_coords):
|
|
||||||
"""Test historical weather performance"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
end_date = datetime.now()
|
|
||||||
start_date = end_date - timedelta(days=7)
|
|
||||||
|
|
||||||
start_time = datetime.now()
|
|
||||||
result = await client.get_historical_weather(lat, lon, start_date, end_date)
|
|
||||||
execution_time = (datetime.now() - start_time).total_seconds() * 1000
|
|
||||||
|
|
||||||
assert isinstance(result, list), "Should return historical list"
|
|
||||||
assert execution_time < 10000, "Should execute within 10 seconds (allowing for API calls)"
|
|
||||||
|
|
||||||
print(f"✅ Historical performance test passed - {execution_time:.0f}ms")
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# INTEGRATION TESTS
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
async def test_real_aemet_api_access(self, client, madrid_coords):
|
|
||||||
"""Test actual AEMET API access (if API key is available)"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
try:
|
|
||||||
# Test current weather
|
|
||||||
current_result = await client.get_current_weather(lat, lon)
|
|
||||||
assert current_result is not None, "Should get current weather"
|
|
||||||
|
|
||||||
if current_result['source'] == WeatherSource.AEMET.value:
|
|
||||||
print(f"🎉 SUCCESS: Got real AEMET current weather data!")
|
|
||||||
print(f" Temperature: {current_result['temperature']}°C")
|
|
||||||
print(f" Description: {current_result['description']}")
|
|
||||||
else:
|
|
||||||
print(f"ℹ️ Got synthetic current weather (API key may not be configured)")
|
|
||||||
|
|
||||||
# Test forecast
|
|
||||||
forecast_result = await client.get_forecast(lat, lon, 3)
|
|
||||||
assert len(forecast_result) == 3, "Should get 3-day forecast"
|
|
||||||
|
|
||||||
if forecast_result[0]['source'] == WeatherSource.AEMET.value:
|
|
||||||
print(f"🎉 SUCCESS: Got real AEMET forecast data!")
|
|
||||||
print(f" Tomorrow: {forecast_result[1]['temperature']}°C - {forecast_result[1]['description']}")
|
|
||||||
else:
|
|
||||||
print(f"ℹ️ Got synthetic forecast (API key may not be configured)")
|
|
||||||
|
|
||||||
# Test historical (last week)
|
|
||||||
end_date = datetime.now()
|
|
||||||
start_date = end_date - timedelta(days=7)
|
|
||||||
historical_result = await client.get_historical_weather(lat, lon, start_date, end_date)
|
|
||||||
|
|
||||||
assert isinstance(historical_result, list), "Should get historical data"
|
|
||||||
|
|
||||||
real_historical = [r for r in historical_result if r['source'] == WeatherSource.AEMET.value]
|
|
||||||
if real_historical:
|
|
||||||
print(f"🎉 SUCCESS: Got real AEMET historical data!")
|
|
||||||
print(f" Records: {len(real_historical)} real + {len(historical_result) - len(real_historical)} synthetic")
|
|
||||||
else:
|
|
||||||
print(f"ℹ️ Got synthetic historical data (API limitations or key issues)")
|
|
||||||
|
|
||||||
print(f"✅ Real AEMET API integration test completed")
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
print(f"⚠️ AEMET API integration test failed: {e}")
|
|
||||||
# This is acceptable if API key is not configured
|
|
||||||
|
|
||||||
async def test_data_consistency(self, client, madrid_coords):
|
|
||||||
"""Test data consistency across different methods"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
# Get current weather
|
|
||||||
current = await client.get_current_weather(lat, lon)
|
|
||||||
|
|
||||||
# Get today's forecast
|
|
||||||
forecast = await client.get_forecast(lat, lon, 1)
|
|
||||||
today_forecast = forecast[0] if forecast else None
|
|
||||||
|
|
||||||
if current and today_forecast:
|
|
||||||
# Temperature should be somewhat consistent
|
|
||||||
temp_diff = abs(current['temperature'] - today_forecast['temperature'])
|
|
||||||
assert temp_diff < 15, "Current and forecast temperature should be reasonably consistent"
|
|
||||||
|
|
||||||
# Both should use same source type preference
|
|
||||||
if current['source'] == WeatherSource.AEMET.value:
|
|
||||||
assert today_forecast['source'] == WeatherSource.AEMET.value, "Should use consistent data sources"
|
|
||||||
|
|
||||||
print(f"✅ Data consistency test passed")
|
|
||||||
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# STANDALONE TEST FUNCTIONS
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
async def run_manual_test():
|
|
||||||
"""Manual test function that can be run directly"""
|
|
||||||
print("="*60)
|
|
||||||
print("AEMET WEATHER CLIENT TEST - JULY 2025")
|
|
||||||
print("="*60)
|
|
||||||
|
|
||||||
client = AEMETClient()
|
|
||||||
madrid_lat, madrid_lon = 40.4168, -3.7038 # Madrid center
|
|
||||||
|
|
||||||
print(f"\n=== Testing Madrid Weather ({madrid_lat}, {madrid_lon}) ===")
|
|
||||||
|
|
||||||
# Test current weather
|
|
||||||
print(f"\n1. Testing Current Weather...")
|
|
||||||
current = await client.get_current_weather(madrid_lat, madrid_lon)
|
|
||||||
if current:
|
|
||||||
print(f" Temperature: {current['temperature']}°C")
|
|
||||||
print(f" Description: {current['description']}")
|
|
||||||
print(f" Humidity: {current['humidity']}%")
|
|
||||||
print(f" Wind: {current['wind_speed']} km/h")
|
|
||||||
print(f" Source: {current['source']}")
|
|
||||||
|
|
||||||
# Test forecast
|
|
||||||
print(f"\n2. Testing 7-Day Forecast...")
|
|
||||||
forecast = await client.get_forecast(madrid_lat, madrid_lon, 7)
|
|
||||||
if forecast:
|
|
||||||
print(f" Forecast days: {len(forecast)}")
|
|
||||||
print(f" Tomorrow: {forecast[1]['temperature']}°C - {forecast[1]['description']}")
|
|
||||||
print(f" Source: {forecast[0]['source']}")
|
|
||||||
|
|
||||||
# Test historical
|
|
||||||
print(f"\n3. Testing Historical Weather (last 7 days)...")
|
|
||||||
end_date = datetime.now()
|
|
||||||
start_date = end_date - timedelta(days=7)
|
|
||||||
historical = await client.get_historical_weather(madrid_lat, madrid_lon, start_date, end_date)
|
|
||||||
if historical:
|
|
||||||
print(f" Historical records: {len(historical)}")
|
|
||||||
if historical:
|
|
||||||
real_count = len([r for r in historical if r['source'] == WeatherSource.AEMET.value])
|
|
||||||
synthetic_count = len(historical) - real_count
|
|
||||||
print(f" Real data: {real_count}, Synthetic: {synthetic_count}")
|
|
||||||
|
|
||||||
print(f"\n✅ Manual test completed!")
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
# If run directly, execute manual test
|
|
||||||
asyncio.run(run_manual_test())
|
|
||||||
@@ -1,594 +0,0 @@
|
|||||||
# ================================================================
|
|
||||||
# services/data/tests/test_aemet_edge_cases.py
|
|
||||||
# ================================================================
|
|
||||||
"""
|
|
||||||
Edge cases and integration tests for AEMET weather API client
|
|
||||||
Covers boundary conditions, error scenarios, and complex integrations
|
|
||||||
"""
|
|
||||||
|
|
||||||
import pytest
|
|
||||||
import asyncio
|
|
||||||
from datetime import datetime, timedelta
|
|
||||||
from unittest.mock import Mock, patch, AsyncMock
|
|
||||||
import json
|
|
||||||
from typing import Dict, List, Any
|
|
||||||
|
|
||||||
from app.external.aemet import (
|
|
||||||
AEMETClient,
|
|
||||||
WeatherDataParser,
|
|
||||||
SyntheticWeatherGenerator,
|
|
||||||
LocationService,
|
|
||||||
AEMETConstants,
|
|
||||||
WeatherSource
|
|
||||||
)
|
|
||||||
|
|
||||||
# Configure pytest-asyncio
|
|
||||||
pytestmark = pytest.mark.asyncio
|
|
||||||
|
|
||||||
|
|
||||||
class TestAEMETEdgeCases:
|
|
||||||
"""Test edge cases and boundary conditions"""
|
|
||||||
|
|
||||||
async def test_extreme_coordinates(self, aemet_client):
|
|
||||||
"""Test handling of extreme coordinate values"""
|
|
||||||
extreme_coords = [
|
|
||||||
(90, 180), # North pole, antimeridian
|
|
||||||
(-90, -180), # South pole, antimeridian
|
|
||||||
(0, 0), # Null island
|
|
||||||
(40.5, -180), # Valid latitude, extreme longitude
|
|
||||||
(90, -3.7), # Extreme latitude, Madrid longitude
|
|
||||||
]
|
|
||||||
|
|
||||||
for lat, lon in extreme_coords:
|
|
||||||
result = await aemet_client.get_current_weather(lat, lon)
|
|
||||||
|
|
||||||
assert result is not None, f"Should handle extreme coords ({lat}, {lon})"
|
|
||||||
assert result['source'] == WeatherSource.SYNTHETIC.value, "Should fallback to synthetic for extreme coords"
|
|
||||||
assert isinstance(result['temperature'], (int, float)), "Should have valid temperature"
|
|
||||||
|
|
||||||
async def test_boundary_date_ranges(self, aemet_client, madrid_coords):
|
|
||||||
"""Test boundary conditions for date ranges"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
now = datetime.now()
|
|
||||||
|
|
||||||
# Test same start and end date
|
|
||||||
result = await aemet_client.get_historical_weather(lat, lon, now, now)
|
|
||||||
assert isinstance(result, list), "Should return list for same-day request"
|
|
||||||
|
|
||||||
# Test reverse date range (end before start)
|
|
||||||
start_date = now
|
|
||||||
end_date = now - timedelta(days=1)
|
|
||||||
result = await aemet_client.get_historical_weather(lat, lon, start_date, end_date)
|
|
||||||
assert isinstance(result, list), "Should handle reverse date range gracefully"
|
|
||||||
|
|
||||||
# Test extremely large date range
|
|
||||||
start_date = now - timedelta(days=1000)
|
|
||||||
end_date = now
|
|
||||||
result = await aemet_client.get_historical_weather(lat, lon, start_date, end_date)
|
|
||||||
assert isinstance(result, list), "Should handle very large date ranges"
|
|
||||||
|
|
||||||
async def test_forecast_edge_durations(self, aemet_client, madrid_coords):
|
|
||||||
"""Test forecast with edge case durations"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
edge_durations = [0, 1, 30, 365, -1, 1000]
|
|
||||||
|
|
||||||
for days in edge_durations:
|
|
||||||
try:
|
|
||||||
result = await aemet_client.get_forecast(lat, lon, days)
|
|
||||||
|
|
||||||
if days <= 0:
|
|
||||||
assert len(result) == 0 or result is None, f"Should handle non-positive days ({days})"
|
|
||||||
elif days > 100:
|
|
||||||
# Should handle gracefully, possibly with synthetic data
|
|
||||||
assert isinstance(result, list), f"Should handle large day count ({days})"
|
|
||||||
else:
|
|
||||||
assert len(result) == days, f"Should return {days} forecast days"
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
# Some edge cases might raise exceptions, which is acceptable
|
|
||||||
print(f"ℹ️ Days={days} raised exception: {e}")
|
|
||||||
|
|
||||||
def test_parser_edge_cases(self, weather_parser):
|
|
||||||
"""Test weather data parser with edge case inputs"""
|
|
||||||
# Test with None values
|
|
||||||
result = weather_parser.safe_float(None, 10.0)
|
|
||||||
assert result == 10.0, "Should return default for None"
|
|
||||||
|
|
||||||
# Test with empty strings
|
|
||||||
result = weather_parser.safe_float("", 5.0)
|
|
||||||
assert result == 5.0, "Should return default for empty string"
|
|
||||||
|
|
||||||
# Test with extreme values
|
|
||||||
result = weather_parser.safe_float("999999.99", 0.0)
|
|
||||||
assert result == 999999.99, "Should handle large numbers"
|
|
||||||
|
|
||||||
result = weather_parser.safe_float("-999.99", 0.0)
|
|
||||||
assert result == -999.99, "Should handle negative numbers"
|
|
||||||
|
|
||||||
# Test temperature extraction edge cases
|
|
||||||
assert weather_parser.extract_temperature_value([]) is None, "Should handle empty list"
|
|
||||||
assert weather_parser.extract_temperature_value({}) is None, "Should handle empty dict"
|
|
||||||
assert weather_parser.extract_temperature_value("invalid") is None, "Should handle invalid string"
|
|
||||||
|
|
||||||
def test_synthetic_generator_edge_cases(self, synthetic_generator):
|
|
||||||
"""Test synthetic weather generator edge cases"""
|
|
||||||
# Test with extreme date ranges
|
|
||||||
end_date = datetime.now()
|
|
||||||
start_date = end_date - timedelta(days=1000)
|
|
||||||
|
|
||||||
result = synthetic_generator.generate_historical_data(start_date, end_date)
|
|
||||||
assert isinstance(result, list), "Should handle large date ranges"
|
|
||||||
assert len(result) == 1001, "Should generate correct number of days"
|
|
||||||
|
|
||||||
# Test forecast with zero days
|
|
||||||
result = synthetic_generator.generate_forecast_sync(0)
|
|
||||||
assert result == [], "Should return empty list for zero days"
|
|
||||||
|
|
||||||
# Test forecast with large number of days
|
|
||||||
result = synthetic_generator.generate_forecast_sync(1000)
|
|
||||||
assert len(result) == 1000, "Should handle large forecast ranges"
|
|
||||||
|
|
||||||
def test_location_service_edge_cases(self):
|
|
||||||
"""Test location service edge cases"""
|
|
||||||
# Test distance calculation with same points
|
|
||||||
distance = LocationService.calculate_distance(40.4, -3.7, 40.4, -3.7)
|
|
||||||
assert distance == 0.0, "Distance between same points should be zero"
|
|
||||||
|
|
||||||
# Test distance calculation with antipodal points
|
|
||||||
distance = LocationService.calculate_distance(40.4, -3.7, -40.4, 176.3)
|
|
||||||
assert distance > 15000, "Antipodal points should be far apart"
|
|
||||||
|
|
||||||
# Test station finding with no stations (if list were empty)
|
|
||||||
with patch.object(AEMETConstants, 'MADRID_STATIONS', []):
|
|
||||||
station = LocationService.find_nearest_station(40.4, -3.7)
|
|
||||||
assert station is None, "Should return None when no stations available"
|
|
||||||
|
|
||||||
|
|
||||||
class TestAEMETDataIntegrity:
|
|
||||||
"""Test data integrity and consistency"""
|
|
||||||
|
|
||||||
async def test_data_type_consistency(self, aemet_client, madrid_coords):
|
|
||||||
"""Test that data types are consistent across calls"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
# Get current weather multiple times
|
|
||||||
results = []
|
|
||||||
for _ in range(3):
|
|
||||||
result = await aemet_client.get_current_weather(lat, lon)
|
|
||||||
results.append(result)
|
|
||||||
|
|
||||||
# Check that field types are consistent
|
|
||||||
if all(r is not None for r in results):
|
|
||||||
for field in ['temperature', 'precipitation', 'humidity', 'wind_speed', 'pressure']:
|
|
||||||
types = [type(r[field]) for r in results if field in r]
|
|
||||||
if types:
|
|
||||||
first_type = types[0]
|
|
||||||
assert all(t == first_type for t in types), f"Inconsistent types for {field}: {types}"
|
|
||||||
|
|
||||||
async def test_temperature_consistency(self, aemet_client, madrid_coords):
|
|
||||||
"""Test temperature consistency between different data sources"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
# Get current weather and today's forecast
|
|
||||||
current = await aemet_client.get_current_weather(lat, lon)
|
|
||||||
forecast = await aemet_client.get_forecast(lat, lon, 1)
|
|
||||||
|
|
||||||
if current and forecast and len(forecast) > 0:
|
|
||||||
current_temp = current['temperature']
|
|
||||||
forecast_temp = forecast[0]['temperature']
|
|
||||||
|
|
||||||
# Temperatures should be reasonably close (within 15°C)
|
|
||||||
temp_diff = abs(current_temp - forecast_temp)
|
|
||||||
assert temp_diff < 15, f"Temperature difference too large: current={current_temp}°C, forecast={forecast_temp}°C"
|
|
||||||
|
|
||||||
async def test_source_consistency(self, aemet_client, madrid_coords):
|
|
||||||
"""Test that data source is consistent within same time period"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
# Get multiple current weather readings
|
|
||||||
current1 = await aemet_client.get_current_weather(lat, lon)
|
|
||||||
current2 = await aemet_client.get_current_weather(lat, lon)
|
|
||||||
|
|
||||||
if current1 and current2:
|
|
||||||
# Should use same source type (both real or both synthetic)
|
|
||||||
assert current1['source'] == current2['source'], "Should use consistent data source"
|
|
||||||
|
|
||||||
def test_historical_data_ordering(self, weather_parser, mock_historical_data):
|
|
||||||
"""Test that historical data is properly ordered"""
|
|
||||||
parsed_data = weather_parser.parse_historical_data(mock_historical_data)
|
|
||||||
|
|
||||||
if len(parsed_data) > 1:
|
|
||||||
dates = [record['date'] for record in parsed_data]
|
|
||||||
assert dates == sorted(dates), "Historical data should be chronologically ordered"
|
|
||||||
|
|
||||||
def test_forecast_date_progression(self, weather_parser, mock_forecast_data):
|
|
||||||
"""Test that forecast dates progress correctly"""
|
|
||||||
parsed_forecast = weather_parser.parse_forecast_data(mock_forecast_data, 7)
|
|
||||||
|
|
||||||
if len(parsed_forecast) > 1:
|
|
||||||
for i in range(1, len(parsed_forecast)):
|
|
||||||
prev_date = parsed_forecast[i-1]['forecast_date']
|
|
||||||
curr_date = parsed_forecast[i]['forecast_date']
|
|
||||||
diff = (curr_date - prev_date).days
|
|
||||||
assert diff == 1, f"Forecast dates should be consecutive days, got {diff} day difference"
|
|
||||||
|
|
||||||
|
|
||||||
class TestAEMETErrorRecovery:
|
|
||||||
"""Test error recovery and resilience"""
|
|
||||||
|
|
||||||
async def test_network_interruption_recovery(self, aemet_client, madrid_coords):
|
|
||||||
"""Test recovery from network interruptions"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
# Mock intermittent network failures
|
|
||||||
call_count = 0
|
|
||||||
|
|
||||||
async def mock_get_with_failures(*args, **kwargs):
|
|
||||||
nonlocal call_count
|
|
||||||
call_count += 1
|
|
||||||
if call_count <= 2: # Fail first two calls
|
|
||||||
raise Exception("Network timeout")
|
|
||||||
else:
|
|
||||||
return {"datos": "http://example.com/data"}
|
|
||||||
|
|
||||||
with patch.object(aemet_client, '_get', side_effect=mock_get_with_failures):
|
|
||||||
result = await aemet_client.get_current_weather(lat, lon)
|
|
||||||
|
|
||||||
# Should eventually succeed or fallback to synthetic
|
|
||||||
assert result is not None, "Should recover from network failures"
|
|
||||||
assert result['source'] in [WeatherSource.AEMET.value, WeatherSource.SYNTHETIC.value]
|
|
||||||
|
|
||||||
async def test_partial_data_recovery(self, aemet_client, madrid_coords, weather_parser):
|
|
||||||
"""Test recovery from partial/corrupted data"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
# Mock corrupted historical data (some records missing fields)
|
|
||||||
corrupted_data = [
|
|
||||||
{"fecha": "2025-07-20", "tmax": 25.2}, # Missing tmin and other fields
|
|
||||||
{"fecha": "2025-07-21"}, # Only has date
|
|
||||||
{"tmax": 27.0, "tmin": 15.0}, # Missing date
|
|
||||||
{"fecha": "2025-07-22", "tmax": 23.0, "tmin": 14.0, "prec": 0.0} # Complete record
|
|
||||||
]
|
|
||||||
|
|
||||||
parsed_data = weather_parser.parse_historical_data(corrupted_data)
|
|
||||||
|
|
||||||
# Should only return valid records and handle corrupted ones gracefully
|
|
||||||
assert isinstance(parsed_data, list), "Should return list even with corrupted data"
|
|
||||||
valid_records = [r for r in parsed_data if 'date' in r and r['date'] is not None]
|
|
||||||
assert len(valid_records) >= 1, "Should salvage at least some valid records"
|
|
||||||
|
|
||||||
async def test_malformed_json_recovery(self, aemet_client, madrid_coords):
|
|
||||||
"""Test recovery from malformed JSON responses"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
# Mock malformed responses
|
|
||||||
malformed_responses = [
|
|
||||||
None,
|
|
||||||
"",
|
|
||||||
"invalid json",
|
|
||||||
{"incomplete": "response"},
|
|
||||||
{"datos": None},
|
|
||||||
{"datos": ""},
|
|
||||||
]
|
|
||||||
|
|
||||||
for response in malformed_responses:
|
|
||||||
with patch.object(aemet_client, '_get', new_callable=AsyncMock, return_value=response):
|
|
||||||
result = await aemet_client.get_current_weather(lat, lon)
|
|
||||||
|
|
||||||
assert result is not None, f"Should handle malformed response: {response}"
|
|
||||||
assert result['source'] == WeatherSource.SYNTHETIC.value, "Should fallback to synthetic"
|
|
||||||
|
|
||||||
async def test_api_rate_limiting_recovery(self, aemet_client, madrid_coords):
|
|
||||||
"""Test recovery from API rate limiting"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
# Mock rate limiting responses
|
|
||||||
rate_limit_response = {
|
|
||||||
"descripcion": "Demasiadas peticiones",
|
|
||||||
"estado": 429
|
|
||||||
}
|
|
||||||
|
|
||||||
with patch.object(aemet_client, '_get', new_callable=AsyncMock, return_value=rate_limit_response):
|
|
||||||
result = await aemet_client.get_current_weather(lat, lon)
|
|
||||||
|
|
||||||
assert result is not None, "Should handle rate limiting"
|
|
||||||
assert result['source'] == WeatherSource.SYNTHETIC.value, "Should fallback to synthetic on rate limit"
|
|
||||||
|
|
||||||
|
|
||||||
class TestAEMETPerformanceAndScaling:
|
|
||||||
"""Test performance characteristics and scaling behavior"""
|
|
||||||
|
|
||||||
async def test_concurrent_requests_performance(self, aemet_client, madrid_coords):
|
|
||||||
"""Test performance with concurrent requests"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
# Create multiple concurrent requests
|
|
||||||
tasks = []
|
|
||||||
for i in range(10):
|
|
||||||
task = aemet_client.get_current_weather(lat, lon)
|
|
||||||
tasks.append(task)
|
|
||||||
|
|
||||||
start_time = datetime.now()
|
|
||||||
results = await asyncio.gather(*tasks, return_exceptions=True)
|
|
||||||
execution_time = (datetime.now() - start_time).total_seconds() * 1000
|
|
||||||
|
|
||||||
# Check that most requests succeeded
|
|
||||||
successful_results = [r for r in results if isinstance(r, dict) and 'temperature' in r]
|
|
||||||
assert len(successful_results) >= 8, "Most concurrent requests should succeed"
|
|
||||||
|
|
||||||
# Should complete in reasonable time (allowing for potential API rate limiting)
|
|
||||||
assert execution_time < 15000, f"Concurrent requests took too long: {execution_time:.0f}ms"
|
|
||||||
|
|
||||||
print(f"✅ Concurrent requests test - {len(successful_results)}/10 succeeded in {execution_time:.0f}ms")
|
|
||||||
|
|
||||||
async def test_memory_usage_with_large_datasets(self, aemet_client, madrid_coords):
|
|
||||||
"""Test memory usage with large historical datasets"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
# Request large historical dataset
|
|
||||||
end_date = datetime.now()
|
|
||||||
start_date = end_date - timedelta(days=90) # 3 months
|
|
||||||
|
|
||||||
import psutil
|
|
||||||
import os
|
|
||||||
|
|
||||||
# Get initial memory usage
|
|
||||||
process = psutil.Process(os.getpid())
|
|
||||||
initial_memory = process.memory_info().rss / 1024 / 1024 # MB
|
|
||||||
|
|
||||||
result = await aemet_client.get_historical_weather(lat, lon, start_date, end_date)
|
|
||||||
|
|
||||||
# Get final memory usage
|
|
||||||
final_memory = process.memory_info().rss / 1024 / 1024 # MB
|
|
||||||
memory_increase = final_memory - initial_memory
|
|
||||||
|
|
||||||
assert isinstance(result, list), "Should return historical data"
|
|
||||||
|
|
||||||
# Memory increase should be reasonable (less than 100MB for 90 days)
|
|
||||||
assert memory_increase < 100, f"Memory usage increased too much: {memory_increase:.1f}MB"
|
|
||||||
|
|
||||||
print(f"✅ Memory usage test - {len(result)} records, +{memory_increase:.1f}MB")
|
|
||||||
|
|
||||||
async def test_caching_behavior(self, aemet_client, madrid_coords):
|
|
||||||
"""Test caching behavior and performance improvement"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
# First request (cold)
|
|
||||||
start_time = datetime.now()
|
|
||||||
result1 = await aemet_client.get_current_weather(lat, lon)
|
|
||||||
first_call_time = (datetime.now() - start_time).total_seconds() * 1000
|
|
||||||
|
|
||||||
# Second request (potentially cached)
|
|
||||||
start_time = datetime.now()
|
|
||||||
result2 = await aemet_client.get_current_weather(lat, lon)
|
|
||||||
second_call_time = (datetime.now() - start_time).total_seconds() * 1000
|
|
||||||
|
|
||||||
assert result1 is not None, "First call should succeed"
|
|
||||||
assert result2 is not None, "Second call should succeed"
|
|
||||||
|
|
||||||
# Both should return valid data
|
|
||||||
assert 'temperature' in result1, "First result should have temperature"
|
|
||||||
assert 'temperature' in result2, "Second result should have temperature"
|
|
||||||
|
|
||||||
print(f"✅ Caching test - First call: {first_call_time:.0f}ms, Second call: {second_call_time:.0f}ms")
|
|
||||||
|
|
||||||
|
|
||||||
class TestAEMETIntegrationScenarios:
|
|
||||||
"""Test realistic integration scenarios"""
|
|
||||||
|
|
||||||
async def test_daily_weather_workflow(self, aemet_client, madrid_coords):
|
|
||||||
"""Test a complete daily weather workflow"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
# Simulate a daily weather check workflow
|
|
||||||
workflow_results = {}
|
|
||||||
|
|
||||||
# Step 1: Get current conditions
|
|
||||||
current = await aemet_client.get_current_weather(lat, lon)
|
|
||||||
workflow_results['current'] = current
|
|
||||||
assert current is not None, "Should get current weather"
|
|
||||||
|
|
||||||
# Step 2: Get today's forecast
|
|
||||||
forecast = await aemet_client.get_forecast(lat, lon, 1)
|
|
||||||
workflow_results['forecast'] = forecast
|
|
||||||
assert len(forecast) == 1, "Should get today's forecast"
|
|
||||||
|
|
||||||
# Step 3: Get week ahead forecast
|
|
||||||
week_forecast = await aemet_client.get_forecast(lat, lon, 7)
|
|
||||||
workflow_results['week_forecast'] = week_forecast
|
|
||||||
assert len(week_forecast) == 7, "Should get 7-day forecast"
|
|
||||||
|
|
||||||
# Step 4: Get last week's actual weather for comparison
|
|
||||||
end_date = datetime.now() - timedelta(days=1)
|
|
||||||
start_date = end_date - timedelta(days=7)
|
|
||||||
historical = await aemet_client.get_historical_weather(lat, lon, start_date, end_date)
|
|
||||||
workflow_results['historical'] = historical
|
|
||||||
assert isinstance(historical, list), "Should get historical data"
|
|
||||||
|
|
||||||
# Validate workflow consistency
|
|
||||||
all_sources = set()
|
|
||||||
if current: all_sources.add(current['source'])
|
|
||||||
if forecast: all_sources.add(forecast[0]['source'])
|
|
||||||
if week_forecast: all_sources.add(week_forecast[0]['source'])
|
|
||||||
if historical: all_sources.update([h['source'] for h in historical])
|
|
||||||
|
|
||||||
print(f"✅ Daily workflow test - Sources used: {', '.join(all_sources)}")
|
|
||||||
|
|
||||||
return workflow_results
|
|
||||||
|
|
||||||
async def test_weather_alerting_scenario(self, aemet_client, madrid_coords):
|
|
||||||
"""Test weather alerting scenario"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
# Get forecast for potential alerts
|
|
||||||
forecast = await aemet_client.get_forecast(lat, lon, 3)
|
|
||||||
|
|
||||||
alerts = []
|
|
||||||
for day in forecast:
|
|
||||||
# Check for extreme temperatures
|
|
||||||
if day['temperature'] > 35:
|
|
||||||
alerts.append(f"High temperature alert: {day['temperature']}°C on {day['forecast_date'].date()}")
|
|
||||||
elif day['temperature'] < -5:
|
|
||||||
alerts.append(f"Low temperature alert: {day['temperature']}°C on {day['forecast_date'].date()}")
|
|
||||||
|
|
||||||
# Check for high precipitation
|
|
||||||
if day['precipitation'] > 20:
|
|
||||||
alerts.append(f"Heavy rain alert: {day['precipitation']}mm on {day['forecast_date'].date()}")
|
|
||||||
|
|
||||||
# Alerts should be properly formatted
|
|
||||||
for alert in alerts:
|
|
||||||
assert isinstance(alert, str), "Alert should be string"
|
|
||||||
assert "alert" in alert.lower(), "Alert should contain 'alert'"
|
|
||||||
|
|
||||||
print(f"✅ Weather alerting test - {len(alerts)} alerts generated")
|
|
||||||
|
|
||||||
return alerts
|
|
||||||
|
|
||||||
async def test_historical_analysis_scenario(self, aemet_client, madrid_coords):
|
|
||||||
"""Test historical weather analysis scenario"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
# Get historical data for analysis
|
|
||||||
end_date = datetime.now()
|
|
||||||
start_date = end_date - timedelta(days=30)
|
|
||||||
|
|
||||||
historical = await aemet_client.get_historical_weather(lat, lon, start_date, end_date)
|
|
||||||
|
|
||||||
if historical:
|
|
||||||
# Calculate statistics
|
|
||||||
temperatures = [h['temperature'] for h in historical if h['temperature'] is not None]
|
|
||||||
precipitations = [h['precipitation'] for h in historical if h['precipitation'] is not None]
|
|
||||||
|
|
||||||
if temperatures:
|
|
||||||
avg_temp = sum(temperatures) / len(temperatures)
|
|
||||||
max_temp = max(temperatures)
|
|
||||||
min_temp = min(temperatures)
|
|
||||||
|
|
||||||
# Validate statistics
|
|
||||||
assert min_temp <= avg_temp <= max_temp, "Temperature statistics should be logical"
|
|
||||||
assert -20 <= min_temp <= 50, "Min temperature should be reasonable"
|
|
||||||
assert -20 <= max_temp <= 50, "Max temperature should be reasonable"
|
|
||||||
|
|
||||||
if precipitations:
|
|
||||||
total_precip = sum(precipitations)
|
|
||||||
rainy_days = len([p for p in precipitations if p > 0.1])
|
|
||||||
|
|
||||||
# Validate precipitation statistics
|
|
||||||
assert total_precip >= 0, "Total precipitation should be non-negative"
|
|
||||||
assert 0 <= rainy_days <= len(precipitations), "Rainy days should be reasonable"
|
|
||||||
|
|
||||||
print(f"✅ Historical analysis test - {len(historical)} records analyzed")
|
|
||||||
|
|
||||||
return {
|
|
||||||
'record_count': len(historical),
|
|
||||||
'avg_temp': avg_temp if temperatures else None,
|
|
||||||
'temp_range': (min_temp, max_temp) if temperatures else None,
|
|
||||||
'total_precip': total_precip if precipitations else None,
|
|
||||||
'rainy_days': rainy_days if precipitations else None
|
|
||||||
}
|
|
||||||
|
|
||||||
return {}
|
|
||||||
|
|
||||||
|
|
||||||
class TestAEMETRegressionTests:
|
|
||||||
"""Regression tests for previously fixed issues"""
|
|
||||||
|
|
||||||
async def test_timezone_handling_regression(self, aemet_client, madrid_coords):
|
|
||||||
"""Regression test for timezone handling issues"""
|
|
||||||
lat, lon = madrid_coords
|
|
||||||
|
|
||||||
# Get current weather and forecast
|
|
||||||
current = await aemet_client.get_current_weather(lat, lon)
|
|
||||||
forecast = await aemet_client.get_forecast(lat, lon, 2)
|
|
||||||
|
|
||||||
if current:
|
|
||||||
# Current weather date should be recent (within last hour)
|
|
||||||
now = datetime.now()
|
|
||||||
time_diff = abs((now - current['date']).total_seconds())
|
|
||||||
assert time_diff < 3600, "Current weather timestamp should be recent"
|
|
||||||
|
|
||||||
if forecast:
|
|
||||||
# Forecast dates should be in the future
|
|
||||||
now = datetime.now().date()
|
|
||||||
for day in forecast:
|
|
||||||
forecast_date = day['forecast_date'].date()
|
|
||||||
assert forecast_date >= now, f"Forecast date {forecast_date} should be today or future"
|
|
||||||
|
|
||||||
async def test_data_type_conversion_regression(self, weather_parser):
|
|
||||||
"""Regression test for data type conversion issues"""
|
|
||||||
# Test cases that previously caused issues
|
|
||||||
test_cases = [
|
|
||||||
("25.5", 25.5), # String to float
|
|
||||||
(25, 25.0), # Int to float
|
|
||||||
("", None), # Empty string
|
|
||||||
("invalid", None), # Invalid string
|
|
||||||
(None, None), # None input
|
|
||||||
]
|
|
||||||
|
|
||||||
for input_val, expected in test_cases:
|
|
||||||
result = weather_parser.safe_float(input_val, None)
|
|
||||||
if expected is None:
|
|
||||||
assert result is None, f"Expected None for input {input_val}, got {result}"
|
|
||||||
else:
|
|
||||||
assert result == expected, f"Expected {expected} for input {input_val}, got {result}"
|
|
||||||
|
|
||||||
def test_empty_data_handling_regression(self, weather_parser):
|
|
||||||
"""Regression test for empty data handling"""
|
|
||||||
# Empty lists and dictionaries should be handled gracefully
|
|
||||||
empty_data_cases = [
|
|
||||||
[],
|
|
||||||
[{}],
|
|
||||||
[{"invalid": "data"}],
|
|
||||||
None,
|
|
||||||
]
|
|
||||||
|
|
||||||
for empty_data in empty_data_cases:
|
|
||||||
result = weather_parser.parse_historical_data(empty_data if empty_data is not None else [])
|
|
||||||
assert isinstance(result, list), f"Should return list for empty data: {empty_data}"
|
|
||||||
# May be empty or have some synthetic data, but should not crash
|
|
||||||
|
|
||||||
|
|
||||||
# ================================================================
|
|
||||||
# STANDALONE TEST RUNNER FOR EDGE CASES
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
async def run_edge_case_tests():
|
|
||||||
"""Run edge case tests manually"""
|
|
||||||
print("="*60)
|
|
||||||
print("AEMET EDGE CASE TESTS")
|
|
||||||
print("="*60)
|
|
||||||
|
|
||||||
client = AEMETClient()
|
|
||||||
parser = WeatherDataParser()
|
|
||||||
generator = SyntheticWeatherGenerator()
|
|
||||||
|
|
||||||
madrid_coords = (40.4168, -3.7038)
|
|
||||||
|
|
||||||
print(f"\n1. Testing extreme coordinates...")
|
|
||||||
extreme_result = await client.get_current_weather(90, 180)
|
|
||||||
print(f" Extreme coords result: {extreme_result['source']} source")
|
|
||||||
|
|
||||||
print(f"\n2. Testing parser edge cases...")
|
|
||||||
parser_tests = [
|
|
||||||
parser.safe_float(None, 10.0),
|
|
||||||
parser.safe_float("invalid", 5.0),
|
|
||||||
parser.extract_temperature_value([]),
|
|
||||||
]
|
|
||||||
print(f" Parser edge cases passed: {len(parser_tests)}")
|
|
||||||
|
|
||||||
print(f"\n3. Testing synthetic generator extremes...")
|
|
||||||
large_forecast = generator.generate_forecast_sync(100)
|
|
||||||
print(f" Generated {len(large_forecast)} forecast days")
|
|
||||||
|
|
||||||
print(f"\n4. Testing concurrent requests...")
|
|
||||||
tasks = [client.get_current_weather(*madrid_coords) for _ in range(5)]
|
|
||||||
concurrent_results = await asyncio.gather(*tasks, return_exceptions=True)
|
|
||||||
successful = len([r for r in concurrent_results if isinstance(r, dict)])
|
|
||||||
print(f" Concurrent requests: {successful}/5 successful")
|
|
||||||
|
|
||||||
print(f"\n✅ Edge case tests completed!")
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
asyncio.run(run_edge_case_tests())
|
|
||||||
34
services/external/Dockerfile
vendored
Normal file
34
services/external/Dockerfile
vendored
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
# services/external/Dockerfile
|
||||||
|
FROM python:3.11-slim
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Install system dependencies
|
||||||
|
RUN apt-get update && apt-get install -y \
|
||||||
|
gcc \
|
||||||
|
g++ \
|
||||||
|
curl \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
# Copy requirements and install Python dependencies
|
||||||
|
COPY services/external/requirements.txt .
|
||||||
|
RUN pip install --no-cache-dir -r requirements.txt
|
||||||
|
|
||||||
|
# Copy shared modules first
|
||||||
|
COPY shared/ /app/shared/
|
||||||
|
|
||||||
|
# Copy application code
|
||||||
|
COPY services/external/app/ /app/app/
|
||||||
|
|
||||||
|
# Set Python path to include shared modules
|
||||||
|
ENV PYTHONPATH=/app
|
||||||
|
|
||||||
|
# Expose port
|
||||||
|
EXPOSE 8000
|
||||||
|
|
||||||
|
# Health check
|
||||||
|
HEALTHCHECK --interval=30s --timeout=10s --start-period=60s --retries=3 \
|
||||||
|
CMD python -c "import requests; requests.get('http://localhost:8000/health', timeout=5)" || exit 1
|
||||||
|
|
||||||
|
# Run the application
|
||||||
|
CMD ["python", "-m", "uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
|
||||||
1
services/external/app/__init__.py
vendored
Normal file
1
services/external/app/__init__.py
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
# services/external/app/__init__.py
|
||||||
1
services/external/app/api/__init__.py
vendored
Normal file
1
services/external/app/api/__init__.py
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
# services/external/app/api/__init__.py
|
||||||
@@ -1,6 +1,4 @@
|
|||||||
# ================================================================
|
# services/external/app/api/traffic.py
|
||||||
# services/data/app/api/traffic.py - FIXED VERSION
|
|
||||||
# ================================================================
|
|
||||||
"""Traffic data API endpoints with improved error handling"""
|
"""Traffic data API endpoints with improved error handling"""
|
||||||
|
|
||||||
from fastapi import APIRouter, Depends, HTTPException, Query, Path
|
from fastapi import APIRouter, Depends, HTTPException, Query, Path
|
||||||
@@ -12,10 +10,11 @@ from sqlalchemy.ext.asyncio import AsyncSession
|
|||||||
|
|
||||||
from app.core.database import get_db
|
from app.core.database import get_db
|
||||||
from app.services.traffic_service import TrafficService
|
from app.services.traffic_service import TrafficService
|
||||||
from app.services.messaging import data_publisher, publish_traffic_updated
|
from app.services.messaging import publish_traffic_updated
|
||||||
from app.schemas.external import (
|
from app.schemas.traffic import (
|
||||||
TrafficDataResponse,
|
TrafficDataResponse,
|
||||||
HistoricalTrafficRequest
|
HistoricalTrafficRequest,
|
||||||
|
TrafficForecastRequest
|
||||||
)
|
)
|
||||||
|
|
||||||
from shared.auth.decorators import (
|
from shared.auth.decorators import (
|
||||||
@@ -86,7 +85,7 @@ async def get_historical_traffic(
|
|||||||
raise HTTPException(status_code=400, detail="Date range cannot exceed 90 days")
|
raise HTTPException(status_code=400, detail="Date range cannot exceed 90 days")
|
||||||
|
|
||||||
historical_data = await traffic_service.get_historical_traffic(
|
historical_data = await traffic_service.get_historical_traffic(
|
||||||
request.latitude, request.longitude, request.start_date, request.end_date, db
|
request.latitude, request.longitude, request.start_date, request.end_date, str(tenant_id)
|
||||||
)
|
)
|
||||||
|
|
||||||
# Publish event (with error handling)
|
# Publish event (with error handling)
|
||||||
@@ -112,58 +111,74 @@ async def get_historical_traffic(
|
|||||||
logger.error("Unexpected error in historical traffic API", error=str(e))
|
logger.error("Unexpected error in historical traffic API", error=str(e))
|
||||||
raise HTTPException(status_code=500, detail=f"Internal server error: {str(e)}")
|
raise HTTPException(status_code=500, detail=f"Internal server error: {str(e)}")
|
||||||
|
|
||||||
@router.post("/tenants/{tenant_id}/traffic/stored")
|
@router.post("/tenants/{tenant_id}/traffic/forecast")
|
||||||
async def get_stored_traffic_for_training(
|
async def get_traffic_forecast(
|
||||||
request: HistoricalTrafficRequest,
|
request: TrafficForecastRequest,
|
||||||
db: AsyncSession = Depends(get_db),
|
|
||||||
tenant_id: UUID = Path(..., description="Tenant ID"),
|
tenant_id: UUID = Path(..., description="Tenant ID"),
|
||||||
current_user: Dict[str, Any] = Depends(get_current_user_dep),
|
current_user: Dict[str, Any] = Depends(get_current_user_dep),
|
||||||
):
|
):
|
||||||
"""Get stored traffic data specifically for training/re-training purposes"""
|
"""Get traffic forecast for location"""
|
||||||
try:
|
try:
|
||||||
# Validate date range
|
logger.debug("API: Getting traffic forecast",
|
||||||
if request.end_date <= request.start_date:
|
lat=request.latitude, lon=request.longitude, hours=request.hours)
|
||||||
raise HTTPException(status_code=400, detail="End date must be after start date")
|
|
||||||
|
|
||||||
# Allow longer date ranges for training (up to 3 years)
|
# For now, return mock forecast data since we don't have a real traffic forecast service
|
||||||
if (request.end_date - request.start_date).days > 1095:
|
# In a real implementation, this would call a traffic forecasting service
|
||||||
raise HTTPException(status_code=400, detail="Date range cannot exceed 3 years for training data")
|
|
||||||
|
|
||||||
logger.info("Retrieving stored traffic data for training",
|
# Generate mock forecast data for the requested hours
|
||||||
tenant_id=str(tenant_id),
|
forecast_data = []
|
||||||
location=f"{request.latitude},{request.longitude}",
|
from datetime import datetime, timedelta
|
||||||
date_range=f"{request.start_date} to {request.end_date}")
|
|
||||||
|
|
||||||
# Use the dedicated method for training data retrieval
|
base_time = datetime.utcnow()
|
||||||
stored_data = await traffic_service.get_stored_traffic_for_training(
|
for hour in range(request.hours):
|
||||||
request.latitude, request.longitude, request.start_date, request.end_date, db
|
forecast_time = base_time + timedelta(hours=hour)
|
||||||
)
|
|
||||||
|
|
||||||
# Log retrieval for audit purposes
|
# Mock traffic pattern (higher during rush hours)
|
||||||
logger.info("Stored traffic data retrieved for training",
|
hour_of_day = forecast_time.hour
|
||||||
records_count=len(stored_data),
|
if 7 <= hour_of_day <= 9 or 17 <= hour_of_day <= 19: # Rush hours
|
||||||
tenant_id=str(tenant_id),
|
traffic_volume = 120
|
||||||
purpose="model_training")
|
pedestrian_count = 80
|
||||||
|
congestion_level = "high"
|
||||||
|
average_speed = 15
|
||||||
|
elif 22 <= hour_of_day or hour_of_day <= 6: # Night hours
|
||||||
|
traffic_volume = 20
|
||||||
|
pedestrian_count = 10
|
||||||
|
congestion_level = "low"
|
||||||
|
average_speed = 50
|
||||||
|
else: # Regular hours
|
||||||
|
traffic_volume = 60
|
||||||
|
pedestrian_count = 40
|
||||||
|
congestion_level = "medium"
|
||||||
|
average_speed = 35
|
||||||
|
|
||||||
# Publish event for monitoring
|
# Use consistent TrafficDataResponse format
|
||||||
|
forecast_data.append({
|
||||||
|
"date": forecast_time.isoformat(),
|
||||||
|
"traffic_volume": traffic_volume,
|
||||||
|
"pedestrian_count": pedestrian_count,
|
||||||
|
"congestion_level": congestion_level,
|
||||||
|
"average_speed": average_speed,
|
||||||
|
"source": "madrid_opendata_forecast"
|
||||||
|
})
|
||||||
|
|
||||||
|
# Publish event (with error handling)
|
||||||
try:
|
try:
|
||||||
await publish_traffic_updated({
|
await publish_traffic_updated({
|
||||||
"type": "stored_data_retrieved_for_training",
|
"type": "forecast_requested",
|
||||||
"latitude": request.latitude,
|
"latitude": request.latitude,
|
||||||
"longitude": request.longitude,
|
"longitude": request.longitude,
|
||||||
"start_date": request.start_date.isoformat(),
|
"hours": request.hours,
|
||||||
"end_date": request.end_date.isoformat(),
|
|
||||||
"records_count": len(stored_data),
|
|
||||||
"tenant_id": str(tenant_id),
|
|
||||||
"timestamp": datetime.utcnow().isoformat()
|
"timestamp": datetime.utcnow().isoformat()
|
||||||
})
|
})
|
||||||
except Exception as pub_error:
|
except Exception as pub_error:
|
||||||
logger.warning("Failed to publish stored traffic retrieval event", error=str(pub_error))
|
logger.warning("Failed to publish traffic forecast event", error=str(pub_error))
|
||||||
|
# Continue processing
|
||||||
|
|
||||||
return stored_data
|
logger.debug("Successfully returning traffic forecast", records=len(forecast_data))
|
||||||
|
return forecast_data
|
||||||
|
|
||||||
except HTTPException:
|
except HTTPException:
|
||||||
raise
|
raise
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error("Unexpected error in stored traffic retrieval API", error=str(e))
|
logger.error("Unexpected error in traffic forecast API", error=str(e))
|
||||||
raise HTTPException(status_code=500, detail=f"Internal server error: {str(e)}")
|
raise HTTPException(status_code=500, detail=f"Internal server error: {str(e)}")
|
||||||
@@ -1,5 +1,7 @@
|
|||||||
# services/data/app/api/weather.py - UPDATED WITH UNIFIED AUTH
|
# services/external/app/api/weather.py
|
||||||
"""Weather data API endpoints with unified authentication"""
|
"""
|
||||||
|
Weather API Endpoints
|
||||||
|
"""
|
||||||
|
|
||||||
from fastapi import APIRouter, Depends, HTTPException, Query, BackgroundTasks, Path
|
from fastapi import APIRouter, Depends, HTTPException, Query, BackgroundTasks, Path
|
||||||
from typing import List, Optional, Dict, Any
|
from typing import List, Optional, Dict, Any
|
||||||
@@ -7,18 +9,15 @@ from datetime import datetime, date
|
|||||||
import structlog
|
import structlog
|
||||||
from uuid import UUID
|
from uuid import UUID
|
||||||
|
|
||||||
from app.schemas.external import (
|
from app.schemas.weather import (
|
||||||
WeatherDataResponse,
|
WeatherDataResponse,
|
||||||
WeatherForecastResponse,
|
WeatherForecastResponse,
|
||||||
WeatherForecastRequest
|
WeatherForecastRequest,
|
||||||
|
HistoricalWeatherRequest
|
||||||
)
|
)
|
||||||
from app.services.weather_service import WeatherService
|
from app.services.weather_service import WeatherService
|
||||||
from app.services.messaging import publish_weather_updated
|
from app.services.messaging import publish_weather_updated
|
||||||
|
|
||||||
from app.schemas.external import (
|
|
||||||
HistoricalWeatherRequest
|
|
||||||
)
|
|
||||||
|
|
||||||
# Import unified authentication from shared library
|
# Import unified authentication from shared library
|
||||||
from shared.auth.decorators import (
|
from shared.auth.decorators import (
|
||||||
get_current_user_dep,
|
get_current_user_dep,
|
||||||
@@ -73,6 +72,49 @@ async def get_current_weather(
|
|||||||
logger.error("Failed to get current weather", error=str(e))
|
logger.error("Failed to get current weather", error=str(e))
|
||||||
raise HTTPException(status_code=500, detail=f"Internal server error: {str(e)}")
|
raise HTTPException(status_code=500, detail=f"Internal server error: {str(e)}")
|
||||||
|
|
||||||
|
@router.post("/tenants/{tenant_id}/weather/historical")
|
||||||
|
async def get_historical_weather(
|
||||||
|
request: HistoricalWeatherRequest,
|
||||||
|
db: AsyncSession = Depends(get_db),
|
||||||
|
tenant_id: UUID = Path(..., description="Tenant ID"),
|
||||||
|
current_user: Dict[str, Any] = Depends(get_current_user_dep),
|
||||||
|
):
|
||||||
|
"""Get historical weather data with date range in payload"""
|
||||||
|
try:
|
||||||
|
# Validate date range
|
||||||
|
if request.end_date <= request.start_date:
|
||||||
|
raise HTTPException(status_code=400, detail="End date must be after start date")
|
||||||
|
|
||||||
|
if (request.end_date - request.start_date).days > 1000:
|
||||||
|
raise HTTPException(status_code=400, detail="Date range cannot exceed 90 days")
|
||||||
|
|
||||||
|
historical_data = await weather_service.get_historical_weather(
|
||||||
|
request.latitude, request.longitude, request.start_date, request.end_date)
|
||||||
|
|
||||||
|
# Publish event (with error handling)
|
||||||
|
try:
|
||||||
|
await publish_weather_updated({
|
||||||
|
"type": "historical_requested",
|
||||||
|
"latitude": request.latitude,
|
||||||
|
"longitude": request.longitude,
|
||||||
|
"start_date": request.start_date.isoformat(),
|
||||||
|
"end_date": request.end_date.isoformat(),
|
||||||
|
"records_count": len(historical_data),
|
||||||
|
"timestamp": datetime.utcnow().isoformat()
|
||||||
|
})
|
||||||
|
except Exception as pub_error:
|
||||||
|
logger.warning("Failed to publish historical weather event", error=str(pub_error))
|
||||||
|
# Continue processing
|
||||||
|
|
||||||
|
return historical_data
|
||||||
|
|
||||||
|
except HTTPException:
|
||||||
|
raise
|
||||||
|
except Exception as e:
|
||||||
|
logger.error("Unexpected error in historical weather API", error=str(e))
|
||||||
|
raise HTTPException(status_code=500, detail=f"Internal server error: {str(e)}")
|
||||||
|
|
||||||
|
|
||||||
@router.post("/tenants/{tenant_id}/weather/forecast", response_model=List[WeatherForecastResponse])
|
@router.post("/tenants/{tenant_id}/weather/forecast", response_model=List[WeatherForecastResponse])
|
||||||
async def get_weather_forecast(
|
async def get_weather_forecast(
|
||||||
request: WeatherForecastRequest,
|
request: WeatherForecastRequest,
|
||||||
@@ -113,86 +155,3 @@ async def get_weather_forecast(
|
|||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error("Failed to get weather forecast", error=str(e))
|
logger.error("Failed to get weather forecast", error=str(e))
|
||||||
raise HTTPException(status_code=500, detail=f"Internal server error: {str(e)}")
|
raise HTTPException(status_code=500, detail=f"Internal server error: {str(e)}")
|
||||||
|
|
||||||
@router.post("/tenants/{tenant_id}/weather/historical")
|
|
||||||
async def get_historical_weather(
|
|
||||||
request: HistoricalWeatherRequest,
|
|
||||||
db: AsyncSession = Depends(get_db),
|
|
||||||
tenant_id: UUID = Path(..., description="Tenant ID"),
|
|
||||||
current_user: Dict[str, Any] = Depends(get_current_user_dep),
|
|
||||||
):
|
|
||||||
"""Get historical weather data with date range in payload"""
|
|
||||||
try:
|
|
||||||
# Validate date range
|
|
||||||
if request.end_date <= request.start_date:
|
|
||||||
raise HTTPException(status_code=400, detail="End date must be after start date")
|
|
||||||
|
|
||||||
if (request.end_date - request.start_date).days > 1000:
|
|
||||||
raise HTTPException(status_code=400, detail="Date range cannot exceed 90 days")
|
|
||||||
|
|
||||||
historical_data = await weather_service.get_historical_weather(
|
|
||||||
request.latitude, request.longitude, request.start_date, request.end_date, db
|
|
||||||
)
|
|
||||||
|
|
||||||
# Publish event (with error handling)
|
|
||||||
try:
|
|
||||||
await publish_weather_updated({
|
|
||||||
"type": "historical_requested",
|
|
||||||
"latitude": request.latitude,
|
|
||||||
"longitude": request.longitude,
|
|
||||||
"start_date": request.start_date.isoformat(),
|
|
||||||
"end_date": request.end_date.isoformat(),
|
|
||||||
"records_count": len(historical_data),
|
|
||||||
"timestamp": datetime.utcnow().isoformat()
|
|
||||||
})
|
|
||||||
except Exception as pub_error:
|
|
||||||
logger.warning("Failed to publish historical weather event", error=str(pub_error))
|
|
||||||
# Continue processing
|
|
||||||
|
|
||||||
return historical_data
|
|
||||||
|
|
||||||
except HTTPException:
|
|
||||||
raise
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Unexpected error in historical weather API", error=str(e))
|
|
||||||
raise HTTPException(status_code=500, detail=f"Internal server error: {str(e)}")
|
|
||||||
|
|
||||||
@router.post("/tenants/{tenant_id}weather/sync")
|
|
||||||
async def sync_weather_data(
|
|
||||||
background_tasks: BackgroundTasks,
|
|
||||||
force: bool = Query(False, description="Force sync even if recently synced"),
|
|
||||||
tenant_id: UUID = Path(..., description="Tenant ID"),
|
|
||||||
current_user: Dict[str, Any] = Depends(get_current_user_dep),
|
|
||||||
):
|
|
||||||
"""Manually trigger weather data synchronization"""
|
|
||||||
try:
|
|
||||||
logger.info("Weather sync requested",
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
user_id=current_user["user_id"],
|
|
||||||
force=force)
|
|
||||||
|
|
||||||
# Check if user has permission to sync (could be admin only)
|
|
||||||
if current_user.get("role") not in ["admin", "manager"]:
|
|
||||||
raise HTTPException(
|
|
||||||
status_code=403,
|
|
||||||
detail="Insufficient permissions to sync weather data"
|
|
||||||
)
|
|
||||||
|
|
||||||
# Schedule background sync
|
|
||||||
background_tasks.add_task(
|
|
||||||
weather_service.sync_weather_data,
|
|
||||||
tenant_id=tenant_id,
|
|
||||||
force=force
|
|
||||||
)
|
|
||||||
|
|
||||||
return {
|
|
||||||
"message": "Weather sync initiated",
|
|
||||||
"status": "processing",
|
|
||||||
"initiated_by": current_user["user_id"]
|
|
||||||
}
|
|
||||||
|
|
||||||
except HTTPException:
|
|
||||||
raise
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("Failed to initiate weather sync", error=str(e))
|
|
||||||
raise HTTPException(status_code=500, detail=f"Internal server error: {str(e)}")
|
|
||||||
1
services/external/app/core/__init__.py
vendored
Normal file
1
services/external/app/core/__init__.py
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
# services/external/app/core/__init__.py
|
||||||
@@ -1,30 +1,26 @@
|
|||||||
# ================================================================
|
# services/external/app/core/config.py
|
||||||
# DATA SERVICE CONFIGURATION
|
|
||||||
# services/data/app/core/config.py
|
|
||||||
# ================================================================
|
|
||||||
|
|
||||||
"""
|
|
||||||
Data service configuration
|
|
||||||
External data integration and management
|
|
||||||
"""
|
|
||||||
|
|
||||||
from shared.config.base import BaseServiceSettings
|
from shared.config.base import BaseServiceSettings
|
||||||
import os
|
import os
|
||||||
|
from pydantic import Field
|
||||||
|
|
||||||
class DataSettings(BaseServiceSettings):
|
class DataSettings(BaseServiceSettings):
|
||||||
"""Data service specific settings"""
|
"""Data service specific settings"""
|
||||||
|
|
||||||
# Service Identity
|
# Service Identity
|
||||||
APP_NAME: str = "Data Service"
|
SERVICE_NAME: str = "external-service"
|
||||||
SERVICE_NAME: str = "data-service"
|
VERSION: str = "1.0.0"
|
||||||
DESCRIPTION: str = "External data integration and management service"
|
APP_NAME: str = "Bakery External Data Service"
|
||||||
|
DESCRIPTION: str = "External data collection service for weather and traffic data"
|
||||||
|
|
||||||
# Database Configuration
|
# API Configuration
|
||||||
DATABASE_URL: str = os.getenv("DATA_DATABASE_URL",
|
API_V1_STR: str = "/api/v1"
|
||||||
"postgresql+asyncpg://data_user:data_pass123@data-db:5432/data_db")
|
|
||||||
|
|
||||||
# Redis Database (dedicated for external data cache)
|
# Override database URL to use EXTERNAL_DATABASE_URL
|
||||||
REDIS_DB: int = 3
|
DATABASE_URL: str = Field(
|
||||||
|
default="postgresql+asyncpg://external_user:external_pass123@external-db:5432/external_db",
|
||||||
|
env="EXTERNAL_DATABASE_URL"
|
||||||
|
)
|
||||||
|
|
||||||
# External API Configuration
|
# External API Configuration
|
||||||
AEMET_API_KEY: str = os.getenv("AEMET_API_KEY", "")
|
AEMET_API_KEY: str = os.getenv("AEMET_API_KEY", "")
|
||||||
81
services/external/app/core/database.py
vendored
Normal file
81
services/external/app/core/database.py
vendored
Normal file
@@ -0,0 +1,81 @@
|
|||||||
|
# services/external/app/core/database.py
|
||||||
|
"""
|
||||||
|
External Service Database Configuration using shared database manager
|
||||||
|
"""
|
||||||
|
|
||||||
|
import structlog
|
||||||
|
from contextlib import asynccontextmanager
|
||||||
|
from typing import AsyncGenerator
|
||||||
|
|
||||||
|
from app.core.config import settings
|
||||||
|
from shared.database.base import DatabaseManager, Base
|
||||||
|
|
||||||
|
logger = structlog.get_logger()
|
||||||
|
|
||||||
|
# Create database manager instance
|
||||||
|
database_manager = DatabaseManager(
|
||||||
|
database_url=settings.DATABASE_URL,
|
||||||
|
service_name="external-service"
|
||||||
|
)
|
||||||
|
|
||||||
|
async def get_db():
|
||||||
|
"""
|
||||||
|
Database dependency for FastAPI - using shared database manager
|
||||||
|
"""
|
||||||
|
async for session in database_manager.get_db():
|
||||||
|
yield session
|
||||||
|
|
||||||
|
|
||||||
|
async def init_db():
|
||||||
|
"""Initialize database tables using shared database manager"""
|
||||||
|
try:
|
||||||
|
logger.info("Initializing External Service database...")
|
||||||
|
|
||||||
|
# Import all models to ensure they're registered
|
||||||
|
from app.models import weather, traffic # noqa: F401
|
||||||
|
|
||||||
|
# Create all tables using database manager
|
||||||
|
await database_manager.create_tables(Base.metadata)
|
||||||
|
|
||||||
|
logger.info("External Service database initialized successfully")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error("Failed to initialize database", error=str(e))
|
||||||
|
raise
|
||||||
|
|
||||||
|
|
||||||
|
async def close_db():
|
||||||
|
"""Close database connections using shared database manager"""
|
||||||
|
try:
|
||||||
|
await database_manager.close_connections()
|
||||||
|
logger.info("Database connections closed")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error("Error closing database connections", error=str(e))
|
||||||
|
|
||||||
|
|
||||||
|
@asynccontextmanager
|
||||||
|
async def get_db_transaction():
|
||||||
|
"""
|
||||||
|
Context manager for database transactions using shared database manager
|
||||||
|
"""
|
||||||
|
async with database_manager.get_session() as session:
|
||||||
|
try:
|
||||||
|
async with session.begin():
|
||||||
|
yield session
|
||||||
|
except Exception as e:
|
||||||
|
logger.error("Transaction error", error=str(e))
|
||||||
|
raise
|
||||||
|
|
||||||
|
|
||||||
|
@asynccontextmanager
|
||||||
|
async def get_background_session():
|
||||||
|
"""
|
||||||
|
Context manager for background tasks using shared database manager
|
||||||
|
"""
|
||||||
|
async with database_manager.get_background_session() as session:
|
||||||
|
yield session
|
||||||
|
|
||||||
|
|
||||||
|
async def health_check():
|
||||||
|
"""Database health check using shared database manager"""
|
||||||
|
return await database_manager.health_check()
|
||||||
@@ -16,13 +16,6 @@ from ..clients.madrid_client import MadridTrafficAPIClient
|
|||||||
from ..processors.madrid_processor import MadridTrafficDataProcessor
|
from ..processors.madrid_processor import MadridTrafficDataProcessor
|
||||||
from ..processors.madrid_business_logic import MadridTrafficAnalyzer
|
from ..processors.madrid_business_logic import MadridTrafficAnalyzer
|
||||||
from ..models.madrid_models import TrafficRecord, CongestionLevel
|
from ..models.madrid_models import TrafficRecord, CongestionLevel
|
||||||
from app.core.performance import (
|
|
||||||
rate_limit,
|
|
||||||
async_cache,
|
|
||||||
monitor_performance,
|
|
||||||
global_performance_monitor
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
class MadridTrafficClient(BaseTrafficClient, BaseAPIClient):
|
class MadridTrafficClient(BaseTrafficClient, BaseAPIClient):
|
||||||
"""
|
"""
|
||||||
@@ -57,9 +50,6 @@ class MadridTrafficClient(BaseTrafficClient, BaseAPIClient):
|
|||||||
return (self.MADRID_BOUNDS['lat_min'] <= latitude <= self.MADRID_BOUNDS['lat_max'] and
|
return (self.MADRID_BOUNDS['lat_min'] <= latitude <= self.MADRID_BOUNDS['lat_max'] and
|
||||||
self.MADRID_BOUNDS['lon_min'] <= longitude <= self.MADRID_BOUNDS['lon_max'])
|
self.MADRID_BOUNDS['lon_min'] <= longitude <= self.MADRID_BOUNDS['lon_max'])
|
||||||
|
|
||||||
@rate_limit(calls=30, period=60)
|
|
||||||
@async_cache(ttl=300)
|
|
||||||
@monitor_performance(monitor=global_performance_monitor)
|
|
||||||
async def get_current_traffic(self, latitude: float, longitude: float) -> Optional[Dict[str, Any]]:
|
async def get_current_traffic(self, latitude: float, longitude: float) -> Optional[Dict[str, Any]]:
|
||||||
"""Get current traffic data with enhanced pedestrian inference"""
|
"""Get current traffic data with enhanced pedestrian inference"""
|
||||||
try:
|
try:
|
||||||
@@ -98,8 +88,6 @@ class MadridTrafficClient(BaseTrafficClient, BaseAPIClient):
|
|||||||
self.logger.error("Error getting current traffic", error=str(e))
|
self.logger.error("Error getting current traffic", error=str(e))
|
||||||
return None
|
return None
|
||||||
|
|
||||||
@rate_limit(calls=10, period=60)
|
|
||||||
@monitor_performance(monitor=global_performance_monitor)
|
|
||||||
async def get_historical_traffic(self, latitude: float, longitude: float,
|
async def get_historical_traffic(self, latitude: float, longitude: float,
|
||||||
start_date: datetime, end_date: datetime) -> List[Dict[str, Any]]:
|
start_date: datetime, end_date: datetime) -> List[Dict[str, Any]]:
|
||||||
"""Get historical traffic data with pedestrian enhancement"""
|
"""Get historical traffic data with pedestrian enhancement"""
|
||||||
@@ -271,11 +259,24 @@ class MadridTrafficClient(BaseTrafficClient, BaseAPIClient):
|
|||||||
zip_content, zip_url, latitude, longitude, nearest_points
|
zip_content, zip_url, latitude, longitude, nearest_points
|
||||||
)
|
)
|
||||||
|
|
||||||
# Filter by date range
|
# Filter by date range - ensure timezone consistency
|
||||||
filtered_records = [
|
# Make sure start_date and end_date have timezone info for comparison
|
||||||
record for record in month_records
|
start_tz = start_date if start_date.tzinfo else start_date.replace(tzinfo=timezone.utc)
|
||||||
if start_date <= record.get('date', datetime.min.replace(tzinfo=timezone.utc)) <= end_date
|
end_tz = end_date if end_date.tzinfo else end_date.replace(tzinfo=timezone.utc)
|
||||||
]
|
|
||||||
|
filtered_records = []
|
||||||
|
for record in month_records:
|
||||||
|
record_date = record.get('date')
|
||||||
|
if not record_date:
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Ensure record date has timezone info
|
||||||
|
if not record_date.tzinfo:
|
||||||
|
record_date = record_date.replace(tzinfo=timezone.utc)
|
||||||
|
|
||||||
|
# Now compare with consistent timezone info
|
||||||
|
if start_tz <= record_date <= end_tz:
|
||||||
|
filtered_records.append(record)
|
||||||
|
|
||||||
historical_records.extend(filtered_records)
|
historical_records.extend(filtered_records)
|
||||||
|
|
||||||
@@ -54,19 +54,6 @@ class BaseAPIClient:
|
|||||||
logger.error("Unexpected error", error=str(e), url=url)
|
logger.error("Unexpected error", error=str(e), url=url)
|
||||||
return None
|
return None
|
||||||
|
|
||||||
async def get(self, url: str, headers: Optional[Dict] = None, timeout: Optional[int] = None) -> httpx.Response:
|
|
||||||
"""
|
|
||||||
Public GET method for direct HTTP requests
|
|
||||||
Returns the raw httpx Response object for maximum flexibility
|
|
||||||
"""
|
|
||||||
request_headers = headers or {}
|
|
||||||
request_timeout = httpx.Timeout(timeout if timeout else 30.0)
|
|
||||||
|
|
||||||
async with httpx.AsyncClient(timeout=request_timeout, follow_redirects=True) as client:
|
|
||||||
response = await client.get(url, headers=request_headers)
|
|
||||||
response.raise_for_status()
|
|
||||||
return response
|
|
||||||
|
|
||||||
async def _fetch_url_directly(self, url: str, headers: Optional[Dict] = None) -> Optional[Dict[str, Any]]:
|
async def _fetch_url_directly(self, url: str, headers: Optional[Dict] = None) -> Optional[Dict[str, Any]]:
|
||||||
"""Fetch data directly from a full URL (for AEMET datos URLs)"""
|
"""Fetch data directly from a full URL (for AEMET datos URLs)"""
|
||||||
try:
|
try:
|
||||||
@@ -138,7 +125,7 @@ class BaseAPIClient:
|
|||||||
logger.error("Unexpected error", error=str(e), url=url)
|
logger.error("Unexpected error", error=str(e), url=url)
|
||||||
return None
|
return None
|
||||||
|
|
||||||
async def get(self, url: str, headers: Optional[Dict] = None, timeout: Optional[int] = None) -> httpx.Response:
|
async def get_direct(self, url: str, headers: Optional[Dict] = None, timeout: Optional[int] = None) -> httpx.Response:
|
||||||
"""
|
"""
|
||||||
Public GET method for direct HTTP requests
|
Public GET method for direct HTTP requests
|
||||||
Returns the raw httpx Response object for maximum flexibility
|
Returns the raw httpx Response object for maximum flexibility
|
||||||
@@ -17,7 +17,7 @@ from ..base_client import BaseAPIClient
|
|||||||
class MadridTrafficAPIClient(BaseAPIClient):
|
class MadridTrafficAPIClient(BaseAPIClient):
|
||||||
"""Pure HTTP client for Madrid traffic APIs"""
|
"""Pure HTTP client for Madrid traffic APIs"""
|
||||||
|
|
||||||
TRAFFIC_ENDPOINT = "https://datos.madrid.es/egob/catalogo/202468-10-intensidad-trafico.xml"
|
TRAFFIC_ENDPOINT = "https://informo.madrid.es/informo/tmadrid/pm.xml"
|
||||||
MEASUREMENT_POINTS_URL = "https://datos.madrid.es/egob/catalogo/202468-263-intensidad-trafico.csv"
|
MEASUREMENT_POINTS_URL = "https://datos.madrid.es/egob/catalogo/202468-263-intensidad-trafico.csv"
|
||||||
|
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
@@ -46,12 +46,16 @@ class MadridTrafficAPIClient(BaseAPIClient):
|
|||||||
base_url = "https://datos.madrid.es/egob/catalogo/208627"
|
base_url = "https://datos.madrid.es/egob/catalogo/208627"
|
||||||
|
|
||||||
# URL numbering pattern (this may need adjustment based on actual URLs)
|
# URL numbering pattern (this may need adjustment based on actual URLs)
|
||||||
|
# Note: Historical data is only available for past periods, not current/future
|
||||||
if year == 2023:
|
if year == 2023:
|
||||||
url_number = 116 + (month - 1) # 116-127 for 2023
|
url_number = 116 + (month - 1) # 116-127 for 2023
|
||||||
elif year == 2024:
|
elif year == 2024:
|
||||||
url_number = 128 + (month - 1) # 128-139 for 2024
|
url_number = 128 + (month - 1) # 128-139 for 2024
|
||||||
|
elif year == 2025:
|
||||||
|
# For 2025, use the continuing numbering from 2024
|
||||||
|
url_number = 140 + (month - 1) # Starting from 140 for January 2025
|
||||||
else:
|
else:
|
||||||
url_number = 116 # Fallback
|
url_number = 116 # Fallback to 2023 data
|
||||||
|
|
||||||
return f"{base_url}-{url_number}-transporte-ptomedida-historico.zip"
|
return f"{base_url}-{url_number}-transporte-ptomedida-historico.zip"
|
||||||
|
|
||||||
@@ -69,7 +73,7 @@ class MadridTrafficAPIClient(BaseAPIClient):
|
|||||||
'Referer': 'https://datos.madrid.es/'
|
'Referer': 'https://datos.madrid.es/'
|
||||||
}
|
}
|
||||||
|
|
||||||
response = await self.get(endpoint, headers=headers, timeout=30)
|
response = await self.get_direct(endpoint, headers=headers, timeout=30)
|
||||||
|
|
||||||
if not response or response.status_code != 200:
|
if not response or response.status_code != 200:
|
||||||
self.logger.warning("Failed to fetch XML data",
|
self.logger.warning("Failed to fetch XML data",
|
||||||
186
services/external/app/main.py
vendored
Normal file
186
services/external/app/main.py
vendored
Normal file
@@ -0,0 +1,186 @@
|
|||||||
|
# services/external/app/main.py
|
||||||
|
"""
|
||||||
|
External Service Main Application
|
||||||
|
"""
|
||||||
|
|
||||||
|
import structlog
|
||||||
|
from contextlib import asynccontextmanager
|
||||||
|
from fastapi import FastAPI, Request
|
||||||
|
from fastapi.middleware.cors import CORSMiddleware
|
||||||
|
from fastapi.responses import JSONResponse
|
||||||
|
|
||||||
|
from app.core.config import settings
|
||||||
|
from app.core.database import init_db, close_db
|
||||||
|
from shared.monitoring import setup_logging, HealthChecker
|
||||||
|
from shared.monitoring.metrics import setup_metrics_early
|
||||||
|
|
||||||
|
# Setup logging first
|
||||||
|
setup_logging("external-service", settings.LOG_LEVEL)
|
||||||
|
logger = structlog.get_logger()
|
||||||
|
|
||||||
|
# Global variables for lifespan access
|
||||||
|
metrics_collector = None
|
||||||
|
health_checker = None
|
||||||
|
|
||||||
|
# Create FastAPI app FIRST
|
||||||
|
app = FastAPI(
|
||||||
|
title="Bakery External Data Service",
|
||||||
|
description="External data collection service for weather, traffic, and events data",
|
||||||
|
version="1.0.0"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Setup metrics BEFORE any middleware and BEFORE lifespan
|
||||||
|
metrics_collector = setup_metrics_early(app, "external-service")
|
||||||
|
|
||||||
|
@asynccontextmanager
|
||||||
|
async def lifespan(app: FastAPI):
|
||||||
|
"""Application lifespan events"""
|
||||||
|
global health_checker
|
||||||
|
|
||||||
|
# Startup
|
||||||
|
logger.info("Starting External Service...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Initialize database
|
||||||
|
await init_db()
|
||||||
|
logger.info("Database initialized")
|
||||||
|
|
||||||
|
# Register custom metrics
|
||||||
|
metrics_collector.register_counter("weather_api_calls_total", "Total weather API calls")
|
||||||
|
metrics_collector.register_counter("weather_api_success_total", "Successful weather API calls")
|
||||||
|
metrics_collector.register_counter("weather_api_failures_total", "Failed weather API calls")
|
||||||
|
|
||||||
|
metrics_collector.register_counter("traffic_api_calls_total", "Total traffic API calls")
|
||||||
|
metrics_collector.register_counter("traffic_api_success_total", "Successful traffic API calls")
|
||||||
|
metrics_collector.register_counter("traffic_api_failures_total", "Failed traffic API calls")
|
||||||
|
|
||||||
|
metrics_collector.register_counter("data_collection_jobs_total", "Data collection jobs")
|
||||||
|
metrics_collector.register_counter("data_records_stored_total", "Data records stored")
|
||||||
|
metrics_collector.register_counter("data_quality_issues_total", "Data quality issues detected")
|
||||||
|
|
||||||
|
metrics_collector.register_histogram("weather_api_duration_seconds", "Weather API call duration")
|
||||||
|
metrics_collector.register_histogram("traffic_api_duration_seconds", "Traffic API call duration")
|
||||||
|
metrics_collector.register_histogram("data_collection_duration_seconds", "Data collection job duration")
|
||||||
|
metrics_collector.register_histogram("data_processing_duration_seconds", "Data processing duration")
|
||||||
|
|
||||||
|
# Setup health checker
|
||||||
|
health_checker = HealthChecker("external-service")
|
||||||
|
|
||||||
|
# Add database health check
|
||||||
|
async def check_database():
|
||||||
|
try:
|
||||||
|
from app.core.database import get_db
|
||||||
|
from sqlalchemy import text
|
||||||
|
async for db in get_db():
|
||||||
|
await db.execute(text("SELECT 1"))
|
||||||
|
return True
|
||||||
|
except Exception as e:
|
||||||
|
return f"Database error: {e}"
|
||||||
|
|
||||||
|
# Add external API health checks
|
||||||
|
async def check_weather_api():
|
||||||
|
try:
|
||||||
|
# Simple connectivity check
|
||||||
|
if settings.AEMET_API_KEY:
|
||||||
|
return True
|
||||||
|
else:
|
||||||
|
return "AEMET API key not configured"
|
||||||
|
except Exception as e:
|
||||||
|
return f"Weather API error: {e}"
|
||||||
|
|
||||||
|
async def check_traffic_api():
|
||||||
|
try:
|
||||||
|
# Simple connectivity check
|
||||||
|
if settings.MADRID_OPENDATA_API_KEY:
|
||||||
|
return True
|
||||||
|
else:
|
||||||
|
return "Madrid Open Data API key not configured"
|
||||||
|
except Exception as e:
|
||||||
|
return f"Traffic API error: {e}"
|
||||||
|
|
||||||
|
health_checker.add_check("database", check_database, timeout=5.0, critical=True)
|
||||||
|
health_checker.add_check("weather_api", check_weather_api, timeout=10.0, critical=False)
|
||||||
|
health_checker.add_check("traffic_api", check_traffic_api, timeout=10.0, critical=False)
|
||||||
|
|
||||||
|
# Store health checker in app state
|
||||||
|
app.state.health_checker = health_checker
|
||||||
|
|
||||||
|
logger.info("External Service started successfully")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to start External Service: {e}")
|
||||||
|
raise
|
||||||
|
|
||||||
|
yield
|
||||||
|
|
||||||
|
# Shutdown
|
||||||
|
logger.info("Shutting down External Service...")
|
||||||
|
await close_db()
|
||||||
|
|
||||||
|
# Set lifespan AFTER metrics setup
|
||||||
|
app.router.lifespan_context = lifespan
|
||||||
|
|
||||||
|
# CORS middleware (added after metrics setup)
|
||||||
|
app.add_middleware(
|
||||||
|
CORSMiddleware,
|
||||||
|
allow_origins=settings.CORS_ORIGINS,
|
||||||
|
allow_credentials=True,
|
||||||
|
allow_methods=["*"],
|
||||||
|
allow_headers=["*"],
|
||||||
|
)
|
||||||
|
|
||||||
|
# Include routers
|
||||||
|
from app.api.weather import router as weather_router
|
||||||
|
from app.api.traffic import router as traffic_router
|
||||||
|
app.include_router(weather_router, prefix="/api/v1", tags=["weather"])
|
||||||
|
app.include_router(traffic_router, prefix="/api/v1", tags=["traffic"])
|
||||||
|
|
||||||
|
# Health check endpoint
|
||||||
|
@app.get("/health")
|
||||||
|
async def health_check():
|
||||||
|
"""Comprehensive health check endpoint"""
|
||||||
|
if health_checker:
|
||||||
|
return await health_checker.check_health()
|
||||||
|
else:
|
||||||
|
return {
|
||||||
|
"service": "external-service",
|
||||||
|
"status": "healthy",
|
||||||
|
"version": "1.0.0"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Root endpoint
|
||||||
|
@app.get("/")
|
||||||
|
async def root():
|
||||||
|
"""Root endpoint"""
|
||||||
|
return {
|
||||||
|
"service": "External Data Service",
|
||||||
|
"version": "1.0.0",
|
||||||
|
"status": "running",
|
||||||
|
"endpoints": {
|
||||||
|
"health": "/health",
|
||||||
|
"docs": "/docs",
|
||||||
|
"weather": "/api/v1/weather",
|
||||||
|
"traffic": "/api/v1/traffic",
|
||||||
|
"jobs": "/api/v1/jobs"
|
||||||
|
},
|
||||||
|
"data_sources": {
|
||||||
|
"weather": "AEMET (Spanish Weather Service)",
|
||||||
|
"traffic": "Madrid Open Data Portal",
|
||||||
|
"coverage": "Madrid, Spain"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Exception handlers
|
||||||
|
@app.exception_handler(Exception)
|
||||||
|
async def global_exception_handler(request: Request, exc: Exception):
|
||||||
|
"""Global exception handler with metrics"""
|
||||||
|
logger.error(f"Unhandled exception: {exc}", exc_info=True)
|
||||||
|
|
||||||
|
# Record error metric if available
|
||||||
|
if metrics_collector:
|
||||||
|
metrics_collector.increment_counter("errors_total", labels={"type": "unhandled"})
|
||||||
|
|
||||||
|
return JSONResponse(
|
||||||
|
status_code=500,
|
||||||
|
content={"detail": "Internal server error"}
|
||||||
|
)
|
||||||
1
services/external/app/models/__init__.py
vendored
Normal file
1
services/external/app/models/__init__.py
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
# services/external/app/models/__init__.py
|
||||||
@@ -3,8 +3,8 @@
|
|||||||
# ================================================================
|
# ================================================================
|
||||||
"""Weather data models"""
|
"""Weather data models"""
|
||||||
|
|
||||||
from sqlalchemy import Column, String, DateTime, Float, Integer, Text, Index
|
from sqlalchemy import Column, String, DateTime, Float, Integer, Text, Index, Boolean
|
||||||
from sqlalchemy.dialects.postgresql import UUID
|
from sqlalchemy.dialects.postgresql import UUID, JSON
|
||||||
import uuid
|
import uuid
|
||||||
from datetime import datetime, timezone
|
from datetime import datetime, timezone
|
||||||
|
|
||||||
@@ -15,15 +15,36 @@ class WeatherData(Base):
|
|||||||
|
|
||||||
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
|
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
|
||||||
location_id = Column(String(100), nullable=False, index=True)
|
location_id = Column(String(100), nullable=False, index=True)
|
||||||
|
city = Column(String(50), nullable=False)
|
||||||
|
station_name = Column(String(200), nullable=True)
|
||||||
|
latitude = Column(Float, nullable=True)
|
||||||
|
longitude = Column(Float, nullable=True)
|
||||||
date = Column(DateTime(timezone=True), nullable=False, index=True)
|
date = Column(DateTime(timezone=True), nullable=False, index=True)
|
||||||
|
forecast_date = Column(DateTime(timezone=True), nullable=True)
|
||||||
temperature = Column(Float, nullable=True) # Celsius
|
temperature = Column(Float, nullable=True) # Celsius
|
||||||
|
temperature_min = Column(Float, nullable=True)
|
||||||
|
temperature_max = Column(Float, nullable=True)
|
||||||
|
feels_like = Column(Float, nullable=True)
|
||||||
precipitation = Column(Float, nullable=True) # mm
|
precipitation = Column(Float, nullable=True) # mm
|
||||||
|
precipitation_probability = Column(Float, nullable=True)
|
||||||
humidity = Column(Float, nullable=True) # percentage
|
humidity = Column(Float, nullable=True) # percentage
|
||||||
wind_speed = Column(Float, nullable=True) # km/h
|
wind_speed = Column(Float, nullable=True) # km/h
|
||||||
|
wind_direction = Column(Float, nullable=True)
|
||||||
|
wind_gust = Column(Float, nullable=True)
|
||||||
pressure = Column(Float, nullable=True) # hPa
|
pressure = Column(Float, nullable=True) # hPa
|
||||||
|
visibility = Column(Float, nullable=True)
|
||||||
|
uv_index = Column(Float, nullable=True)
|
||||||
|
cloud_cover = Column(Float, nullable=True)
|
||||||
|
condition = Column(String(100), nullable=True)
|
||||||
description = Column(String(200), nullable=True)
|
description = Column(String(200), nullable=True)
|
||||||
|
weather_code = Column(String(20), nullable=True)
|
||||||
source = Column(String(50), nullable=False, default="aemet")
|
source = Column(String(50), nullable=False, default="aemet")
|
||||||
raw_data = Column(Text, nullable=True)
|
data_type = Column(String(20), nullable=False)
|
||||||
|
is_forecast = Column(Boolean, nullable=True)
|
||||||
|
data_quality_score = Column(Float, nullable=True)
|
||||||
|
raw_data = Column(JSON, nullable=True)
|
||||||
|
processed_data = Column(JSON, nullable=True)
|
||||||
|
tenant_id = Column(UUID(as_uuid=True), nullable=True, index=True)
|
||||||
created_at = Column(DateTime(timezone=True), default=lambda: datetime.now(timezone.utc))
|
created_at = Column(DateTime(timezone=True), default=lambda: datetime.now(timezone.utc))
|
||||||
updated_at = Column(DateTime(timezone=True), default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
|
updated_at = Column(DateTime(timezone=True), default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
|
||||||
|
|
||||||
191
services/external/app/repositories/traffic_repository.py
vendored
Normal file
191
services/external/app/repositories/traffic_repository.py
vendored
Normal file
@@ -0,0 +1,191 @@
|
|||||||
|
# ================================================================
|
||||||
|
# services/data/app/repositories/traffic_repository.py
|
||||||
|
# ================================================================
|
||||||
|
"""
|
||||||
|
Traffic Repository - Enhanced for multiple cities with comprehensive data access patterns
|
||||||
|
Follows existing repository architecture while adding city-specific functionality
|
||||||
|
"""
|
||||||
|
|
||||||
|
from typing import Optional, List, Dict, Any, Type, Tuple
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
from sqlalchemy import select, and_, or_, func, desc, asc, text, update, delete
|
||||||
|
from sqlalchemy.orm import selectinload
|
||||||
|
from datetime import datetime, timezone, timedelta
|
||||||
|
import structlog
|
||||||
|
|
||||||
|
from app.models.traffic import TrafficData
|
||||||
|
from app.schemas.traffic import TrafficDataCreate, TrafficDataResponse
|
||||||
|
from shared.database.exceptions import DatabaseError, ValidationError
|
||||||
|
|
||||||
|
logger = structlog.get_logger()
|
||||||
|
|
||||||
|
|
||||||
|
class TrafficRepository:
|
||||||
|
"""
|
||||||
|
Enhanced repository for traffic data operations across multiple cities
|
||||||
|
Provides city-aware queries and advanced traffic analytics
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, session: AsyncSession):
|
||||||
|
self.session = session
|
||||||
|
self.model = TrafficData
|
||||||
|
|
||||||
|
# ================================================================
|
||||||
|
# CORE TRAFFIC DATA OPERATIONS
|
||||||
|
# ================================================================
|
||||||
|
|
||||||
|
async def get_by_location_and_date_range(
|
||||||
|
self,
|
||||||
|
latitude: float,
|
||||||
|
longitude: float,
|
||||||
|
start_date: datetime,
|
||||||
|
end_date: datetime,
|
||||||
|
tenant_id: Optional[str] = None
|
||||||
|
) -> List[TrafficData]:
|
||||||
|
"""Get traffic data by location and date range"""
|
||||||
|
try:
|
||||||
|
location_id = f"{latitude:.4f},{longitude:.4f}"
|
||||||
|
|
||||||
|
# Build base query
|
||||||
|
query = select(self.model).where(self.model.location_id == location_id)
|
||||||
|
|
||||||
|
# Add tenant filter if specified
|
||||||
|
if tenant_id:
|
||||||
|
query = query.where(self.model.tenant_id == tenant_id)
|
||||||
|
|
||||||
|
# Add date range filters
|
||||||
|
if start_date:
|
||||||
|
query = query.where(self.model.date >= start_date)
|
||||||
|
|
||||||
|
if end_date:
|
||||||
|
query = query.where(self.model.date <= end_date)
|
||||||
|
|
||||||
|
# Order by date
|
||||||
|
query = query.order_by(self.model.date)
|
||||||
|
|
||||||
|
result = await self.session.execute(query)
|
||||||
|
return result.scalars().all()
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error("Failed to get traffic data by location and date range",
|
||||||
|
latitude=latitude, longitude=longitude,
|
||||||
|
error=str(e))
|
||||||
|
raise DatabaseError(f"Failed to get traffic data: {str(e)}")
|
||||||
|
|
||||||
|
async def store_traffic_data_batch(
|
||||||
|
self,
|
||||||
|
traffic_data_list: List[Dict[str, Any]],
|
||||||
|
location_id: str,
|
||||||
|
tenant_id: Optional[str] = None
|
||||||
|
) -> int:
|
||||||
|
"""Store a batch of traffic data records with enhanced validation and duplicate handling."""
|
||||||
|
stored_count = 0
|
||||||
|
try:
|
||||||
|
if not traffic_data_list:
|
||||||
|
return 0
|
||||||
|
|
||||||
|
# Check for existing records to avoid duplicates
|
||||||
|
dates = [data.get('date') for data in traffic_data_list if data.get('date')]
|
||||||
|
existing_dates = set()
|
||||||
|
if dates:
|
||||||
|
existing_stmt = select(TrafficData.date).where(
|
||||||
|
and_(
|
||||||
|
TrafficData.location_id == location_id,
|
||||||
|
TrafficData.date.in_(dates)
|
||||||
|
)
|
||||||
|
)
|
||||||
|
result = await self.session.execute(existing_stmt)
|
||||||
|
existing_dates = {row[0] for row in result.fetchall()}
|
||||||
|
logger.debug(f"Found {len(existing_dates)} existing records for location {location_id}")
|
||||||
|
|
||||||
|
batch_records = []
|
||||||
|
for data in traffic_data_list:
|
||||||
|
record_date = data.get('date')
|
||||||
|
if not record_date or record_date in existing_dates:
|
||||||
|
continue # Skip duplicates
|
||||||
|
|
||||||
|
# Validate data before preparing for insertion
|
||||||
|
if self._validate_traffic_data(data):
|
||||||
|
batch_records.append({
|
||||||
|
'location_id': location_id,
|
||||||
|
'city': data.get('city', 'madrid'), # Default to madrid for historical data
|
||||||
|
'tenant_id': tenant_id, # Include tenant_id in batch insert
|
||||||
|
'date': record_date,
|
||||||
|
'traffic_volume': data.get('traffic_volume'),
|
||||||
|
'pedestrian_count': data.get('pedestrian_count'),
|
||||||
|
'congestion_level': data.get('congestion_level'),
|
||||||
|
'average_speed': data.get('average_speed'),
|
||||||
|
'source': data.get('source', 'unknown'),
|
||||||
|
'raw_data': str(data)
|
||||||
|
})
|
||||||
|
|
||||||
|
if batch_records:
|
||||||
|
# Use bulk insert for performance
|
||||||
|
await self.session.execute(
|
||||||
|
TrafficData.__table__.insert(),
|
||||||
|
batch_records
|
||||||
|
)
|
||||||
|
await self.session.commit()
|
||||||
|
stored_count = len(batch_records)
|
||||||
|
logger.info(f"Successfully stored {stored_count} traffic records for location {location_id}")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error("Failed to store traffic data batch",
|
||||||
|
error=str(e), location_id=location_id)
|
||||||
|
await self.session.rollback()
|
||||||
|
raise DatabaseError(f"Batch store failed: {str(e)}")
|
||||||
|
|
||||||
|
return stored_count
|
||||||
|
|
||||||
|
def _validate_traffic_data(self, data: Dict[str, Any]) -> bool:
|
||||||
|
"""Validate traffic data before storage"""
|
||||||
|
required_fields = ['date']
|
||||||
|
|
||||||
|
# Check required fields
|
||||||
|
for field in required_fields:
|
||||||
|
if not data.get(field):
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Validate data types and ranges
|
||||||
|
traffic_volume = data.get('traffic_volume')
|
||||||
|
if traffic_volume is not None and (traffic_volume < 0 or traffic_volume > 10000):
|
||||||
|
return False
|
||||||
|
|
||||||
|
pedestrian_count = data.get('pedestrian_count')
|
||||||
|
if pedestrian_count is not None and (pedestrian_count < 0 or pedestrian_count > 10000):
|
||||||
|
return False
|
||||||
|
|
||||||
|
average_speed = data.get('average_speed')
|
||||||
|
if average_speed is not None and (average_speed < 0 or average_speed > 200):
|
||||||
|
return False
|
||||||
|
|
||||||
|
congestion_level = data.get('congestion_level')
|
||||||
|
if congestion_level and congestion_level not in ['low', 'medium', 'high', 'blocked']:
|
||||||
|
return False
|
||||||
|
|
||||||
|
return True
|
||||||
|
|
||||||
|
async def get_historical_traffic_for_training(self,
|
||||||
|
latitude: float,
|
||||||
|
longitude: float,
|
||||||
|
start_date: datetime,
|
||||||
|
end_date: datetime) -> List[TrafficData]:
|
||||||
|
"""Retrieve stored traffic data for training ML models."""
|
||||||
|
try:
|
||||||
|
location_id = f"{latitude:.4f},{longitude:.4f}"
|
||||||
|
|
||||||
|
stmt = select(TrafficData).where(
|
||||||
|
and_(
|
||||||
|
TrafficData.location_id == location_id,
|
||||||
|
TrafficData.date >= start_date,
|
||||||
|
TrafficData.date <= end_date
|
||||||
|
)
|
||||||
|
).order_by(TrafficData.date)
|
||||||
|
|
||||||
|
result = await self.session.execute(stmt)
|
||||||
|
return result.scalars().all()
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error("Failed to retrieve traffic data for training",
|
||||||
|
error=str(e), location_id=location_id)
|
||||||
|
raise DatabaseError(f"Training data retrieval failed: {str(e)}")
|
||||||
138
services/external/app/repositories/weather_repository.py
vendored
Normal file
138
services/external/app/repositories/weather_repository.py
vendored
Normal file
@@ -0,0 +1,138 @@
|
|||||||
|
# services/external/app/repositories/weather_repository.py
|
||||||
|
|
||||||
|
from typing import List, Dict, Any, Optional
|
||||||
|
from datetime import datetime
|
||||||
|
from sqlalchemy import select, and_
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
import structlog
|
||||||
|
import json
|
||||||
|
|
||||||
|
from app.models.weather import WeatherData
|
||||||
|
|
||||||
|
logger = structlog.get_logger()
|
||||||
|
|
||||||
|
class WeatherRepository:
|
||||||
|
"""
|
||||||
|
Repository for weather data operations, adapted for WeatherService.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, session: AsyncSession):
|
||||||
|
self.session = session
|
||||||
|
|
||||||
|
async def get_historical_weather(self,
|
||||||
|
location_id: str,
|
||||||
|
start_date: datetime,
|
||||||
|
end_date: datetime) -> List[WeatherData]:
|
||||||
|
"""
|
||||||
|
Retrieves historical weather data for a specific location and date range.
|
||||||
|
This method directly supports the data retrieval logic in WeatherService.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
stmt = select(WeatherData).where(
|
||||||
|
and_(
|
||||||
|
WeatherData.location_id == location_id,
|
||||||
|
WeatherData.date >= start_date,
|
||||||
|
WeatherData.date <= end_date
|
||||||
|
)
|
||||||
|
).order_by(WeatherData.date)
|
||||||
|
|
||||||
|
result = await self.session.execute(stmt)
|
||||||
|
records = result.scalars().all()
|
||||||
|
logger.debug(f"Retrieved {len(records)} historical records for location {location_id}")
|
||||||
|
return list(records)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(
|
||||||
|
"Failed to get historical weather from repository",
|
||||||
|
error=str(e),
|
||||||
|
location_id=location_id
|
||||||
|
)
|
||||||
|
raise
|
||||||
|
|
||||||
|
def _serialize_json_fields(self, data: Dict[str, Any]) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Serialize JSON fields (raw_data, processed_data) to ensure proper JSON storage
|
||||||
|
"""
|
||||||
|
serialized = data.copy()
|
||||||
|
|
||||||
|
# Serialize raw_data if present
|
||||||
|
if 'raw_data' in serialized and serialized['raw_data'] is not None:
|
||||||
|
if not isinstance(serialized['raw_data'], str):
|
||||||
|
try:
|
||||||
|
# Convert datetime objects to strings for JSON serialization
|
||||||
|
raw_data = serialized['raw_data']
|
||||||
|
if isinstance(raw_data, dict):
|
||||||
|
# Handle datetime objects in the dict
|
||||||
|
json_safe_data = {}
|
||||||
|
for k, v in raw_data.items():
|
||||||
|
if hasattr(v, 'isoformat'): # datetime-like object
|
||||||
|
json_safe_data[k] = v.isoformat()
|
||||||
|
else:
|
||||||
|
json_safe_data[k] = v
|
||||||
|
serialized['raw_data'] = json_safe_data
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"Could not serialize raw_data, storing as string: {e}")
|
||||||
|
serialized['raw_data'] = str(raw_data)
|
||||||
|
|
||||||
|
# Serialize processed_data if present
|
||||||
|
if 'processed_data' in serialized and serialized['processed_data'] is not None:
|
||||||
|
if not isinstance(serialized['processed_data'], str):
|
||||||
|
try:
|
||||||
|
processed_data = serialized['processed_data']
|
||||||
|
if isinstance(processed_data, dict):
|
||||||
|
json_safe_data = {}
|
||||||
|
for k, v in processed_data.items():
|
||||||
|
if hasattr(v, 'isoformat'): # datetime-like object
|
||||||
|
json_safe_data[k] = v.isoformat()
|
||||||
|
else:
|
||||||
|
json_safe_data[k] = v
|
||||||
|
serialized['processed_data'] = json_safe_data
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"Could not serialize processed_data, storing as string: {e}")
|
||||||
|
serialized['processed_data'] = str(processed_data)
|
||||||
|
|
||||||
|
return serialized
|
||||||
|
|
||||||
|
async def bulk_create_weather_data(self, weather_records: List[Dict[str, Any]]) -> None:
|
||||||
|
"""
|
||||||
|
Bulk inserts new weather records into the database.
|
||||||
|
Used by WeatherService after fetching new historical data from an external API.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
if not weather_records:
|
||||||
|
return
|
||||||
|
|
||||||
|
# Serialize JSON fields before creating model instances
|
||||||
|
serialized_records = [self._serialize_json_fields(data) for data in weather_records]
|
||||||
|
records = [WeatherData(**data) for data in serialized_records]
|
||||||
|
self.session.add_all(records)
|
||||||
|
await self.session.commit()
|
||||||
|
logger.info(f"Successfully bulk inserted {len(records)} weather records")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
await self.session.rollback()
|
||||||
|
logger.error(
|
||||||
|
"Failed to bulk create weather records",
|
||||||
|
error=str(e),
|
||||||
|
count=len(weather_records)
|
||||||
|
)
|
||||||
|
raise
|
||||||
|
|
||||||
|
async def create_weather_data(self, data: Dict[str, Any]) -> WeatherData:
|
||||||
|
"""
|
||||||
|
Creates a single new weather data record.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
# Serialize JSON fields before creating model instance
|
||||||
|
serialized_data = self._serialize_json_fields(data)
|
||||||
|
new_record = WeatherData(**serialized_data)
|
||||||
|
self.session.add(new_record)
|
||||||
|
await self.session.commit()
|
||||||
|
await self.session.refresh(new_record)
|
||||||
|
logger.info(f"Created new weather record with ID {new_record.id}")
|
||||||
|
return new_record
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
await self.session.rollback()
|
||||||
|
logger.error("Failed to create single weather record", error=str(e))
|
||||||
|
raise
|
||||||
1
services/external/app/schemas/__init__.py
vendored
Normal file
1
services/external/app/schemas/__init__.py
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
# services/external/app/schemas/__init__.py
|
||||||
@@ -1,7 +1,7 @@
|
|||||||
# ================================================================
|
# services/external/app/schemas/traffic.py
|
||||||
# services/data/app/schemas/traffic.py
|
"""
|
||||||
# ================================================================
|
Traffic Service Pydantic Schemas
|
||||||
"""Traffic data schemas"""
|
"""
|
||||||
|
|
||||||
from pydantic import BaseModel, Field, field_validator
|
from pydantic import BaseModel, Field, field_validator
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
@@ -70,3 +70,31 @@ class TrafficAnalytics(BaseModel):
|
|||||||
peak_pedestrian_hour: Optional[int] = None
|
peak_pedestrian_hour: Optional[int] = None
|
||||||
congestion_distribution: dict = Field(default_factory=dict)
|
congestion_distribution: dict = Field(default_factory=dict)
|
||||||
avg_speed: Optional[float] = None
|
avg_speed: Optional[float] = None
|
||||||
|
|
||||||
|
class TrafficDataResponse(BaseModel):
|
||||||
|
date: datetime
|
||||||
|
traffic_volume: Optional[int]
|
||||||
|
pedestrian_count: Optional[int]
|
||||||
|
congestion_level: Optional[str]
|
||||||
|
average_speed: Optional[float]
|
||||||
|
source: str
|
||||||
|
|
||||||
|
class LocationRequest(BaseModel):
|
||||||
|
latitude: float
|
||||||
|
longitude: float
|
||||||
|
address: Optional[str] = None
|
||||||
|
|
||||||
|
class DateRangeRequest(BaseModel):
|
||||||
|
start_date: datetime
|
||||||
|
end_date: datetime
|
||||||
|
|
||||||
|
class HistoricalTrafficRequest(BaseModel):
|
||||||
|
latitude: float
|
||||||
|
longitude: float
|
||||||
|
start_date: datetime
|
||||||
|
end_date: datetime
|
||||||
|
|
||||||
|
class TrafficForecastRequest(BaseModel):
|
||||||
|
latitude: float
|
||||||
|
longitude: float
|
||||||
|
hours: int = 24
|
||||||
@@ -1,6 +1,4 @@
|
|||||||
# ================================================================
|
# services/external/app/schemas/weather.py
|
||||||
# services/data/app/schemas/weather.py
|
|
||||||
# ================================================================
|
|
||||||
"""Weather data schemas"""
|
"""Weather data schemas"""
|
||||||
|
|
||||||
from pydantic import BaseModel, Field, field_validator
|
from pydantic import BaseModel, Field, field_validator
|
||||||
@@ -121,3 +119,43 @@ class WeatherAnalytics(BaseModel):
|
|||||||
weather_conditions: dict = Field(default_factory=dict)
|
weather_conditions: dict = Field(default_factory=dict)
|
||||||
rainy_days: int = 0
|
rainy_days: int = 0
|
||||||
sunny_days: int = 0
|
sunny_days: int = 0
|
||||||
|
|
||||||
|
class WeatherDataResponse(BaseModel):
|
||||||
|
date: datetime
|
||||||
|
temperature: Optional[float]
|
||||||
|
precipitation: Optional[float]
|
||||||
|
humidity: Optional[float]
|
||||||
|
wind_speed: Optional[float]
|
||||||
|
pressure: Optional[float]
|
||||||
|
description: Optional[str]
|
||||||
|
source: str
|
||||||
|
|
||||||
|
class WeatherForecastResponse(BaseModel):
|
||||||
|
forecast_date: datetime
|
||||||
|
generated_at: datetime
|
||||||
|
temperature: Optional[float]
|
||||||
|
precipitation: Optional[float]
|
||||||
|
humidity: Optional[float]
|
||||||
|
wind_speed: Optional[float]
|
||||||
|
description: Optional[str]
|
||||||
|
source: str
|
||||||
|
|
||||||
|
class LocationRequest(BaseModel):
|
||||||
|
latitude: float
|
||||||
|
longitude: float
|
||||||
|
address: Optional[str] = None
|
||||||
|
|
||||||
|
class DateRangeRequest(BaseModel):
|
||||||
|
start_date: datetime
|
||||||
|
end_date: datetime
|
||||||
|
|
||||||
|
class HistoricalWeatherRequest(BaseModel):
|
||||||
|
latitude: float
|
||||||
|
longitude: float
|
||||||
|
start_date: datetime
|
||||||
|
end_date: datetime
|
||||||
|
|
||||||
|
class WeatherForecastRequest(BaseModel):
|
||||||
|
latitude: float
|
||||||
|
longitude: float
|
||||||
|
days: int
|
||||||
1
services/external/app/services/__init__.py
vendored
Normal file
1
services/external/app/services/__init__.py
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
# services/external/app/services/__init__.py
|
||||||
63
services/external/app/services/messaging.py
vendored
Normal file
63
services/external/app/services/messaging.py
vendored
Normal file
@@ -0,0 +1,63 @@
|
|||||||
|
# services/external/app/services/messaging.py
|
||||||
|
"""
|
||||||
|
External Service Messaging - Event Publishing using shared messaging infrastructure
|
||||||
|
"""
|
||||||
|
|
||||||
|
from shared.messaging.rabbitmq import RabbitMQClient
|
||||||
|
from app.core.config import settings
|
||||||
|
import structlog
|
||||||
|
|
||||||
|
logger = structlog.get_logger()
|
||||||
|
|
||||||
|
# Single global instance
|
||||||
|
data_publisher = RabbitMQClient(settings.RABBITMQ_URL, "data-service")
|
||||||
|
|
||||||
|
async def setup_messaging():
|
||||||
|
"""Initialize messaging for data service"""
|
||||||
|
try:
|
||||||
|
success = await data_publisher.connect()
|
||||||
|
if success:
|
||||||
|
logger.info("Data service messaging initialized")
|
||||||
|
else:
|
||||||
|
logger.warning("Data service messaging failed to initialize")
|
||||||
|
return success
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("Failed to setup messaging", error=str(e))
|
||||||
|
return False
|
||||||
|
|
||||||
|
async def cleanup_messaging():
|
||||||
|
"""Cleanup messaging for data service"""
|
||||||
|
try:
|
||||||
|
await data_publisher.disconnect()
|
||||||
|
logger.info("Data service messaging cleaned up")
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("Error during messaging cleanup", error=str(e))
|
||||||
|
|
||||||
|
async def publish_weather_updated(data: dict) -> bool:
|
||||||
|
"""Publish weather updated event"""
|
||||||
|
try:
|
||||||
|
return await data_publisher.publish_data_event("weather.updated", data)
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("Failed to publish weather updated event", error=str(e))
|
||||||
|
return False
|
||||||
|
|
||||||
|
async def publish_traffic_updated(data: dict) -> bool:
|
||||||
|
"""Publish traffic updated event"""
|
||||||
|
try:
|
||||||
|
return await data_publisher.publish_data_event("traffic.updated", data)
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("Failed to publish traffic updated event", error=str(e))
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
# Health check for messaging
|
||||||
|
async def check_messaging_health() -> dict:
|
||||||
|
"""Check messaging system health"""
|
||||||
|
try:
|
||||||
|
if data_publisher.connected:
|
||||||
|
return {"status": "healthy", "service": "rabbitmq", "connected": True}
|
||||||
|
else:
|
||||||
|
return {"status": "unhealthy", "service": "rabbitmq", "connected": False, "error": "Not connected"}
|
||||||
|
except Exception as e:
|
||||||
|
return {"status": "unhealthy", "service": "rabbitmq", "connected": False, "error": str(e)}
|
||||||
298
services/external/app/services/traffic_service.py
vendored
Normal file
298
services/external/app/services/traffic_service.py
vendored
Normal file
@@ -0,0 +1,298 @@
|
|||||||
|
# ================================================================
|
||||||
|
# services/data/app/services/traffic_service.py
|
||||||
|
# ================================================================
|
||||||
|
"""
|
||||||
|
Abstracted Traffic Service - Universal interface for traffic data across multiple cities
|
||||||
|
"""
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Dict, List, Any, Optional, Tuple
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
import structlog
|
||||||
|
|
||||||
|
from app.external.apis.traffic import UniversalTrafficClient
|
||||||
|
from app.models.traffic import TrafficData
|
||||||
|
from app.repositories.traffic_repository import TrafficRepository
|
||||||
|
|
||||||
|
logger = structlog.get_logger()
|
||||||
|
from app.core.database import database_manager
|
||||||
|
|
||||||
|
class TrafficService:
|
||||||
|
"""
|
||||||
|
Abstracted traffic service providing unified interface for traffic data
|
||||||
|
Routes requests to appropriate city-specific clients automatically
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
self.universal_client = UniversalTrafficClient()
|
||||||
|
self.database_manager = database_manager
|
||||||
|
|
||||||
|
async def get_current_traffic(
|
||||||
|
self,
|
||||||
|
latitude: float,
|
||||||
|
longitude: float,
|
||||||
|
tenant_id: Optional[str] = None
|
||||||
|
) -> Optional[Dict[str, Any]]:
|
||||||
|
"""
|
||||||
|
Get current traffic data for any supported location
|
||||||
|
|
||||||
|
Args:
|
||||||
|
latitude: Query location latitude
|
||||||
|
longitude: Query location longitude
|
||||||
|
tenant_id: Optional tenant identifier for logging/analytics
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict with current traffic data or None if not available
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
logger.info("Getting current traffic data",
|
||||||
|
lat=latitude, lon=longitude, tenant_id=tenant_id)
|
||||||
|
|
||||||
|
# Delegate to universal client
|
||||||
|
traffic_data = await self.universal_client.get_current_traffic(latitude, longitude)
|
||||||
|
|
||||||
|
if traffic_data:
|
||||||
|
# Add service metadata
|
||||||
|
traffic_data['service_metadata'] = {
|
||||||
|
'request_timestamp': datetime.now().isoformat(),
|
||||||
|
'tenant_id': tenant_id,
|
||||||
|
'service_version': '2.0',
|
||||||
|
'query_location': {'latitude': latitude, 'longitude': longitude}
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.info("Successfully retrieved current traffic data",
|
||||||
|
lat=latitude, lon=longitude,
|
||||||
|
source=traffic_data.get('source', 'unknown'))
|
||||||
|
|
||||||
|
return traffic_data
|
||||||
|
else:
|
||||||
|
logger.warning("No current traffic data available",
|
||||||
|
lat=latitude, lon=longitude)
|
||||||
|
return None
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error("Error getting current traffic data",
|
||||||
|
lat=latitude, lon=longitude, error=str(e))
|
||||||
|
return None
|
||||||
|
|
||||||
|
async def get_historical_traffic(
|
||||||
|
self,
|
||||||
|
latitude: float,
|
||||||
|
longitude: float,
|
||||||
|
start_date: datetime,
|
||||||
|
end_date: datetime,
|
||||||
|
tenant_id: Optional[str] = None
|
||||||
|
) -> List[Dict[str, Any]]:
|
||||||
|
"""
|
||||||
|
Get historical traffic data for any supported location with database storage
|
||||||
|
|
||||||
|
Args:
|
||||||
|
latitude: Query location latitude
|
||||||
|
longitude: Query location longitude
|
||||||
|
start_date: Start date for historical data
|
||||||
|
end_date: End date for historical data
|
||||||
|
tenant_id: Optional tenant identifier
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of historical traffic data dictionaries
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
logger.info("Getting historical traffic data",
|
||||||
|
lat=latitude, lon=longitude,
|
||||||
|
start=start_date, end=end_date, tenant_id=tenant_id)
|
||||||
|
|
||||||
|
# Validate date range
|
||||||
|
if start_date >= end_date:
|
||||||
|
logger.warning("Invalid date range", start=start_date, end=end_date)
|
||||||
|
return []
|
||||||
|
|
||||||
|
location_id = f"{latitude:.4f},{longitude:.4f}"
|
||||||
|
|
||||||
|
async with self.database_manager.get_session() as session:
|
||||||
|
traffic_repo = TrafficRepository(session)
|
||||||
|
# Check database first using the repository
|
||||||
|
db_records = await traffic_repo.get_by_location_and_date_range(
|
||||||
|
latitude, longitude, start_date, end_date, tenant_id
|
||||||
|
)
|
||||||
|
|
||||||
|
if db_records:
|
||||||
|
logger.info("Historical traffic data found in database",
|
||||||
|
count=len(db_records))
|
||||||
|
return [self._convert_db_record_to_dict(record) for record in db_records]
|
||||||
|
|
||||||
|
# Delegate to universal client if not in DB
|
||||||
|
traffic_data = await self.universal_client.get_historical_traffic(
|
||||||
|
latitude, longitude, start_date, end_date
|
||||||
|
)
|
||||||
|
|
||||||
|
if traffic_data:
|
||||||
|
# Add service metadata to each record
|
||||||
|
for record in traffic_data:
|
||||||
|
record['service_metadata'] = {
|
||||||
|
'request_timestamp': datetime.now().isoformat(),
|
||||||
|
'tenant_id': tenant_id,
|
||||||
|
'service_version': '2.0',
|
||||||
|
'query_location': {'latitude': latitude, 'longitude': longitude},
|
||||||
|
'date_range': {
|
||||||
|
'start': start_date.isoformat(),
|
||||||
|
'end': end_date.isoformat()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async with self.database_manager.get_session() as session:
|
||||||
|
traffic_repo = TrafficRepository(session)
|
||||||
|
# Store in database using the repository
|
||||||
|
stored_count = await traffic_repo.store_traffic_data_batch(
|
||||||
|
traffic_data, location_id, tenant_id
|
||||||
|
)
|
||||||
|
logger.info("Traffic data stored for re-training",
|
||||||
|
fetched=len(traffic_data), stored=stored_count,
|
||||||
|
location=location_id)
|
||||||
|
|
||||||
|
logger.info("Successfully retrieved historical traffic data",
|
||||||
|
lat=latitude, lon=longitude, records=len(traffic_data))
|
||||||
|
|
||||||
|
return traffic_data
|
||||||
|
else:
|
||||||
|
logger.info("No historical traffic data available",
|
||||||
|
lat=latitude, lon=longitude)
|
||||||
|
return []
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error("Error getting historical traffic data",
|
||||||
|
lat=latitude, lon=longitude, error=str(e))
|
||||||
|
return []
|
||||||
|
|
||||||
|
def _convert_db_record_to_dict(self, record: TrafficData) -> Dict[str, Any]:
|
||||||
|
"""Convert database record to dictionary format"""
|
||||||
|
return {
|
||||||
|
'date': record.date,
|
||||||
|
'traffic_volume': record.traffic_volume,
|
||||||
|
'pedestrian_count': record.pedestrian_count,
|
||||||
|
'congestion_level': record.congestion_level,
|
||||||
|
'average_speed': record.average_speed,
|
||||||
|
'source': record.source,
|
||||||
|
'location_id': record.location_id,
|
||||||
|
'raw_data': record.raw_data
|
||||||
|
}
|
||||||
|
|
||||||
|
async def get_traffic_events(
|
||||||
|
self,
|
||||||
|
latitude: float,
|
||||||
|
longitude: float,
|
||||||
|
radius_km: float = 5.0,
|
||||||
|
tenant_id: Optional[str] = None
|
||||||
|
) -> List[Dict[str, Any]]:
|
||||||
|
"""
|
||||||
|
Get traffic events and incidents for any supported location
|
||||||
|
|
||||||
|
Args:
|
||||||
|
latitude: Query location latitude
|
||||||
|
longitude: Query location longitude
|
||||||
|
radius_km: Search radius in kilometers
|
||||||
|
tenant_id: Optional tenant identifier
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of traffic events
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
logger.info("Getting traffic events",
|
||||||
|
lat=latitude, lon=longitude, radius=radius_km, tenant_id=tenant_id)
|
||||||
|
|
||||||
|
# Delegate to universal client
|
||||||
|
events = await self.universal_client.get_events(latitude, longitude, radius_km)
|
||||||
|
|
||||||
|
# Add metadata to events
|
||||||
|
for event in events:
|
||||||
|
event['service_metadata'] = {
|
||||||
|
'request_timestamp': datetime.now().isoformat(),
|
||||||
|
'tenant_id': tenant_id,
|
||||||
|
'service_version': '2.0',
|
||||||
|
'query_location': {'latitude': latitude, 'longitude': longitude},
|
||||||
|
'search_radius_km': radius_km
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.info("Retrieved traffic events",
|
||||||
|
lat=latitude, lon=longitude, events=len(events))
|
||||||
|
|
||||||
|
return events
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error("Error getting traffic events",
|
||||||
|
lat=latitude, lon=longitude, error=str(e))
|
||||||
|
return []
|
||||||
|
|
||||||
|
def get_location_info(self, latitude: float, longitude: float) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get information about traffic data availability for location
|
||||||
|
|
||||||
|
Args:
|
||||||
|
latitude: Query location latitude
|
||||||
|
longitude: Query location longitude
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict with location support information
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
info = self.universal_client.get_location_info(latitude, longitude)
|
||||||
|
|
||||||
|
# Add service layer information
|
||||||
|
info['service_layer'] = {
|
||||||
|
'version': '2.0',
|
||||||
|
'abstraction_level': 'universal',
|
||||||
|
'supported_operations': [
|
||||||
|
'current_traffic',
|
||||||
|
'historical_traffic',
|
||||||
|
'traffic_events',
|
||||||
|
'bulk_requests'
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
return info
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error("Error getting location info",
|
||||||
|
lat=latitude, lon=longitude, error=str(e))
|
||||||
|
return {
|
||||||
|
'supported': False,
|
||||||
|
'error': str(e),
|
||||||
|
'service_layer': {'version': '2.0'}
|
||||||
|
}
|
||||||
|
|
||||||
|
async def get_stored_traffic_for_training(self,
|
||||||
|
latitude: float,
|
||||||
|
longitude: float,
|
||||||
|
start_date: datetime,
|
||||||
|
end_date: datetime) -> List[Dict[str, Any]]:
|
||||||
|
"""Retrieve stored traffic data specifically for training purposes"""
|
||||||
|
try:
|
||||||
|
async with self.database_manager.get_session() as session:
|
||||||
|
traffic_repo = TrafficRepository(session)
|
||||||
|
records = await traffic_repo.get_historical_traffic_for_training(
|
||||||
|
latitude, longitude, start_date, end_date
|
||||||
|
)
|
||||||
|
|
||||||
|
# Convert to training format
|
||||||
|
training_data = []
|
||||||
|
for record in records:
|
||||||
|
training_data.append({
|
||||||
|
'date': record.date,
|
||||||
|
'traffic_volume': record.traffic_volume,
|
||||||
|
'pedestrian_count': record.pedestrian_count,
|
||||||
|
'congestion_level': record.congestion_level,
|
||||||
|
'average_speed': record.average_speed,
|
||||||
|
'location_id': record.location_id,
|
||||||
|
'source': record.source,
|
||||||
|
'measurement_point_id': record.raw_data # Contains additional metadata
|
||||||
|
})
|
||||||
|
|
||||||
|
logger.info(f"Retrieved {len(training_data)} traffic records for training",
|
||||||
|
location_id=f"{latitude:.4f},{longitude:.4f}", start=start_date, end=end_date)
|
||||||
|
|
||||||
|
return training_data
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error("Failed to retrieve traffic data for training",
|
||||||
|
error=str(e), location_id=f"{latitude:.4f},{longitude:.4f}")
|
||||||
|
return []
|
||||||
@@ -1,24 +1,25 @@
|
|||||||
# ================================================================
|
# services/data/app/services/weather_service.py - REVISED VERSION
|
||||||
# services/data/app/services/weather_service.py - FIXED VERSION
|
|
||||||
# ================================================================
|
"""Weather data service with repository pattern"""
|
||||||
"""Weather data service with improved error handling"""
|
|
||||||
|
|
||||||
from typing import List, Dict, Any, Optional
|
from typing import List, Dict, Any, Optional
|
||||||
from datetime import datetime, timedelta
|
from datetime import datetime, timedelta
|
||||||
from sqlalchemy.ext.asyncio import AsyncSession
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
from sqlalchemy import select, and_
|
|
||||||
import structlog
|
import structlog
|
||||||
|
|
||||||
from app.models.weather import WeatherData, WeatherForecast
|
from app.models.weather import WeatherData, WeatherForecast
|
||||||
from app.external.aemet import AEMETClient
|
from app.external.aemet import AEMETClient
|
||||||
from app.schemas.external import WeatherDataResponse, WeatherForecastResponse
|
from app.schemas.weather import WeatherDataResponse, WeatherForecastResponse
|
||||||
|
from app.repositories.weather_repository import WeatherRepository
|
||||||
|
|
||||||
logger = structlog.get_logger()
|
logger = structlog.get_logger()
|
||||||
|
from app.core.database import database_manager
|
||||||
|
|
||||||
class WeatherService:
|
class WeatherService:
|
||||||
|
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
self.aemet_client = AEMETClient()
|
self.aemet_client = AEMETClient()
|
||||||
|
self.database_manager = database_manager
|
||||||
|
|
||||||
async def get_current_weather(self, latitude: float, longitude: float) -> Optional[WeatherDataResponse]:
|
async def get_current_weather(self, latitude: float, longitude: float) -> Optional[WeatherDataResponse]:
|
||||||
"""Get current weather for location"""
|
"""Get current weather for location"""
|
||||||
@@ -82,26 +83,23 @@ class WeatherService:
|
|||||||
latitude: float,
|
latitude: float,
|
||||||
longitude: float,
|
longitude: float,
|
||||||
start_date: datetime,
|
start_date: datetime,
|
||||||
end_date: datetime,
|
end_date: datetime) -> List[WeatherDataResponse]:
|
||||||
db: AsyncSession) -> List[WeatherDataResponse]:
|
|
||||||
"""Get historical weather data"""
|
"""Get historical weather data"""
|
||||||
try:
|
try:
|
||||||
logger.debug("Getting historical weather",
|
logger.debug("Getting historical weather",
|
||||||
lat=latitude, lon=longitude,
|
lat=latitude, lon=longitude,
|
||||||
start=start_date, end=end_date)
|
start=start_date, end=end_date)
|
||||||
|
|
||||||
# First check database
|
|
||||||
location_id = f"{latitude:.4f},{longitude:.4f}"
|
location_id = f"{latitude:.4f},{longitude:.4f}"
|
||||||
stmt = select(WeatherData).where(
|
|
||||||
and_(
|
|
||||||
WeatherData.location_id == location_id,
|
|
||||||
WeatherData.date >= start_date,
|
|
||||||
WeatherData.date <= end_date
|
|
||||||
)
|
|
||||||
).order_by(WeatherData.date)
|
|
||||||
|
|
||||||
result = await db.execute(stmt)
|
async with self.database_manager.get_session() as session:
|
||||||
db_records = result.scalars().all()
|
weather_repository = WeatherRepository(session)
|
||||||
|
# Use the repository to get data from the database
|
||||||
|
db_records = await weather_repository.get_historical_weather(
|
||||||
|
location_id,
|
||||||
|
start_date,
|
||||||
|
end_date
|
||||||
|
)
|
||||||
|
|
||||||
if db_records:
|
if db_records:
|
||||||
logger.debug("Historical data found in database", count=len(db_records))
|
logger.debug("Historical data found in database", count=len(db_records))
|
||||||
@@ -123,28 +121,28 @@ class WeatherService:
|
|||||||
)
|
)
|
||||||
|
|
||||||
if weather_data:
|
if weather_data:
|
||||||
# Store in database for future use
|
# Use the repository to store the new data
|
||||||
try:
|
records_to_store = [{
|
||||||
for data in weather_data:
|
"location_id": location_id,
|
||||||
weather_record = WeatherData(
|
"city": "Madrid", # Default city for AEMET data
|
||||||
location_id=location_id,
|
"date": data.get('date', datetime.now()),
|
||||||
date=data.get('date', datetime.now()),
|
"temperature": data.get('temperature'),
|
||||||
temperature=data.get('temperature'),
|
"precipitation": data.get('precipitation'),
|
||||||
precipitation=data.get('precipitation'),
|
"humidity": data.get('humidity'),
|
||||||
humidity=data.get('humidity'),
|
"wind_speed": data.get('wind_speed'),
|
||||||
wind_speed=data.get('wind_speed'),
|
"pressure": data.get('pressure'),
|
||||||
pressure=data.get('pressure'),
|
"description": data.get('description'),
|
||||||
description=data.get('description'),
|
"source": "aemet",
|
||||||
source="aemet",
|
"data_type": "historical",
|
||||||
raw_data=str(data)
|
"raw_data": data, # Pass as dict, not string
|
||||||
)
|
"tenant_id": None
|
||||||
db.add(weather_record)
|
} for data in weather_data]
|
||||||
|
|
||||||
await db.commit()
|
async with self.database_manager.get_session() as session:
|
||||||
logger.debug("Historical data stored in database", count=len(weather_data))
|
weather_repository = WeatherRepository(session)
|
||||||
except Exception as db_error:
|
await weather_repository.bulk_create_weather_data(records_to_store)
|
||||||
logger.warning("Failed to store historical data in database", error=str(db_error))
|
|
||||||
await db.rollback()
|
logger.debug("Historical data stored in database", count=len(weather_data))
|
||||||
|
|
||||||
return [WeatherDataResponse(**item) for item in weather_data]
|
return [WeatherDataResponse(**item) for item in weather_data]
|
||||||
else:
|
else:
|
||||||
19
services/external/pytest.ini
vendored
Normal file
19
services/external/pytest.ini
vendored
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
[tool:pytest]
|
||||||
|
testpaths = tests
|
||||||
|
asyncio_mode = auto
|
||||||
|
python_files = test_*.py
|
||||||
|
python_classes = Test*
|
||||||
|
python_functions = test_*
|
||||||
|
addopts =
|
||||||
|
-v
|
||||||
|
--tb=short
|
||||||
|
--strict-markers
|
||||||
|
--disable-warnings
|
||||||
|
--cov=app
|
||||||
|
--cov-report=term-missing
|
||||||
|
--cov-report=html:htmlcov
|
||||||
|
markers =
|
||||||
|
unit: Unit tests
|
||||||
|
integration: Integration tests
|
||||||
|
slow: Slow running tests
|
||||||
|
external: Tests requiring external services
|
||||||
56
services/external/requirements.txt
vendored
Normal file
56
services/external/requirements.txt
vendored
Normal file
@@ -0,0 +1,56 @@
|
|||||||
|
# services/external/requirements.txt
|
||||||
|
# FastAPI and web framework
|
||||||
|
fastapi==0.104.1
|
||||||
|
uvicorn[standard]==0.24.0
|
||||||
|
|
||||||
|
# Database
|
||||||
|
sqlalchemy==2.0.23
|
||||||
|
psycopg2-binary==2.9.9
|
||||||
|
asyncpg==0.29.0
|
||||||
|
aiosqlite==0.19.0
|
||||||
|
alembic==1.12.1
|
||||||
|
|
||||||
|
# HTTP clients for external APIs
|
||||||
|
httpx==0.25.2
|
||||||
|
aiofiles==23.2.0
|
||||||
|
requests==2.31.0
|
||||||
|
|
||||||
|
# Data processing and time series
|
||||||
|
pandas==2.1.3
|
||||||
|
numpy==1.25.2
|
||||||
|
|
||||||
|
# Validation and serialization
|
||||||
|
pydantic==2.5.0
|
||||||
|
pydantic-settings==2.0.3
|
||||||
|
|
||||||
|
# Authentication and security
|
||||||
|
python-jose[cryptography]==3.3.0
|
||||||
|
|
||||||
|
# Logging and monitoring
|
||||||
|
structlog==23.2.0
|
||||||
|
prometheus-client==0.19.0
|
||||||
|
|
||||||
|
# Message queues
|
||||||
|
aio-pika==9.3.1
|
||||||
|
|
||||||
|
# Background job processing
|
||||||
|
redis==5.0.1
|
||||||
|
|
||||||
|
# Date and time handling
|
||||||
|
pytz==2023.3
|
||||||
|
python-dateutil==2.8.2
|
||||||
|
|
||||||
|
# XML parsing (for some APIs)
|
||||||
|
lxml==4.9.3
|
||||||
|
|
||||||
|
# Geospatial processing
|
||||||
|
pyproj==3.6.1
|
||||||
|
|
||||||
|
# Note: pytest and testing dependencies are in tests/requirements.txt
|
||||||
|
|
||||||
|
# Development
|
||||||
|
python-multipart==0.0.6
|
||||||
|
|
||||||
|
# External API specific
|
||||||
|
beautifulsoup4==4.12.2 # For web scraping if needed
|
||||||
|
xmltodict==0.13.0 # For XML API responses
|
||||||
1
services/external/shared/shared
vendored
Symbolic link
1
services/external/shared/shared
vendored
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
/Users/urtzialfaro/Documents/bakery-ia/shared
|
||||||
314
services/external/tests/conftest.py
vendored
Normal file
314
services/external/tests/conftest.py
vendored
Normal file
@@ -0,0 +1,314 @@
|
|||||||
|
# services/external/tests/conftest.py
|
||||||
|
"""
|
||||||
|
Pytest configuration and fixtures for External Service tests
|
||||||
|
"""
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
import asyncio
|
||||||
|
from datetime import datetime, timezone
|
||||||
|
from typing import AsyncGenerator
|
||||||
|
from uuid import uuid4, UUID
|
||||||
|
|
||||||
|
from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession, async_sessionmaker
|
||||||
|
from sqlalchemy.pool import StaticPool
|
||||||
|
from fastapi.testclient import TestClient
|
||||||
|
|
||||||
|
from app.main import app
|
||||||
|
from app.core.config import settings
|
||||||
|
from app.core.database import Base, get_db
|
||||||
|
from app.models.weather import WeatherData, WeatherStation
|
||||||
|
from app.models.traffic import TrafficData, TrafficMeasurementPoint
|
||||||
|
|
||||||
|
|
||||||
|
# Test database configuration
|
||||||
|
TEST_DATABASE_URL = "sqlite+aiosqlite:///:memory:"
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture(scope="session")
|
||||||
|
def event_loop():
|
||||||
|
"""Create event loop for the test session"""
|
||||||
|
loop = asyncio.new_event_loop()
|
||||||
|
yield loop
|
||||||
|
loop.close()
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
async def test_engine():
|
||||||
|
"""Create test database engine"""
|
||||||
|
engine = create_async_engine(
|
||||||
|
TEST_DATABASE_URL,
|
||||||
|
poolclass=StaticPool,
|
||||||
|
connect_args={"check_same_thread": False}
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create tables
|
||||||
|
async with engine.begin() as conn:
|
||||||
|
await conn.run_sync(Base.metadata.create_all)
|
||||||
|
|
||||||
|
yield engine
|
||||||
|
|
||||||
|
await engine.dispose()
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
async def test_db_session(test_engine) -> AsyncGenerator[AsyncSession, None]:
|
||||||
|
"""Create test database session"""
|
||||||
|
async_session = async_sessionmaker(
|
||||||
|
test_engine, class_=AsyncSession, expire_on_commit=False
|
||||||
|
)
|
||||||
|
|
||||||
|
async with async_session() as session:
|
||||||
|
yield session
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def test_client():
|
||||||
|
"""Create test client"""
|
||||||
|
return TestClient(app)
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
async def override_get_db(test_db_session):
|
||||||
|
"""Override get_db dependency for testing"""
|
||||||
|
async def _override_get_db():
|
||||||
|
yield test_db_session
|
||||||
|
|
||||||
|
app.dependency_overrides[get_db] = _override_get_db
|
||||||
|
yield
|
||||||
|
app.dependency_overrides.clear()
|
||||||
|
|
||||||
|
|
||||||
|
# Test data fixtures
|
||||||
|
@pytest.fixture
|
||||||
|
def sample_tenant_id() -> UUID:
|
||||||
|
"""Sample tenant ID for testing"""
|
||||||
|
return uuid4()
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def sample_weather_data() -> dict:
|
||||||
|
"""Sample weather data for testing"""
|
||||||
|
return {
|
||||||
|
"city": "madrid",
|
||||||
|
"location_id": "40.4168,-3.7038",
|
||||||
|
"date": datetime.now(timezone.utc),
|
||||||
|
"temperature": 18.5,
|
||||||
|
"humidity": 65.0,
|
||||||
|
"pressure": 1013.2,
|
||||||
|
"wind_speed": 10.2,
|
||||||
|
"condition": "partly_cloudy",
|
||||||
|
"description": "Parcialmente nublado",
|
||||||
|
"source": "aemet",
|
||||||
|
"data_type": "current",
|
||||||
|
"is_forecast": False,
|
||||||
|
"data_quality_score": 95.0
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def sample_traffic_data() -> dict:
|
||||||
|
"""Sample traffic data for testing"""
|
||||||
|
return {
|
||||||
|
"city": "madrid",
|
||||||
|
"location_id": "PM_M30_001",
|
||||||
|
"date": datetime.now(timezone.utc),
|
||||||
|
"measurement_point_id": "PM_M30_001",
|
||||||
|
"measurement_point_name": "M-30 Norte - Nudo Norte",
|
||||||
|
"measurement_point_type": "M30",
|
||||||
|
"traffic_volume": 850,
|
||||||
|
"average_speed": 65.2,
|
||||||
|
"congestion_level": "medium",
|
||||||
|
"occupation_percentage": 45.8,
|
||||||
|
"latitude": 40.4501,
|
||||||
|
"longitude": -3.6919,
|
||||||
|
"district": "Chamartín",
|
||||||
|
"source": "madrid_opendata",
|
||||||
|
"data_quality_score": 92.0,
|
||||||
|
"is_synthetic": False
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def sample_weather_forecast() -> list[dict]:
|
||||||
|
"""Sample weather forecast data"""
|
||||||
|
base_date = datetime.now(timezone.utc)
|
||||||
|
return [
|
||||||
|
{
|
||||||
|
"city": "madrid",
|
||||||
|
"location_id": "40.4168,-3.7038",
|
||||||
|
"date": base_date,
|
||||||
|
"forecast_date": base_date,
|
||||||
|
"temperature": 20.0,
|
||||||
|
"temperature_min": 15.0,
|
||||||
|
"temperature_max": 25.0,
|
||||||
|
"precipitation": 0.0,
|
||||||
|
"humidity": 60.0,
|
||||||
|
"wind_speed": 12.0,
|
||||||
|
"condition": "sunny",
|
||||||
|
"description": "Soleado",
|
||||||
|
"source": "aemet",
|
||||||
|
"data_type": "forecast",
|
||||||
|
"is_forecast": True,
|
||||||
|
"data_quality_score": 85.0
|
||||||
|
}
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
async def populated_weather_db(test_db_session: AsyncSession, sample_weather_data: dict):
|
||||||
|
"""Database populated with weather test data"""
|
||||||
|
weather_record = WeatherData(**sample_weather_data)
|
||||||
|
test_db_session.add(weather_record)
|
||||||
|
await test_db_session.commit()
|
||||||
|
yield test_db_session
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
async def populated_traffic_db(test_db_session: AsyncSession, sample_traffic_data: dict):
|
||||||
|
"""Database populated with traffic test data"""
|
||||||
|
traffic_record = TrafficData(**sample_traffic_data)
|
||||||
|
test_db_session.add(traffic_record)
|
||||||
|
await test_db_session.commit()
|
||||||
|
yield test_db_session
|
||||||
|
|
||||||
|
|
||||||
|
# Mock external API fixtures
|
||||||
|
@pytest.fixture
|
||||||
|
def mock_aemet_response():
|
||||||
|
"""Mock AEMET API response"""
|
||||||
|
return {
|
||||||
|
"date": datetime.now(timezone.utc),
|
||||||
|
"temperature": 18.5,
|
||||||
|
"humidity": 65.0,
|
||||||
|
"pressure": 1013.2,
|
||||||
|
"wind_speed": 10.2,
|
||||||
|
"description": "Parcialmente nublado",
|
||||||
|
"source": "aemet"
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def mock_madrid_traffic_xml():
|
||||||
|
"""Mock Madrid Open Data traffic XML"""
|
||||||
|
return """<?xml version="1.0" encoding="UTF-8"?>
|
||||||
|
<pms>
|
||||||
|
<pm codigo="PM_M30_001" nombre="M-30 Norte - Nudo Norte">
|
||||||
|
<intensidad>850</intensidad>
|
||||||
|
<ocupacion>45</ocupacion>
|
||||||
|
<velocidad>65</velocidad>
|
||||||
|
<fechahora>2024-01-15T10:30:00</fechahora>
|
||||||
|
</pm>
|
||||||
|
<pm codigo="PM_URB_002" nombre="Gran Vía - Plaza España">
|
||||||
|
<intensidad>320</intensidad>
|
||||||
|
<ocupacion>78</ocupacion>
|
||||||
|
<velocidad>25</velocidad>
|
||||||
|
<fechahora>2024-01-15T10:30:00</fechahora>
|
||||||
|
</pm>
|
||||||
|
</pms>"""
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def mock_messaging():
|
||||||
|
"""Mock messaging service"""
|
||||||
|
class MockMessaging:
|
||||||
|
def __init__(self):
|
||||||
|
self.published_events = []
|
||||||
|
|
||||||
|
async def publish_weather_updated(self, data):
|
||||||
|
self.published_events.append(("weather_updated", data))
|
||||||
|
return True
|
||||||
|
|
||||||
|
async def publish_traffic_updated(self, data):
|
||||||
|
self.published_events.append(("traffic_updated", data))
|
||||||
|
return True
|
||||||
|
|
||||||
|
async def publish_collection_job_started(self, data):
|
||||||
|
self.published_events.append(("job_started", data))
|
||||||
|
return True
|
||||||
|
|
||||||
|
async def publish_collection_job_completed(self, data):
|
||||||
|
self.published_events.append(("job_completed", data))
|
||||||
|
return True
|
||||||
|
|
||||||
|
return MockMessaging()
|
||||||
|
|
||||||
|
|
||||||
|
# Mock external clients
|
||||||
|
@pytest.fixture
|
||||||
|
def mock_aemet_client():
|
||||||
|
"""Mock AEMET client"""
|
||||||
|
class MockAEMETClient:
|
||||||
|
async def get_current_weather(self, lat, lon):
|
||||||
|
return {
|
||||||
|
"date": datetime.now(timezone.utc),
|
||||||
|
"temperature": 18.5,
|
||||||
|
"humidity": 65.0,
|
||||||
|
"pressure": 1013.2,
|
||||||
|
"wind_speed": 10.2,
|
||||||
|
"description": "Parcialmente nublado",
|
||||||
|
"source": "aemet"
|
||||||
|
}
|
||||||
|
|
||||||
|
async def get_forecast(self, lat, lon, days):
|
||||||
|
return [
|
||||||
|
{
|
||||||
|
"forecast_date": datetime.now(timezone.utc),
|
||||||
|
"temperature": 20.0,
|
||||||
|
"temperature_min": 15.0,
|
||||||
|
"temperature_max": 25.0,
|
||||||
|
"precipitation": 0.0,
|
||||||
|
"humidity": 60.0,
|
||||||
|
"wind_speed": 12.0,
|
||||||
|
"description": "Soleado",
|
||||||
|
"source": "aemet"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
|
||||||
|
return MockAEMETClient()
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def mock_madrid_client():
|
||||||
|
"""Mock Madrid traffic client"""
|
||||||
|
class MockMadridClient:
|
||||||
|
async def fetch_current_traffic_xml(self):
|
||||||
|
return """<?xml version="1.0" encoding="UTF-8"?>
|
||||||
|
<pms>
|
||||||
|
<pm codigo="PM_TEST_001" nombre="Test Point">
|
||||||
|
<intensidad>500</intensidad>
|
||||||
|
<ocupacion>50</ocupacion>
|
||||||
|
<velocidad>50</velocidad>
|
||||||
|
<fechahora>2024-01-15T10:30:00</fechahora>
|
||||||
|
</pm>
|
||||||
|
</pms>"""
|
||||||
|
|
||||||
|
return MockMadridClient()
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def mock_madrid_processor():
|
||||||
|
"""Mock Madrid traffic processor"""
|
||||||
|
class MockMadridProcessor:
|
||||||
|
async def process_current_traffic_xml(self, xml_content):
|
||||||
|
return [
|
||||||
|
{
|
||||||
|
"city": "madrid",
|
||||||
|
"location_id": "PM_TEST_001",
|
||||||
|
"date": datetime.now(timezone.utc),
|
||||||
|
"measurement_point_id": "PM_TEST_001",
|
||||||
|
"measurement_point_name": "Test Point",
|
||||||
|
"measurement_point_type": "TEST",
|
||||||
|
"traffic_volume": 500,
|
||||||
|
"average_speed": 50.0,
|
||||||
|
"congestion_level": "medium",
|
||||||
|
"occupation_percentage": 50.0,
|
||||||
|
"latitude": 40.4168,
|
||||||
|
"longitude": -3.7038,
|
||||||
|
"district": "Centro",
|
||||||
|
"source": "madrid_opendata",
|
||||||
|
"data_quality_score": 90.0,
|
||||||
|
"is_synthetic": False
|
||||||
|
}
|
||||||
|
]
|
||||||
|
|
||||||
|
return MockMadridProcessor()
|
||||||
9
services/external/tests/requirements.txt
vendored
Normal file
9
services/external/tests/requirements.txt
vendored
Normal file
@@ -0,0 +1,9 @@
|
|||||||
|
# Testing dependencies for External Service
|
||||||
|
pytest==7.4.3
|
||||||
|
pytest-asyncio==0.21.1
|
||||||
|
pytest-mock==3.12.0
|
||||||
|
httpx==0.25.2
|
||||||
|
fastapi[all]==0.104.1
|
||||||
|
sqlalchemy[asyncio]==2.0.23
|
||||||
|
aiosqlite==0.19.0
|
||||||
|
coverage==7.3.2
|
||||||
393
services/external/tests/unit/test_repositories.py
vendored
Normal file
393
services/external/tests/unit/test_repositories.py
vendored
Normal file
@@ -0,0 +1,393 @@
|
|||||||
|
# services/external/tests/unit/test_repositories.py
|
||||||
|
"""
|
||||||
|
Unit tests for External Service Repositories
|
||||||
|
"""
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
from datetime import datetime, timezone, timedelta
|
||||||
|
from uuid import uuid4
|
||||||
|
|
||||||
|
from app.repositories.weather_repository import WeatherRepository
|
||||||
|
from app.repositories.traffic_repository import TrafficRepository
|
||||||
|
from app.models.weather import WeatherData, WeatherStation, WeatherDataJob
|
||||||
|
from app.models.traffic import TrafficData, TrafficMeasurementPoint, TrafficDataJob
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
class TestWeatherRepository:
|
||||||
|
"""Test Weather Repository operations"""
|
||||||
|
|
||||||
|
async def test_create_weather_data(self, test_db_session, sample_weather_data):
|
||||||
|
"""Test creating weather data"""
|
||||||
|
repository = WeatherRepository(test_db_session)
|
||||||
|
|
||||||
|
record = await repository.create_weather_data(sample_weather_data)
|
||||||
|
|
||||||
|
assert record is not None
|
||||||
|
assert record.id is not None
|
||||||
|
assert record.city == sample_weather_data["city"]
|
||||||
|
assert record.temperature == sample_weather_data["temperature"]
|
||||||
|
|
||||||
|
async def test_get_current_weather(self, populated_weather_db, sample_weather_data):
|
||||||
|
"""Test getting current weather data"""
|
||||||
|
repository = WeatherRepository(populated_weather_db)
|
||||||
|
|
||||||
|
result = await repository.get_current_weather("madrid")
|
||||||
|
|
||||||
|
assert result is not None
|
||||||
|
assert result.city == "madrid"
|
||||||
|
assert result.temperature == sample_weather_data["temperature"]
|
||||||
|
|
||||||
|
async def test_get_weather_forecast(self, test_db_session, sample_weather_forecast):
|
||||||
|
"""Test getting weather forecast"""
|
||||||
|
repository = WeatherRepository(test_db_session)
|
||||||
|
|
||||||
|
# Create forecast data
|
||||||
|
for forecast_item in sample_weather_forecast:
|
||||||
|
await repository.create_weather_data(forecast_item)
|
||||||
|
|
||||||
|
result = await repository.get_weather_forecast("madrid", 7)
|
||||||
|
|
||||||
|
assert len(result) == 1
|
||||||
|
assert result[0].is_forecast is True
|
||||||
|
|
||||||
|
async def test_get_historical_weather(self, test_db_session, sample_weather_data):
|
||||||
|
"""Test getting historical weather data"""
|
||||||
|
repository = WeatherRepository(test_db_session)
|
||||||
|
|
||||||
|
# Create historical data
|
||||||
|
historical_data = sample_weather_data.copy()
|
||||||
|
historical_data["date"] = datetime.now(timezone.utc) - timedelta(days=1)
|
||||||
|
await repository.create_weather_data(historical_data)
|
||||||
|
|
||||||
|
start_date = datetime.now(timezone.utc) - timedelta(days=2)
|
||||||
|
end_date = datetime.now(timezone.utc)
|
||||||
|
|
||||||
|
result = await repository.get_historical_weather("madrid", start_date, end_date)
|
||||||
|
|
||||||
|
assert len(result) >= 1
|
||||||
|
|
||||||
|
async def test_create_weather_station(self, test_db_session):
|
||||||
|
"""Test creating weather station"""
|
||||||
|
repository = WeatherRepository(test_db_session)
|
||||||
|
|
||||||
|
station_data = {
|
||||||
|
"station_id": "TEST_001",
|
||||||
|
"name": "Test Station",
|
||||||
|
"city": "madrid",
|
||||||
|
"latitude": 40.4168,
|
||||||
|
"longitude": -3.7038,
|
||||||
|
"altitude": 650.0,
|
||||||
|
"is_active": True
|
||||||
|
}
|
||||||
|
|
||||||
|
station = await repository.create_weather_station(station_data)
|
||||||
|
|
||||||
|
assert station is not None
|
||||||
|
assert station.station_id == "TEST_001"
|
||||||
|
assert station.name == "Test Station"
|
||||||
|
|
||||||
|
async def test_get_weather_stations(self, test_db_session):
|
||||||
|
"""Test getting weather stations"""
|
||||||
|
repository = WeatherRepository(test_db_session)
|
||||||
|
|
||||||
|
# Create test station
|
||||||
|
station_data = {
|
||||||
|
"station_id": "TEST_001",
|
||||||
|
"name": "Test Station",
|
||||||
|
"city": "madrid",
|
||||||
|
"latitude": 40.4168,
|
||||||
|
"longitude": -3.7038,
|
||||||
|
"is_active": True
|
||||||
|
}
|
||||||
|
await repository.create_weather_station(station_data)
|
||||||
|
|
||||||
|
stations = await repository.get_weather_stations("madrid")
|
||||||
|
|
||||||
|
assert len(stations) == 1
|
||||||
|
assert stations[0].station_id == "TEST_001"
|
||||||
|
|
||||||
|
async def test_create_weather_job(self, test_db_session, sample_tenant_id):
|
||||||
|
"""Test creating weather data collection job"""
|
||||||
|
repository = WeatherRepository(test_db_session)
|
||||||
|
|
||||||
|
job_data = {
|
||||||
|
"job_type": "current",
|
||||||
|
"city": "madrid",
|
||||||
|
"status": "pending",
|
||||||
|
"scheduled_at": datetime.utcnow(),
|
||||||
|
"tenant_id": sample_tenant_id
|
||||||
|
}
|
||||||
|
|
||||||
|
job = await repository.create_weather_job(job_data)
|
||||||
|
|
||||||
|
assert job is not None
|
||||||
|
assert job.job_type == "current"
|
||||||
|
assert job.status == "pending"
|
||||||
|
|
||||||
|
async def test_update_weather_job(self, test_db_session, sample_tenant_id):
|
||||||
|
"""Test updating weather job"""
|
||||||
|
repository = WeatherRepository(test_db_session)
|
||||||
|
|
||||||
|
# Create job first
|
||||||
|
job_data = {
|
||||||
|
"job_type": "current",
|
||||||
|
"city": "madrid",
|
||||||
|
"status": "pending",
|
||||||
|
"scheduled_at": datetime.utcnow(),
|
||||||
|
"tenant_id": sample_tenant_id
|
||||||
|
}
|
||||||
|
job = await repository.create_weather_job(job_data)
|
||||||
|
|
||||||
|
# Update job
|
||||||
|
update_data = {
|
||||||
|
"status": "completed",
|
||||||
|
"completed_at": datetime.utcnow(),
|
||||||
|
"success_count": 1
|
||||||
|
}
|
||||||
|
|
||||||
|
success = await repository.update_weather_job(job.id, update_data)
|
||||||
|
|
||||||
|
assert success is True
|
||||||
|
|
||||||
|
async def test_get_weather_jobs(self, test_db_session, sample_tenant_id):
|
||||||
|
"""Test getting weather jobs"""
|
||||||
|
repository = WeatherRepository(test_db_session)
|
||||||
|
|
||||||
|
# Create test job
|
||||||
|
job_data = {
|
||||||
|
"job_type": "forecast",
|
||||||
|
"city": "madrid",
|
||||||
|
"status": "completed",
|
||||||
|
"scheduled_at": datetime.utcnow(),
|
||||||
|
"tenant_id": sample_tenant_id
|
||||||
|
}
|
||||||
|
await repository.create_weather_job(job_data)
|
||||||
|
|
||||||
|
jobs = await repository.get_weather_jobs()
|
||||||
|
|
||||||
|
assert len(jobs) >= 1
|
||||||
|
assert any(job.job_type == "forecast" for job in jobs)
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
class TestTrafficRepository:
|
||||||
|
"""Test Traffic Repository operations"""
|
||||||
|
|
||||||
|
async def test_create_traffic_data(self, test_db_session, sample_traffic_data):
|
||||||
|
"""Test creating traffic data"""
|
||||||
|
repository = TrafficRepository(test_db_session)
|
||||||
|
|
||||||
|
# Convert sample data to list for bulk create
|
||||||
|
traffic_list = [sample_traffic_data]
|
||||||
|
|
||||||
|
count = await repository.bulk_create_traffic_data(traffic_list)
|
||||||
|
|
||||||
|
assert count == 1
|
||||||
|
|
||||||
|
async def test_get_current_traffic(self, populated_traffic_db, sample_traffic_data):
|
||||||
|
"""Test getting current traffic data"""
|
||||||
|
repository = TrafficRepository(populated_traffic_db)
|
||||||
|
|
||||||
|
result = await repository.get_current_traffic("madrid")
|
||||||
|
|
||||||
|
assert len(result) >= 1
|
||||||
|
assert result[0].city == "madrid"
|
||||||
|
|
||||||
|
async def test_get_current_traffic_with_filters(self, populated_traffic_db):
|
||||||
|
"""Test getting current traffic with filters"""
|
||||||
|
repository = TrafficRepository(populated_traffic_db)
|
||||||
|
|
||||||
|
result = await repository.get_current_traffic("madrid", district="Chamartín")
|
||||||
|
|
||||||
|
# Should return results based on filter
|
||||||
|
assert isinstance(result, list)
|
||||||
|
|
||||||
|
async def test_get_historical_traffic(self, test_db_session, sample_traffic_data):
|
||||||
|
"""Test getting historical traffic data"""
|
||||||
|
repository = TrafficRepository(test_db_session)
|
||||||
|
|
||||||
|
# Create historical data
|
||||||
|
historical_data = sample_traffic_data.copy()
|
||||||
|
historical_data["date"] = datetime.now(timezone.utc) - timedelta(days=1)
|
||||||
|
await repository.bulk_create_traffic_data([historical_data])
|
||||||
|
|
||||||
|
start_date = datetime.now(timezone.utc) - timedelta(days=2)
|
||||||
|
end_date = datetime.now(timezone.utc)
|
||||||
|
|
||||||
|
result = await repository.get_historical_traffic("madrid", start_date, end_date)
|
||||||
|
|
||||||
|
assert len(result) >= 1
|
||||||
|
|
||||||
|
async def test_create_measurement_point(self, test_db_session):
|
||||||
|
"""Test creating traffic measurement point"""
|
||||||
|
repository = TrafficRepository(test_db_session)
|
||||||
|
|
||||||
|
point_data = {
|
||||||
|
"point_id": "TEST_POINT_001",
|
||||||
|
"name": "Test Measurement Point",
|
||||||
|
"city": "madrid",
|
||||||
|
"point_type": "TEST",
|
||||||
|
"latitude": 40.4168,
|
||||||
|
"longitude": -3.7038,
|
||||||
|
"district": "Centro",
|
||||||
|
"road_name": "Test Road",
|
||||||
|
"is_active": True
|
||||||
|
}
|
||||||
|
|
||||||
|
point = await repository.create_measurement_point(point_data)
|
||||||
|
|
||||||
|
assert point is not None
|
||||||
|
assert point.point_id == "TEST_POINT_001"
|
||||||
|
assert point.name == "Test Measurement Point"
|
||||||
|
|
||||||
|
async def test_get_measurement_points(self, test_db_session):
|
||||||
|
"""Test getting measurement points"""
|
||||||
|
repository = TrafficRepository(test_db_session)
|
||||||
|
|
||||||
|
# Create test point
|
||||||
|
point_data = {
|
||||||
|
"point_id": "TEST_POINT_001",
|
||||||
|
"name": "Test Point",
|
||||||
|
"city": "madrid",
|
||||||
|
"point_type": "TEST",
|
||||||
|
"latitude": 40.4168,
|
||||||
|
"longitude": -3.7038,
|
||||||
|
"is_active": True
|
||||||
|
}
|
||||||
|
await repository.create_measurement_point(point_data)
|
||||||
|
|
||||||
|
points = await repository.get_measurement_points("madrid")
|
||||||
|
|
||||||
|
assert len(points) == 1
|
||||||
|
assert points[0].point_id == "TEST_POINT_001"
|
||||||
|
|
||||||
|
async def test_get_measurement_points_with_filters(self, test_db_session):
|
||||||
|
"""Test getting measurement points with filters"""
|
||||||
|
repository = TrafficRepository(test_db_session)
|
||||||
|
|
||||||
|
# Create test points with different types
|
||||||
|
for i, point_type in enumerate(["M30", "URB", "TEST"]):
|
||||||
|
point_data = {
|
||||||
|
"point_id": f"TEST_POINT_{i:03d}",
|
||||||
|
"name": f"Test Point {i}",
|
||||||
|
"city": "madrid",
|
||||||
|
"point_type": point_type,
|
||||||
|
"latitude": 40.4168,
|
||||||
|
"longitude": -3.7038,
|
||||||
|
"is_active": True
|
||||||
|
}
|
||||||
|
await repository.create_measurement_point(point_data)
|
||||||
|
|
||||||
|
# Filter by type
|
||||||
|
points = await repository.get_measurement_points("madrid", road_type="M30")
|
||||||
|
|
||||||
|
assert len(points) == 1
|
||||||
|
assert points[0].point_type == "M30"
|
||||||
|
|
||||||
|
async def test_get_traffic_analytics(self, populated_traffic_db):
|
||||||
|
"""Test getting traffic analytics"""
|
||||||
|
repository = TrafficRepository(populated_traffic_db)
|
||||||
|
|
||||||
|
analytics = await repository.get_traffic_analytics("madrid")
|
||||||
|
|
||||||
|
assert isinstance(analytics, dict)
|
||||||
|
assert "total_measurements" in analytics
|
||||||
|
assert "average_volume" in analytics
|
||||||
|
|
||||||
|
async def test_create_traffic_job(self, test_db_session, sample_tenant_id):
|
||||||
|
"""Test creating traffic collection job"""
|
||||||
|
repository = TrafficRepository(test_db_session)
|
||||||
|
|
||||||
|
job_data = {
|
||||||
|
"job_type": "current",
|
||||||
|
"city": "madrid",
|
||||||
|
"status": "pending",
|
||||||
|
"scheduled_at": datetime.utcnow(),
|
||||||
|
"tenant_id": sample_tenant_id
|
||||||
|
}
|
||||||
|
|
||||||
|
job = await repository.create_traffic_job(job_data)
|
||||||
|
|
||||||
|
assert job is not None
|
||||||
|
assert job.job_type == "current"
|
||||||
|
assert job.status == "pending"
|
||||||
|
|
||||||
|
async def test_update_traffic_job(self, test_db_session, sample_tenant_id):
|
||||||
|
"""Test updating traffic job"""
|
||||||
|
repository = TrafficRepository(test_db_session)
|
||||||
|
|
||||||
|
# Create job first
|
||||||
|
job_data = {
|
||||||
|
"job_type": "current",
|
||||||
|
"city": "madrid",
|
||||||
|
"status": "pending",
|
||||||
|
"scheduled_at": datetime.utcnow(),
|
||||||
|
"tenant_id": sample_tenant_id
|
||||||
|
}
|
||||||
|
job = await repository.create_traffic_job(job_data)
|
||||||
|
|
||||||
|
# Update job
|
||||||
|
update_data = {
|
||||||
|
"status": "completed",
|
||||||
|
"completed_at": datetime.utcnow(),
|
||||||
|
"success_count": 10
|
||||||
|
}
|
||||||
|
|
||||||
|
success = await repository.update_traffic_job(job.id, update_data)
|
||||||
|
|
||||||
|
assert success is True
|
||||||
|
|
||||||
|
async def test_get_traffic_jobs(self, test_db_session, sample_tenant_id):
|
||||||
|
"""Test getting traffic jobs"""
|
||||||
|
repository = TrafficRepository(test_db_session)
|
||||||
|
|
||||||
|
# Create test job
|
||||||
|
job_data = {
|
||||||
|
"job_type": "historical",
|
||||||
|
"city": "madrid",
|
||||||
|
"status": "completed",
|
||||||
|
"scheduled_at": datetime.utcnow(),
|
||||||
|
"tenant_id": sample_tenant_id
|
||||||
|
}
|
||||||
|
await repository.create_traffic_job(job_data)
|
||||||
|
|
||||||
|
jobs = await repository.get_traffic_jobs()
|
||||||
|
|
||||||
|
assert len(jobs) >= 1
|
||||||
|
assert any(job.job_type == "historical" for job in jobs)
|
||||||
|
|
||||||
|
async def test_bulk_create_performance(self, test_db_session):
|
||||||
|
"""Test bulk create performance"""
|
||||||
|
repository = TrafficRepository(test_db_session)
|
||||||
|
|
||||||
|
# Create large dataset
|
||||||
|
bulk_data = []
|
||||||
|
for i in range(100):
|
||||||
|
data = {
|
||||||
|
"city": "madrid",
|
||||||
|
"location_id": f"PM_TEST_{i:03d}",
|
||||||
|
"date": datetime.now(timezone.utc),
|
||||||
|
"measurement_point_id": f"PM_TEST_{i:03d}",
|
||||||
|
"measurement_point_name": f"Test Point {i}",
|
||||||
|
"measurement_point_type": "TEST",
|
||||||
|
"traffic_volume": 100 + i,
|
||||||
|
"average_speed": 50.0,
|
||||||
|
"congestion_level": "medium",
|
||||||
|
"occupation_percentage": 50.0,
|
||||||
|
"latitude": 40.4168,
|
||||||
|
"longitude": -3.7038,
|
||||||
|
"source": "test"
|
||||||
|
}
|
||||||
|
bulk_data.append(data)
|
||||||
|
|
||||||
|
import time
|
||||||
|
start_time = time.time()
|
||||||
|
|
||||||
|
count = await repository.bulk_create_traffic_data(bulk_data)
|
||||||
|
|
||||||
|
end_time = time.time()
|
||||||
|
execution_time = end_time - start_time
|
||||||
|
|
||||||
|
assert count == 100
|
||||||
|
assert execution_time < 3.0 # Should complete in under 3 seconds
|
||||||
445
services/external/tests/unit/test_services.py
vendored
Normal file
445
services/external/tests/unit/test_services.py
vendored
Normal file
@@ -0,0 +1,445 @@
|
|||||||
|
# services/external/tests/unit/test_services.py
|
||||||
|
"""
|
||||||
|
Unit tests for External Service Services
|
||||||
|
"""
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
from datetime import datetime, timezone, timedelta
|
||||||
|
from unittest.mock import AsyncMock, patch
|
||||||
|
from uuid import uuid4
|
||||||
|
|
||||||
|
from app.services.weather_service import WeatherService
|
||||||
|
from app.services.traffic_service import TrafficService
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
class TestWeatherService:
|
||||||
|
"""Test Weather Service business logic"""
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def weather_service(self):
|
||||||
|
"""Create weather service instance"""
|
||||||
|
return WeatherService()
|
||||||
|
|
||||||
|
async def test_get_current_weather_from_cache(self, weather_service):
|
||||||
|
"""Test getting current weather from cache"""
|
||||||
|
with patch('app.services.weather_service.get_db_transaction') as mock_get_db:
|
||||||
|
mock_db = AsyncMock()
|
||||||
|
mock_get_db.return_value.__aenter__.return_value = mock_db
|
||||||
|
|
||||||
|
mock_repository = AsyncMock()
|
||||||
|
mock_weather = AsyncMock()
|
||||||
|
mock_weather.date = datetime.now(timezone.utc) - timedelta(minutes=30) # Fresh data
|
||||||
|
mock_weather.to_dict.return_value = {"temperature": 18.5, "city": "madrid"}
|
||||||
|
mock_repository.get_current_weather.return_value = mock_weather
|
||||||
|
|
||||||
|
with patch('app.services.weather_service.WeatherRepository', return_value=mock_repository):
|
||||||
|
result = await weather_service.get_current_weather("madrid")
|
||||||
|
|
||||||
|
assert result is not None
|
||||||
|
assert result["temperature"] == 18.5
|
||||||
|
assert result["city"] == "madrid"
|
||||||
|
|
||||||
|
async def test_get_current_weather_fetch_from_api(self, weather_service, mock_aemet_response):
|
||||||
|
"""Test getting current weather from API when cache is stale"""
|
||||||
|
with patch('app.services.weather_service.get_db_transaction') as mock_get_db:
|
||||||
|
mock_db = AsyncMock()
|
||||||
|
mock_get_db.return_value.__aenter__.return_value = mock_db
|
||||||
|
|
||||||
|
mock_repository = AsyncMock()
|
||||||
|
# No cached data or stale data
|
||||||
|
mock_repository.get_current_weather.return_value = None
|
||||||
|
mock_stored = AsyncMock()
|
||||||
|
mock_stored.to_dict.return_value = {"temperature": 20.0}
|
||||||
|
mock_repository.create_weather_data.return_value = mock_stored
|
||||||
|
|
||||||
|
# Mock AEMET client
|
||||||
|
mock_client = AsyncMock()
|
||||||
|
mock_client.get_current_weather.return_value = mock_aemet_response
|
||||||
|
|
||||||
|
with patch('app.services.weather_service.WeatherRepository', return_value=mock_repository):
|
||||||
|
weather_service.aemet_client = mock_client
|
||||||
|
|
||||||
|
result = await weather_service.get_current_weather("madrid")
|
||||||
|
|
||||||
|
assert result is not None
|
||||||
|
assert result["temperature"] == 20.0
|
||||||
|
mock_client.get_current_weather.assert_called_once()
|
||||||
|
|
||||||
|
async def test_get_weather_forecast_from_cache(self, weather_service):
|
||||||
|
"""Test getting weather forecast from cache"""
|
||||||
|
with patch('app.services.weather_service.get_db_transaction') as mock_get_db:
|
||||||
|
mock_db = AsyncMock()
|
||||||
|
mock_get_db.return_value.__aenter__.return_value = mock_db
|
||||||
|
|
||||||
|
mock_repository = AsyncMock()
|
||||||
|
mock_forecast = [AsyncMock(), AsyncMock()]
|
||||||
|
for item in mock_forecast:
|
||||||
|
item.created_at = datetime.now(timezone.utc) - timedelta(hours=1) # Fresh
|
||||||
|
item.to_dict.return_value = {"temperature": 22.0}
|
||||||
|
mock_repository.get_weather_forecast.return_value = mock_forecast
|
||||||
|
|
||||||
|
with patch('app.services.weather_service.WeatherRepository', return_value=mock_repository):
|
||||||
|
result = await weather_service.get_weather_forecast("madrid", 7)
|
||||||
|
|
||||||
|
assert len(result) == 2
|
||||||
|
assert all(item["temperature"] == 22.0 for item in result)
|
||||||
|
|
||||||
|
async def test_get_weather_forecast_fetch_from_api(self, weather_service):
|
||||||
|
"""Test getting weather forecast from API when cache is stale"""
|
||||||
|
with patch('app.services.weather_service.get_db_transaction') as mock_get_db:
|
||||||
|
mock_db = AsyncMock()
|
||||||
|
mock_get_db.return_value.__aenter__.return_value = mock_db
|
||||||
|
|
||||||
|
mock_repository = AsyncMock()
|
||||||
|
# No cached data
|
||||||
|
mock_repository.get_weather_forecast.return_value = []
|
||||||
|
mock_stored = AsyncMock()
|
||||||
|
mock_stored.to_dict.return_value = {"temperature": 25.0}
|
||||||
|
mock_repository.create_weather_data.return_value = mock_stored
|
||||||
|
|
||||||
|
# Mock AEMET client
|
||||||
|
mock_client = AsyncMock()
|
||||||
|
mock_client.get_forecast.return_value = [
|
||||||
|
{"forecast_date": datetime.now(), "temperature": 25.0}
|
||||||
|
]
|
||||||
|
|
||||||
|
with patch('app.services.weather_service.WeatherRepository', return_value=mock_repository):
|
||||||
|
weather_service.aemet_client = mock_client
|
||||||
|
|
||||||
|
result = await weather_service.get_weather_forecast("madrid", 7)
|
||||||
|
|
||||||
|
assert len(result) == 1
|
||||||
|
assert result[0]["temperature"] == 25.0
|
||||||
|
mock_client.get_forecast.assert_called_once()
|
||||||
|
|
||||||
|
async def test_get_historical_weather(self, weather_service, sample_tenant_id):
|
||||||
|
"""Test getting historical weather data"""
|
||||||
|
start_date = datetime.now(timezone.utc) - timedelta(days=7)
|
||||||
|
end_date = datetime.now(timezone.utc)
|
||||||
|
|
||||||
|
with patch('app.services.weather_service.get_db_transaction') as mock_get_db:
|
||||||
|
mock_db = AsyncMock()
|
||||||
|
mock_get_db.return_value.__aenter__.return_value = mock_db
|
||||||
|
|
||||||
|
mock_repository = AsyncMock()
|
||||||
|
mock_historical = [AsyncMock(), AsyncMock()]
|
||||||
|
for item in mock_historical:
|
||||||
|
item.to_dict.return_value = {"temperature": 18.0}
|
||||||
|
mock_repository.get_historical_weather.return_value = mock_historical
|
||||||
|
|
||||||
|
with patch('app.services.weather_service.WeatherRepository', return_value=mock_repository):
|
||||||
|
result = await weather_service.get_historical_weather(
|
||||||
|
"madrid", start_date, end_date, sample_tenant_id
|
||||||
|
)
|
||||||
|
|
||||||
|
assert len(result) == 2
|
||||||
|
assert all(item["temperature"] == 18.0 for item in result)
|
||||||
|
|
||||||
|
async def test_get_weather_stations(self, weather_service):
|
||||||
|
"""Test getting weather stations"""
|
||||||
|
with patch('app.services.weather_service.get_db_transaction') as mock_get_db:
|
||||||
|
mock_db = AsyncMock()
|
||||||
|
mock_get_db.return_value.__aenter__.return_value = mock_db
|
||||||
|
|
||||||
|
mock_repository = AsyncMock()
|
||||||
|
mock_stations = [AsyncMock()]
|
||||||
|
mock_stations[0].to_dict.return_value = {"station_id": "TEST_001"}
|
||||||
|
mock_repository.get_weather_stations.return_value = mock_stations
|
||||||
|
|
||||||
|
with patch('app.services.weather_service.WeatherRepository', return_value=mock_repository):
|
||||||
|
result = await weather_service.get_weather_stations("madrid")
|
||||||
|
|
||||||
|
assert len(result) == 1
|
||||||
|
assert result[0]["station_id"] == "TEST_001"
|
||||||
|
|
||||||
|
async def test_trigger_weather_collection(self, weather_service, sample_tenant_id):
|
||||||
|
"""Test triggering weather data collection"""
|
||||||
|
with patch('app.services.weather_service.get_db_transaction') as mock_get_db:
|
||||||
|
mock_db = AsyncMock()
|
||||||
|
mock_get_db.return_value.__aenter__.return_value = mock_db
|
||||||
|
|
||||||
|
mock_repository = AsyncMock()
|
||||||
|
mock_job = AsyncMock()
|
||||||
|
mock_job.id = uuid4()
|
||||||
|
mock_job.to_dict.return_value = {"id": str(mock_job.id), "status": "pending"}
|
||||||
|
mock_repository.create_weather_job.return_value = mock_job
|
||||||
|
|
||||||
|
with patch('app.services.weather_service.WeatherRepository', return_value=mock_repository):
|
||||||
|
result = await weather_service.trigger_weather_collection(
|
||||||
|
"madrid", "current", sample_tenant_id
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result["status"] == "pending"
|
||||||
|
mock_repository.create_weather_job.assert_called_once()
|
||||||
|
|
||||||
|
async def test_process_weather_collection_job(self, weather_service):
|
||||||
|
"""Test processing weather collection job"""
|
||||||
|
job_id = uuid4()
|
||||||
|
|
||||||
|
with patch('app.services.weather_service.get_db_transaction') as mock_get_db:
|
||||||
|
mock_db = AsyncMock()
|
||||||
|
mock_get_db.return_value.__aenter__.return_value = mock_db
|
||||||
|
|
||||||
|
mock_repository = AsyncMock()
|
||||||
|
|
||||||
|
# Mock job
|
||||||
|
mock_job = AsyncMock()
|
||||||
|
mock_job.id = job_id
|
||||||
|
mock_job.job_type = "current"
|
||||||
|
mock_job.city = "madrid"
|
||||||
|
|
||||||
|
mock_repository.get_weather_jobs.return_value = [mock_job]
|
||||||
|
mock_repository.update_weather_job.return_value = True
|
||||||
|
|
||||||
|
# Mock updated job after completion
|
||||||
|
mock_updated_job = AsyncMock()
|
||||||
|
mock_updated_job.to_dict.return_value = {"id": str(job_id), "status": "completed"}
|
||||||
|
|
||||||
|
# Mock methods for different calls
|
||||||
|
def mock_get_jobs_side_effect():
|
||||||
|
return [mock_updated_job] # Return completed job
|
||||||
|
|
||||||
|
mock_repository.get_weather_jobs.side_effect = [
|
||||||
|
[mock_job], # First call returns pending job
|
||||||
|
[mock_updated_job] # Second call returns completed job
|
||||||
|
]
|
||||||
|
|
||||||
|
with patch('app.services.weather_service.WeatherRepository', return_value=mock_repository):
|
||||||
|
with patch.object(weather_service, '_collect_current_weather', return_value=1):
|
||||||
|
result = await weather_service.process_weather_collection_job(job_id)
|
||||||
|
|
||||||
|
assert result["status"] == "completed"
|
||||||
|
|
||||||
|
async def test_map_weather_condition(self, weather_service):
|
||||||
|
"""Test weather condition mapping"""
|
||||||
|
test_cases = [
|
||||||
|
("Soleado", "clear"),
|
||||||
|
("Nublado", "cloudy"),
|
||||||
|
("Parcialmente nublado", "partly_cloudy"),
|
||||||
|
("Lluvioso", "rainy"),
|
||||||
|
("Nevando", "snowy"),
|
||||||
|
("Tormenta", "stormy"),
|
||||||
|
("Desconocido", "unknown")
|
||||||
|
]
|
||||||
|
|
||||||
|
for description, expected in test_cases:
|
||||||
|
result = weather_service._map_weather_condition(description)
|
||||||
|
assert result == expected
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
class TestTrafficService:
|
||||||
|
"""Test Traffic Service business logic"""
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def traffic_service(self):
|
||||||
|
"""Create traffic service instance"""
|
||||||
|
return TrafficService()
|
||||||
|
|
||||||
|
async def test_get_current_traffic_from_cache(self, traffic_service):
|
||||||
|
"""Test getting current traffic from cache"""
|
||||||
|
with patch('app.services.traffic_service.get_db_transaction') as mock_get_db:
|
||||||
|
mock_db = AsyncMock()
|
||||||
|
mock_get_db.return_value.__aenter__.return_value = mock_db
|
||||||
|
|
||||||
|
mock_repository = AsyncMock()
|
||||||
|
mock_traffic = [AsyncMock()]
|
||||||
|
mock_traffic[0].date = datetime.now(timezone.utc) - timedelta(minutes=5) # Fresh
|
||||||
|
mock_traffic[0].to_dict.return_value = {"traffic_volume": 850}
|
||||||
|
mock_repository.get_current_traffic.return_value = mock_traffic
|
||||||
|
|
||||||
|
with patch('app.services.traffic_service.TrafficRepository', return_value=mock_repository):
|
||||||
|
result = await traffic_service.get_current_traffic("madrid")
|
||||||
|
|
||||||
|
assert len(result) == 1
|
||||||
|
assert result[0]["traffic_volume"] == 850
|
||||||
|
|
||||||
|
async def test_get_current_traffic_fetch_from_api(self, traffic_service, mock_madrid_traffic_xml):
|
||||||
|
"""Test getting current traffic from API when cache is stale"""
|
||||||
|
with patch('app.services.traffic_service.get_db_transaction') as mock_get_db:
|
||||||
|
mock_db = AsyncMock()
|
||||||
|
mock_get_db.return_value.__aenter__.return_value = mock_db
|
||||||
|
|
||||||
|
mock_repository = AsyncMock()
|
||||||
|
# No cached data
|
||||||
|
mock_repository.get_current_traffic.return_value = []
|
||||||
|
mock_repository.bulk_create_traffic_data.return_value = 2
|
||||||
|
|
||||||
|
# Mock clients
|
||||||
|
mock_client = AsyncMock()
|
||||||
|
mock_client.fetch_current_traffic_xml.return_value = mock_madrid_traffic_xml
|
||||||
|
|
||||||
|
mock_processor = AsyncMock()
|
||||||
|
mock_processor.process_current_traffic_xml.return_value = [
|
||||||
|
{"traffic_volume": 850, "measurement_point_id": "PM_M30_001"},
|
||||||
|
{"traffic_volume": 320, "measurement_point_id": "PM_URB_002"}
|
||||||
|
]
|
||||||
|
|
||||||
|
with patch('app.services.traffic_service.TrafficRepository', return_value=mock_repository):
|
||||||
|
traffic_service.madrid_client = mock_client
|
||||||
|
traffic_service.madrid_processor = mock_processor
|
||||||
|
|
||||||
|
result = await traffic_service.get_current_traffic("madrid")
|
||||||
|
|
||||||
|
assert len(result) == 2
|
||||||
|
assert result[0]["traffic_volume"] == 850
|
||||||
|
mock_client.fetch_current_traffic_xml.assert_called_once()
|
||||||
|
|
||||||
|
async def test_get_historical_traffic(self, traffic_service, sample_tenant_id):
|
||||||
|
"""Test getting historical traffic data"""
|
||||||
|
start_date = datetime.now(timezone.utc) - timedelta(days=7)
|
||||||
|
end_date = datetime.now(timezone.utc)
|
||||||
|
|
||||||
|
with patch('app.services.traffic_service.get_db_transaction') as mock_get_db:
|
||||||
|
mock_db = AsyncMock()
|
||||||
|
mock_get_db.return_value.__aenter__.return_value = mock_db
|
||||||
|
|
||||||
|
mock_repository = AsyncMock()
|
||||||
|
mock_historical = [AsyncMock(), AsyncMock()]
|
||||||
|
for item in mock_historical:
|
||||||
|
item.to_dict.return_value = {"traffic_volume": 500}
|
||||||
|
mock_repository.get_historical_traffic.return_value = mock_historical
|
||||||
|
|
||||||
|
with patch('app.services.traffic_service.TrafficRepository', return_value=mock_repository):
|
||||||
|
result = await traffic_service.get_historical_traffic(
|
||||||
|
"madrid", start_date, end_date, tenant_id=sample_tenant_id
|
||||||
|
)
|
||||||
|
|
||||||
|
assert len(result) == 2
|
||||||
|
assert all(item["traffic_volume"] == 500 for item in result)
|
||||||
|
|
||||||
|
async def test_get_measurement_points(self, traffic_service):
|
||||||
|
"""Test getting measurement points"""
|
||||||
|
with patch('app.services.traffic_service.get_db_transaction') as mock_get_db:
|
||||||
|
mock_db = AsyncMock()
|
||||||
|
mock_get_db.return_value.__aenter__.return_value = mock_db
|
||||||
|
|
||||||
|
mock_repository = AsyncMock()
|
||||||
|
mock_points = [AsyncMock()]
|
||||||
|
mock_points[0].to_dict.return_value = {"point_id": "PM_TEST_001"}
|
||||||
|
mock_repository.get_measurement_points.return_value = mock_points
|
||||||
|
|
||||||
|
with patch('app.services.traffic_service.TrafficRepository', return_value=mock_repository):
|
||||||
|
result = await traffic_service.get_measurement_points("madrid")
|
||||||
|
|
||||||
|
assert len(result) == 1
|
||||||
|
assert result[0]["point_id"] == "PM_TEST_001"
|
||||||
|
|
||||||
|
async def test_get_traffic_analytics(self, traffic_service):
|
||||||
|
"""Test getting traffic analytics"""
|
||||||
|
start_date = datetime.now(timezone.utc) - timedelta(days=30)
|
||||||
|
end_date = datetime.now(timezone.utc)
|
||||||
|
|
||||||
|
with patch('app.services.traffic_service.get_db_transaction') as mock_get_db:
|
||||||
|
mock_db = AsyncMock()
|
||||||
|
mock_get_db.return_value.__aenter__.return_value = mock_db
|
||||||
|
|
||||||
|
mock_repository = AsyncMock()
|
||||||
|
mock_analytics = {
|
||||||
|
"total_measurements": 1000,
|
||||||
|
"average_volume": 650.5,
|
||||||
|
"peak_hour": "08:00"
|
||||||
|
}
|
||||||
|
mock_repository.get_traffic_analytics.return_value = mock_analytics
|
||||||
|
|
||||||
|
with patch('app.services.traffic_service.TrafficRepository', return_value=mock_repository):
|
||||||
|
result = await traffic_service.get_traffic_analytics(
|
||||||
|
"madrid", start_date, end_date
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result["total_measurements"] == 1000
|
||||||
|
assert result["average_volume"] == 650.5
|
||||||
|
assert "generated_at" in result
|
||||||
|
|
||||||
|
async def test_trigger_traffic_collection(self, traffic_service, sample_tenant_id):
|
||||||
|
"""Test triggering traffic data collection"""
|
||||||
|
with patch('app.services.traffic_service.get_db_transaction') as mock_get_db:
|
||||||
|
mock_db = AsyncMock()
|
||||||
|
mock_get_db.return_value.__aenter__.return_value = mock_db
|
||||||
|
|
||||||
|
mock_repository = AsyncMock()
|
||||||
|
mock_job = AsyncMock()
|
||||||
|
mock_job.id = uuid4()
|
||||||
|
mock_job.to_dict.return_value = {"id": str(mock_job.id), "status": "pending"}
|
||||||
|
mock_repository.create_traffic_job.return_value = mock_job
|
||||||
|
|
||||||
|
with patch('app.services.traffic_service.TrafficRepository', return_value=mock_repository):
|
||||||
|
result = await traffic_service.trigger_traffic_collection(
|
||||||
|
"madrid", "current", user_id=sample_tenant_id
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result["status"] == "pending"
|
||||||
|
mock_repository.create_traffic_job.assert_called_once()
|
||||||
|
|
||||||
|
async def test_process_traffic_collection_job(self, traffic_service):
|
||||||
|
"""Test processing traffic collection job"""
|
||||||
|
job_id = uuid4()
|
||||||
|
|
||||||
|
with patch('app.services.traffic_service.get_db_transaction') as mock_get_db:
|
||||||
|
mock_db = AsyncMock()
|
||||||
|
mock_get_db.return_value.__aenter__.return_value = mock_db
|
||||||
|
|
||||||
|
mock_repository = AsyncMock()
|
||||||
|
|
||||||
|
# Mock job
|
||||||
|
mock_job = AsyncMock()
|
||||||
|
mock_job.id = job_id
|
||||||
|
mock_job.job_type = "current"
|
||||||
|
mock_job.city = "madrid"
|
||||||
|
mock_job.location_pattern = None
|
||||||
|
|
||||||
|
mock_repository.get_traffic_jobs.return_value = [mock_job]
|
||||||
|
mock_repository.update_traffic_job.return_value = True
|
||||||
|
|
||||||
|
# Mock updated job after completion
|
||||||
|
mock_updated_job = AsyncMock()
|
||||||
|
mock_updated_job.to_dict.return_value = {"id": str(job_id), "status": "completed"}
|
||||||
|
|
||||||
|
mock_repository.get_traffic_jobs.side_effect = [
|
||||||
|
[mock_job], # First call returns pending job
|
||||||
|
[mock_updated_job] # Second call returns completed job
|
||||||
|
]
|
||||||
|
|
||||||
|
with patch('app.services.traffic_service.TrafficRepository', return_value=mock_repository):
|
||||||
|
with patch.object(traffic_service, '_collect_current_traffic', return_value=125):
|
||||||
|
result = await traffic_service.process_traffic_collection_job(job_id)
|
||||||
|
|
||||||
|
assert result["status"] == "completed"
|
||||||
|
|
||||||
|
async def test_is_traffic_data_fresh(self, traffic_service):
|
||||||
|
"""Test traffic data freshness check"""
|
||||||
|
from app.models.traffic import TrafficData
|
||||||
|
|
||||||
|
# Fresh data (5 minutes old)
|
||||||
|
fresh_data = [AsyncMock()]
|
||||||
|
fresh_data[0].date = datetime.utcnow() - timedelta(minutes=5)
|
||||||
|
|
||||||
|
result = traffic_service._is_traffic_data_fresh(fresh_data)
|
||||||
|
assert result is True
|
||||||
|
|
||||||
|
# Stale data (15 minutes old)
|
||||||
|
stale_data = [AsyncMock()]
|
||||||
|
stale_data[0].date = datetime.utcnow() - timedelta(minutes=15)
|
||||||
|
|
||||||
|
result = traffic_service._is_traffic_data_fresh(stale_data)
|
||||||
|
assert result is False
|
||||||
|
|
||||||
|
# Empty data
|
||||||
|
result = traffic_service._is_traffic_data_fresh([])
|
||||||
|
assert result is False
|
||||||
|
|
||||||
|
async def test_collect_current_traffic(self, traffic_service):
|
||||||
|
"""Test current traffic collection"""
|
||||||
|
with patch('app.services.traffic_service.get_db_transaction') as mock_get_db:
|
||||||
|
mock_db = AsyncMock()
|
||||||
|
mock_get_db.return_value.__aenter__.return_value = mock_db
|
||||||
|
|
||||||
|
mock_repository = AsyncMock()
|
||||||
|
mock_repository.bulk_create_traffic_data.return_value = 10
|
||||||
|
|
||||||
|
with patch('app.services.traffic_service.TrafficRepository', return_value=mock_repository):
|
||||||
|
with patch.object(traffic_service, '_fetch_current_traffic_from_api', return_value=[{} for _ in range(10)]):
|
||||||
|
result = await traffic_service._collect_current_traffic("madrid", None)
|
||||||
|
|
||||||
|
assert result == 10
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user