SALES
FORECAST
AI
DATA LATENCY & FRAGILITY
"C-Level executives were making decisions based on 60-day-old data. The previous process involved manual extraction from ERPs into fragile Excel spreadsheets—a 2-month cycle prone to human error and manipulation. The goal was to eliminate the 'Spreadsheet Fatigue' and provide a single source of truth."
DECOUPLED PREDICTION PIPELINE
[ ERP CLIENTE ]
|
v
[ NORMALIZATION WORKER ] ---> [ AWS RDS (Unified Data) ] <--- [ ML ENGINE (ECS) ]
(Python ETL) ^ | ^
| | (Writes Predictions)| (Reads History)
| +---------------------+
|
[ PYTHON BFF ]
|
v
[ REACT FRONTEND ]Architecture designed for read-performance. The ML Engine runs as an isolated service on ECS, ingesting historical data and persisting predictions back into the RDS. The Python BFF consumes this pre-computed data, ensuring zero-latency dashboards for the end-user.
NATURAL LANGUAGE INTERFACE
To democratize data access for non-technical directors, I developed a Text-to-SQL layer using LLMs. This allows executives to query complex datasets using plain Portuguese.
User asks question in natural language
LLM compares with Schema Metadata
System executes safe, readonly SQL
// USER_INPUT
"Qual foi o faturamento da região Sul em Janeiro?"
// SYSTEM_PROCESS
analyzing_schema...
mapping_entities [region, date, revenue]...
// GENERATED_SQL
SELECT sum(total_revenue)
FROM sales_transactions
WHERE region = 'SUL'
AND month = '01'
AND year = '2025';
// OUTPUT
R$ 353.220,00
TRUST-BASED VISUALIZATION
As UX Lead, I mentored the product design team to prioritize data legibility over aesthetics. We implemented clear confidence intervals and regional heatmaps to give directors the confidence to act.
Eliminated the need for a dedicated data extraction squad, automating the entire ETL pipeline.
Moved from semi-annual reports to real-time dashboards, enabling agile goal setting for all branches.
Standardized data model allowed instant integration with new client ERPs without code changes.