A complete end-to-end demonstration of a governed, real-time data product for 3PL logistics operations
This repository contains a production-ready reference implementation demonstrating how to build a modern data product on Microsoft Fabric with:
β
Real-Time Data Product with Fabric Real-Time Intelligence (Eventhouse)
β
OneLake Security - Centralized Row-Level Security across all 6 Fabric engines
β
Microsoft Purview Unified Catalog - Complete data governance and quality monitoring
β
GraphQL API via Azure API Management with OAuth2 authentication
β
Live Demo Application - Visual interface showing partner-specific data filtering
β
SAP IDoc Simulator - Generate realistic 3PL logistics data
# 1. Clone the repository
git clone https://github.com/flthibau/Fabric-SAP-Idocs.git
cd Fabric-SAP-Idocs
# 2. Launch the demo application
cd demo-app
.\start-demo.ps1
# 3. Get an access token (example with FedEx carrier)
.\get-token.example.ps1 -ServicePrincipal fedex
# 4. Open http://localhost:8000 and paste the tokenπ Full Setup Guide: See demo-app/QUICKSTART.md
A manufacturing company outsources logistics to external partners (carriers, warehouses, customers) and needs to expose real-time operational data via API while ensuring each partner sees only their own data.
- 3 Partner Types: Carriers (e.g., FedEx), Warehouse Partners (e.g., WH-EAST), Customers (e.g., ACME Corp)
- 5 Data Entities: Orders, Shipments, Deliveries, Warehouse Movements, Invoices
- Security Requirement: Partners must only see data they're authorized to access
Real-Time Data Product powered by:
- π₯ Microsoft Fabric Real-Time Intelligence (Eventhouse) for sub-second streaming
- π OneLake Security for centralized Row-Level Security across 6 engines
- π Microsoft Purview for data governance and quality monitoring
- π GraphQL API exposed through Azure API Management
π Complete Business Case: demo-app/BUSINESS_SCENARIO.md
SAP ERP System
β
Azure Event Hubs (idoc-events)
β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β MICROSOFT FABRIC REAL-TIME INTELLIGENCE β
β βββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β Eventhouse (KQL Database) β β
β β - Sub-second ingestion β β
β β - Streaming transformations β β
β β - Real-time analytics β β
β βββββββββββββββββββββββββββββββββββββββββββββββββββ β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β MICROSOFT FABRIC LAKEHOUSE β
β βββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β OneLake Storage (Delta Lake) β β
β β - Bronze: Raw IDocs β β
β β - Silver: Normalized tables β β
β β - Gold: Business views (materialized) β β
β βββββββββββββββββββββββββββββββββββββββββββββββββββ β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β ONELAKE SECURITY LAYER (Centralized RLS) β
β β Real-Time Intelligence (KQL) β
β β Data Engineering (Spark) β
β β Data Warehouse (SQL) β
β β Power BI (Direct Lake) β
β β GraphQL API (THIS PROJECT) β
β β OneLake API β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β
Fabric GraphQL API (partner_logistics_api)
β
Azure API Management (apim-3pl-flt)
- OAuth2 validation
- CORS policy
- Rate limiting
β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β MICROSOFT PURVIEW UNIFIED CATALOG β
β - Data Product registration β
β - Data quality monitoring β
β - Lineage tracking β
β - Business glossary β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β
Partner Applications
- FedEx Carrier Portal
- Warehouse WH-EAST Dashboard
- ACME Corp Customer Portal
π Technical Architecture: demo-app/API_TECHNICAL_SETUP.md
This project demonstrates how to build a governed, real-time data product on Microsoft Fabric with enterprise-grade security and quality controls. It showcases the complete journey from SAP IDoc ingestion to partner API consumption.
Live demonstration of Row-Level Security in action
- Visual Interface: 4-tab application showing partner-specific data filtering
- OAuth2 Authentication: Service Principal token acquisition scripts
- Documentation:
BUSINESS_SCENARIO.md- Professional business case (LinkedIn-ready)API_TECHNICAL_SETUP.md- Complete technical guideQUICKSTART.md- Get started in 5 minutes
Technologies: HTML5, JavaScript, Python HTTP Server
Generate realistic 3PL logistics data
- 5 IDoc Types: ORDERS, SHPMNT, DESADV, WHSCON, INVOIC
- Configurable Scenarios: Warehouse count, customer count, carrier count
- Azure Event Hubs Integration: Direct ingestion to Fabric Eventstream
- Documentation: Complete setup and usage guide
Technologies: Python 3.11+, Azure SDK, YAML configuration
cd simulator
python main.py --count 100 # Generate 100 IDocsMicrosoft Fabric workspace setup and data transformations
- Eventstream: Real-Time Intelligence ingestion configuration
- Data Engineering: Spark notebooks for Bronze/Silver/Gold layers
- Warehouse: SQL schemas and materialized views
- OneLake Security: Row-Level Security configuration guides
Key Files:
warehouse/security/ONELAKE_RLS_CONFIGURATION_GUIDE.md- Complete RLS setupdata-engineering/notebooks/gold_layer_orders_summary.py- Gold layer transformations
GraphQL and APIM configuration
- GraphQL Schema: Partner-filtered data access (
partner-api.graphql) - APIM Policies:
- CORS configuration
- OAuth2 validation
- Rate limiting
- GraphQL passthrough
- PowerShell Scripts: Service Principal setup, APIM deployment, testing
Endpoints:
- GraphQL:
https://apim-3pl-flt.azure-api.net/graphql - REST (auto-generated):
/rest/shipments,/rest/orders, etc.
Microsoft Purview integration
- Data Product Registration: Purview catalog integration
- Data Quality Rules: Automated quality monitoring
- KQL Queries: Quality validation dashboards
- Python Scripts: Quality rule deployment
Technologies: Microsoft Purview, KQL, Python
Infrastructure as Code templates
- Bicep Templates: Azure resource deployment
- PowerShell Scripts: APIM setup, Service Principal creation, REST API deployment
- Configuration Files: Resource definitions and policies
| Layer | Technology | Purpose |
|---|---|---|
| Ingestion | Azure Event Hubs | SAP IDoc streaming |
| Real-Time Processing | Fabric Real-Time Intelligence (Eventhouse) | Sub-second analytics |
| Storage | OneLake (Delta Lake) | Unified data lake |
| Transformation | Fabric Data Engineering (Spark) | ETL pipelines |
| Analytics | Fabric Data Warehouse (SQL) | TSQL queries |
| API | Fabric GraphQL + Azure APIM | Data product exposure |
| Security | OneLake Security + Azure AD | Centralized RLS |
| Governance | Microsoft Purview | Data catalog & quality |
| BI | Power BI Direct Lake | Real-time dashboards |
Storage-Layer Row-Level Security enforced across all 6 Fabric engines:
-- Example RLS rule applied at OneLake storage layer
CREATE FUNCTION dbo.PartnerSecurityPredicate(@partner_id NVARCHAR(50))
RETURNS TABLE
WITH SCHEMABINDING
AS RETURN (
SELECT 1 AS AccessGranted
WHERE @partner_id = CAST(SESSION_CONTEXT(N'PartnerID') AS NVARCHAR(50))
)Benefits:
β
Centralized: One RLS definition, enforced everywhere
β
Multi-Engine: Works across KQL, Spark, SQL, Power BI, GraphQL, OneLake API
β
Identity-Aware: Leverages Azure AD Service Principal claims
β
Impossible to Bypass: Enforced at storage layer, not application layer
- Partner Application β Acquires OAuth2 token from Azure AD
- Token Claims β Include Service Principal ObjectId
- APIM Gateway β Validates token, extracts claims
- GraphQL API β Sets session context with partner identity
- OneLake Security β Filters data based on RLS rules
- Partner Receives β Only authorized data
π Complete Security Guide: fabric/warehouse/security/ONELAKE_RLS_CONFIGURATION_GUIDE.md
Data Product Registration:
- Product Name:
SAP-3PL-Logistics-Real-Time-Product - Domain: Logistics & Supply Chain
- Owner: Data Product Team
- SLA: < 5 minutes latency, 99.9% availability
Data Quality Monitoring (6 dimensions):
- Completeness: Required fields populated
- Accuracy: Valid reference data
- Consistency: Cross-entity relationships maintained
- Timeliness: Data freshness SLA compliance
- Validity: Format and range validations
- Uniqueness: No duplicate key violations
Automated Quality Checks:
// Example quality check running in Purview
idoc_shipments_gold
| summarize
TotalRows = count(),
MissingCarrier = countif(isempty(carrier_id)),
FutureDates = countif(ship_date > now())
| extend
CompletenessScore = 100.0 * (1 - todouble(MissingCarrier) / TotalRows),
ValidityScore = 100.0 * (1 - todouble(FutureDates) / TotalRows)π Governance Setup: governance/PURVIEW_DATA_QUALITY_SETUP.md
Sub-Second Streaming Analytics:
- Ingestion latency: < 1 second
- Query performance: Sub-second for aggregations
- Retention: Configurable (hot/cold tiers)
Example KQL Queries:
// Real-time shipment tracking
idoc_shipments_raw
| where ingestion_time() > ago(5m)
| where carrier_id == "FEDEX"
| summarize
ShipmentCount = count(),
TotalWeight = sum(weight_kg)
by bin(ship_date, 1h)
| render timechartUse Cases:
- Live operational dashboards
- Real-time alerting
- Streaming anomaly detection
- Interactive exploration
- Azure Subscription with Microsoft Fabric enabled
- Azure AD tenant with permission to create Service Principals
- Fabric Workspace with appropriate permissions
- PowerShell 7+ or Azure CLI
- Python 3.11+ (for simulator)
cd infrastructure/bicep
# Deploy Event Hub
az deployment group create \
--resource-group rg-fabric-sap-idocs \
--template-file event-hub.bicep
# Deploy APIM (if not existing)
az deployment group create \
--resource-group rg-fabric-sap-idocs \
--template-file apim.bicepcd api/scripts
# Create 3 Service Principals (FedEx, Warehouse, ACME)
.\create-partner-apps.ps1
# Grant Fabric workspace access
.\grant-sp-workspace-access.ps1Eventstream:
- Create Eventstream:
idoc-ingestion-stream - Source: Azure Event Hubs (
eh-idoc-flt8076/idoc-events) - Destination: Eventhouse
kql-3pl-logistics
Lakehouse:
- Create Lakehouse:
lakehouse_3pl - Run Bronze/Silver/Gold transformation notebooks
- Create materialized views
OneLake Security:
-- Run in Fabric Warehouse
-- See fabric/warehouse/security/ONELAKE_RLS_CONFIGURATION_GUIDE.md
CREATE SECURITY POLICY PartnerAccessPolicy
ADD FILTER PREDICATE dbo.PartnerSecurityPredicate(partner_id)
ON gold.orders, gold.shipments, gold.invoices
WITH (STATE = ON);cd fabric/scripts
# Enable GraphQL on Lakehouse
.\enable-graphql-api.ps1
# Deploy API definition
.\deploy-graphql-api.ps1cd api/scripts
# Deploy APIM policies (CORS, OAuth, etc.)
.\configure-and-test-apim.ps1
# Test REST API endpoints
.\test-rest-apis.ps1cd demo-app
# Copy example file and add your secrets
Copy-Item get-token.example.ps1 get-token.ps1
# Edit get-token.ps1 with real Service Principal secrets
# Start demo server
.\start-demo.ps1
# In another terminal, get a token
.\get-token.ps1 -ServicePrincipal fedex
# Open http://localhost:8000 and paste tokencd governance/purview
# Register data product in Purview
python create_data_quality_rules.py
# Deploy quality monitoring dashboard
# Upload data_quality_monitoring_dashboard.kql to Purviewπ Detailed Setup Guides: See individual README files in each folder
# 1. Generate test IDocs
cd simulator
python main.py --count 50
# 2. Verify Eventstream ingestion
# Check Fabric Eventstream monitoring
# 3. Query Eventhouse (Real-Time Intelligence)
# Run KQL query in Eventhouse portal
# 4. Test GraphQL API
cd ../api/scripts
.\test-graphql-rls.ps1
# 5. Test REST API via APIM
.\test-rest-apis.ps1
# 6. Verify RLS filtering
.\test-fedex-only.ps1 # Should only see FedEx shipments// Run in Eventhouse or Purview
idoc_shipments_gold
| extend QualityCheck = case(
isempty(carrier_id), "Missing Carrier",
isempty(tracking_number), "Missing Tracking",
weight_kg <= 0, "Invalid Weight",
"OK"
)
| summarize count() by QualityCheck- π
demo-app/BUSINESS_SCENARIO.md- LinkedIn-ready business case - π
demo-app/API_TECHNICAL_SETUP.md- Technical architecture deep-dive - π
PROJECT_STRUCTURE.md- Repository organization
- π
demo-app/QUICKSTART.md- Get demo running in 5 minutes - π
simulator/README.md- IDoc simulator setup - π
fabric/GRAPHQL_DEPLOYMENT_GUIDE.md- GraphQL API deployment - π
fabric/warehouse/security/ONELAKE_RLS_CONFIGURATION_GUIDE.md- OneLake Security setup
- π
api/GRAPHQL_QUERIES_REFERENCE.md- GraphQL schema and examples - π
api/APIM_CONFIGURATION.md- APIM policies and configuration
- π
governance/PURVIEW_DATA_QUALITY_SETUP.md- Purview integration - π
docs/governance-guide.md- Data governance best practices
- Sub-second latency from SAP to API using Eventhouse
- Streaming analytics with KQL for operational insights
- Hot path for live dashboards and alerting
- Single RLS definition enforced across 6 Fabric engines
- Storage-layer security impossible to bypass
- Identity-aware filtering via Azure AD integration
- Purview Unified Catalog for data product registration
- Automated quality monitoring with 6 quality dimensions
- Full lineage tracking from SAP to API
- GraphQL-first API for flexible data access
- APIM gateway for enterprise-grade API management
- OAuth2 authentication with Service Principal claims
- Live application showing real RLS filtering
- Complete documentation for LinkedIn sharing
- Infrastructure as Code for repeatable deployment
Contributions are welcome! Please follow these guidelines:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is provided as-is for educational and demonstration purposes.
Florent Thibault
Microsoft - Data & AI Specialist
π§ Contact: [Your Contact Info]
π LinkedIn: [Your LinkedIn]
- Microsoft Fabric Team - Real-Time Intelligence capabilities
- Azure APIM Team - GraphQL support
- Microsoft Purview Team - Data governance platform
- Community Contributors - Testing and feedback
- v1.0.0 (October 2025)
- β Complete demo application with RLS
- β Real-Time Intelligence integration
- β OneLake Security implementation
- β Purview Unified Catalog integration
- β GraphQL API via APIM
- β SAP IDoc simulator
- β Comprehensive documentation
β If you find this project useful, please star the repository!
π Repository: https://github.com/flthibau/Fabric-SAP-Idocs