A Streamlit-based dashboard for orchestrating data synchronization between Azure Data Manager for Energy (ADME) instances through Microsoft Fabric Lakehouses.
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β ADME-Fabric Control Plane β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β β
β βββββββββββββββββββββββββββ βββββββββββββββββββββββββββ β
β β SOURCE ENVIRONMENT β β TARGET ENVIRONMENT β β
β β β β β β
β β βββββββββββββββββββ β β βββββββββββββββββββ β β
β β β Source ADME β β β β Target ADME β β β
β β β (Azure Data β β β β (Azure Data β β β
β β β Manager for β β β β Manager for β β β
β β β Energy) β β β β Energy) β β β
β β ββββββββββ¬βββββββββ β β ββββββββββ²βββββββββ β β
β β β β β β β β
β β β Search & β β Storage β β β
β β β Storage API β β API PUT β β β
β β βΌ β β β β β
β β βββββββββββββββββββ β β βββββββββββββββββββ β β
β β β Fabric β β Sync β β Fabric β β β
β β β Lakehouse ββββββΌβββββββββββββΆβ Lakehouse β β β
β β β (Delta Tables) β β (EDS) β β (Delta Tables) β β β
β β βββββββββββββββββββ β β βββββββββββββββββββ β β
β β β β β β
β βββββββββββββββββββββββββββ βββββββββββββββββββββββββββ β
β β
β ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β CONTROL PLANE (This App) β β
β β β β
β β ββββββββββββββββ ββββββββββββββββ ββββββββββββββββββββββββββββ β β
β β β Streamlit β β SQLite / β β Fabric REST API β β β
β β β Dashboard ββββΆβ Azure SQL β β (Notebook Triggers) β β β
β β β β β Database β β β β β
β β ββββββββββββββββ ββββββββββββββββ ββββββββββββββββββββββββββββ β β
β β β β
β ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Data Flow:
1. Export: Source ADME βββΆ Source Lakehouse (via Export Notebook)
2. Sync: Source Lakehouse βββΆ Target Lakehouse (via External Data Share)
3. Ingest: Target Lakehouse βββΆ Target ADME (via Ingest Notebook)
π View Interactive Mermaid Diagrams
flowchart TB
subgraph SourceEnv["Source Environment"]
ADME_S[("Source ADME<br/>Azure Data Manager<br/>for Energy")]
subgraph FabricS["Source Fabric Workspace"]
LH_S[("Source Lakehouse<br/>Delta Tables")]
NB_Export["Export Notebook<br/>(ADME β Fabric)"]
end
end
subgraph TargetEnv["Target Environment"]
subgraph FabricT["Target Fabric Workspace"]
LH_T[("Target Lakehouse<br/>Delta Tables")]
NB_Ingest["Ingest Notebook<br/>(Fabric β ADME)"]
end
ADME_T[("Target ADME<br/>Azure Data Manager<br/>for Energy")]
end
subgraph ControlPlane["Control Plane Dashboard"]
UI["Streamlit UI"]
DB[("SQLite<br/>Jobs & Connections")]
API["Fabric REST API<br/>Notebook Trigger"]
end
ADME_S -->|"1. Search & Storage API"| NB_Export
NB_Export -->|"2. Write Delta"| LH_S
LH_S -->|"3. EDS Sync"| LH_T
LH_T -->|"4. Read Delta"| NB_Ingest
NB_Ingest -->|"5. Storage API PUT"| ADME_T
UI -->|"Configure & Monitor"| DB
UI -->|"Trigger Jobs"| API
API -->|"Execute"| NB_Export
API -->|"Execute"| NB_Ingest
sequenceDiagram
participant User
participant Dashboard as Control Plane
participant SrcADME as Source ADME
participant SrcFabric as Source Fabric
participant TgtFabric as Target Fabric
participant TgtADME as Target ADME
User->>Dashboard: Configure Sync Job
User->>Dashboard: Click "Sync Now"
rect rgb(240, 248, 255)
Note over Dashboard,SrcFabric: Phase 1: Export
Dashboard->>SrcFabric: Trigger Export Notebook
SrcFabric->>SrcADME: Search API (get record IDs)
SrcADME-->>SrcFabric: Record IDs + Cursor
SrcFabric->>SrcADME: Storage API (batch records)
SrcADME-->>SrcFabric: Full Record Data
SrcFabric->>SrcFabric: Write to Delta Table
SrcFabric-->>Dashboard: Export Complete (N records)
end
rect rgb(255, 248, 240)
Note over SrcFabric,TgtFabric: Phase 2: Fabric Sync (EDS)
Dashboard->>SrcFabric: Trigger Sync Notebook
SrcFabric->>TgtFabric: External Data Share
TgtFabric-->>Dashboard: Sync Complete
end
rect rgb(240, 255, 240)
Note over TgtFabric,TgtADME: Phase 3: Ingest
Dashboard->>TgtFabric: Trigger Ingest Notebook
TgtFabric->>TgtFabric: Read from Delta Table
TgtFabric->>TgtFabric: Transform Records
TgtFabric->>TgtADME: Storage API PUT (batch)
TgtADME-->>TgtFabric: Ingest Response
TgtFabric-->>Dashboard: Ingest Complete (N records)
end
Dashboard-->>User: Job Complete β
- Connection Management: Configure ADME-Fabric environment pairs with encrypted secrets
- Sync Wizard: Multi-step wizard with cascading dropdowns for kind/partition selection
- Real-time Dashboard: Monitor active jobs with live logs and progress tracking
- Job History: View past jobs, clone configurations, re-run with identical settings
- Error Browser: Analyze errors with categorization and full stack traces
- Comprehensive Logging: Full API request/response tracking for debugging
- Mock Mode: Develop and test without real ADME/Fabric connections
| Component | Technology | Version |
|---|---|---|
| Frontend/Backend | Streamlit | 1.40.0 |
| Database | SQLite + SQLAlchemy | 2.0.25 |
| Authentication | MSAL | 1.26.0 |
| HTTP Client | Requests + urllib3 | 2.31.0 |
| Encryption | Fernet (cryptography) | 41.0.7 |
| Validation | Pydantic | 2.5.3 |
- Python 3.11+ (Download)
- Git (Download)
- Azure Service Principal with permissions to:
- Azure Data Manager for Energy (ADME)
- Microsoft Fabric workspace
- Microsoft Fabric workspace with a Lakehouse
git clone https://github.com/mhadaily/ADME-FABRIC-ControlPlane.git
cd ADME-FABRIC-ControlPlaneWindows (PowerShell):
python -m venv .venv
.\.venv\Scripts\Activate.ps1Linux/macOS:
python3 -m venv .venv
source .venv/bin/activatepip install -r requirements.txt# Copy the example environment file
copy .env.example .env # Windows
# cp .env.example .env # Linux/macOSGenerate an encryption key:
python -c "from cryptography.fernet import Fernet; print(Fernet.generate_key().decode())"Edit .env and add your encryption key:
ENCRYPTION_KEY=your-generated-key-here
LOG_LEVEL=INFO
MOCK_MODE=falsestreamlit run app.pyThe application will open at http://localhost:8501
Enable mock mode to develop without real ADME/Fabric connections:
# .env
MOCK_MODE=trueMock mode provides:
- Simulated ADME kinds and partitions
- Fake record counts
- Delayed responses to simulate network latency
- No actual API calls
# .env
LOG_LEVEL=DEBUG
DEBUG=trueadme-fabric-control-plane/
βββ app.py # Streamlit entry point
βββ requirements.txt # Python dependencies
βββ Dockerfile # Container configuration
βββ .env.example # Environment template
βββ LICENSE # MIT License
βββ CONTRIBUTING.md # Contribution guidelines
β
βββ config/
β βββ settings.py # Pydantic settings with defaults
β βββ logging_config.py # Logging configuration
β
βββ database/
β βββ models.py # SQLAlchemy models
β βββ db.py # Database session management
β
βββ clients/
β βββ base_client.py # HTTP client with retry logic
β βββ adme_client.py # ADME API client
β βββ fabric_client.py # Fabric API client
β
βββ services/
β βββ auth_service.py # Token management & caching
β βββ connection_service.py # Environment CRUD operations
β βββ job_service.py # Job tracking & management
β βββ sync_service.py # 3-phase sync orchestration
β βββ scheduler_service.py # Job scheduling
β βββ error_service.py # Centralized error handling
β
βββ components/
β βββ connection_form.py # Environment configuration form
β βββ sync_wizard.py # Multi-step sync wizard
β βββ query_builder.py # LUCENE query builder
β βββ job_card.py # Job status card
β βββ log_viewer.py # Real-time log viewer
β βββ error_display.py # Error banners & details
β
βββ pages/
β βββ 1_π_Connections.py # Connection management
β βββ 2_π_Data_Product_Definitions.py # Data product definitions
β βββ 3_π_Publish.py # Publish data products
β βββ 4_π₯_Consume.py # Consume data products
β βββ 5_π_Dashboard.py # Real-time monitoring
β βββ 6_π_History.py # Job history
β βββ 8_β°_Scheduled_Jobs.py # Scheduled jobs
β
βββ utils/
β βββ exceptions.py # Custom exception classes
β βββ validators.py # Input validation
β βββ encryption.py # Fernet encryption helpers
β βββ keyvault.py # Azure Key Vault integration
β βββ secrets_manager.py # Secrets management
β
βββ notebooks/ # Fabric notebook templates
β βββ README.md # Deployment instructions
β βββ adme_export_notebook_v2.py # ADME β Fabric export
β βββ adme_ingest_notebook_v2.py # Fabric β ADME ingest
β
βββ docs/ # Additional documentation
β βββ COMPLETE_WORKFLOW_GUIDE.md
β βββ SECRETS_MANAGEMENT.md
β βββ FABRIC_API_VERIFICATION.md
β
βββ tests/ # Test suite
βββ conftest.py
βββ test_encryption.py
βββ test_fabric_api.py
βββ test_validators.py
The project uses:
- Black for code formatting
- MyPy for type checking
- Pytest for testing
# Format code
black .
# Type check
mypy .
# Run tests
pytest- Create a file in
pages/with emoji prefix (e.g.,6_βοΈ_Settings.py) - Add page config at the top:
import streamlit as st st.set_page_config(page_title="Settings", page_icon="βοΈ", layout="wide")
- Streamlit auto-discovers pages from the
pages/folder
pytestpytest --cov=. --cov-report=html
# Open htmlcov/index.html to view report# Unit tests only
pytest tests/unit/
# Integration tests
pytest tests/integration/
# Specific test file
pytest tests/unit/test_validators.py-
Connections Page
- Add new environment
- Test ADME connection
- Test Fabric connection
- Edit existing environment
- Delete environment
-
Sync Page
- Select source environment
- Select data partition
- Select kind (record type)
- Configure filters
- Select target environment
- Execute sync job
-
Dashboard
- View active jobs
- View job progress
- Cancel running job
- View logs
-
History
- View completed jobs
- Clone job configuration
- Re-run job
- Export job logs
docker build -t adme-fabric-control-plane .docker run -d \
--name adme-control-plane \
-p 8501:8501 \
-e ENCRYPTION_KEY=your-fernet-key \
-e LOG_LEVEL=INFO \
-v $(pwd)/data:/app/data \
adme-fabric-control-planeCreate docker-compose.yml:
version: '3.8'
services:
control-plane:
build: .
ports:
- '8501:8501'
environment:
- ENCRYPTION_KEY=${ENCRYPTION_KEY}
- LOG_LEVEL=INFO
- MOCK_MODE=false
volumes:
- ./data:/app/data
restart: unless-stoppedRun:
docker-compose up -dThe easiest way to deploy to Azure Container Apps is using the included deployment script:
# Deploy to Azure (creates all resources automatically)
./deploy.ps1
# Or with custom parameters
./deploy.ps1 -ResourceGroup "my-rg" -Location "eastus" -AppName "my-control-plane"The script automatically:
- Creates Azure Container Registry
- Builds and pushes the Docker image
- Creates Azure Container Apps environment
- Configures managed identity for Key Vault access
- Sets up Azure Key Vault for secrets storage
- Deploys the application
-
Push to Azure Container Registry:
az acr login --name <your-acr> docker tag adme-fabric-control-plane <your-acr>.azurecr.io/adme-control-plane:latest docker push <your-acr>.azurecr.io/adme-control-plane:latest
-
Create App Service:
az webapp create \ --resource-group <rg-name> \ --plan <plan-name> \ --name <app-name> \ --deployment-container-image-name <your-acr>.azurecr.io/adme-control-plane:latest
-
Configure Environment:
az webapp config appsettings set \ --resource-group <rg-name> \ --name <app-name> \ --settings \ ENCRYPTION_KEY="your-fernet-key" \ LOG_LEVEL="INFO" \ WEBSITES_PORT="8501"
-
Create App Service (Python 3.11):
az webapp create \ --resource-group <rg-name> \ --plan <plan-name> \ --name <app-name> \ --runtime "PYTHON:3.11"
-
Configure Startup Command:
az webapp config set \ --resource-group <rg-name> \ --name <app-name> \ --startup-file "streamlit run app.py --server.port 8000 --server.address 0.0.0.0 --server.headless true"
-
Deploy Code:
az webapp deployment source config-local-git \ --resource-group <rg-name> \ --name <app-name> git remote add azure <git-url-from-output> git push azure main
| Variable | Description | Required |
|---|---|---|
ENCRYPTION_KEY |
Fernet encryption key | β Yes |
DATABASE_PATH |
SQLite path (use persistent storage) | β Yes |
LOG_LEVEL |
INFO or WARNING for production |
No |
MOCK_MODE |
Must be false |
No |
WEBSITES_PORT |
Set to 8501 for Azure |
β Yes (Azure) |
For production, mount Azure Files for SQLite persistence:
az webapp config storage-account add \
--resource-group <rg-name> \
--name <app-name> \
--custom-id data \
--storage-type AzureFiles \
--share-name <share-name> \
--account-name <storage-account> \
--mount-path /app/data \
--access-key <access-key>- Navigate to your Fabric workspace
- Create a new Notebook for each:
adme_export_notebook.pyβ Export Notebookfabric_sync_notebook.pyβ Sync Notebookadme_ingest_notebook.pyβ Ingest Notebook
- Copy the content from
notebooks/into each Fabric notebook - Update environment configuration in your Control Plane with the notebook IDs
The notebooks accept parameters via the Fabric REST API when triggered from the Control Plane:
| Notebook | Key Parameters |
|---|---|
| Export | dataPartitionId, admeServerUrl, searchApiKind, credentials |
| Sync | source_workspace_id, target_workspace_id, sync_mode |
| Ingest | osduEndpoint, dataPartitionId, batch_size, credentials |
See notebooks/README.md for detailed deployment instructions.
- All secrets (client secrets) are encrypted using Fernet symmetric encryption
- The encryption key is stored as an environment variable (never in code)
- Secrets are decrypted only when needed for API calls
from cryptography.fernet import Fernet
print(Fernet.generate_key().decode())- β
Never commit
.envfiles - β Use Azure Key Vault for production secrets
- β Rotate encryption keys periodically
- β Enable HTTPS at the load balancer/App Service level
- β Restrict network access to the App Service
"ENCRYPTION_KEY not set"
Make sure .env file exists and contains ENCRYPTION_KEY
Database locked error
SQLite doesn't support concurrent writes. This usually resolves itself.
For production, consider using PostgreSQL.
"Connection refused" to ADME
1. Check ADME server URL (must be HTTPS)
2. Verify Service Principal has permissions
3. Check firewall rules
Streamlit "AxiosError"
Usually a browser cache issue. Clear cache or try incognito mode.
View application logs:
# Local development
streamlit run app.py 2>&1 | tee app.log
# Docker
docker logs adme-control-plane
# Azure App Service
az webapp log tail --name <app-name> --resource-group <rg-name>This project is licensed under the MIT License - see the LICENSE file for details.
Contributions are welcome! Please see CONTRIBUTING.md for guidelines.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit changes (
git commit -m 'Add amazing feature') - Push to branch (
git push origin feature/amazing-feature) - Open a Pull Request
- Azure Data Manager for Energy Documentation
- Microsoft Fabric Documentation
- Microsoft Fabric Notebook Public API
- Streamlit Documentation
- OSDU Data Platform
-
Create Data Product
- User selects source environment, partition, kinds, and query.
- APIs:
GET /api/partition/v1/partitions(ADME): List partitionsPOST /api/search/v2/query(ADME): Aggregate kinds with countsPOST /api/search/v2/query(ADME): Preview/filter records
- Result: Data product definition is saved.
-
Publish Data Product
- User triggers export/publish for a data product.
- APIs:
POST /notebooks/{export_notebook_id}/execute(Fabric): Start exportGET /notebooks/{export_notebook_id}/status(Fabric): Poll for exportPOST /notebooks/{sync_notebook_id}/execute(Fabric): (Optional) SyncPOST /notebooks/{ingest_notebook_id}/execute(Fabric): Start ingestGET /notebooks/{ingest_notebook_id}/status(Fabric): Poll for ingest
- Result: Data is exported from ADME, processed in Fabric, and ready for sharing.
-
Consume Data Product
- User selects a published data product to ingest into their own ADME.
- APIs:
GET /api/data-products(Fabric/Custom): List data productsPOST /api/consume(Fabric/Custom): Trigger ingestion
- Result: Data product is ingested into the consumerβs ADME environment.
flowchart TD
A[User: Create Data Product] -->|Selects Source, Partition, Kind| B[ADME API: /partitions, /query]
B -->|Save Config| C[Control Plane: Save Data Product]
C -->|User Publishes| D[Fabric API: Execute Export Notebook]
D -->|Exported Data| E[Fabric Lakehouse]
E -->|Execute Ingest Notebook| F[Fabric API: Ingest to Target ADME]
F -->|Data Ready| G[Data Product Published]
G -->|User Consumes| H[Fabric API: List/Consume Data Product]
H -->|Ingested| I[Target ADME]
-
Create Data Product: The user defines a data product by selecting the source ADME, partition, and kinds. The app calls ADME APIs to list partitions and kinds, and saves the configuration.
-
Publish Data Product: When the user publishes, the app triggers Fabric notebooks via the Fabric API to export data from ADME, process it in Fabric, and prepare it for sharing.
-
Consume Data Product: Users will be able to browse and subscribe to published data products, triggering ingestion into their own ADME environment.
