SUPi
DocsCore ConceptsArchitecture Overview

Architecture Overview

How SUPi's platform is structured and data flows.

SUPi follows a modular, microservices architecture designed for industrial environments where reliability, latency, and data sovereignty are critical.

Platform Layers

┌─────────────────────────────────────────────────┐
│                  Presentation Layer               │
│    Dashboards  │  Alerts  │  Reports  │  API     │
├─────────────────────────────────────────────────┤
│                  Intelligence Layer               │
│  Predictive    │  Anomaly    │  Process          │
│  Maintenance   │  Detection  │  Optimization     │
├─────────────────────────────────────────────────┤
│                  Modeling Layer                    │
│  Digital Twins  │  ML Models  │  Physics Engine  │
├─────────────────────────────────────────────────┤
│                  Data Layer                        │
│  Ingestion  │  Storage  │  Processing  │  MLOps  │
├─────────────────────────────────────────────────┤
│                  Integration Layer                 │
│  SCADA  │  OPC-UA  │  MQTT  │  ERP  │  Historian│
└─────────────────────────────────────────────────┘

Data Flow

  1. Ingestion — Sensor data streams in via OPC-UA, MQTT, or REST connectors at configurable intervals (typically 1–10 seconds)
  2. Processing — Raw signals are cleaned, normalized, and enriched with asset metadata
  3. Modeling — Digital twins consume the processed data to update real-time simulations
  4. Intelligence — ML models run inference against the digital twin state to produce predictions
  5. Presentation — Results surface as dashboard updates, alerts, or API responses

Deployment Models

SUPi supports three deployment topologies:

  • Edge — Lightweight inference at the plant level, model training in the cloud
  • On-Premise — Full platform running within your data center
  • Hybrid — Edge inference + cloud-based training with federated learning