Your graph is now alive. Think of it as a mini-Google Knowledge Graph for your warehouse. Unlike SQL (tables) or NoSQL (documents), ADT uses a graph query language similar to SQL but with RELATED and IS_OF_MODEL . Query 1: Find all sensors in the Receiving Zone az dt twin query --adt-name adt-warehouse-<unique> \ --query-command "SELECT sensor FROM digitaltwins zone JOIN sensor RELATED zone.hasSensor WHERE zone.\$dtId = 'ZoneReceiving'" Result: Returns TempSensor-Rcv . Query 2: Traverse two hops (Warehouse → Zone → Sensor) az dt twin query --adt-name adt-warehouse-<unique> \ --query-command "SELECT sensor, zone FROM digitaltwins wh JOIN zone RELATED wh.contains JOIN sensor RELATED zone.hasSensor WHERE wh.\$dtId = 'WarehouseMain'" This is powerful. In a real app, this query would run in milliseconds, even across 100,000+ nodes. Step 6: Simulate Telemetry and Compute Changes Here’s where it gets truly hands-on. Azure Digital Twins itself does not ingest telemetry directly. Instead, you use Azure Functions or IoT Hub to route data in.
We’ll simulate a temperature spike and compute a "high temperature" alert using a . Architecture Pattern: IoT Device → IoT Hub → Azure Function → ADT (patch property) The Function Logic (Python example): import json import logging import azure.functions as func from azure.digitaltwins.core import DigitalTwinsClient from azure.identity import DefaultAzureCredential def main(event: func.EventHubEvent): # Parse telemetry from IoT device telemetry = json.loads(event.get_body().decode('utf-8')) sensor_id = telemetry['sensorId'] new_temp = telemetry['temperature'] hands-on azure digital twins read online
"@id": "dtmi:handsOn:Sensor;1", "@type": "Interface", "displayName": "Sensor", "contents": [ "@type": "Telemetry", "name": "value", "schema": "double" , "@type": "Property", "name": "sensorType", "schema": "@type": "Enum", "valueSchema": "string", "enumValues": [ "name": "Temperature", "enumValue": "temp" , "name": "Humidity", "enumValue": "humi" , "name": "Motion", "enumValue": "motion" ] ] Your graph is now alive
Azure Digital Twins isn't just a database—it's the of your physical world. Go build something intelligent. Query 1: Find all sensors in the Receiving
# Upload models az dt model create --adt-name adt-warehouse-<unique> \ --models models/*.json az dt model list --adt-name adt-warehouse-<unique> -o table
Azure Digital Twins (ADT) gives that data context . It knows that Sensor 47 belongs to Room 312 , which is on the North Wing of Floor 3 , and that room contains a Server Rack . If the temperature rises, ADT understands the impact .
# Connect to ADT credential = DefaultAzureCredential() service_client = DigitalTwinsClient(credential, "https://adt-warehouse-<unique>.api.eus.digitaltwins.azure.net")