Algorithmic Failures Have No Home
- James W.
- 3 days ago
- 1 min read

LinkedIn Post 05: Algorithmic Failures Have No Home
Your building's HVAC AI model starts behaving erratically.
Energy consumption rises. Comfort complaints increase. Equipment runtime extends.
You contact the vendor. They investigate. They find a "model degradation issue" in the training data.
Your contract says: "Vendor shall continuously optimize building performance."
It says nothing about what happens when the model fails.
Is it a breach? An SLA violation? An act of God? Your contract doesn't specify.
The vendor's interpretation: "We're not liable. The contract just says we'll optimize. If the model fails, that's a technical issue, not a breach."
You hire a lawyer. The investigation costs more than the damages. You accept the degradation and move on.
This is the liability vacuum: when AI models fail, contracts are silent. Vendors face no accountability.
Solution: Establish explicit Service Level Agreements (SLAs) for algorithmic performance:
Energy consumption SLA: "Model shall maintain consumption within 105% of baseline"
Accuracy SLA: "Predictive alerts shall achieve 80% precision"
Failure SLA: "If model degrades SLAs for >14 days, vendor reimburses excess costs"
Algorithms should have explicit liability, just like equipment.

Comments