Prohibited
If it manipulates or exploits people, it is banned under Article 5. Stop here.
Koala
AI Act Governance Assistant
Koala is built for compliance officers who need real answers under Regulation (EU) 2024/1689. It links your AI system catalog to the exact articles that apply, so you can respond to audit and legal requests with citations, not guesswork.
Scope is EU AI Act only. Omnibus updates are treated as amendments, never as a separate regime.
Most AI compliance tools answer in abstractions. Koala starts with a system description and ends with a clear list of duties you can actually act on.
The public demo is scripted for speed. The full product runs locally with citations, ingestion, and RAG.
Short, opinionated, and designed to create an audit trail.
Describe the AI system in plain language. If you cannot explain it clearly, you cannot defend it.
Koala maps that description to the AI Act and returns the specific duties that apply to your role.
Copy the citations into your governance notes and move the system toward compliance with confidence.
See the full product flow on a dedicated page. The live engine runs locally, so the web demo is scripted.
The demo includes Chat, Catalog, KPIs, AI Setup, and Help. It is a guided simulation so anyone can try the workflow, but real ingestion and retrieval require local deployment.
Koala forces a decision: is it in Annex III, or not? Everything else follows from that.
If it manipulates or exploits people, it is banned under Article 5. Stop here.
If it appears in Annex III, you must treat it like a regulated product with full obligations.
Transparency still applies. Users must know when they are interacting with AI.
No extra duties, but you still need to justify why you classify it as minimal.
Model-level duties apply even if you are not the final deployer.
Run Koala locally for full RAG, PDF ingestion, and AI system analysis. The GitHub README includes setup steps
and sample data. The full app is served at /app in local mode.