🐨

Koala

AI Act Governance Assistant

EU AI Act · In force Indexed: COM(2025) 836 proposal Demo is simulated

Turn your AI system list into obligations you can defend.

Koala is built for compliance officers who need real answers under Regulation (EU) 2024/1689. It links your AI system catalog to the exact articles that apply, so you can respond to audit and legal requests with citations, not guesswork.

Scope is EU AI Act only. Omnibus updates are treated as amendments, never as a separate regime.

Why Koala exists

Most AI compliance tools answer in abstractions. Koala starts with a system description and ends with a clear list of duties you can actually act on.

The public demo is scripted for speed. The full product runs locally with citations, ingestion, and RAG.

Catalog-first workflow Citations over summaries Built for compliance officers

A workflow that matches how compliance teams operate

Short, opinionated, and designed to create an audit trail.

  1. 1. Start with the catalog

    Describe the AI system in plain language. If you cannot explain it clearly, you cannot defend it.

  2. 2. Ask obligation questions

    Koala maps that description to the AI Act and returns the specific duties that apply to your role.

  3. 3. Document and act

    Copy the citations into your governance notes and move the system toward compliance with confidence.

Interactive demo

See the full product flow on a dedicated page. The live engine runs locally, so the web demo is scripted.

Full-page walkthrough

The demo includes Chat, Catalog, KPIs, AI Setup, and Help. It is a guided simulation so anyone can try the workflow, but real ingestion and retrieval require local deployment.

Launch the demo

Risk levels, without the fog

Koala forces a decision: is it in Annex III, or not? Everything else follows from that.

Prohibited

If it manipulates or exploits people, it is banned under Article 5. Stop here.

High Risk

If it appears in Annex III, you must treat it like a regulated product with full obligations.

Limited Risk

Transparency still applies. Users must know when they are interacting with AI.

Minimal Risk

No extra duties, but you still need to justify why you classify it as minimal.

General Purpose

Model-level duties apply even if you are not the final deployer.

Want the real product experience?

Run Koala locally for full RAG, PDF ingestion, and AI system analysis. The GitHub README includes setup steps and sample data. The full app is served at /app in local mode.

Open the repository