AI literacy training aligned to the EU AI Act

The EU AI Act introduces new legal responsibilities for organisations using AI, including a requirement to train staff in safe, transparent and accountable AI use.

Help your teams use AI responsibly and stay ahead of regulation

Narratology’s AI literacy training helps organisations meet these expectations. We provide plain-language, scenario-based workshops that equip staff with the skills and awareness they need to use generative AI tools like ChatGPT ethically, effectively, and within legal bounds.

Whether you’re in compliance, HR, IT, or operational delivery — if your teams touch AI, they need this literacy.

What AI literacy does the EU AI Act require?

The EU AI Act (2024) applies from February 2025. It includes a duty to ensure human oversight, transparency and appropriate training. It emphasises the need to train people according to their:

  • Technical knowledge
  • Experience, education and training
  • How and why the AI is used

Organisations will probably have to use multiple types and providers of AI literacy to make sure that they comply with these requirements. 

Providers and deployers of AI systems shall take measures to ensure, to their best extent, a sufficient level of AI literacy of their staff and other persons dealing with the operation and use of AI systems on their behalf, taking into account their technical knowledge, experience, education and training and the context the AI systems are to be used in, and considering the persons or groups of persons on whom the AI systems are to be used.

What the training covers

We tailor our workshops to your organisation’s sector and use case.

Depending on your needs, we may cover:

  • How generative AI tools work, and where they fail
  • Legal and ethical risks of AI: bias, hallucination, misinformation
  • Prompt design and evaluation
  • Protecting sensitive data and upholding GDPR
  • Human-centred design, explainability, and responsible use
  • Organisational risk: how to create “minimum viable guardrails” across teams

This AI literacy training is not about tool use. It builds organisational literacy that aligns with law, values. risk management and accountability.

Who AI literacy training is for

Organisations should train all teams in AI literacy, in a way that’s relevant to them. Depending on your needs, we may offer training to:

  • Content and UX teams
  • Corporate legal and ethics officers
  • Cross-functional innovation or governance teams
  • Leadership teams developing AI policies or guardrails

It’s suitable for both organisations operating in the EU, and UK-based teams wanting to align with best practices or meet vendor obligations under EU law.

How we deliver AI literacy training

We often deliver this training as part of a broader training initiative. We can train in person or online.

You’ll receive:

  • A summary report with key learning outcomes and risk themes
  • Optional attendance certificates
  • Course evaluations from delegates

Outcomes

According to the EU AI Act, organisations are expected to show proactive steps toward safe AI adoption. We can help you to:

  • Document staff training as part of your compliance effort

  • Reduce risk and reputational exposure

  • Build internal clarity around what’s allowed — and what’s not

  • Empower your people without overwhelming them

Interested in AI literacy training for your organisation?

Interested in bringing this training to your team? Let’s talk about what you need and what would work best.

Contact us to book a session or ask a question. Or read about basic AI literacy training for non-tech teams.