Complex tangled ribbons of data representing challenges in AI-driven learning and development.

Ask most L&D teams what they’re doing with AI in L&D, and they’ll list the tools.

Ask what’s actually changed in how people learn. You’ll get silence.

That’s the gap. It’s wider than most organisations admit.

AI in L&D has real potential. But most teams layer new tech on broken learning design. The result? Faster delivery of content that still doesn’t change behaviour.

The tools aren’t the problem. The questions are.

Why most rollouts underdeliver: a AI in L&D perspective

Most L&D teams believe in AI in L&D. But believing in something and designing for it are different things.

Here’s what we see constantly. Organisations start with the right intentions. Then deadlines hit. Stakeholder pressure builds. And the things that would make the learning work get stripped out first.

The result? Content that looks good in a demo. Passes a review. Gets ignored the moment it goes live.

However, some organisations get it right. They protect what matters. Psychological safety. Space to think. A clear link to the real business problem.

So ask yourself: does your current process protect those things? Or does it quietly design them out?

Two students collaborating on a tablet in a library, exploring AI tools for learning and development, highlighting challenges and solutions in integrating AI in corporate training.

What the evidence actually says

The evidence on AI in L&D is not complicated. But it does contradict how most organisations commission learning.

Behaviour change needs three things. Relevance to the person’s real role. Space to practise safely. Feedback that is timely and specific.

Most corporate programmes deliver none of these well.

But here’s the thing. The problem isn’t awareness. Every Head of L&D we talk to knows this already. The problem is most design processes were never built to deliver it.

As a result, teams end up measuring what’s easy. Completion rates. Satisfaction scores. Time on platform. These feel safe. They’re also poor signs of whether anything actually changed.

The fix? Rewrite what success means. Not what gets ticked. What people do differently after the programme ends. Is your success criteria built around that?

Office workers engaged in training and collaboration, highlighting challenges of AI in learning and development within corporate environments.

The three things that actually work

When AI in L&D works well, you don’t need the dashboard to tell you. You can see it.

Managers have different conversations. Teams handle new situations with more confidence. Problems surface earlier, because people feel safe enough to raise them.

That’s the outcome worth designing for. Not completion. Not satisfaction scores. Actual behaviour change, visible in the work itself.

So what does it take? In our experience, three things make the real difference.

First: relevance. Not to a job description. To the challenges the person faces today. Second: space to practise. Not just knowledge transfer. Third: connection to real work. Not a training bubble nobody returns to.

Instead of asking how to cover the content, ask how to change what people do on Monday morning. That question changes the brief entirely.

How we approach this differently

Here’s a simple way to reframe your approach to AI in L&D.

Stop starting with content. Start with the business problem.

Before writing a single objective, ask the business leader one question: what does this person need to do differently? Not know. Do.

That question surfaces two things quickly. First, the performance gap is usually smaller and more specific than the original request. Second, it often isn’t a training problem at all. It’s process. It’s management. It’s culture.

No amount of content fixes those things.

Because of this, the best L&D teams act like consultants, not order-takers. They push back on briefs. They ask hard questions. They refuse to build content they know won’t change anything.

That takes confidence. It builds credibility. And it comes from delivering work that actually works. Start there.

The hidden cost of ignoring takeaways

Most L&D teams believe in AI in L&D. But believing in something and designing for it are different things.

Here’s what we see constantly. Organisations start with the right intentions. Then deadlines hit. Stakeholder pressure builds. And the things that would make the learning work get stripped out first.

The result? Content that looks good in a demo. Passes a review. Gets ignored the moment it goes live.

However, some organisations get it right. They protect what matters. Psychological safety. Space to think. A clear link to the real business problem.

So ask yourself: does your current process protect those things? Or does it quietly design them out?

Modern learning environment with tiered seating, interactive pods, and collaborative areas designed to enhance employee training and development.

What the industry is saying 

The conversation in L&D is shifting. Here’s what we’re tracking right now:

→ AI in 2026: The Turning Point for Learning & Development: AI isn’t replacing L&D, it’s reshaping it. From adaptive learning platforms to workflow-embedded AI agents.
→ AI in L&D: A 2025 Report by Donald H Taylor: AI is transforming L&D. Or is it?
→ How AI Will Reshape L&D and HR in 2026: As AI continues to transform L&D and HR, the organisations that will thrive in 2026 will be those that embrace AI.

The fundamentals haven’t changed. But the pace has. Organisations that haven’t started asking better questions are already falling behind.

AI in L&D visual showing a futuristic screen with text about language models and machine learning.

Three things to do now

If any of this resonates, here are three practical moves worth making:

→ Rewrite your success metrics. Replace completion rates with a behaviour you can observe in the workplace. That shift alone changes how you design everything.
→ Rewrite your next brief. Before commissioning content, ask: what do we need people to do differently? Not know! Then design backwards from that.
→ Have a harder conversation with your stakeholders. Push back on generic content requests. Ask about the business problem underneath them. That’s where the real work starts.

At mindboost, we help L&D teams build learning that changes how people work, not just what content they can access. If you want to explore what that looks like for your organisation, let’s talk

author avatar
Caleb Foster Founder
After gaining more than 20 years of experience in operational excellence in the hospitality and digital learning sectors, Caleb wanted to rid the world of dull ‘click next’ and ineffective elearning and solve the epidemic of uninspiring digital learning. Mindboost began back in 2016, when Caleb, saw a huge opportunity to create better quality digital learning content that connects with learners emotionally to encourage a desire to learn more. Caleb realised there was a lack of true understanding of an organisation’s culture and inner-working when learning providers were presented with a request from a client. So, the Mindboost team get under the cover of an organisation’s performance need and ultimately look to connect with learners emotionally. When a learner is connected emotionally, they tend to start believing in a change, this then generates a feeling and makes a greater impact within the organisation than just conveying information.