I’m tired of the noise.

My LinkedIn feed has become a parade of posts about how AI is “revolutionizing” instructional design. How it’s “amplifying” L&D. How anyone who doesn’t jump on board is getting left behind.

So I keep asking: Where are the results?

Not the AI-generated infographics. Not the creepy talking-head avatars. Not the gimmicks. Where are the real examples of AI improving how people actually learn and perform?

I’ve been in life sciences training for over 20 years. I’ve watched technology hype cycles come and go. And while I use AI in my own work and believe it’s a genuinely powerful tool, I’m worried we’re headed somewhere familiar: chasing efficiency while sacrificing effectiveness.

The Problem is How We’re Using It

Here’s what I’m seeing.

Organizations are mandating “AI-first” without understanding what that means in practice. Leaders who’ve never built a learning experience are treating AI as a magic button for cutting costs. Instructional designers are being told to produce more, faster. Quality takes the hit.

The result? What some are calling “AI slop.” Content that’s technically complete but hollow. Training that checks boxes but doesn’t change behavior. Courses that look polished but don’t prepare anyone to actually do the work.

In most industries, that’s a waste of time and money.

In healthcare and medical devices, it’s a patient safety problem.

Where AI Helps and Where it Doesn’t

I want to be clear: I’m not against AI. I use it regularly.

There are tasks where AI enhances my team’s work. Drafting content that we then shape and refine. Generating assessment questions that we validate and improve. Speeding up localization. Turning dense SME input into starting points that are actually readable. Handling repetitive production tasks so we can spend time on strategy instead.

There are also tasks where AI falls short. Understanding what a specific audience of learners actually needs. Designing for behavior change rather than information transfer. Navigating the regulatory and clinical realities of medical device training. Knowing what to leave out because it will confuse more than clarify. Making judgment calls about risk. Building relationships with SMEs and stakeholders that surface insights you can’t get any other way.

The pattern: AI does production work well. Humans remain essential for judgment, strategy, and design.

The organizations getting real value from AI aren’t replacing instructional designers. They’re freeing designers from tedious tasks so they can focus on solving performance problems.

A Practical Approach to AI Integration

How should organizations actually think about this? Based on what I’m seeing work and fail, here’s what I’d suggest:

Start with the problem you’re trying to solve.

Before asking “How can we use AI here?” ask “What outcome do we need?” If you can’t name the business result or the behavior change you’re after, AI won’t help. You’ll just produce the wrong thing faster.

Fix your foundation before you accelerate.

If your current training is mediocre, AI will help you produce mediocre training at scale. Get clear on what good looks like first. Then use AI to get there more efficiently.

Keep humans in the critical loops.

AI can draft. Humans must validate. In regulated industries, when patient safety is on the line, there’s no shortcut around expert review. Build it into your process from the start.

Measure what actually matters.

Completion rates aren’t outcomes. Time-to-develop isn’t effectiveness. If you’re claiming AI improves your training, measure knowledge retention, skill transfer, confidence, and where possible, performance on the job.

Invest in your people.

The instructional designers who know how to direct AI, who have the critical thinking and design skills to make AI outputs useful, are more valuable now than before. Don’t cut the people who make AI work.

What Comes Next

I understand the pressure. Budgets are tight. Timelines are aggressive. Everyone’s doing more with less.

But handing the keys to AI and hoping for the best isn’t the answer. The answer is being honest about where AI helps and where it doesn’t, and making sure skilled humans are guiding the process at every step.

The companies that will do well aren’t chasing every AI gimmick. They’re using AI as a tool in support of clear goals, with experienced designers who know how to use it.

Human expertise directing AI capability. That’s the partnership that actually works.

If you’re trying to figure out how to bring AI into your training work without losing the quality your learners and patients need, I’d like to hear how it’s going. This is the challenge we’re working through with our clients right now. There’s no formula, but there is a thoughtful path forward.

It starts with being honest about what AI can and can’t do.

Creating engaging eLearning content for life science professionals requires a thoughtful blend of technology, real-world relevance, and personalization. Whether you’re training physicians on new devices or educating sales teams, the goal is to foster not just knowledge retention but actionable skills that translate into better patient care and business success.

Next Steps:

If you’re feeling the pressure to “do more with AI” but aren’t sure how to maintain the quality your learners need, let’s talk. We work alongside internal teams to develop training and help organizations figure out where AI fits and where it doesn’t.