A leader reviewing our most recent research (we’re looking at you Ben Sieke!) made an observation that stuck: many of the challenges surfacing in the data aren’t new. Strategy, measurement, governance, proving value — L&D has been circling these for a decade or more.
That’s not a criticism. It’s a pattern worth examining.
Some of what our industry faces is genuinely new. AI readiness, workforce anxiety about disruption, the tension between efficiency and human connection — these pressures have arrived quickly. But much of what’s holding L&D back is inherited. Problems we’ve worked around instead of through.
This piece explores that distinction. What’s an emerging challenge, and what’s a persistent one we keep not solving? More importantly: why does it matter now?
The inherited problems and why we keep circling
The gaps in our research aren’t revelations. They’ve been on conference agendas and consultant slide decks for years. The question worth asking isn’t really what the gaps are. It’s why they persist.
We’re rewarded for delivery, not design.
L&D professionals are trained to help. Request comes in, solve it. Build the course. Launch the program. That responsiveness is a strength, until it becomes a pattern of shipping before we’ve defined what success looks like.
As one practitioner put it in our recent Leader Talk webinar session with Training Industry: “Development is wanted for new training at our company, but time is not dedicated to it. They are one-off projects, so it’s difficult to emphasize training when we don’t have time to develop it — even though it’s recognized [that] it’s needed.”
The time pressure is real. Another described the math: “20% of my time goes to compliance training. 60% goes to training for new products and systems outside my control. That leaves 20% for strategic work.” When the urgent crowds out the important, strategy becomes something we’ll get to later.
The middle was comfortable enough.
For years, L&D could operate between reactive and strategic without serious consequence. Programs shipped and stakeholders were satisfied enough. We reported completions and satisfaction, leadership accepted them, and the conversation never evolved.
That pattern still holds. When we polled L&D leaders in the webinar, roughly 60% said their organization primarily measures completions, attendance, and sentiment data. Not because they don’t want to do more, but because the infrastructure, permission, and time to measure differently hasn’t materialized. IOW, the pain of staying in the middle was tolerable. So we stayed.
Authority never matched expectation.
L&D connects to every function. We’re asked to support transformation, enable culture change, upskill for the future. But the budget, headcount, and decision-making power rarely match that scope. We’re positioned as a support function and then asked why we’re not operating strategically. The mismatch makes progress hard even when intent is there.
The work to fix it is unglamorous.
Documenting a strategy. Building governance structures. Creating measurement systems that tie to business outcomes. None of this produces a deliverable you can demo. It requires long conversations with stakeholders who may not understand why it matters. It’s foundational, invisible, and easy to defer.
When we asked webinar participants how mature their learning strategy was compared to peers, 57% said “I have no bleeping idea.” Another 34% said “I kind of know.” Only 9% were confident. That’s not a knowledge gap — it’s a symptom of work that hasn’t been prioritized.
These aren’t excuses. They’re explanations. And they matter because the pressure is no longer tolerant of “the middle”.
The emerging pressures
Not everything is inherited. Some of what L&D faces now is genuinely different in speed, scale, or stakes.
AI arrived before we were ready.
Our Learning Strategy Benchmarking Data Report (January 2026) showed AI strategy as the lowest-scoring item across 30 questions. But this isn’t really a technology story. It’s an infrastructure story. Most L&D functions have platforms. What they don’t have is the data architecture, governance clarity, or measurement discipline to use AI strategically. The technology showed up. The foundation it needs to sit on didn’t.
The confusion on the ground is palpable. When we asked practitioners about AI’s role in their organizations during the webinar, responses ranged from “not even sure where to begin” to “I’ve been told by IT we can use it, then we can’t, then we can, then we can’t.” Others described “layering it on more and more, but no direction on when or how to use it.”
Organizations are investing. People are trying. But without strategy, governance, or clear guidelines, the result is fragmentation rather than transformation.
Workers feel the disruption — and they’re looking around.
Our Learning is Human ebook, drawing on Lighthouse Research data, found that a majority of workers are worried about the future of work. Younger workers, those with the longest careers ahead, are even more concerned. They know development is on them. But knowing isn’t the same as being equipped. The gap between “I’m responsible for my growth” and “I have what I need to grow” keeps widening.
Connection became scarce while content became infinite.
Generic content is everywhere — bought, built, or generated. What’s harder to find is learning wrapped in context. Connected to how your people, in your organization, need to apply it. AI can produce material at scale. It can’t navigate your culture, build trust over time, or know which conversations to handle carefully. The organizations flooding their systems with more content are often making the signal-to-noise problem worse.
Speed is compressing timelines.
The gap between “this is coming” and “this is here” is shrinking. L&D used to have runway to figure things out. Now the business expects answers faster — about AI policy, reskilling priorities, what’s safe to use and what isn’t. The luxury of waiting until we have a plan is disappearing.
Why the old problems block the new opportunities
Here’s the not-so-comfortable connection: the emerging pressures don’t exist separately from the inherited problems. They sit on top of them.
AI readiness isn’t low because L&D lacks interest in AI. It’s low because the preconditions were never built. You can’t personalize learning with AI if you can’t use the data you already collect. You can’t demonstrate AI’s value if you’ve never demonstrated traditional learning’s value. You can’t govern AI responsibly if you never established governance for anything else.
This showed up clearly in the Leader Talk session. When we asked practitioners to rate their ability to tie learning initiatives to business KPIs, zero respondents selected “fully mature.” The majority described their practice as inconsistent or still in early development. One participant captured the measurement challenge: “I try to tell stories with data, but it’s hard to find direct correlations between the training that happened and the resulting impact.”
Another put it more bluntly: “Completions don’t change business behaviors.”
The persistent problems aren’t just old business. They’re actively blocking the new opportunities. The organizations that skipped foundational work are discovering they can’t leapfrog to innovation.
The opportunity hidden in the pattern
There’s something useful in recognizing that the persistent problems are, in fact, persistent: it means there’s a wealth of practice to draw from.
We don’t have to invent solutions for strategy. Measurement frameworks exist. Governance models are well-documented. The field has been writing and speaking about this for years. The problem was never that we didn’t know what to do. It was that we didn’t have enough pressure to do it.
Now we do.
Our talk last week surfaced examples of what works when the pieces come together. One participant described finally getting senior leadership to champion L&D: “Having a senior leader champion L&D is hard but extremely valuable. It took three years of building trust to land in that position.” Another shared: “Our training went SO well. The leaders in attendance took it seriously because THEIR leaders were present and participating.”
These aren’t complicated interventions. They’re the result of strategy, alignment, and patience — the foundational work finally paying off.
AI’s arrival isn’t just a challenge to react to. It’s a forcing function. The organizations that finally formalize strategy, build outcome measurement, and establish governance will be the ones positioned to lead what comes next.
The question isn’t whether L&D can adapt to AI. It’s whether we’ll use this moment to finish what we started.
From here
- Take the Learning Strategy Scorecard to see where you stand
- Read the full Learning Strategy Benchmarking Data Report
- Explore Learning is Human on why connection drives better outcomes
- See L&D’s Guide to AI Adoption for practical frameworks