Case Study

How 3M validated a global L&D function with the Strategy Scorecard

In this case study

Table of Contents

We always think the foundation needs to be fixed. Now I know we're strong. We just need to power it.

We caught up recently with Karen Ganitsky, SIBG Global Sales Training Leader at 3M, to talk about how her global team had been operating, what the past year had taught her about leading L&D from the business side, and where the function was heading next. Karen took on the role in late 2023, when the team was being rebuilt as a global organisation inside 3M’s Safety and Industrial Business Group (SIBG). She came to L&D from the business — customer service, supply chain, operations, Lean Six Sigma, business development — and inherited a team with strong delivery and a set of unanswered strategic questions.

The track record was strong. A 2025 sales bootcamp had landed across the global organisation, and the year-over-year numbers showed it. Salesforce activity capture rose from roughly 600,000 logged actions in 2024 to over a million in 2025 — a 64% lift in a behaviour the training never explicitly asked for. Programmes were running. Leaders were asking for more. What Karen wanted next was a structured way to step back from delivery and test whether the global strategy was directionally right.

3M Learning Strategy

About 3M and SIBG

3M is a science company. The Safety and Industrial Business Group sits inside it, with sales operations across North America, EMEA, Latin America, and Asia. Karen’s team of roughly 20 people, including six divisional trainers who came from the field, supports three sales programmes: SPEED for end-user sales, ACE for sales to distribution, and DRIVE for sales managers. The team is global, distributed across multiple time zones, and supports a sales force inside a science-led culture where data is the default language for any new investment, learning included.

The challenge

A function rebuilt from scratch

In 2022, 3M dismantled most of its L&D infrastructure across business groups. When Karen took on the global SIBG sales training role, she had one month to scope what was needed, build a budget case, and present it to leadership. She came in from the business and partnered closely with the experienced L&D team she inherited, and together they put forward a tiered investment proposal. Leadership approved the highest tier and doubled the budget. The vote of confidence set Karen up to deliver, and she moved quickly to make good on it.

No external benchmark

For the next twelve months, Karen drew on the discipline of an experienced team, her own project management background, and a sharp instinct for where the function needed to go. She read widely across the industry — articles, reports, conference write-ups — but none of it added up to a credible answer to the question she most wanted to answer:

As we were building the strategy for the global team, my biggest question was, how do I know I'm doing the right thing?

3M is a data-driven company, so inside that culture, Karen needed something that could turn her judgement into evidence the business would recognise.

"Call out quote component here Officia non deserunt excepteur in adipisicing cillum ea ullamco minim commodo fugiat. Et ullamco do nostrud eiusmod labore nisi aute."

The approach

Strategy on a Page (SOAP), then the Scorecard 

Before the team took the Learning Strategy Scorecard, Karen had already built her function’s strategy on a page using WeLearn’s SOAP template. The framework captured four priorities, the initiatives underneath each one, and the line connecting them back to the business. It replaced the bespoke decks she had been creating to explain L&D’s priorities in stakeholder meetings.

“It’s avoided having to create a million different PowerPoints. It’s one slide and everyone gets it.”

— Karen Ganitsky

By the time the team came to the Scorecard, the strategy already existed — on a page. The Scorecard’s job was to test it through the eyes of the people running it.

Some L&D leaders run the diagnostic first to find out where they stand, then build the strategy off the back of what it surfaces. Karen ran it the other way around. Both sequences are valid. Which one fits depends on whether the bigger gap is direction or alignment. When direction is set and the question is whether the team is reading from the same page, the diagnostic is the more useful starting point.

Taking the Scorecard as a global team

Karen took the Scorecard herself first, then sent it to her wider team. Twelve of seventeen responded, a 71% response rate. Each person worked through 30 statements covering the six dimensions of L&D maturity the Scorecard measures: alignment, governance, technology, content, measurement, and culture. The instructions were straightforward. If a question didn’t apply or wasn’t clear, that was useful information in its own right.

What the diagnostic surfaced

The data returned a specific picture. SIBG’s L&D function was directionally aligned with business priorities, with strategic intent visible in select initiatives. Team members saw content as relevant. Culture treated learning as legitimate. For a young function, the foundations were strong, and the function was tracking ahead of where most organisations sit in WeLearn’s wider benchmark.

The gaps existed where they typically sit, in the connective tissue. Governance, measurement, and the visibility of decision-making cadence were maturing more slowly than alignment. The shape was familiar: alignment ahead of governance is a common pattern, and the work to be done was now named.

Where the team’s perceptions diverged

The most useful signal in the data was how differently team members experienced their own function.

L&D teams often agree on the value of learning while experiencing the system around it differently. One person sees the strategy clearly. Another reads the same situation as ambiguous. What looks like disagreement is usually uneven access to the same information. In a globally distributed team operating across time zones, where information cascades through different regional cultures and leadership structures, the gaps sharpen further. They also show up in the language itself. Strategy means a five-year direction in one conversation and a one-year execution plan in another. The same word does different work depending on who’s using it.

For Karen’s team, surfacing the pattern was the breakthrough. Nobody disagreed about what L&D was for. The next stretch of work was about tightening the language so the team and its stakeholders were using it the same way.

Governance, narrowed down

The biggest reframe came on governance. Karen had been treating it as a 3M-wide problem: too big to tackle, too political, with decision rights spread across HR, expertise delivery, marketing, and her own team. She’d been deferring it because she didn’t know where to start.

The Scorecard gave her the way in.

It was an epiphany. Wait, why am I over-engineering this? I don't have to figure this out for 3M. I can figure it out for my team and then keep expanding the circle.

It’s a simple reframe with a big practical effect. Governance for the team Karen owns. Then SIBG. Then the wider organisation. A piece of work that had felt impossible to start now had a workable first step.

A clearer line on what L&D measures

A second reframe came out of the engagement, this time on what L&D actually measures. The WeLearn team put it to Karen plainly: L&D measures the effectiveness of the programme. The leader measures the effectiveness of the people. Those are two different jobs, and confusing them is what often puts training on the wrong side of a difficult conversation — the kind where a leader expects a 40-hour course to produce someone fully proficient and blames training when it doesn’t.

The same logic shaped how Karen now describes her function to the business. She moved from “we make people better” to “we turn potential into performance,” language that ties directly to the performance enablement priorities her business was already focused on.

“When we talk about what we do as performance, it gives it relevancy to the business. Oh, okay, you’re going to help my team perform better.”

— Karen Ganitsky

The roadmap WeLearn delivered

The engagement included a maturity report with an executive summary and a six-week alignment roadmap, followed by a working session for Karen’s team built around the results. The roadmap covered governance calibration, measurement calibration, technology and visibility alignment, process and portfolio audit, culture activation, and reassessment. The takeaway was clear: the foundation is strong, and the work ahead is to stabilise the system that powers it.

For a function under three years old, that line shifted the centre of gravity from proving the foundations to building on them.

3M Learning Strategy

Impact

A L&D function ahead of the game

The engagement paid off quickly. Shortly after the walk-through with WeLearn, Karen had a strategic conversation with her senior leader about how the function would sustain its momentum beyond the current programme cycle. She walked into it with the maturity report and the team’s plan for closing the gaps already in hand.

“She [the senior leader] didn’t even tell me what she needed. We were already on it. Being able to show her this data was incredibly powerful.”

— Karen Ganitsky

The big shift here was structural. Karen now had a way to walk into any leadership conversation with the function’s direction, progress, and next priorities laid out in one place.

L&D as strategic business partner 

Before the 2024 bootcamp, some leaders inside the business didn’t know the L&D function existed. A year later, post-Scorecard and SOAP,  Karen has weekly check-ins with senior leaders she previously had no access to.

Today, L&D is completely a partner in achieving the growth numbers they need to hit.

What's next

Closing the governance and definition gaps

The next stretch of work brings Karen’s three regional leaders together for a working session built directly on the diagnostic. A decision matrix for what L&D does, what it doesn’t do, and how it prioritises requests. A documented strategy for the next five years. Definitional alignment so the team uses the same language internally and with the business.

A wider remit for L&D inside SIBG

The longer-horizon ambition Karen has named is a wider role for L&D inside the business: the team that sustains change after the rollout phase ends. 3M is moving through significant change, and Karen sees learning as positioned to carry the work that follows the announcement, where the habits and capabilities behind a change actually take root. The partnership with sales technology stays central, because without that data partnership, behaviour change isn’t traceable.

“It’s very hard in L&D to track behaviour if you don’t have a good partnership with the different organisations.”

— Karen Ganitsky

Retaking the Scorecard in 12–24 months

Karen plans to retake the Scorecard against the same baseline. That will give her a defensible data point for her next leadership conversation and a structured way to measure progress against the gaps the team is closing now.

I want to take this again and see how much we've moved up the maturity scale.

Advice for learning leaders

Karen’s advice for L&D leaders working to move their function out of cost-centre framing is direct. 

“Meet with different stakeholders. Find a real pain point, not just a gap. A gap seems light. A real pain point that L&D can help close. That’s the breakthrough.”

— Karen Ganitsky

See where your global L&D function stands

The Scorecard is built for teams who want a clearer read on their function from the inside. WeLearn analyses team responses, places the function on the maturity ladder against a benchmark of L&D organisations, and works through the results with you to set the priorities that matter most. Book a Scorecard team engagement →

Your success story starts here

See how WeLearn can support your team with strategies that deliver measurable impact.

More case studies you might like