Skip to main content
Request a Demo
topics

Design thinking for training and development helps L&D teams build programs employees actually use by starting with empathy, defining real problems and iterating based on feedback. 

Since Sharon Boller and Laura Fletcher published their foundational framework in 2020, the workplace has changed dramatically. Hybrid teams, AI tools and accelerated skills gaps have transformed how L&D professionals apply these principles in practice.

The framework includes five phases:

  • Empathize: Understand learner needs through research and data
  • Define: Frame the real problem, not just symptoms
  • Ideate: Generate multiple solution concepts
  • Prototype: Create testable versions quickly
  • Test: Validate with real learners and iterate

This guide shows how to apply design thinking in today’s workplace, with modern tools that make each phase faster and more informed by actual employee behavior.

Design learning experiences that employees actually use.

Explore how D2L Brightspace supports empathy-driven training with analytics, rapid prototyping and continuous feedback.

Explore Brightspace for business

What Is Design Thinking?

Design thinking is a human-centered problem-solving approach that originated at design firm IDEO and Stanford’s d.school in the 1990s. The methodology focuses on understanding users deeply, defining problems clearly, generating multiple solutions, prototyping quickly and testing with real people before full implementation.

In learning and development, design thinking shifts the focus from building courses based on what stakeholders request to designing experiences grounded in how employees actually learn and work. Instead of starting with “we need training on X,” L&D teams start with “what challenges are employees facing and what support would help them perform better?”

The approach gained significant traction in corporate L&D following Sharon Boller and Laura Fletcher’s 2020 book, which adapted the traditional design thinking process specifically for training and development contexts. Their framework emphasized balancing three forces: learner wants and needs, business requirements and organizational constraints.

What’s Changed Since the First Book Came out

When Boller and Fletcher published their book in June 2020, L&D teams were frantically converting in-person workshops to virtual delivery while wondering if this was temporary.

It wasn’t. By 2021, the Great Resignation was underway. Seven in 10 HR leaders now describe the current business climate as “increasingly challenging,” according to McKinsey’s 2025 research. The pressure on L&D shifted from “deliver training” to “prove training drives retention and business outcomes.”

Meanwhile, AI went from experimental to mainstream. By 2025, employees were adopting AI three times faster than their leaders realized. Hybrid teams made scheduling everyone for the same session logistically impossible. Skills gaps accelerated beyond what annual training plans could address.

The design thinking framework that emphasized empathy, iteration and learner-centered design became more relevant. However, the tools evolved dramatically. LMS platforms now capture behavioral data revealing what employees actually do. AI tools generate prototype content in hours instead of weeks. Continuous feedback loops are technically feasible at scale.

The core principles haven’t changed. What’s different is the pace, the tools and the complexity of the environments where learning needs to happen.

Why Design Thinking Still Matters in L&D

When Sharon Boller and Laura Fletcher published their foundational book on design thinking for training and development in 2020, they gave L&D leaders a structured way to move beyond assumption-based course creation. Five years later, that human-centered approach has become more urgent.

Today’s workplace looks fundamentally different. Seven in 10 HR leaders describe the current business climate as “increasingly challenging,” according to McKinsey’s 2025 workplace learning research. Hybrid work has fragmented how people learn. Skills gaps widen faster than traditional training can close them. Employees expect the same personalization they get from consumer apps.

We’ve found that empathy in 2026 looks different than it did in 2020. L&D teams are combining qualitative insight with behavioral data from learning platforms, tracking where engagement drops and continuously testing assumptions against real usage patterns.

The original framework remains powerful: get perspective, refine the problem, ideate and prototype, iterate and implement. When traditional top-down course design fails to improve onboarding speed or compliance completion, design thinking forces better questions. Where are new hires struggling in week three? What prevents managers from completing ethics training? Where in the workflow do employees need support they’re not getting?

The challenge now is applying design thinking with 2026 tools and constraints. That means knowing how to align training to business goals while accounting for hybrid teams, AI-assisted content creation and learning that happens in the flow of work.

Empathize: Going Beyond Interviews

The empathy phase gave L&D professionals something critical in 2020: a structured process to include learner voice before building solutions. Instead of designing courses based solely on what subject matter experts said or executives requested, teams could conduct empathy interviews, build personas and create experience maps that captured how learning actually happened in workflow.

That foundation remains essential. In our experience, the teams that skip empathy work still end up rebuilding programs after launch because they missed critical context about how employees work, what constraints they face, or where existing solutions already fail them.

What’s evolved is the data available to inform and validate empathy work. L&D teams in 2026 are layering qualitative insights from interviews with quantitative signals from learning platforms. Tools like Brightspace have learning analytics that can show you where learners consistently drop off, which modules they return to repeatedly as job aids, or how completion patterns differ between remote and on-site employees.

We’ve found this combination catches disconnects that interviews alone miss. A manager might tell you new hires need more product training, but platform data shows they’re actually rewatching the CRM basics module five times while barely touching advanced product features. That insight changes your problem statement entirely.

Upgrading empathy work with behavioral data

Modern empathy combines traditional qualitative methods with platform signals. Start with learner interviews and experience mapping to understand workflow context and pain points. Then validate those insights against LMS behavioral data like completion rates, time-on-task and content interaction patterns.

Look for mismatches between what people say and what they do. When survey responses indicate satisfaction but analytics show high abandonment rates, that’s your signal to dig deeper in follow-up conversations. The goal isn’t to replace human insight with data – it’s to ask better questions. “We noticed you bookmarked this resource and returned to it three times last month. What made it more useful than the formal training?” tends to surface more actionable feedback than “how was the course?”

Tools like Brightspace also reveal patterns across distributed teams that interviews might miss, like when mobile completion rates spike during specific shifts or when certain cohorts consistently skip optional modules that turn out to be critical for job performance.

Define: Framing the Right Learning Problem

The define phase in Boller and Fletcher’s framework gave L&D teams tools like problem statement templates and strategy blueprints to uncover what stakeholders were actually trying to solve. When a VP says “we need sales training,” this phase helps you discover whether the actual problem is knowledge gaps, misaligned incentives, broken tools, or unclear processes.

Diagram showing the Define phase of design thinking for training and development: a stakeholder request for sales training is analyzed using empathy and LMS data, leading to insights about new hires struggling with product knowledge and a reframed problem focused on just-in-time troubleshooting resources instead of full product training.

That reframing work remains critical. However, the inputs available for problem definition have expanded since 2020.

Upgrading problem definition with behavioral signals:

  • Start with stakeholder interviews to understand the business challenge and hypothesized causes
  • Layer in platform analytics to see how employees actually behave versus what stakeholders assume
  • Look for contradictions between reported problems and observed patterns

For example, if managers say “new hires aren’t retaining product knowledge” but your LMS data shows they’re rewatching the same three troubleshooting videos repeatedly while ignoring product overview modules, that signals a workflow support gap rather than a retention issue.

We’ve also found that hybrid work requires more specificity in problem framing. “Employees struggle with collaboration tools” needs to distinguish between remote employees lacking structured onboarding, on-site employees reverting to old habits, or managers not modeling new tools in meetings. Each represents a different friction point requiring a different solution.

The strongest problem statements specify the performance gap, the context where it appears and the constraints that matter. “Customer service reps in our APAC region escalate routine billing questions to supervisors at 2x the rate of other regions, despite completing the same training” gives you a clear problem to solve.

Ideate: From Workshops to Cross-Functional Design

The ideation phase brings diverse perspectives into solution brainstorming before committing to a specific approach. In 2020, this typically meant gathering stakeholders in a room for half-day workshops. In 2026, the pace and structure look different.

How ideation has evolved:

  • Shorter, asynchronous sprints using Miro or Mural replace all-day workshops
  • Focused 45-minute virtual sessions target specific solution aspects rather than broad questions
  • Multiple rapid sessions with different participant mixes generate more diverse input than single large workshops

We’ve seen this approach work well for onboarding redesign. Instead of asking “how should we onboard new hires,” teams break ideation into targeted questions like “how might we help remote employees build relationships in their first week” or “what information needs personal delivery from managers versus self-serve access.”

The hybrid workplace has also expanded who participates in ideation. Research on design thinking and teamwork shows that ideation and prototyping phases have the largest positive effects on collaboration, creativity and reflection. However, the same research found negative results when brainstorming is overused without structure. We’ve found this particularly true in virtual settings where unstructured ideation sessions drift without clear outcomes.

Leading teams now include IT for technical constraints, operations for workflow realities and actual learners experiencing the problem firsthand. A compliance training redesign benefits more from including frontline employees who rush through modules, managers who track completion and legal teams who define requirements than from a room of instructional designers brainstorming alone.

Structured ideation for distributed teams

Virtual ideation requires tighter facilitation than in-person workshops. We recommend time-boxing each activity (10 minutes for individual brainstorming, 15 minutes for group building) and using visual templates in tools like Miro that guide participants through specific prompts rather than open-ended “share your ideas.”

One pattern that works well: have participants silently add ideas to a shared board for 5-7 minutes, then spend the remaining time clustering and building on each other’s concepts. This prevents dominant voices from steering the conversation too early and gives remote participants equal input regardless of communication style.

Prototype: Faster, Smarter Testing with AI

Rapid prototyping was central to the original framework—creating rough versions of solutions to test with learners before full development. The goal was catching design flaws early when they’re cheap to fix rather than after launch when they’re expensive.

That principle hasn’t changed. What’s changed is how quickly L&D teams can move from concept to testable prototype.

How AI accelerates prototyping:

  • Generate draft learning modules from outline to testable content in hours instead of weeks
  • Create multiple content variations quickly to test different approaches with learner groups
  • Produce realistic mockups that feel closer to finished products, generating more specific feedback

AI learning platforms like D2L Lumi allow instructional designers to input learning objectives and source materials, then generate structured course content that can be refined based on feedback. McKinsey’s research on workplace learning emphasizes that agile experimentation now replaces rigid planning as the most effective way to roll out L&D innovations with lower risk. We’ve found the better approach is using AI to create multiple rough prototypes quickly, then investing time testing each with actual learners rather than moving straight to development with a single polished version.

Test: Usability and Continuous Learning Loops

The test phase in the original framework emphasized piloting programs with small learner groups, gathering feedback and refining before full rollout. This caught usability issues, unclear instructions and content gaps that designers missed.

That validation step remains essential. However, the tools available for testing have made continuous feedback loops more feasible at scale.

Modern testing layers multiple feedback mechanisms:

  • Pre-launch usability testing with small learner cohorts to catch friction points
  • Post-launch behavioral tracking through LMS analytics showing actual usage patterns
  • Periodic pulse surveys asking learners what’s working and what’s not
  • Regular program reviews comparing learning outcomes against business metrics

Platforms like Brightspace capture engagement data, completion patterns and performance indicators that make it easier to spot issues after launch. When a module shows consistent drop-off at the same point across multiple cohorts, that’s a signal to investigate and iterate rather than waiting for annual reviews. Moreover, the definition of “testing” has expanded beyond measuring learning outcomes to measuring workflow integration.

Research on workplace learning emphasizes that continuous learning ecosystems work better than episodic training delivery. Consequently, testing now asks whether employees can find and apply learning resources at the moment of need.Instead of only tracking course completion rates, teams are monitoring which resources employees bookmark and return to, what search terms they use to find content and where they’re switching between devices. These signals reveal whether your learning design actually supports workflow. We’ve found that framing programs as ongoing iterations where refinement based on real usage data is part of the training program evaluation process builds stakeholder confidence through visible improvements.

Modern design thinking requires modern tools.

See how D2L’s AI-powered platform helps L&D teams prototype faster and iterate based on real learner behavior.

Book a demo

Applying Design Thinking Today: A 2026 Remix

The five-phase framework from Boller and Fletcher still provides the structure L&D teams need. However, treating these as recurring loops rather than linear steps better matches how modern learning programs actually develop.

Putting it into practice:

Start with empathy work that combines learner interviews with behavioral signals from your learning platform. Use Brightspace analytics to identify engagement drops, then layer qualitative research to understand the context.

Frame problems as experience gaps rather than knowledge deficits. “Employees can’t find process guidance at the moment they need it” leads to different solutions than “employees don’t understand our new process.”

Run ideation as focused virtual sprints with the people closest to the problem – employees experiencing the friction, managers seeing performance gaps, teams handling workflow constraints.

Use AI tools like D2L Lumi to prototype multiple approaches quickly, then test each with small learner groups. Embed testing as an ongoing process by tracking which content employees actually use and where they get stuck.Frame learning solutions as training strategies that evolve based on employee needs rather than finished programs. Design entire ecosystems – including just-in-time resources, peer learning and performance support – rather than isolated training events.

Frequently Asked Questions About Design Thinking for Training and Development

What Is Design Thinking for Training and Development?

Design thinking for training and development is a human-centered problem-solving approach that helps L&D teams understand learner needs before building solutions. The methodology includes five phases: empathize with learners, define the real problem, ideate potential solutions, prototype concepts quickly and test with real users. Rather than starting with “we need a course on X,” design thinking starts with understanding the challenges employees face and what support would help them perform better in their actual work context.

How Can Design Thinking Improve Corporate Training Programs?

Design thinking improves corporate training by ensuring programs address real learner needs rather than assumed ones. The empathy phase reveals where employees actually struggle, the define phase frames problems as experience gaps rather than knowledge deficits and the prototype and test phases catch design flaws before full rollout. We’ve found that this approach leads to higher employee engagement, better knowledge retention and stronger connections between learning outcomes and business performance. The iterative nature also means programs evolve based on feedback rather than staying static after launch.

What Are the Five Phases of Design Thinking in L&D?

The five phases are: empathize (understand learner perspectives through interviews, observation and behavioral data), define (frame the real problem you’re solving), ideate (generate multiple solution concepts through cross-functional collaboration), prototype (create testable versions quickly using tools like AI learning platforms) and test (validate with real learners and iterate based on feedback). These phases work best as recurring loops rather than linear steps, allowing L&D teams to cycle back through empathy and problem definition as they learn more about what employees need.

How Does Design Thinking Differ From Instructional Design?

Design thinking and instructional design serve different but complementary purposes. Instructional design focuses on how to teach specific content effectively – analyzing tasks, designing learning sequences and selecting appropriate delivery methods. Design thinking focuses on whether you’re solving the right problem in the first place. We’ve found that combining both approaches works well: use design thinking upfront to ensure you’re addressing real learner needs and business challenges, then apply instructional design principles to build effective learning experiences. The main difference is design thinking emphasizes discovery and iteration while traditional instructional design often assumes the learning need is already clearly defined.

How Is AI Used in Design Thinking for Learning and Development?

AI learning platforms like D2L Lumi accelerate the prototyping phase by generating draft learning modules from outlines and source materials in hours instead of weeks. This allows L&D teams to create multiple content variations quickly and test different approaches with learner groups. AI also supports the empathy phase by analyzing LMS behavioral data to reveal patterns in how employees interact with learning content. However, AI doesn’t replace the human-centered aspects of design thinking – it speeds up content creation so teams can invest more time in learner research, usability testing and iteration based on feedback.

What Role Does Empathy Play in Workplace Learning Design?

Empathy in L&D means understanding learner contexts, challenges and workflows before designing training. This includes conducting interviews to uncover what employees actually need, observing how they perform tasks in real work environments and analyzing behavioral data to see where current training falls short. Employee-centered training that starts with empathy typically achieves higher completion rates and better on-the-job application because it addresses real friction points rather than assumed knowledge gaps. Modern empathy work combines qualitative methods like learner feedback with quantitative signals from learning platforms.

How Does Design Thinking Support Hybrid and Remote Learning Programs?

Design thinking helps L&D teams understand the specific challenges hybrid and remote learners face – like finding time for synchronous sessions across time zones, accessing resources from different devices, or feeling disconnected from in-office colleagues. The empathy phase reveals these context-specific needs, while the ideate and prototype phases generate solutions like asynchronous learning paths, mobile-first content design, or peer learning channels that work across distributed teams. Remote learning design benefits particularly from the test phase, where teams can validate whether their solutions actually work in fragmented work environments before full deployment.

How Can L&D Teams Test and Improve Learning Programs With Design Thinking?

L&D teams can test programs through usability testing sessions with small learner groups before launch, then continue testing after implementation by tracking behavioral data in their LMS. Pilot learning programs with representative employee cohorts to catch friction points early. Monitor which content employees actually use versus what they’re required to complete, where they consistently get stuck and what resources they return to repeatedly. These feedback loops inform ongoing improvements rather than waiting for annual redesign cycles. Modern learning platforms make continuous testing feasible by capturing engagement patterns and completion data in real time.

Is Design Thinking Effective for Compliance and Onboarding Training?

Design thinking applications work particularly well for compliance training and onboarding because these programs often suffer from low engagement despite being mandatory. The empathy phase reveals why employees rush through compliance modules or what new hires actually struggle with during onboarding. The ideate phase generates alternatives to traditional lecture-based compliance content, like scenario-based learning that mirrors real ethical dilemmas employees face. For onboarding experience design, testing with recent hires shows where information overload happens or which resources they actually reference after their first week, allowing teams to create more targeted support.Keywords to include: compliance training design, onboarding experience, design thinking applications

Can Design Thinking Help Improve Learner Retention and ROI?

Design thinking improves learner engagement and L&D ROI by ensuring training addresses real performance gaps that matter to the business. When programs are designed based on actual employee needs and tested with real users, completion rates typically increase and on-the-job application improves. The framework also helps L&D teams frame their work in business terms by connecting learning initiatives to specific performance metrics during the define phase. Training strategy optimization through continuous iteration means programs stay relevant as business needs evolve rather than becoming outdated shortly after launch.

Table of Contents

  1. What Is Design Thinking?
  2. What’s Changed Since the First Book Came out
  3. Why Design Thinking Still Matters in L&D
  4. Applying Design Thinking Today: A 2026 Remix