Providing learning is an important part of the work associations do. In many cases, it can tie back into an association’s mission to represent the profession and support the members. In others, it can be part of a broader business strategy to diversify how, where and when associations generate revenue. Whatever the motivation, learning is fundamental.
But developing a learning-focused culture isn’t an overnight project. Doing it right and keeping up the momentum takes time, practice and persistence.
We talked with Josh Goldman, director of consulting services at Tagoras, about what associations are doing to accelerate learning and some of the takeaways, trends and pitfalls people should be aware of.
What should associations be thinking about when it comes to trends? Which ones are going to make the biggest splashes, and what pitfalls do they need to watch out for?
Before I was able to sit in the consulting seat and see how different organizations respond and react, I might’ve been the one who was presenting on a stage somewhere and saying, “Oh, micro-learning is the newest trend. Oh, cohort-based learning is the newest trend. Oh, AI is the newest trend.”
What I’ve realized is the new trend that associations should be looking at isn’t this monolithic thing. The trend really needs to be adapted and personalized to individual groups and segments within the membership. I can talk about some of the things we’ve found with organizations, but that’s what I like to refer to as the pitfall of trends. Someone latches onto the newest, shiniest thing in learning, and that’s the trend they have to do, whereas that may not actually be what a segment of your customer base is looking for.
Can you give us an example?
There’s a good use case I’ve experienced, which really goes back to knowing and understanding your customer. This organization was interested in really learning what was attracting their early career professionals because they were seeing the retirement cliff coming, like a lot of organizations are. We were sitting in the boardroom with the advisory committee, and they wanted networking. We need to do more networking. We need to do local networking in states and chapters.
But when we collected the information and did the analysis and segmentation, networking was almost dead last on the list of what early career professionals were looking for. And when we reviewed the results with the board and staff, they said, “Oh, this may make sense because those early career professionals are so busy getting onboarded to the industry. All their preferences were self-paced, on my own time, let me do it myself.” I circled back to the beginning of the conversation we had because micro-learning is a trend for that segment within the industry we ended up recommending.
To me, it comes back to having to segment, personalize and really know your customer at the end of the day.
Where does that put us when it comes to AI?
AI will absolutely revolutionize the way learning businesses do their work and serve their markets and customers. In the association world, it’s relatively in its infancy. But it’s obviously not in the corporate or commercial spaces. We’re lagging a bit in the AI space.
ASAE has a Drivers of Change research project, and AI was on that list of things associations should be paying attention to. That was released over five years ago. Maybe people weren’t adapting or thinking it was going to happen as fast as it did, despite the research, but they’re just starting to pay attention to it.
So how, then, are we going to play catch-up? How are we going to use it?
For learning businesses, it really creates that opportunity, right? When we think about mature organizations, they anticipate what kind of capacity they need to meet their future demands and plan for that in a meaningful way. So, there’s an opportunity to use AI for good, not evil. There is an opportunity there for learning businesses.
You mentioned capacity. As we look at associations today, do we have the staff and resources to be successful with modern learning?
In our experience working with clients, it’s a mix of competency and capacity. What we’re seeing now, especially post-COVID-19, is that people are coming out of that headspace where they had to survive and get the job done. Now, they’re realizing how much inefficiency there is in their infrastructure, processes and people.
So, it’s a mix. Some associations are understaffed for the work they’re trying to do, and some haven’t quite paid enough attention to efficiencies about how their people and technology support each other to achieve their mission, goals and outcomes.
We had a client recently who was not quite aware of how much duct tape was happening behind the scenes for the customer experience to be good. We presented findings to the board that the staff is doing an amazing job keeping it together with manual processes that they didn’t know existed. The board committed to a pretty significant financial investment to improve that technology so that humans can do what humans are good at and technology can do what technology is good at.
In what ways do shifting customer and member expectations come into play? How are associations responding to them?
It’s a bit of you pull on a thread and all of a sudden, you’ve unraveled the entire shirt. It’s one of those questions. I think it’s a couple of things, and it actually ties into some of the trends we’re seeing. There’s the changing nature of customer expectations. That puts increased pressure on associations and learning businesses—whether that’s increased quality of the content, a more seamless user experience or self-service expectations. We see new strategies as being a priority.
The motivations and drivers that people have to engage with or purchase from an organization, pre-COVID-19 versus post-COVID-19, are very distinct. We’re seeing trends changing in a very short period of time.
We used to see value-based and barrier-based decisions. If it’s value based, maybe it’s of high quality, it’s delivered by a respected subject-matter expert, it solves a problem, it helps someone network with peers or it’s in a great location. With barrier based, it’s, “Oh, it’s too expensive,” or “I didn’t know about it.”
But now, we’re seeing some different kinds of expectations and decision-making criteria. We are seeing that the No. 1 barrier to participating in programs, whether they’re in person or online, is time. Only an average of 3% to 10% of respondents across various associations are saying that it’s too expensive or doesn’t have value or isn’t relevant to them. Those are very tiny barriers. Almost exclusively, we’re seeing a very high percentage for whom the issue is time. You really have to make sure that what you’re creating and delivering targets those potential customers’ specific needs so they will say, “I have to make time for this.”
I think you can maintain and survive being a mass generalist and trying to serve everyone, but you’re not going to thrive based on these changing expectations.
Then the other interesting trend we’re seeing that’s climbing more to the top of the list—in terms of what’s important when registering for an educational program—is whether it’s been shown to have an impact. And I think that correlates to the fact that I only have so much time to give, so if I’m going to give my time, energy, money or talent, it better have an impact.
And I think that market-facing learning businesses are not great at showing impact. They look at satisfaction, but they haven’t implemented more of a mature impact analysis mechanism to actually try to evaluate whether what they’re doing has an impact on the learner at the end of the day.
There are simple things you can do to evaluate the impact on the person, right? You should be looking at feedback over satisfaction scores so you can figure out how to improve programs and performance. Has your behavior changed because of participation in these learning programs? The lowest-hanging fruit we recommend is the “post survey.” Three months after, or three weeks after, you send the survey to individuals who participated in the learning program, and you ask questions about the application. Are you more efficient at what you’re doing? Is it easier doing what you’re doing?
When it comes to competition, what are associations seeing?
I think the answer reminds me of the interesting conversation we had the other day with one of our clients about competition. They were adamant that they were really the only ones in the space who do what they do. And you know, I’m a skeptic, especially about competition. But in large part, that was actually true. They really did fill a very large space in a very niche area.
But what we realized when we dug a little bit deeper was that they were competing more with themselves. They offered so much value to the market—they were sending so many emails, resources, research reports, white papers, events, webinars—that when we looked at awareness, there was a huge lack of it regarding that they were offering all this value. That opened my eyes to think about other associations we work with and say, “Well, are we potentially our own worst enemy in the marketplace?”
A potential solution we’ve been seeing is the coordination of value or content under one executive domain. ASAE just created a chief program officer. The American Bar Association has a chief product officer. It’s the idea of coordinating all those content channels under one executive sponsor so you can really have a coherent strategy as opposed to decentralized value centers.
How should associations be tracking their progress toward these goals?
I think that there is a little bit of trepidation over thinking about business intelligence and business analytics. We tend to overcomplicate things, but I think it’s a learning business leader’s responsibility to understand the program’s impact on their business overall. Especially executives should understand the impact on the learning business, the membership business and the publication business—and how these things are intertwined to deliver value.
That’s a “value add” for both the learning business and the customer on the other end who’s saying, “Wow, they really know me; they understand me and they know what I’m interested in, so yeah, I’m going to put my trust and brand loyalty with that organization.”
Can you tell us more about the Learning Business Maturity Model™ and how it’s used?
This is a model that Jeff Cobb and Celisa Steele, co-founders and managing directors of Tagoras, put together. They wanted to create a model that could encourage learning businesses to think about how they do their business. It has four domains focused on the context of a learning business: Leadership, Strategy, Capacity, and Portfolio and Marketing.
How effective a learning business is places it into one of four stages. The first stage is static. You’re not doing many new things. You’re now really evaluating your market. I’ve described it before as SALY—same as last year.
Then you progress up to stage two, where your organization is starting to experiment with innovation. Maybe you’ve looked at your portfolio and thought, “Well, we need to create some new programs for an emerging audience.” Your leadership has buy-in and understands the value for the organization, but you’re still being largely reactive. You’re not getting out in front of your customers, your members or your learners to really understand their situation and help them respond or adapt it via learning.
And then you move into more of a stage three organization. You become a much more proactive organization that is out front, understands what’s going on in the business, and develops strategies and programs to exceed members’ needs.
And then you move into a stage four organization that’s considered innovative. You’re really leading in the marketplace with learning initiatives and programs. This stage is a bit aspirational because stage four organizations will never be happy with where they are. They’re not going to change for change’s sake, but they’re always looking at the market for ways to improve the value of their learning programs and services.
We recommend using the maturity model in three different ways.
The easiest is to just grab it and use it as a point of discussion and dialogue with your learning staff. Sit down around a table, look at the model and ask, “Where do you think we are? How do you think we are doing in our strategy? What are our capacity concerns? Have we looked at our portfolio? How well do we think it matches the market? Have we evaluated our marketing and the ways we’re communicating with our learners and customers?”
A more sophisticated or advanced use of the maturity model is in the diagnostic phase. We have a self-assessment you can download that asks questions to help you quantitatively define where you are. You could do that assessment yourself as the business leader, but we recommend the entire learning team complete it. Then you sit down and tally those results. It creates a couple of different value points—a common language, a common understanding and a framework you don’t have to create yourself—on how to evaluate what the business is doing to drive learning value.
The third thing you can do with that information is decide to act. You’ve discussed. You’ve diagnosed. You know where you are. Now, you can begin to prioritize how to move your strategy forward. For example, “We’re a reactive organization and the No. 1 thing we should do is X.” And then you generate agreement and consensus and apply resources to move the needle in ways that you think are going to be effective.
As part of this evaluation, does any of it look at revenue generation and making learning a sustainable line of business?
As we know, other parts of the business model are eroding, and organizations are having to rethink revenue models. If you want to grow or increase your services to meet a need, you often have to figure out how you’re going to find the revenue to do that. Would you end up increasing membership dues? Would you begin to look for other non-dues revenue sources?
There are several models associations are experimenting with, but I think we are at an inflection point where associations are concerned about their business model and beginning to make real efforts and investments to consider revising and revamping potential model changes.
An important part of what Tagoras does with clients is to consider the revenue impact and the business of learning. It is a business and should be treated like a business, but we have to respect the mission as well. And so if the mission is for it to be truly a part of membership, what can we do to think about other ways to generate revenue and increase the value of the education, programs and services that they’re offering?
Lastly, as we look ahead at the future of learning for associations, what role is (or should) technology play?
About half of our work at Tagoras is related to learning technology, advisory services and selection programs. How Tagoras actually got started is that Jeff and Celisa sold a learning technology company that they had founded, and then they wanted to think about what was next. They realized that so many customers come to them describing a technology need or feature requirements, but they really couldn’t talk about how the technology supported the strategy and the people in the process.
We really advise people not to lead with technology. If you’re rethinking your business, that needs to be the third thing you think about. You really need to understand your strategy. You really need to understand your market. And then you need to go find a good partner that’s going to support you. We like to say that your learning technology should work for you; you should not work for it. But you can’t get to that nirvana until you talk about your strategy and understand your market.
That keeps you from going down that rabbit hole of shiny features that the sales or account manager wants to show you about the platform. It gives you a consistent point of reference as you’re making decisions, so you pull those back out whenever you’re selecting who’s going to move from demo to potentially demo round two. When you get to the very end, you’re saying, “Here’s who we want to partner with; let’s check that against what our real goals and objectives were. Were we distracted along the way or are we still on point with being true to the core of why we’re trying to implement this technology to begin with?”
I think the only other thing I’m seeing a little bit more of occurring now than in the past is siloed approaches to technology selection—whether that’s the single education team, the single meeting team or even the single IT team that goes out. When you’re adopting learning tech, you need to really consider the entire ecosystem and have stakeholders with your major value centers involved in that process. They need to be involved because you may select an excellent technology that supports the learning team, but then find out that marketing and communications say that will not work for the way we do marketing and communications in this organization. You really need to have that cross-functional, enterprise mindset.
Haley Wilson is a Content Marketing Manager at D2L, specializing in the corporate learning space. She holds an Honours Bachelor of Arts degree from the University of Guelph as well as a Master of Arts focused in history from Wilfrid Laurier University.
Stay in the know
Educators and training pros get our insights, tips, and best practices delivered monthly