Browse other podcasts

What Support Leaders Get Wrong About Agent Training and How to Fix It

Guest

Tony Won, Head of Customer Experience at TruPlay Games

Summary

Tony Won has worked in the video game industry for over a decade, doing everything from UI/UX design and music composition to leading customer support and experience teams (including building a few from scratch).

After leading support and CX teams at Gosu Group, Riot Games, Epic Games, and Telus International (Games unit) for the last 7 years, Tony is now Head of Customer Experience at TruPlay Games. In this week’s episode, Tony shares what he thinks support leaders often get wrong about agent training and how to fix it.

What support leaders get wrong about agent training

In Tony’s opinion, too many support leaders approach agent training with a traditional “teach to the test” mentality. Training isn’t adapted or updated to fit the way humans actually learn (e.g. hour-long trainings, where a trainer talks at the agents like it’s a 100-level college lecture, aren’t nearly as effective as step-by-step, interactive trainings).

Second, agents (aka the target audience) aren’t often involved or consulted in the development of training materials. Supports orgs talk a big game about personalizing experiences for customers, but many neglect to apply this same concept to internal training.

And third, KPI culture can kill creativity when it comes to training. For example, Tony said he often sees support leaders use test completion or pass/fail percentages as benchmarks for training. But those metrics don’t capture if an agent truly understands the material and changes their behavior.

Five ways to fix agent training

1. Determine your Kirkpatrick level 3 (behavior) and 4 (results) goals ahead of time. Then commit resources to long-term observation to see if the training hit those goals. Are agents still using that new tool or process several months later? Did the training change their behavior?

2. Try multiple approaches to accommodate different learning styles. Try shorter training sessions and different types of learning materials, and incorporate time for more participation and hands-on practice. Most importantly, ask your agents how they like to learn.

3. Involve your agents in the training design process. They know better than anyone how the training will be applied in their day-to-day work. And if you have agents with expertise in what you’re wanting to teach, ask them to create training content.

4. Ditch the unhelpful KPIs. Bad KPIs drive poor behaviors, and this is especially true when it comes to training. Instead of using “easy” KPIs like number of tests completed or percentage of tests passed, work with your data team to measure long-term behavior change and the effect on customer experience.

5. Don’t create training in isolation. Make sure your trainers have the time and resources to understand how your team’s policies, procedures, and processes affect agent behavior. And give them a seat at the table when discussing those systems. Because it doesn’t matter how effective your training is designed to be if agents are being measured on conflicting goals.

Watch or listen to Tony’s full episode above to learn more! And don’t forget to rate Beyond the Queue on Apple Podcasts. ⭐

See the episode’s transcript

Meredith Metsker: Hey everyone, welcome back to Beyond the Queue. Today, I'm excited to welcome Tony Won. He's the head of customer experience at TruPlay Games. Tony, thank you so much for joining me today.

Tony Won: Thanks for having me, Meredith.

Meredith Metsker: So, you've had a really cool career in customer support and experience, particularly in the video game industry. And I saw that you've even designed some games, you've done music composition for them. So, you are a man of many talents.

Tony Won: That's really nice of you to say. At startups, you don't have a choice. You just have to do what needs to be done. And that was me cutting my teeth on a few fun and new things back in the day.

Meredith Metsker: Nice. So, today, I want to talk to you about an important topic for all support leaders and one that I know you have dealt with. And that is agent training. So, I know you've said before that the training concept is broken in many customer service or organizations. So, let's dig into that a little bit. To get us started, can you just tell me what you mean when you say that agent training is broken?

Tony Won: Sure. I think at the core of all of that is one of the pieces of evidence you can clearly see is the whole bias towards training for testing versus what I would call training for learning, right. And I think you that that's really at the heart of it. A lot of people are getting into classrooms and taking modules or whatever. But it's all to pass some quiz or a test. In my view, it doesn't really quite get at the heart of continuous adult learning, which is a very different thing.

Meredith Metsker: Okay. So, in your experience, what have you seen support leaders getting wrong about this?

Tony Won: There's a couple of different ways it expresses itself. It's probably mostly reducing the complexity of learning to testing because it's easier to do. Let's just face it, it's much easier to teach people to a test because you have all of these ways to get some numbers on the board, right. And the deeper part of it that concerns me is that it might not be in a conscious thing. But I think a lot of people aren't thinking very deeply about really caring about the learning process or just about people learning.

And so, you see it in things, like for example, agents usually aren't included in generating any of the training materials that they have or the knowledge that they work with. They usually don't get to regularly interact with that knowledge. It's usually done by a separate group of people. You see that there is almost zero evidence in most operations I've seen for the long-term evaluation, or looking in training terms like the Kirkpatrick level three and level four evaluations.

And I think you see it in little ways too where people tend to miss out on thinking about how they're going to integrate practical concepts like tribal knowledge, or trying to understand how people are working around your current rules and your current systems. Because, let's face it, if we were running any customer support operation, most likely your agent start to have workarounds for what you've built. And it's like, "Well, what are those workers? Why do they feel like they have to do that?" You don't really see a lot of investigation into looking at how to incorporate those things.

Most of the time is just like, "Oh, don't have tribal knowledge, or don't use those additional resources." It's like, "Okay, we understand, but there are practical problems around knowledge and understanding that those things you're solving for that your current offerings don't." So, taking a look at that, I don't usually see that either.

Meredith Metsker: Okay. So, what do you think support leaders, and agents too, what are they missing? What are they missing out on when support orgs approach training this way?

Tony Won: I think they're missing out on a lot of really practical pieces of knowledge that would help everyone do their jobs better. I would say the overall outcome that you're missing is a much improved to work environment around learning on the job, around feeling like you're equipped to do your job properly, and feeling like you're working to make work a better place, and that the systems are all changing according to reality, versus feeling like you're running up against a bunch of walls all the time at work.

So that all of that connects into things like agent satisfaction, attrition, how happy people or org. And at the end of the day, a lot of those things do plug into a better customer experience for people who are on the receiving end for the interactions that the agents have to participate in. Right?

Meredith Metsker: Yeah, and that's the end goal, right?

Tony Won: Absolutely. Yeah, absolutely.

Meredith Metsker: So, with that in mind, why do you think that support leaders are, or have been, getting this concept of agent training wrong? Is it just that we're still learning what makes a good learning environment? Because I know in school growing up, I had to learn to a test often.

Tony Won: Right, right. I think that is spot-on. I think there have been pockets of people in education, who have been going on about the educational model, and all of these really important topics for learning at any age at the root of the problem in our typical work context. So, I would say that that's certainly part of it. Like clinging on to old models that maybe were really good for one thing, but blanket applying them to other things. So, at the root, I would say it's an old problem we have in business and in business processes.

It takes a lot of times it borrows from manufacturing and takes very simple mechanical principles. And then, tries to incorrectly apply them, I would say, to complex human behaviors. So, you have something that's a very step-by-step process. And yeah, you put this here and you screw this in and make sure that it functions properly and onto the next stage. But human behavior isn't like that.

We all know that, right? So, the way it ends up working out at most companies is I think a lot of times you get very rigid best practices in customer experience and customer service. And these are blindly applied to every human being in your organization despite the very clear fact that if you have a conversation with somebody, like we all acknowledge, "Oh, yeah."

And if you think about, if you have children, like think about your own children, like, "Oh, yeah, everyone's different, people learn differently. We need personal attention." Right? We talked about all these things. But then, even if you think about customer service context, it's like, "Oh, yeah, you need to personalize your responses with other people, because they're people." But then, when it comes to them in the internal context of training, it's like, "No, we don't need to personalize anything." It's just, you're a cog in a machine... That's how people end up feeling, right? So, you get that.

I know we've had a conversation before where I'll mention the word, it's unintentional, I think, many times, but it is a dehumanization, right? Agents aren't people anymore. They're not individuals anymore with individual needs and different ways of learning. And, I think, together with what I was sharing, there's the application of incorrect methods depending on the context, right?

So, something very typical might be, "Oh, we're going to solve this problem we have with a 15-minute training or one-hour training, and we're just going to deploy it to 50,000 people globally who are in all these different countries and have all these different contexts." And it's going to fundamentally change the way they think about X, right? And it's like, "Wow, if that really worked, you should seriously quit your job and go create world peace because that's a better use of that skill." Right? But we all know that it actually doesn't work.

And so, we see that problem's there. And then, I think another reason that we tend to get this wrong as a community is the KPI culture, right. Sometimes, the KPIs are aimed in the wrong direction, and they're incorrectly applied. So, in the very specific training context, a lot of times I've seen test completion, pass/fail percentages be like, the benchmark. And I'm like, "Well, that's really limiting." Or, a survey result, like, "Oh, did you like the training?" Right?

And that's not deep enough to get at, if we're really interested in whether or not our agents are going to learn new things and change behaviors, which is a lot of what people are looking for overtime when it comes to interacting with their customers.

Meredith Metsker: Yeah, that makes sense, especially when you said we want to offer a personalized experience to our customers. So, why wouldn't we model that internally?

Tony Won: Yeah, right.

Meredith Metsker: Okay. So now onto the actionable part here. How can support leaders fix the agent training process?

Tony Won: It's really tough. And I think there's no silver bullet to it. But I think there's a couple of general tips that I would give, and I would say, maybe about just four for this conversation. So, one, I would say, you should state very clearly up front whenever you're tempted to use training as a tool to solve a problem. What your Kirkpatrick level threes and fours are going to be? What are those goals that you have for this training initiative? Because then, those are going to force you to hopefully have longer term or to devote resources to longer-term observation.

So, Kirkpatrick three and four is really looking at not just how do they feel about the training or how do they test to it, but what are the changed behaviors you see after your training initiative is complete? And what are the results of those things? Right. So, a lot of places don't devote resources towards that. They just stop at the level two. They do a typical assessment, and nobody is checking for, "Hey, like three months down the line, are you using that new tool? Are you using that new process? Does it make sense to you?

Is it second nature to you the different approach that we shared with you in terms of how to talk to this customer or how to defuse a situation, like how is that going? Is it just part of your fabric now? Or is it something that you just forgot and you dumped off in whatever?" And you're still continuing in your previous habits, devoting resources towards consistent, I would say, long-term observation and coaching, and really taking more of a long-term view of learning.

The second thing I would say is looking at... We were talking about how people learn differently. And so, not just designing one training for one thing, but even if you have one topic, designing a number of different training and different ways to practice I would say, which is a component that I feel is missing a lot of. So, typically, I'll see a lot of trainers, like they'll do a classroom training and they'll be talking at the agents quite a bit. And then, there'll be a test or something like that.

But what I've challenged a lot of professionals to do, and they know this already, but to try a couple of different approaches, different types of material, but also just trying to flip the classroom every once in a while and to have much shorter lecture sessions and have people actually go do like go practice what it is that you're trying to, skills that you're trying to build up and really focus in on those for a period of time until a new behavior, a new practice becomes second nature to you, right?

So, a lot more practical parts of training and participation and engagement versus just sitting around like you're back in high school listening to your teacher drone on. So, for sure, like stuff like that, think about like, "Hey, self-service learning paths and programs." And all these different ways in which you can supply both material and just different ways of learning to your people.

Third, I would say in connection to what we had before, like why people get it wrong. If you can examine your KPIs, and if you have bad ones, get rid of them, right? A lot of times people are trying to look for, I would say, illegitimate ROI on training. From a trainers' perspective, they're like, "Well, my boss is pressuring me maybe to prove the value of what it is I'm doing." And sometimes they flatly grab for a KPI that isn't very helpful or might just make them look ... Test completion is really easy, right? Because it's like, "Oh, look, everyone took this test, everyone passed it, I did my job."

Or sometimes people try to reach even further in an illegitimate way. They haven't built up the data connections between how they design their training initiative and what result they're trying to improve. So, let's say you're trying to improve CSAT, or NPS, or CES, or productivity or whatever it is, whatever KPI you're trying to move. And they didn't spend the time really proving out re their thesis. And they just illegitimately grab for KPIs, and that confuses things as well.

And you have people trying to claim credit for maybe something, someone in operations that something, or maybe has nothing to do with anybody and everyone's just really happy that a new product came out. And so, they're sending you all do these glowing reviews, right? And it has nothing to do with the training initiative or anything. But you'll see bad KPIs drive poor behaviors at work, because that's how you're measuring people's success.

And then, fourth, I would say, don't create training in isolation. It's really, really tempting. And a lot of people have training as this completely separate thing. But more than that, it's not just that they're typically organized separately from operations. But also, I would say that they don't typically consider all of the systems that are operative in your operation today, right, that curb behavior. And so, I would say trainers really have to, I would say, be empowered to have the resources and time to understand what all of these different systems that are affecting agent behavior are.

Because if you put any bonus or malus system in place, it's going to curb people's behavior. So, what are your policies, your procedures, your processes? Because those things that get you rewarded or get you punished at the end of the day are going to speak so much more loudly than a nice training program, even if it's very well put together. Because from the agent perspective is like, "That's all nice and well." But at the end of the day, like, "This is how you evaluate me?" Right? And it gets back to that very understandably primary thing.

It's like," Well, if this is what you're measuring me on, this is the behavior you're going to get." And I think training needs to take a look at that and have a voice at the table when it comes to making those decisions. Someone should be like," Hey, doesn't matter what we do here, because as long as you have this policy, that penalizes them for doing this." Everyone's going to be super nervous about spending that extra minute even if it's just a minute to carefully listen to the customers' problems, or whatever it may be, right?

Meredith Metsker: Okay. So, I'm curious what are some policies that in your opinion would work better as far as measuring the success of training?

Tony Won: I think coming up with, if we're talking... Like if I had a policy for measuring the success of training, I would say you should look very carefully going back to the behavioral piece and not being so tied up to something quantifiable in some cases, right? It's very, very hard and, especially, the larger your teams get to be able to pull out what is affecting what at any one time to isolate efforts, and what it actually does to the organization?

So, I would say, be less about those particular numbers and be more about positive behaviors that you want to see exhibited in your teams, right, and those points. Like if you look at, for example, some of the places where people say they received the best customer experience, a lot of them typically have... It's not that they don't look at the numbers, but there's much less emphasis on the numbers and much more emphasis on big ideas. This is how we would like you to treat our customers. This is how we want our customers to feel as they interact with you.

So, instead of aiming for a certain number, they might ask you, like a call handling number or something like that, like," Oh, you must..." and resolve all of your calls in 15 minutes, whatever. They'll say things like, "Did you do everything that you could to make life easier for the customer?" So, it's much more about these overarching ideas that will pull people's behaviors towards, in general, more positive customer outcomes.

Meredith Metsker: Okay. So, for support leaders who are maybe in the early stages of adapting the way that agent training is done and viewed, how can they go about tying those different policies to the bottom line? Because I'm assuming, they still have to report out, they still have to convince the C-suite that this is... the new agent training is effective. So, I'm curious what your thoughts are on that.

Tony Won: Yeah. Reporting effectiveness of training. There's a lot of data to sift through to get to those answers. Right? In general, it's one of the tips that I like to drill into my younger colleagues in the organization. Really working with your data teams to find out what is the legitimate connection between the output of your customer service teams' performance and customer outcomes, right? So, in some cases, you're looking for loyal customers, for example.

And because you have loyal customers, they have repeat business, right? And then, what specifically is the effect of great customer experience and service recovery from a bad experience with your company, et cetera, worth to the business, right? So, there's a lot of specific data and math that people will have to crunch to, but you want to be looking for those explicit connections between what's happening with your customers and your company's ecosystem, what happens when they experience a service failure or product failure?

Because usually right after that, they hit your team, right? What happened when they hit your teams? And then, depending on whether that's a great experience or whether that's a poor experience, and specifically, for what types of problems, what then is the effect on, let's say, retention for customer, spend for a customer. And these are all different depending on the different contexts, but you want to find those connections to the bottom line.

And so, when you're looking at training and you're saying something like, "We have done a lot of study, we know these sets of behaviors really help to... on average, improve customer outcomes, whether that's CES, NPS, whatever it is that you're using. And that these have a positive effect on these numbers over here that are important to the business." Right? So, really understanding the entire customer journey with your company, not just from a very narrow like, "Oh, this is the customer service experience. And we got a survey, and that's it."

But no, no, no, no. What was their experience before they got to you? And what was their experience afterwards? And what is their continuing experience, hopefully with your company? And I would say that those numbers are really important to find and to work out. And so, when you think about, let's say, if you're a trainer, you think about some of those attitude or behavioral changes that you want to make, you can find legitimate connections.

And it might not always be, again, super direct, and there might be a bunch of other things. It's best to be honest about that. But I think if you do evaluations at all the levels of Kirkpatrick and you can make that business connection, that provides a very, at least robust picture that you're trying to pay attention to learning in the organization and how that's helping and what's happening as a result of those things. You might not be able to have many times like a one for one.

If you have all these pieces of information, you can tell a story about the overall general positive effects, right? And you can look at even additional things like attrition, people who are intrinsically motivated, and have that feeling within them satisfied. A huge part of that is mastery or continuous learning. They'll be more satisfied. You'll have less turnover. And we all know turnovers are expensive for many reasons. So, understanding how to calculate all of those costs in an operation as well, because it's not immediately obvious in many cases that high turnover rates in call centers are expensive to the company.

Because a lot of people don't really spend that time calculating like," Oh, the loss of all the tribal knowledge for someone who's been here for two years, and, Oh, you have to retrain somebody, and, Oh, the lead time for hiring a new person." And then, you have to ramp them up, and then all this other stuff. And a lot of people aren't used to calculating those.

And so, I would think, in addition to what was already shared, looking at those as well to make sure that you're proving your point and helping training be much more functional and practical to the org.

Meredith Metsker: Yeah, that makes a lot of sense. I wanted to go back to point number two. You were talking about adjusting teaching methods and adjusting the training methods to account for people's different learning types, which I love that especially because I have ADHD so I definitely learn differently from other maybe more neurotypical people. So, I'm curious, what are some ways and some methods that support leaders can adjust their training style?

Tony Won: I think having the expertise within your organization where you're getting... Something I like to do is you hire people who aren't from the specific context that you're in. And so, a good example of this is, I've had friends, and I've hired myself somebody who has a Ph.D. in clinical psychology, for example. Or, you hire a behavioral economist, who has worked in, I don't know, hospitality, but working in banking right now, right? They have a completely different context. But they have a very helpful and deep skill set, like expertise.

And because they're not constrained by the typical best practices in the industry that we're in, they have no problem asking the question, like, "Why would you do it this way?" Right? Or, "How about we try something new? Or, that doesn't seem right to me?" Or, they don't have the baggage of like, "I've been doing this for 20 years and this is the way that we always do it. And I've given so many speeches. And I've been a guest on Meredith's show." And all this other stuff, like they don't have that baggage, and sometimes, frankly, that ego, right?

They feel a little out of place. They're like "Um." Maybe they feel like they have some imposter syndrome because like "I've never really done this before. I have this other expertise." I think those slightly unorthodox hires are people who have very useful deep skills in adjacent areas that you think would help out, right? That's one. The second would be what I would say is from a practical example. Usually, training many times is built top-down. Some executive gets angry about something or a client gets angry about something. They want something fixed.

And then, the training team gets put together, and then they start building this thing. And then, they deploy it, right? Something that I've done quite a bit in the past and encouraged to a lot of my teammates to do is like, "Well, hey, hold on a second, you have a need, you have an idea. Why don't you get a bunch of people from the operations, the agents themselves, and instead of just this top-down thing, you bottom-up as well and you meet in the middle."

And so, you have the strategic concerns and the business concerns that are coming from various people. But then, you also have the very valuable information of the population that you're aiming to actually convince that like, "Hey, this is a good thing to do. Or, this is a new practice that you should adopt." And having them be part of the creative process for training to say, "No, that won't work, or that idea sounds... We actually do it this way and this is why we do this." And they're like, "Oh, we've never thought of that before."

And so, they're the ones actually doing the work. And they have so much valuable information to offer up to trainers and curriculum designers and instructional designers, whoever is building the material that I think a lot of people would do well to consider. And then, there's also just the practical, nitty-gritty stuff where you're like, "Hey, don't just think of classroom training." Right. Think of what is the age group of the people that you're trying to train? Cool.

How do they best consume, like when they want to learn themselves, where do they go? They go to YouTube. It's like, cool, why don't we do some YouTube videos? There's all of these different types of media and different ways of communicating in which training teams can take better advantage of then, let's say, the historic tendencies of the discipline.

Meredith Metsker: Yeah, I love that. I know for me, the hour-long lecture in the lecture hall from college 100-level classes didn't work as well as the more step-by-step, hands-on... like, I learn a concept, I go apply it. I learn the next concept, I go apply it. That definitely works better for me.

Tony Won: Right, right. That's cool.

Meredith Metsker: And your comment about making sure that training is both top-down and bottom-up, it reminds me of point number four, where you said that all the right people have to be in the room. You can't just make this training in a vacuum.

Tony Won: Yeah, that's absolutely correct. It's really vital, because we're talking about people, right, in the same way that it's so funny. Because we're in customer service, right? So, it's like, "Oh, hey, have a conversation with the customer, ask them for their opinion. We're going to take all of your feedback and make our company better." And like, we don't want to do that internally, for each other. It's so weird to me.

It's like, "Wait, just wait a minute, these people are valuable. They talk to customers all day long, like day in and day out. We should probably talk to them." Right?

Meredith Metsker: Yeah, that's probably a better way to get buy-in.

Tony Won: Yeah, it's funny to me that there's so little of that sometimes... It's on the verge of comical, right? I can't, like laughing as self-defense, because it's too sad otherwise.

Meredith Metsker: It's like one of those things where it's funny, but it's not.

Tony Won: Yeah, yeah. Yeah.

Meredith Metsker: So, I'm curious. How have you improved agent training within your teams? Can you give me some real-world examples and success stories?

Tony Won: Sure, yeah. Coming from gaming, specifically, I just borrowed a lot of ideas from the general industry, I would say. One of them, for example, would be the idea of user-generated content in gaming. We were just talking about this, right? Instead of just top-down, it's bottom-up. So, a lot of times, your agents can create some of the best practical short trainings that other people find useful. If we think about the concept, for example, of tribal knowledge.

And you could ask anyone on the floor like, "Dude, if you had a problem, who would you talk to on the floor about this? Or, Oh Samantha over there, she knows everything about... Everyone goes to Samantha." That's if you have that tech problem, she's the one who helps you. And a lot of times, people will be like, "Oh, well, you know what Samantha do." But it's not just about what Samantha is doing. It's like, "No, no, no, no, just get Samantha to create some stuff." Because she probably already has it in a folder and just sick and tired of telling people the same thing. She's like, "Just read this document."

So, borrowing that concept of user-generated content and really trying to find practical, fast ways of involving your agents in the creation of training or just simple knowledge materials, and not outlawing that or making people hide it from you, or whatever. The other one I would say is... And so, how that ends up being expressed in practical terms is, some people use KCS methodology. Having software that supports that methodology in real time is really, really important.

If you can build or find a KB out there that helps your agents and empowers them to contribute and change information and add their knowledge to the knowledge base as quickly as possible, that's really cool. And if you can make sure that they're also involved, if some of them are just happened to be really good practical trainers, which a lot of them probably are, just empowering them to give trainings to their teammates.

So, maybe there's a couple of seniors on your team and maybe you have some program where you can get certified and be like, "Yeah, I'm a certified senior agent for these issues. You're on my team, I could totally teach you about that, just ping me, right, not a problem." Instead of having to register for a course or wait till three weeks, or whatever, when the training issue is coming out, or search this ridiculously large database where everything's very sterile and difficult to find, and the knowledge might be outdated or whatever.

The other one I would say is borrowing from the... along together with that, that peer-to-peer concept. So, you have those tools, you have involving people and generating the training. But I would say also, training each other, like sharing knowledge and information and helping each other out on the job is a very... it's so natural to any workplace, right? So, instead of pooh-poohing on that natural function, finding ways to integrate it into how your teams actually function.

So, if you have a team of 15, for example, how can you help people understand how to get help from each other and incorporate that tribal knowledge into a much faster way for people to ramp up and learn on the job, versus always having to go through something that's super structured where you have to register and all this stuff. It's like, "No, you have X number of teammates designated to this stuff, and they can help each other." Right? And it's just this natural function. And how are you going to capture that training and learning?

Because then, the question changes, right? It's not how do you stop that. It's like, well, how do you facilitate it and how do you capture that data so that you can get a better understanding of what's actually happening between those people. Because there's all these little small ways in which we learn from each other constantly as we're working together. And that that's often disregarded or not well understood. And I think teams suffer for it, too.

They go, "Oh, training other people is not my job." And it's really easy just to brush it away when... It's like, "No..." Instead of waiting for that course that's a week later or never finding it. It's like, "No, you could just ask Jim, and Jim will take five minutes and just show you what's up, and then help assign the next 10 of those interactions to you, and then check up with you again afterwards."

There's just all these interesting, more casual ways, and organic ways of learning that I think, is possible. So, it's less of a strict process and more of how do you help encourage the natural processes that happen. Because those are the most useful and practical. Right?

Meredith Metsker: Right. Yeah. I suppose that probably ties in with point number three you made earlier about not being overly reliant on KPIs or not using the wrong KPIs. If people are so worried about constantly meeting these goals, then they're probably not going to want to make the time to help their peers out.

Tony Won: Yeah, of course not. If you're restricted and you've got a certain number to hit, even if you hit it regularly without a lot of stress, just the fact that it's there driving that stress, like we talked about earlier, right? The unfortunate side consequence of that particular KPI is not in my interest to help my teammates in a cycle. What environment are we creating for people where it's literally not in your best interest to help your teammates get better, right? I don't know.

Meredith Metsker: So, I suppose as a support leader, as you try to facilitate this more open environment for this more casual training, I suppose that's a combination of making sure that you have the right KPIs, and that you're maybe flexible with them. But also, I imagine hiring the right people with that right mentality.

Tony Won: Yeah, yeah, that's spot-on. Hiring is so important. I just joined a startup not too long ago, and it's just all I'm doing these days. And it's just so important to look at the attitude of the people that you're hiring. A lot of people may literally have the skills, competency to do the job well. But there's so much to be said about people who are really striving for excellence in everything that they do, even if it's just a transitory job, right? Like, "Oh, I know I'm not going to do this forever."

I always tell people, "Just be honest with me, that's totally understandable and cool." For example, this might be an entry-level position you're applying for. You're only really going to be in it for a year or two while you finish college or whatever. That's completely fine. Just so long is the trade-off is you get a job and you can learn a bunch of skills that are going to be helpful for your next step in your career. But in exchange for me, really putting energy and effort into helping you grow and do really well, please, come with that winning attitude of... you're going to put your all into it.

You're not going to be deterred by maybe, the office environment or seemingly the way things are, and you're going to push for change and try to make a positive impact every day. And I think that is a very fair trade, and it's completely fine just to be upfront about that stuff. But yeah, hiring for attitude is extremely important in making this work, right? If you're just going to come to work and be like, "Oh, I'm just going to do whatever is on the sheet."

I think there's deeper fundamental issues with motivation that I would love to talk about before we even get to any of the things we're talking about today. Right? I think there's a problem there. And it's sad for me to encounter people like that.

Meredith Metsker: Yeah, for sure. I'm curious, for you as a support leader throughout your career, has there been a time or two where you've come in as the leader, you've seen that the training process was not ideal, you made some changes, and then saw some results from it? And just curious if you've done that, how it worked, and then what your results were?

Tony Won: Sure. One example I'll gave is... It's a very long story though, so I'll just try to hit some of the summary points. I guess you could say training was done in a very typical and traditional manner that we've highlighted a few times during this chat. And coming into, help people do this bottom-up thing a little bit, or do this training thing a little bit differently. So, one example was implementing a whole different channel, for example.

Traditionally, if we had taken the historic approach, people who were separate from the operations would have come in and created this whole thing, massage it like crazy, polished it, and then deployed it out to the operations. And it might have had various levels of success, right? But instead, if you had to summarize the breakup of the work, it was more like 10% management/top-down and 90%, bottom-up. And the result of that at the end of the day was the fastest adoption of the new channel of that particular chat channel and also the fastest. So, I would say, speed to excellence, which was measured by hitting the 90% plus satisfaction ratings from customers on that channel.

So, a number of different offices had already deployed that. But by the time it was our turn in taking that approach, and the credit really goes to, again, the 90% of the work was the bottom-up. The people in the actual operation doing the work, helping figure out like, "Hey, how do we roll out this training and this new way of interacting with customers, and this new tool that we had to use better?" And the credit really goes to them, right? And so, they did a fantastic job, and got stellar results that that worked the best for, across a number of different teams and a number of different global locations.

The other one would be a similar story where there was a, and this one was much harder to do, there was an initiative around trying to move away from a typical call center, focus on, let's say, more traditional KPIs and looking towards these big behavioral buckets that we wanted people to really understand. And so, it was much less about your rigid processes, and much more about, these are the general ways we want you to approach these problem sets and you can make whatever decision you think you need to, right?

So, instead of only best practices and you must do it this way and there's no other answer, here are some areas where we know what the best practices, here are other areas in which it's not about best practice, it's about good practice. So, here are some general approaches that we know to be good and we understand. But the actual literal outcome can vary depending on your judgment, right, and you're completely free to make that judgment and make that decision on the spot for the customer. And so, that was a lot harder. The initial proposal was rejected by the agents themselves. And so, we came back.

We're working across teams, and I said, "Well, how about this instead? How about we come back and we joined force. We're going to send over a couple of people from the actual operation to work together with folks over at HQ to figure out how we can make this a little bit more effective for the agents." And that was a very long process, but the results at the end of the day, we were looking at like a long tail of improved customer satisfaction surveys. But it wasn't an overnight thing. It definitely took time. But, yeah.

And I would say that there's still some challenges in that area, because again, you have some conflicting systems that still exist within the team. But overall, it was fairly positive for the people who went through the program. Yeah, so, those are two examples.

Meredith Metsker: Awesome. Thanks for walking me through that. I think that's probably a good place to start wrapping us up here. But before I ask you my final question, is there anything else on our topic that you would like to add that we haven't covered yet?

Tony Won: No, no, I think we've had a really good conversation on it.

Meredith Metsker: Agreed. Okay. So, the last big broad question here. But generally speaking, what advice do you have for up and coming support leaders?

Tony Won: The five I typically give if you're just starting out. One is that, it's very tempting because people will have best practice, again, like the seemingly right answer for everything. But if you oversimplify your problems, you're going to get the human outcomes wrong, right? So, try not to oversimplify things. Dealing with people and behavior is very, very complex. But also, at the same time, try not to overcomplicate things as well. Don't over-engineer, instead just focus on, I would say, the adjacent shifts that you think the teams can make.

Respect how hard good leadership is, and depending on your size, make sure that you're moving at the appropriate speed. Two, I would say, pay attention to how all of your systems, again, your policies and procedures, et cetera, how the KPIs, how they're putting pressures on your team, and what behaviors that's causing. If you're in charge, just remember that the dysfunctions that you see are most likely and probably at least partly your fault. So, pay attention to those. Pay attention to what you get mad at and et cetera.

Three, I would say, keep your discussion focused on the outcomes that you want, and really try to avoid micromanaging your people. It never helps. I'm sure you've seen plenty of operations out there, where they come with 10 million different rules, and then no one ends up following any of them. Or, they just give up and be like, "Fine, I'm going to follow every single one of these rules." And your customer experience is going to be garbage, right? So, just avoid that.

Four, I would say, it seems painfully obvious, but people forget it a lot. Your customers are the most important people to your business. Fight for them. Make sure you represent their voices in your company. Don't back down, like you spend probably the most time in your company talking to customers, maybe besides the CEO, but you really should represent them well within your company and stand up for their interests. That's your job. Don't lose sight of that.

And then, lastly, make sure that you understand your connection to the top-line business outcomes as soon as you can, what we said earlier, right, making those business connections through data, learn some basic statistics, basic business principles, accounting, learn how to handle data. All of that's going to be super-duper important to you because if there's one thing we have tons of in customer service is customer data.

It's just it's everywhere, but a lot of it isn't put to great use. And I would say these massive opportunities for anyone paying attention to help the business by actually understanding all that data and feedback, and getting it back to your product teams.

Meredith Metsker: Awesome. I love that. Some good advice. All right. Well, that's all the questions I've got for you, Tony. Thank you very much again for taking the time to talk with me today.

Tony Won: Thanks, Meredith. I really appreciate it, and it's been fun. And it's always an honor to speak to people who are thoughtful like yourself and have questions. So thank you for having me.

Meredith Metsker: If anyone that listening or watching wants to learn more from you or maybe contact you, what's a good way for them to do that?

Tony Won: Cool. Yeah, totally easy to ping me on LinkedIn. Just remember that their inbox is not designed well.

Meredith Metsker: Oh, yes.

Tony Won: So, that's gross. But you can also go to my blog at moarcustomers.com. It's moar, spelled M-O-A-Rcustomers.com. You can find my email and contact info there. So, if you just email me personally. Sometimes, I do get busy, and it might take me a few days to respond. But I usually try to respond as quickly as possible. So, hit me up on any of those two channels, and I'll be happy to have a conversation.