In Part 3 of our series, Bill speaks with Kevin about choosing outside coaches and advisors, and where technology and AI fit in on the set of priorities for HR leaders.
Bill Glenn: How do you choose and how do you decide when to use an outside coach or an advisor for your senior leaders? What does exceeding your expectations look like? What’s the value proposition that you want to see that separates average from good, from great?
Kevin Cox: I look at an outside advisor the way I think most people look at a personal trainer in the gym, and by that, I mean most of us know what we’re supposed to do to live a healthy life.
Probably most of us don’t do it. And one of the things a personal trainer can do for you is not only work on your technique, so you get the goals without maybe hurting yourself or sort of without doing the exercise the wrong way. But the other thing they can do is make you feel more accountable to your own goals and your own plans, and they’ll let you really have it if that’s not the case.
When I think about executive coaching, executive advice, strategic advisors, I really think about it the same way. Now if we had a son or a daughter that had some natural talents in tennis or dance or golf or whatever, and they were really naturally good and we had the resources to do it, most of us would go get a personal coach or a trainer to help that son or daughter be all that they could be right? To take advantage of their natural talents and build on it. So why would we not do that for executives? Not remedial coaching when something’s broken, but sort of maximizing potential kind of coaching. How can you take your natural given gifts and talents and how can you make those better, stronger, faster than you can make them on your own?
That’s to me a use case for coaching. I think using coaches for broken things or broken people is a less noble cause. I think we ought to be able to do most of that internally and most of that ourselves. But I think maximizing potential coaching is a terrific use case for third party coaching.
When I think about the difference between good and great coaches, and I’ve been around many, I think of three things. The first one is that great coaches really do appreciate the context of the engagement. I love it when a coach comes in with a playbook or a protocol just like a personal trainer does. I love that. I actually think that’s essential versus sort of total freeform white sheet of paper.
However, I think the coach needs to adapt that playbook to the context of that executive and what’s going on in the business right now. So, the best ones I’ve seen, they do that really well. They take their playbook, and they adjust it. They might overweight something, underweight something, reorder something because of context.
Number two, just like a personal trainer, they are tough and direct. They are not afraid to go kind of right between the eyes, and say, you know, you’re resisting this. The 360’s are crystal clear that this needs to happen. Your leader is crystal clear on this, and you are rationalizing, or you are kidding yourself, or you are avoiding it. But I’m going to tell you, your organization thinks you can be terrific. This is in the way, and we need to fix it, or you can be so much better at this and let’s double down on it and let’s see how much we can do in terms of your strategic ability to elevate or your ability to multitask, or the way you balance direction with respect or whatever that might be. So tough and direct is the second thing that comes around.
The third one is they have to be connected to the leadership of the company. I do not buy, “Hey Bill, I’m your coach, this is just going to be between you and me. We’re not going to talk to your leader about this because this is about a trusted relationship. I’m here for you.”
I think that’s a cheap way in that it guarantees no sustainability and it guarantees that you’re not really connecting that manager or executive member into their organization. Great coaches find a way to middle that. They don’t betray the trust of the person they’re working with, but they say, “We need to contract on how we talk to your leadership about this. Let’s you, and I agree on how to do that. So, what I say to them, you are going to know about, but there are some things that we can keep private between us.”
So that’s good to great for me, those three things.
Bill Glenn: And thematically, insights on the organization to CHRO or the head of talent, what would be critical?
Kevin Cox: That’s a bonus. That’s related to that first point I was making about context. So, you were exactly right. If you want to take the meta view of this whole thing, you might be thinking about one coaching engagement, but that coach sees the organization in a way that management either might not see or might not want to see. So, there are definitely context indicators that are useful there.
Bill Glenn: You touch on technology in the model and maybe you can expand a little bit or, or just explain what your thinking is there. And then also where AI fits into your set of priorities and what’s the use of AI?
Kevin Cox: Well, in the model, we talk about it. And this is one of the changes, Bill, as you kind of pointed out, one of the differences between chapter two and chapter one. I wasn’t trying to get at AI through this necessarily, but I was trying to acknowledge the fact that most of us are working in a more digitally oriented ecosystem right now.
Most of us are doing more things through mobile phones and Workday and other systems of record that are more digitally oriented than it used to be. We need to acknowledge that. Most of us are dealing with shared services or some sort of operational environment that we ought to continue to make more accurate and more efficient and lower cost.
I was just trying to dignify the fact that that’s the ship that we’re all on right now, no matter what company you’re in or what industry you’re part of. And I just feel like we need HR to acknowledge that that’s not the problem of the IT organization. That’s something we need to manage, and we need to understand better than some of us do. So that was that.
AI, early innings, and I don’t know. I watch this space a lot. I read something the other day that I thought was really profound, and it said AI furthers human knowledge but not human understanding. It currently has the ability to further knowledge because it’s taking massive amounts of data that a human being simply cannot process and organize and structure, and it’s bringing that to you. That’s knowledge.
At the moment, AI hasn’t figured out how to make sense of that. That’s what understanding means. So that’s just a bunch of data. It’s just a bunch of information and it’s just a bunch of correlations and that’s worth something. But the reason I’m not quite sure where it’ll go in HR, and I don’t want to come across like a Luddite here because something meaningful will happen from this, is we live in a world of sort of understanding.
We are trying to interpret data and have it make sense. Judgment is a large part of what we do. Take interviewing, right? People have been interviewing for hundreds of years and still it’s little bit more predictive than a coin flip, right? We know that. Is AI going to meaningfully change that? I’m not sure. It could, it may not. So, to me it goes in the top right hand corner of the model. There’s something called the trend watch list, and this is like a parking lot for subjects like AI that we put into that. And we just kind of keep studying it. We keep thinking about it and at some point, that could form the basis of a chapter three of this model when it gets kind of large enough or significant enough that we promote it and incorporate it into the work.
It’s too early for that to happen yet, but it’s a very important space to watch. That’s at least how I think about AI.
Bill Glenn: Yeah, for now, I reject that AI is useful for making the decision itself, but clearly there’s some element of the data feeds to make judgment or evaluation more exacting.
Kevin Cox: IBM came out with Watson Health a few years ago. They spent an enormous amount of money, and you’ll have to look long and hard to find more talented engineers than you could find at a company like IBM. And to me, that point about they furthered knowledge but not necessarily understanding. That’s a good example of it. I mean, they digested everything that had been written in journals and whatnot about, for example, cancer and how to think about diagnosis and treatment of cancer.
And you would think that that exercise would’ve resulted in very, very meaningful improvements in cancer outcomes. It didn’t, or at least it hasn’t yet. And then any oncologist you would talk to would say, “well, there’s a judgment and a pattern recognition kind of thing here, based on my experience that is non-zero.” It’s meaningful. Some think it’s a hundred, some think it’s 20, but it’s some percentage of success and outcomes. I think you could think about talent the exact same way. Love the information, might rule out some really bad practices. Point us in the direction of better practices, but people that are looking for instant coffee, you know, push this button and here’s the outcome. I don’t believe that’s the most likely case.