Rethinking What Commercial Banks Measure

August 20, 2018 Jim Young

Commercial banks already measure plenty of things in trying to determine ways to optimize their performance. But are they measuring the right things? And perhaps just as important - are they ignoring the right things? 

Dallas Wells and Jim Young explore those questions in this week's episode of the Purposeful Banker. 

   

Helpful Links

Immeasurably Important

The Default Settings That Can Help Your Commercial Bank

Why Benchmarking Is a Bad Idea

Podcast Transcription

Jim Young: Hi and welcome to The Purposeful Banker, the podcast brought to you by PrecisionLender, where we discuss the big topics on the minds of today's best bankers. I'm your host, Jim Young, director of communications at PrecisionLender, and I'm joined again today by Dallas Wells, our EVP for international operations. Today we're going to talk about what you measure at your bank, what you choose to ignore, and why sometimes the latter may be just as important as the former.
 
We're basing a lot of our discussion on the article, Immeasurably Important by Morgan Housel, partner at the investment firm, the Collaborative Fund. We'll include that piece in our show links. Let's start off with the opening anecdote from the piece. It's about Robert McNamara, formerly the Ford whiz kid who then became, I want to say, Secretary of Defense, I think, and how he measured US success in the Vietnam War. Give us kind of quickly, what's the quick moral of that story?
 
Dallas Wells: Yeah, so McNamara had great success at Ford, and went to the job at Secretary of Defense and sort of tried to apply the same principles, which was basically measure everything, and use that data to guide your decision-making and to find efficiencies and to really measure your success. He applied those principles to the Vietnam War, and he was measuring things like number of weapons captured, number of enemy soldiers captured, number of soldier deaths, and comparing those in the Viet Cong versus for the United States. By all of those metrics, there was no way to argue that we weren't winning the war and winning by orders of magnitude.
 
The anecdotal number they use is 10-to-one. There would be 10 North Vietnamese soldiers killed for every one American soldier. McNamara just could not figure out why the war was being experienced so negatively, portrayed so negatively, here in the US. The point that was made to him was Americans don't care about the 10. They care about the one. Some things are just immeasurably important was the title of the article and kind of the anecdote of that story, that some things are way more important than others, and they will tell the entire story, no matter what the rest of your data might say. It's really important to get those right.
Interested in learning more about PrecisionLender? Visit our home page.
Jim Young: Yeah. It's interesting. Just earlier talking with our CEO, Carl Ryden, and he was telling me the story of when IBM switched over from their green screen monitor to the color monitors. He said they tried to do an ROI case for it, and they couldn't really come up with anything other than people are just going to like this a whole lot better. That turned out to be right, but if they'd just gone strictly by the measurements, the measurements would have told them, "Don't bother doing this."
 
Dallas Wells: And Apple did the same thing with the graphical interface and using a mouse. Especially with new things like that, they're new and different, and they're really, really hard to measure, but also really, really important.
 
Jim Young: McNamara's errors are clear in hindsight, but how does this apply to banking? We're not out there trying to win hearts and minds, or are we?
 
Dallas Wells: Well, it's interesting you phrase it that way. I think maybe in some ways, banks are, or maybe they should be. Let's start with kind of how this applies. Every bank that we come across, every bank in the world, is trying to make sense and make use of and make value out of their data, so these big data piles that everybody's sitting on. There's new technology, new approaches, that is sort of unlocking that data and giving banks the ability to all of a sudden measure things that they could never measure before. Their reaction to that is, well, if measurement is good, then more measurement is better, right? The number of reports and tables and charts and dashboards that we see in most banks has just exploded.
 
There's now a whole lot of noise out there, and buried in that noise is some really insightful, really important stuff. What banks are not yet having much success figuring out is how to sort through all that noise, how to get rid of the stuff that's not important and find the things that really move the needle. That's where we kind of get to the hearts and minds thing is it's a numbers-based business, and it is very data-driven, but when you get right down to it, you are selling, yes, a commodity product, but it is a financial product.
 
Almost like that 10-to-one that McNamara dealt with, there's this outsized importance to it, right? Their mortgage on their house, their personal checking account, the business cash flow for this business that they've started and grown with their blood, sweat, and tears, these things have an emotional attachment to them. Mistakes that banks make are really compounded, and also the things that they do well are seen in a much warmer way than you would expect when it's all math-based. Banks face, I think, a very similar situation here.
 
Jim Young: You kind of tapped on it that, look, everybody, our company included, is working on these systems to be able to tap into that data, and like you said, to be able to provide measurement or insights in areas that we haven't had before. Does this point out a flaw in the system though? I guess I'm trying to figure out ... On the one hand, it feels like we're talking about it's important to figure out which are the right numbers to focus on, but at the same time, going back to the McNamara example, it feels like we're saying sometimes there aren't going to be numbers attached to things that are important.
 
Dallas Wells: A couple things there. One, why are you doing AI and machine learning and sort of the deep quant stuff? There has to be a why behind you go through that effort. For too many banks, the why is efficiency and cost savings. They view those things as potential replacements for human labor. What are the jobs that people are doing that we can write an algorithm to do, or that an API can do just as good, maybe more accurate, and can do it cheaper? I don't think that's the real value. That's not how we are trying to design the value in our software, and that's not where we see the real potential.
 
The potential instead is to keep that human in the loop, and instead use the AI and the machine learning to augment them. As Carl Ryden puts it, we're not trying to build robots. We're building Iron Man suits. It is a way to amplify the human that's still fully in control, because humans are pretty good at sort of the art part of this, of the figuring out which things are important. You have relationship managers out there that can say, "My customer's going to hate this. I'm going to lose my customer because of this thing." It can be things like a five basis point difference in rate, but it says that it's 4.97 versus 5.02.
 
The human can understand that the flipping to a five handle on that has some outsized emotional importance that really matters, whereas the software is just going to do the math. Yes, the tech can get smart about those things too, and eventually maybe we get there. But it's not today, and it's not tomorrow. It's not right around the bend. That human judgment is still really important, and we have to keep the humans included in the loop there. It's about the goals, and it's about how you design these systems. Yeah, you'll get efficiencies, but I don't think that's how you should start designing them is to find ways to just cut people out of the middle.
 
Jim Young: Yeah, but going back to sort of another analogy that we've used back in our previous history in terms of the Moneyball sort of aspect of it, it is a balancing act, right? Right now in this article, and I think what we've, in a lot of our discussions and our content, we're advocating for that, "Hey, you've got these predictive things, but you've got to marry it with human judgment." That augmented intelligence or a centaur or all those sort of analogies that we use nowadays is, "Don't forget about the power of human judgment." In this case, we're killing them 10-to-one, but you know what, our guys on the ground tell us that their morale is a lot higher than ours. Going back to our old days, it was, hey ... In baseball, it was that my eye test tells me this guy looks like a better player, but actually the numbers tell me the opposite. You can't skew too far one way to the other, it seems to me.
 
Dallas Wells: Right. Yeah, I think that's exactly right.
 
Jim Young: Yeah, but then we get into the whole matter of ... I got to tell you. This article, by the end of it, I was like, "Maybe we should just scrap machines and just all go with our gut feeling," because it started talking about, look, very active trying to measure something can affect what you're measuring. I think Heisenberg is involved with this, but we certainly have seen this with incentive plans at banks. You measure people on how much they grow, and you, by doing that, alter their behavior to try to really, really grow things in a certain way. How do you factor that part into it?
 
Dallas Wells: Yeah, that's a tricky one. It's one we see banks wrestle with. Heck, we're wrestling with it right now. We're talking about measuring our own sales activity. What you really want to measure is outcomes, of course, but you want to be able to have an idea of what things cause those outcomes. Well, you can say, "These specific types of meetings lead to sales," but if you incent those meetings, guess what you're going to get? A whole bunch of meetings. Those are the kinds of things that you have to try to be aware of as you design these things and as you measure these things. I think again where banks have some work to do, and one of the things we spent a lot of time talking about is banks are good at presenting the data. They measure these metrics, and then they sometimes will build incentive plans around them, but that's where it stops.
 
You actually have to close that loop, and you have to look at how the metric that you showed or the incentive that you gave, what behavior did it change? That's something that data science is pretty good at that banks have yet to fully adopt, which is here's the thing we put on the screen in front of our banker. What did they decide to do because of it, and what was the outcome? You have to have good outcome labels in there, and you have to have what the behavior was. You start to link those things together. It takes doing it at scale. It takes doing it with some discipline. The very largest banks are getting smart about this, and we see a few that are starting to do it. It's a very powerful tool when you take all that data and all that insight, and you tie it to behavior, and you measure those outcomes.
 
That's the full loop. That's where you actually start moving things in the right direction, and you can adjust course when they go in the wrong direction. That's the thing that's going to be hard for the smaller banks to replicate. We've talked a lot about scale and M&A, and why it's difficult for small banks, and it typically has always gone back to they just don't have enough scale to cover all the overhead burden and the regulatory costs. One of the things that we've started thinking about and talking about is maybe this aspect of it is actually more important, which is that it's the data scale, more so than the kind of physical infrastructure scale, that's going to be the real difference.
 
When you have a large trillion-plus dollar global bank that has thousands or millions of interactions, which they can do that sort of measurement, what did we incent, or what did we show, what did we measure, what was the behavior, what was the outcome, they can steer that thing really fast, and all of a sudden, they're the Amazon of banking, showing customers exactly what they want to see when they want to see it. You're back trying to do it by gut feel, so your fear there of maybe we should just stop measuring, I think those that do it that way, they're doomed to be left behind. Eventually, they're going to have to figure out a way to shortcut that.
 
Jim Young: I feel like you might have answered this already, going to go ahead and ask this. If it has been answered, you can just go with asked and answered. Then I'll just do the show wrap, and we'll move on.
 
I love this one line in it toward the end, where they just said, "Gathering information is a science. Filtering out noise is an art." As we've talked about it, there's the mining of data for insight. If you're looking at that process, you mine that data for insights, and then you deliver those insights to the people at the bank that need it the most at the moment they need it the most, where does that filtering happen exactly? You've kind of talked about it a little bit from a behavioral data aspect, but is that where the filter happens? Or is it a filter above that that says, "Here's what determines what we're going to look at"?
 
Dallas Wells: That's a really good question. We actually spent a lot of time with this, and I'll get a little bit into the functionality of our software just to explain why. We have a bot named Andi, and Andi delivers coaching essentially for relationship managers at a bank. But we have physical constraints, and we also have sort of human attention constraints.
 
Andi will pop up on the screen and have coaching help insights to give a banker, but there might be ... If you're talking about a complex transaction, so multiple million dollars, multiple credit facilities, deposit accounts, fee-based income, an existing relationship, related entities, all kinds of stuff going on, there could be literally thousands of things that we want to tell that banker right now. Andi has a lot of things to say and is going to be bursting to say all those things.
 
We have to figure out actually how to filter through that and say, "All right, we've got realistically maybe seven to 10 things that we can tell that banker." More than that, it's either not going to fit on the screen, or they're going to stop reading, whichever comes first. The starting point has to be, "Well, which thing is most impactful? Which thing is realistic?" We could always say to every banker, "Hey, raise the interest rate on this loan by 3.5%, and voila, your deal looks awesome." But is that realistic, right? Can that realistically happen? You have to have kind of odds of it actually working and being realistic and the impact, so there's a little bit of a formula there. Some of this is art, some of this is science.
 
Your question about where does this filtering happen, I think there's a human element to it to just get started. Figure out which metrics really move the needle in whatever you're doing. What are the two or three most important things? Start there. Then every time you want to add something, you kind of have to do the arithmetic of, is this worth adding? Is it more important than one of the three or four things we've already got on there? Just know that there is sort of limited capacity for things on which we can make decisions. It kind of goes back to ... We talked about it with incentive plans, right? If you have 25 metrics that you use to calculate your bonus payments to your loan officers, you might as well have none, because it's too complicated.
 
The same principles apply here. You can use some human judgment, but really the formula you're going after is sort of how realistic is it times impact. That's your biggest kind of expected value. Some of that can be done with tech, some of that can be done with people, but that's the process. Where it happens, that's where this gets really hard, because it's got to happen at scale. You have to kind of design it, and then you have to use technology to leverage that. That's what all these banks are going through right now. When they say digital transformation, that's essentially what they're talking about. They're saying, "How can I tell my front-line people the things that they need to know in a way in which they can consume it, filter out the noise, and how can I do that at scale?" Welcome to the multi-billion-dollar problem. Those who solve it are going to do very well for themselves.
 
Jim Young: Yeah, absolutely. All right, great. Well, that'll do it for this week's show. A reminder, if you want to listen to more of our podcast, check out more of our content, you can visit our resource page at precisionlender.com, or you can head over to our home page to learn more about the company behind the content. Finally, if you like what you've been hearing, make sure to subscribe to the feed in iTunes, SoundCloud, Google Play, or Stitcher. We love to get ratings and feedback on any of those platforms. Until next time, this has been Jim Young and Dallas Wells, and you've been listening to The Purposeful Banker.
 

Interested in learning more about PrecisionLender?

Visit PrecisionLender.com

About the Author

Jim Young

Jim Young, Director of Content at PrecisionLender, is an award-winning writer with experience in a range of positions in media and marketing, from reporter to website editor to content marketer. Throughout his career Jim has focused on the story – how to find it, how to understand it, and how best to share it with others. At PrecisionLender, he manages the many ways in which the company shares its philosophy on banking and the power of relationships. Jim graduated Phi Beta Kappa from Duke University and holds a masters degree in journalism from Columbia University.

Follow on Linkedin More Content by Jim Young
Previous Article
Why Is Your Bank Innovating?
Why Is Your Bank Innovating?

Everyone at your bank is on the innovation train, but is it bound for the right destination? Learn why the ...

Next Article
Will a Bank's AI Produce an ROI? 6 Questions to Ask
Will a Bank's AI Produce an ROI? 6 Questions to Ask

Lots of banks are talking about their innovation and artificial intelligence initiatives, but will those pr...