In this podcast, Motley Fool host Deidre Woollard talks about the art of failing with Amy Edmondson, the Novartis Professor of Leadership and Management at the Harvard Business School, and author of Right Kind of Wrong.

They discuss:

  • The complex failure at Boeing.
  • What to do after something goes wrong.
  • The problem with "move fast and break things."

To catch full episodes of all The Motley Fool's free podcasts, check out our podcast center. To get started investing, check out our quick-start guide to investing in stocks. A full transcript follows the video.

This video was recorded on Feb. 18, 2024.

Amy Edmondson: Because that's too scattershot, have been I think fail-fast fell often can imply that it means just try everything and eventually something will work. No, it's much more of an iterative process. It's a thoughtful process. It's like you try something that honestly you believe it might work or why waste your time?

Ricky Mulvey: I'm Ricky Mulvey and that's Amy Edmondson, the Novartis Professor of Leadership and Management at the Harvard Business School. She is the author of Right Kind of Wrong, the Science of Failing Well. My colleague, Deidre Woollard caught up with Edmondson to talk about the complex failure at Boeing, what companies and you can do after something goes wrong, and how investors can apply a smart failure strategy?

Deidre Woollard: I love this book. I was expecting to feel a little better and I think, I do feel a little better about some of my own failures, and I want to get right into that because when we fail, we want to cover it up. As I've studied businesses, and I have a little love of business failures. Nearly every big dramatic one I've ever seen comes from someone noticing something and making the decision to not expose it. How do we better at failure, especially in the business world?

Amy Edmondson: Wow, well, music to my ears because that is something I have noticed as well that most of the big failures that I've studied can be traced back to somebody, usually someone who is genuinely expert at irrelevant aspect of the phenomenon, feeling unable to voice a concern at a crucial time or in some cases, voicing and concern, but really just not being heard for various reasons. Those are at least theoretically, practically preventable when we're at our best. When I say we, I mean, we as human beings, but more importantly, we in terms of the organizations that we create and leave. I think the goal is to lead an organization where no one ever believes that they're not supposed to speak up with irrelevant concern, because catching something in time is worth untold economic and sometimes human safety value. That's got to be a idea. Then you said, how do we do better at failing well? I realize, I love the subtitle but every time I hear it settle out, it sounds like feeling well, it feels like a mini failure, it's not feeling well, it's failing well, but how do we do it? Well, I think we really do have to do something that's quite hard, which is to create organizations where people are honest, and agile, and straightforward, and willing to tell the truth in a quickly way, even in the face of real doubt and uncertainty.

Deidre Woollard: Part of that is this idea of what you call intelligent failure, and that intelligent failure is actually what leads to success. I love this part in the book where you talk about Eli Lilly, they're hosting failure parties, and they want different teams to learn from small failures, so getting comfortable with that idea of these little failures. How do companies bring that failure out?

Amy Edmondson: You're absolutely right. Let's go right to intelligent failures because the art or the science of failing well means, you should be engaging in intelligent failures, I'll clarify what that means in just a moment, and you should be doing your part individually and organizationally to prevent basic and complex failures as often as possible. What's an intelligent failure? It's a failure that is in new territory, we really don't yet have the knowledge we needed to ensure success. We like it or not have to experiment, we have to do something risky, something uncertain. It's in new territory, it's in pursuit of a goal, whether that goal is a new drug as an Eli Lilly, or a life partner, or you name it. We're trying to make progress on something we care about. Number 3, you've done your homework, you've done enough background work to know what is known, what isn't known, what might work, what's worth trying next? Then finally, the failure is only intelligent if it's no bigger than it has to be, meaning, you didn't waste or use resources that were larger than necessary to get to the next step, to get the knowledge that we need. You said something just a minute or two ago about size, that we should have these small failures. In truth, intelligent failures are generally small failures. But the distinction between intelligent failures and the others isn't just size, it isn't even largely size, it's more of type. Is this new territory, is it in pursuit of a goal? Have we done our homework? Is it as small as it can be and still get the knowledge? Eli Lilly, though, one of the failures I write about, was the failure of a clinical trial of a new drug called Alimta. This is some years ago, and everything had gone well in the laboratory, in the Phase 1 trials for safety to show the drug was safe for people.

Then this Phase 2 trial is about to show with a large enough sample of people no bigger than it has to be, that it works, that it has efficacy, and Alas, it didn't work. It was a cancer drug, and it failed to show the improvement in health that they had hoped, so that's a failure, so have a failure party. Now, why would you have a failure party? For several reasons, but one is to celebrate the hard work that got us that far. There was literally no way to prevent that failure, the only way to learn whether it worked was to do a trial. They did a trial, no bigger than it had to be, it didn't work. Number 2, when you celebrate things like that, other people tend to show up that helps prevent the terrible waste of someone else engaging in the same failure a second time, that's never intelligent. We need to share the knowledge about our failures within the organization so that we can help our colleagues not replicate them. Number 3, it allows people to call something a failure in a timely way. There's always this temptation, it's not going well. Everybody knows it's not going well, but I don't really want to admit it, so I'll just push it, I'll try harder and maybe magical thinking, maybe something good will happen. It's never a good use of time and resources. Then if you do your homework, if you do it right, and you say, we failed, that's disappointing.

Amy Edmondson: Let's understand why. In this case, the Olympic story, the physician in charge of the trials looked into the data closely as you would, and discovered that in fact, some patients in the trial did very well. They were very helped by the drug, but others showed no impact. Wait a minute. Why did some do so well and others didn't so? Let's look into that. What he discovers was that the patients who didn't do well had a folic acid deficiency. That's a simple B vitamin. Not all failures end up with such a happy story. But what that means is all they had to do was add a folic acid to the drug, and then it worked for everybody. That ended up becoming a very successful drug, both clinically and economically for the company. I'm not saying that every time you take the time to analyze a failure, you will magically pull a multi-million-dollar product out of a hat, or a billion-dollar product out of the hat. But I am saying if you don't take the time to analyze the failure then you have no hope of gaining that extra possibility of success.

Deidre Woollard: I liked what you said there about magical thinking because I think that plays into failure a lot, is that we just think it's just going to get better.

Amy Edmondson: We'll just try harder. What is the definition of insanity? Is doing the same thing over and over and expecting a different result. It's kind of magical thinking.

Deidre Woollard: So you talked about the intelligent failure. Let's go to the other side, the basic failure. So the basic failure, you described it in the book, it can be anything small. We have tons of basic failures every day, but sometimes those basic failures can lead to something really tragic. So when you're doing the same thing over and over, how do you avoid some of the basic failures?

Amy Edmondson: You try not to do the same thing over and over. Basic failure is a failure caused by a single cause, usually human error. I forgot to plug my cellphone in, and then the battery died, and then I missed the meeting. It's a basic failure and it's completely my fault, not fault in a bad way, but just I was distracted and I forgot to plug it in or worse a much more sort of blame worthy basic failure is I was texting while driving and I got into a car accident. I didn't do that, but [laughs] that happens. It's the cause of that failure, that car accident is couldn't be more simple, it couldn't be more preventable. But again, I'm not saying it's blame worthy per se. We always want to understand the whole story before we start assigning blame. But it's not the kind of failure we celebrate or have a party for. It's the kind of failure we do our very best to avoid. We recognize that people make mistakes, but especially in organizations, we want to find ways to make it easier for people to do the right thing and harder to do the wrong thing.

Deidre Woollard: That's interesting because you talked about not doing the same thing over and over. But then in some situations you have to so talk about checklists and things like that.

Amy Edmondson: Yes. I think we should do the same thing over and over when it works to get the result we want, but we don't want to do the same thing over and over that is clearly not working for us. But yes checklists are marvelous tool for helping us do things that really truly do need to be done the right way. Whether that's an airplane taking off or a cake recipe. You want to use the protocol. You want to use the checklist.

Deidre Woollard: Absolutely and part of your book, you talk so much about psychological safety, which is allowing mistakes to happen, even these basic failures where we should know better and to be addressed. You've got some examples in the book about how businesses can build that psychological safety. It seems like it's an easy fix, but it's one that I don't see a lot of companies doing.

Amy Edmondson: First I'll define psychological safety as a belief that the environment is safe for taking interpersonal risks, like speaking up with a mistake. That's one. Another interpersonal risk would be asking for help when you're in over your head, and that too can help avoid many a failure, or pointing out that your colleague is about to do something dangerous. That can feel very hard to do both at work and at home. Yet, it's really crucial to feel that that's OK around here. That's what we do around here because we care about each other, about the product, what have you. Psychological safety defines this sort of environment where you just believe that you won't be rejected, humiliated, punished for speaking up, and particularly with respect to failures so that you are invited and feel invited to be a failure preventer and an intelligent failure producer. I'll be a basic failure preventer and I will be an intelligent failure producer, but I need to have psychological safety. One of the ways that I think good organizations try to create psychological safety, first and most importantly, is by calling attention to uncertainty, calling attention to the nature of the work, calling attention to the reality of human error or the reality of system complexity. If you're a leader of a team or an organization, you say things like, you know, we've never done a project like this before. Things are definitely going to go wrong on the way to I hope our magnificent success. We need to hear from you. You're issuing that invitation that is logical and rational to say, this is the kind of project or this is the kind of organization where you are expected to speak up because of what's at stake. Then you put policies in place, like one that I write about is called blame free reporting, which is not the same as blame free action. But it doesn't say anything goes, do whatever you want and you'll never be blamed for it. No, it says when you report something that's out of whack or that you don't think is right, or that you don't understand, you'll never be blamed for the act of reporting. The act of reporting is always valued around here. Now if we do our research and we get into something that went wrong and we discovered that someone showed up for work drunk. That's a blame or the act. They'll be held to some kind of standard for that. So there'll be some kind of consequences for that. But there'll never be negative consequences for reporting an error or reporting a problem, or reporting a deviation.

Deidre Woollard: That's really important I think for businesses, especially even things you talked about in the book about assembly lines in factories and being able to stop the line at any time and not be worried you're costing the company money or something like that.

Amy Edmondson: You asked, why is it so hard? I think it's because many managers equate blame free reporting or the idea of reporting with lax or anything goes environment. They haven't recognized the distinction between people behaving in a problematic way and people being willing to speak up honestly about what they see, and those are very different phenomena.

Deidre Woollard: One another phenomenon that has happened with Silicon Valley, this whole idea of like fail-fast, fail better, just keep failing. It almost seems like it's sort of the opposite of what you're talking about, which is really analyzing every failure as something to learn from it. It's just like if you just keep failing, you'll eventually get there. That always work, does it?

Amy Edmondson: No. That's too scattershot.

Amy Edmondson: I think fail fast, fail often can imply that it means just try everything and eventually something will work. No, it's much more of an iterative processes. It's a thoughtful process. It's like you try something that honestly you believe it might work or why waste your time? You believe it might work, whether it's starting a new company or trying to invent something or design something, you earnestly believe it could work and here's why. Guess what? You were wrong, and it's not your fault for being wrong. It's new territory, nobody's ever been here before. Now, your job is to figure out why it was wrong quickly and what that implies for what to try next. So it's a scientific process. It's a thoughtful process. I don't mind the rhetoric fail fast, fail often, but it needs to be clarified a little. It needs to be sharpen so that people understand it's not that scattershot process, it's that very thoughtful learning process. The more we try and learn, we get smarter, and our next experiment is a little better, and the one after that's better still.

Deidre Woollard: It's also not about, we fail just tossing the whole thing out. That part of that after process is about figuring out what what can be saved.

Amy Edmondson: Yeah. You've got this failure, you've invested in it, you might as well get your money's worth. Figure out what that failure tought you, what new information do you have that you didn't have before and how do you put it to work?

Deidre Woollard: You also in the book you talked about complex failure. I found this one, this part particularly challenging because people want to figure out, here's the one thing we did, it's wrong, we just don't do that again. But there's often a lot of things that go into that. So when I'm looking at businesses from an investor standpoint and I see something wrong with the company and maybe can't figure it out, I keep looking for that one thing they're doing wrong, but it doesn't always go that way, does it?

Amy Edmondson: No. Some failures really are the perfect storm. Just the unpredictable break down caused by a handful of factors coming together in just the wrong way in and unpredictable way. Let's say you started a new business in February of 2020, and it was something that involved customer service in the real-world. Your business was very likely to not get off the ground very fast because of the timing of that. It's something you could not have seen coming. A complex failure related to a global pandemic, for instance. Other times, complex failures, you own some of the causal responsibility, but it's still not that simple. The Boeing 737 MAX failures, both the first two very visible tragic crashes in 2018 and 2019, and the more recent challenges are the product of a handful of factors. Some of them self-imposed by Boeing management and board, some of them external like the unexpected. There was an unexpected announcement of a new product by Airbus that led them to hurry the development of a new plane to compete with this product. Some of its external, some of its internal, but they put themselves in a position, unbeknownst to them, really, they weren't thoughtful enough to see it coming, where they dramatically increase the chances of complex failures.

Deidre Woollard: That's a fascinating example because we tend to just zero in on the idea of the bots, well, they just have to fix the bots. It sounds like what you're saying is there's a lot of things they need to fix in order to make sure that this type of thing doesn't happen again.

Amy Edmondson: Because you really have to ask yourself, it's the why behind the why. Why would the bots have been not put in properly? What are the conditions that led that to happen? Then you get to such things as capacity problems in the plant, excess hurry, customers in a rush, a design that's more complicated than is optimal and on and on, it goes. You have to look at each of these factors and then step back and say, how do we design a system that is less vulnerable to this kind of breakdown?

Deidre Woollard: Well, and the other factor there too is that people want to fix the immediate problem without fixing the long-term problem. So it sounds to me like with Boeing, they're trying to focus on just this one piece of the plane that's having this issue. But what you're talking about is a much larger issues at the company that could lead to something else happening again. I think that happens in a lot of companies that they fix the immediate, but they don't necessarily fix the larger issues.

Amy Edmondson: That's right. They fail to check whether this problem is the problem or whether it's a symptom of a larger problem.

Deidre Woollard: Interesting. Well, you have this line in the book that I found really interesting was about playing to win versus playing not to lose. As an investor, I know I'm probably going to lose sometimes, and the more risks I take, the more smaller companies that I get excited about, the more likely I might lose. How do you adjust your mindset when sure about that?

Amy Edmondson: I think of investing as a classic context where you get good at this because you understand that if you are taking on some highly risky but potentially very profitable investments, you will want to balance some of those out with safer investments that are less likely to give you a huge return, but are less likely to fail as well. So I think you're intuitively doing a smart failure strategy in that context when you're good at it. But there is always pressure or a force that leads you to want to play it safe because none of us like to fail. If we give in any field, not just investing, but if we give into that instinct to just want to play it safe, only do things that we're 95% sure are going to succeed, we may see a fair amount of success, but we don't know how far below the success we could have had will be. So you can think about that in athletics or in the job search. If you take a job that you know you can get that job and it's not a not a stretch than maybe you will get it, but maybe you could have gone out for that dream job and maybe you would have gotten that too.

Ricky Mulvey: As always, people on the program may have interest in the stocks they talk about and the Motley Fool may have formal recommendations for or against, so don't buy or sell anything based solely on what you hear. We will be off tomorrow for President's day and we will be back on Tuesday. Thanks for listening. I'm Ricky Mulvey. We'll see you then.