Rotman Visiting Experts

Fail smarter: What the research tells us about failure and growth

Episode Summary

Can you get better at failure? Amy Edmondson joined Rotman Visiting Experts to explore the good kind of failures, the kind we need to avoid, and what we can learn from it all.

Episode Notes

Can you get better at failure? Amy Edmondson joined Rotman Visiting Experts to explore the good kind of failures, the kind we need to avoid, and what we can learn from it all.

Episode Transcription

Brett Hendrie: It was 1876 when Milton Hershey launched a confectionary business in Philadelphia. The sweets maker had great recipes, but he only lasted six years before the soaring price of sugar forced him into bankruptcy. In 1883, he tried again in New York, this time focusing on milk-produced caramel. His sweets were a hit, but the local cost of milk created cashflow problems. Three years later, his second business shuttered. An undaunted Hershey moved to Lancaster, Pennsylvania, where regional milk and sugar were abundant. Now, backed by a decade of experience and lessons learned, Hershey's business finally took off and became a global sensation. And the Hershey brand lives on today. 

Failure, even minor failures can sting, but we can also learn from the process. And there might even be a way for us to fail more intelligently. So how can we learn to navigate failure better?

Welcome to Visiting Experts, a Rotman School podcast bringing you backstage conversations on business and society with the influential scholars, thinkers and leaders featured in our acclaimed Speaker Series. I'm your host, Brett Hendrie. 

I'm joined today by professor Amy Edmondson to speak about her new book Right Kind Of Wrong: The Science of Failing Well. Amy is a professor of leadership and management at the Harvard Business School. She's an award-winning researcher celebrated for her work on the concept of psychological safety and teams within organizations. She's a best-selling author with a half dozen books to her name, and in 2021, she was named the world's most influential management thinker by Thinker's50. We are thrilled to have her with us here today. Welcome back to Rotman Amy.

Amy Edmondson: Thank you so much.

BH: Congratulations on the book, which I truly love. I found it full of so many great insights and so accessible. I wanted to begin our conversation by asking you the social context from which we understand and explore failure today. It seems to me that we have diverging messages that confront us. We have on one hand of Silicon Valley-informed approach of “moving fast and break things.” And on the other hand, we have an approach that we often see on social media that just celebrate success. So how can we better understand how we see failure in our culture today?

AE: Brett I think you're absolutely right. We are barraged with conflicting messages. And I think that's quite paralyzing at times you hear the message, “Oh, we're supposed to celebrate failure.” But then you also hear the message that only success counts. And worse than just individual success, from say, messages in social media, are the contexts the managers in health care, aviation, where they're thinking, “I'm sorry, not in the real world where I work, failure is not an option.” 

And so these conflicting messages are very real. And they essentially reflect the lack of a clear framework, the lack of adequate terminology to allow us to distinguish between the truly good kind of failures that we must learn to welcome, because they are on the path to success; and the preventable kinds of failures that, in fact, we should work hard to prevent. And the Hershey story is just perfect. And I am sorry and sad that I didn't have that story in the book because it's magnificent. And what Hershey did and his failures were clearly on the path to success. And yet, I'm sure it didn't always feel that way at the time.

BH: I want to talk about the frameworks that you mentioned. And we'll get to them in a moment. But why is it for the managers and the leaders in the organizations where they're afraid to fail that we've become so conditioned to be concerned about what will happen to us if we fail?

AE: We have learned the stakes feel very high when we have this belief, whether it's conscious or just below the surface that “If I fail, I'll die. If I fail, my career will be over.” Which is, of course, absolutely not true —so long as that failure doesn't look like an egregious act of sabotage of some kind. That's another story. That's not what I'm talking about. But the reality is, we live in work in a volatile, uncertain, complex, ambiguous world. Things will go wrong. And it's okay that they went wrong so long as you learn from them. We try not to have the same exact failure twice.

BH: I think it's natural for all of us to overestimate the impact of a failure and to think, “I've made a small mistake this person will think poorly of me, my boss will have a negative reputation.” But generally. the repercussions are much less than what we think they might be,

AE: Yes, and there's so much pain and emotional suffering that comes from postponing, telling someone about [the failure] or hoping it will go away when, in fact, far more often than not, when you speak up when you acknowledge when you, bring it forward, it’s not that bad. 

BH: On the flip side, in terms of what we've been seeing with social media, what have your observations been in terms of its impact on how we think about success and failure?

AE: Social media has the impact of amplifying things, right. It amplifies the good, it amplifies the bad. It makes the territory far more fraught. There's so many places where you can stumble. It can feel like the stakes are very high indeed.

BH: I know that on platforms like LinkedIn or Facebook, where people are giving you an upvote on something, you're looking for that affirmation and it's challenging and difficult for people to be open and brave enough to share “here's something that didn't work” and to see what the response will be.

AE: And the response to that, maybe because of what I study, I see plenty of stories of people saying, “Here's something that didn't go well, or here's something I'm struggling with.” And those get far more likes and upvotes. I think that people imagine that it will be terrible if they are seen suddenly, as a fallible human being.

BH: Well, it's certainly an interesting context, and one that's shifting all the time, and that we didn't have 20 or 30 years ago. I want to talk about the schema that you present in your book in terms of the different types of failure. There's intelligent failure, basic failure, and complex failures. But before you explain to us about them, can you help us understand why you thought it would be useful to break them into different categories?

AE: This to me is the way to help us sort out the happy talk about failure from the lived experience of “I'm sorry, I don't want to have failure in my life.” 

Deep down, we think failure is bad. There are plenty of cultural messages, plenty of organizational messages from within organizations. It's not easy for people to sort the messaging out on their own. Maybe a better way to put that, it's not easy to know what to think and how to act without a better taxonomy. And indeed, there are failures, we should work hard to avoid. And both basic and complex failures are theoretically avoidable. And we can work hard to avoid them, some will slip through. That's okay.

BH: The first failure that you talked about in your book are the intelligent failures, and you say this is the best type. Help us understand what's an intelligent failure, and why should we be motivated to being open to them?

AE: An intelligent failure is an undesired result of a novel foray into new territory. And that could be new territory for the world or new territory for you. If you're a little kid, and you've never ridden a bicycle before, and you get on that bicycle trying to learn how to ride it, that's new territory for you. And there is absolutely no way you will learn to ride that bike without some failures along the way. The reason we need intelligent failures in our lives is that they are the only way to learn and grow. They're the only way to progress to achieve, to experience adventure, to form relationships with people.

BH: What distinguishes an intelligent failure like that, from one where there might be ramifications that are genuinely bad?

AE: Well, intelligent failure, as I said, is a new territory. It's in pursuit of a goal. It is hypothesis driven. You have good reason to believe this might work. You're trying to advance. And this seems to be the next step. And then finally, and importantly, intelligent failures are assmall as possible. You don't want to take excessive risks.

BH: The risk components seem so critical to me, it's all about risk. And so for managers and leaders at work who are trying to build an environment where it's okay for them teams to have intelligent failures what are the ingredients that they need to be focused on?

AE: Well to begin with you have to think about it this way: how important is innovation in my industry, my company, for our future success? Because there are settings where the answer to that question is very, profoundly important. And then there are settings where…not as much. 

Now to the degree to which innovation matters, you need to have people taking smart risks. You need to have people trying new things in pursuit of those new products and services that will bring in the future revenues. And so if you are in an environment where innovation might matter, and it's a kind of failure-free place — like you aren't hearing about a lot of failures — that’s probably not a good sign. Unfortunately, it will feel good. It always feels good when everything's going well. But it isn't good.

BH: I would imagine if you were in the R&D department of a technology company, that's an environment, but if you're a pilot working for an international airline, that's not one to take on risk.

AE: The true answer to your question is be very clear about the context, right? What are the stakes, what are the risks to human safety to our economic, state or to our reputation? And now that I'm clear about it, tell everybody else what I'm thinking. If I'm a pilot flying from Toronto to New York, I want to be very clear that we know how to do this, and little things could come up that we've never seen before. I need to hear from you. So I want you to speak up so that we don't have a basic failure. We don't have a preventable failure.

BH: Is that speaking up something that is powered by psychological safety? Can you help us a little understand a bit more like what is psychological safety and why is it important in those situations?

AE: You can think of psychological safety as a state of low interpersonal risk or low intrapersonal threat, so that you feel comfortable being honest, being candid. That's what psychological safety is all about. It's a sense of permission for candor around here. And in hierarchies, which, let's face it, most organizations are hierarchies, that isn't automatic. Most people will spontaneously think, “I'm going to just wait and see. I'll speak up if I'm sure that what I have to say will be well received. But if I'm not sure, then I'll just hold back or wait and see, I'll read the tea leaves.” And that can and will and has been documented to lead to things as catastrophic as airline crashes, and as small as problems just not being caught in a timely way.

BH: And so what would your advice be to leaders to create an environment of psychological safety where people feel free to speak up?

AE: To create psychological safety, remind people of what's at stake; why the project matters; how much genuine uncertainty there is; how much interdependence; how it's going to take all of us being willing to take interpersonal risks to offer ideas, to offer critiques, to ask for help when you're in over your head - that's necessary for our success. So just name those kinds of things because it creates that sense of permission. And then secondarily, be very curious. Like, lean into inquiry, ask lots of questions, ask good questions.

BH: Creating that environment seems so important to have intelligent failures. Because when you're assessing the risk component, there's judgment involved. And I would imagine it's especially important to be able to connect and dialogue with your colleagues and your superiors and your stakeholders so that you have the information that you need to make informed judgments.

AE: Exactly. I think every manager should want to prevent the kind of failure that happens when someone else in your team had knowledge that you didn't have and you made the wrong decision as a result. That is theoretically and practically preventable. There will be failures that happen in new territory where literally nobody saw it coming, and those you can't prevent, you can only try to keep them as small as possible.

BH: Let's talk about basic failures. Can you help us understand what they are and what you've seen, some organizations or industries do to help prevent them.

AE: So basic failures occur in familiar territory, in the territory where we have a formula for how to get the result we want. There's a recipe. And yet a mistake was made. So basic failures have single cause — it's usually human error, or I wasn't paying attention — which will lead to a human error. And clearly, they are theoretically preventable. But what great organizations do to prevent as many as possible is a variety of practices that range from good training programs, making sure people have the knowledge and skills they need to get the results that you hoped for; to permission to speak up, they work hard to create blameless reporting and psychological safety so that people do speak up when they need help, speak up when they see something that might be wrong — might not be wrong, or it might be wrong. They do as much failure-proofing as possible. Make it easy to do the right thing. We have many examples of that. And, and of course, they reward people for speaking up in a timely way, especially when it then prevents a failure.

BH: The airline industry seems like a particularly strong example of this. And I know that there's research being done in terms of the benefits of just having a checklist where the pilot and the copilot are going through each item of the technical efficiency of their systems before they take off. And just implementing that simple checklist, really has had a lot of benefits. And it's something that's been taken over into other industries.

AE: Into medicine especially. So aviation has become phenomenally safe compared to the earlier days. And it's really two big factors. And one was, they said let's systematize what we can, let's have checklists, let's have let's build in as many structures as we can to help us not miss things. And then the other is just they really worked hard to change the culture to make the culture of aviation one in which people would speak up despite the hierarchy.

BH: Basic failures are a little bit more straightforward. There are things that we can hopefully plan and prevent and develop systems so that they don't happen. Help us understand complex failures, and what differentiates them from basic or intelligent failures.

AE: The big distinction between a basic failure and a complex failure is that a complex failure is multi causal, it is the undesired result of multiple factors that came together in just the wrong way. Any one of those factors on its own would not have caused failure, something's a little out of place here, or you know, just a little different there. And the way they lined up — I like to think of it as the perfect storm — lead to a result that nobody wanted. And many complex failures happen often in organizational settings unfold over a long period of time - that some decision someone made years ago comes back to haunt us because we didn't really think of how that decision then would interact with our modern situation.

BH: One of the examples in your book that is very vivid and powerful is about the Columbia Space Shuttle disaster and tragedy. And in that situation, a piece of foam had fallen off the shuttle and caused a hole, and in the analysis they unpacked that this was the cause of it. But one of the things that was so interesting to me in your book was that you highlighted an analyst had actually seen this in the video recording of the shuttle launched and had they caught that before. That wouldn't have been the tragedy.

AE: That's right. And this is actually a really good illustration of a complex failure because at first glance it looks like okay, the failure was caused by this bit of foam coming off the solid rocket booster and hitting the shuttle. And that is true. And there are quite a few factors that contributed to allowing and enabling pieces of foam to fall off and strike the orbiter, and then to not catching it and correcting in a timely way. 

So as you said, an engineer did look at a video and saw something he thought might, but might not be a foam strike. But he did not have the confidence. And he spoke up about it, but he wasn't able to get the resources he needed, because he wasn't very convincing. And he didn't have any data to find out for sure whether this had happened. Part of the reason it's a complex failure, is the program's increasing focus on production pressure, on the need to have as many flights as possible, that sense of, “We just got to keep this thing flying or we won't get our funding, we won't get our resources that we need;” the fact that foam strikes had happened in the past that were little bitty things that caused maintenance repairs needed, but nothing catastrophic. It's so I could identify five or six things that sort of came together in in just the wrong way. 

BH: It was cultural, it was systems. But it was also, as you point out, it was missed warning signs, as well. And the idea of missed warning signs was really interesting to me, because it made me think about the role of intuition. And that's something that I think that people with expertise or time tend to rely on. I was curious, is there a role for intuition in us detecting those signals in a process?

AE: Well, intuition can be thought of as not yet verbalized insight. You have some experiences that are telling you something's wrong or something's right, or this is a good design, this is a bad design. And they aren't just out of the blue. Intuition comes from expertise, your ability to sort of recognize there's a pattern here comes from preparation. You've come by it honestly, but you haven't quite found the words or the data to precisely say why you're recognizing something's going on here.

BH: It seems like it's a component of the types of awareness that you mentioned are important in terms of how we can fail well. Can you share with us those types of awareness and maybe how we can help build them for ourselves?

AE: If we think of failing well as essentially the activities of preventing preventable failure, and pursuing smart risks, and often experiencing intelligent failures, then well, what's going to help us do that? 

I identify three kinds of awareness that I think are really important. And they are self-awareness, situational awareness, and system awareness. 

So self-awareness is the ability to pause and think about, you know, what might I be layering on to this situation? What might I be thinking about that isn't really true? “This is awful. And I'm going to die if I'm late for this meeting,” for example, and just learn to sort of pause that unhealthy thinking and say, “Let's be realistic, let's be scientific. Okay, I may be late, I'll be able to make amends, I'll make an apology,” and so forth. For me, self-awareness is really about choosing learning over knowing, choosing curiosity over ego, if you will. There's always something to learn in every new situation, something's going on that I don't quite understand yet. And, and to just realize that that's going to be a more fun and more productive and a more joyful stance. Situation awareness is pretty simple. Pause to think, what are the stakes here? Are they high from a reputational financial, economic perspective? Or are they low? And what's the degree of uncertainty? Very low, very high, and that maps out the kind of failure landscape if you're in a high-stakes, high uncertainty context, you want to proceed very carefully, very cautiously. You want to take only very small risks, find out what happens quickly, and figure out whatever damage and pivot right if you're in a very high uncertainty, low stakes, have fun experimenting, see what you can learn right? And I have found that people will be excessively vigilant when the stakes are low and sometimes a little bit incautious when the stakes are high. 

BH: I love those. And the self-awareness piece strikes me as perhaps the most important, at least the one that we have the most agency over. And I love in your book that you use the words that we need to be vigorously humble and curious, which seems like such a great map for us to find those qualities.

AE: Yes. And we have the capacity to be vigorously humble and curious as human beings. But we often forget to do that, or to be that. As we get older, as we move from childhood to college and university and adulthood, you begin to think, “Oh, I'm supposed to know, I'm supposed to have the answers. I'm not supposed to be curious,” and [you begin] to not realize how attractive curiosity is and how attractive humility is.

BH: So speaking of careers, one of my questions for you is to understand how the ability to fail well has helped you in your career. Obviously, it's part of your work as a researcher. But take us outside of the research sphere. And are there any examples or stories that come to mind in terms of you failing well, either professionally or personally?

AE: Failing means preventing preventable failures and embracing and pursuing the intelligent ones. And I would say I've had modest success. But certainly not a perfect track record. I have had, and I write about very few of them, but I have had totally preventable stupid failures that would have been better to prevent. I failed a math exam in my first year of college. I let a boom on a sailboat hit me and knock me out of a boat and give me many stitches in my head because I failed to pay attention in a dangerous moment on the race course. I'm neither proud of them. Nor are they good practice. They are not failing well, they were failing badly. 

Yet, as a researcher, and as a person, living a life, I've had intelligent failures as well. I've had many research hypotheses that made sense but didn't pan out. You've wasted time and resources, and you don't get a publication out of that. It's too bad, but it's part of the career. So it's okay. You have to remind yourself that that's okay. I certainly had relationships not work out in ways that I think I have to say we're intelligent right there, new territory, of course, any new relationship is new territory, good reason to believe this could work, certainly in pursuit of a goal life partner. And not sure they were always as small as possible. But I'm glad that I learned what I learned and ultimately found my husband and soulmate.

BH: Obviously, we can learn from our failures and we should learn from our failures. Can we learn more from them than we can from success?

AE: I think the answer is yes. We can certainly learn from our successes. But you know, our successes don't always tell us why it worked, right might have been lucky. It might have been any number of things might have been because you worked really hard. And that's good. But the failures tend to have more clear diagnostic information.

BH: Your book is full of so much great insight and advice and things that are so useful to people across sectors and wherever they are in their careers. As we wrap up here, I'm wondering if you have one takeaway that you could share with our audience who are looking to adjust their mindset so that they can fail more intelligently.

AE: Maybe it’s every field — whether that's science, or celebrity chefs, or elite athletes, or management business success — I think in every field, the most successful people are not people who have failed less often than the rest of us. They are people who have failed more often. You think, “Wait, can that really be true?” Yes, because they've taken more risks. They've learned more, they've pivoted more, they've learned how to navigate uncertainty. They've learned how to dance with their chosen field in ways that allow them to progress despite their fallibility. They've developed failure muscles that have allowed them to succeed.

BH: It's that determination and it's going back to the Hershey's example that even though he went bankrupt twice, he stuck with it. Amy, congratulations on the book. Thank you so much for being here at Rotman we really appreciate it. The book again is right kind of wrong, the science of failing Well, Amy, for people who want to find you and hear more about your work, where can they find you?

AE: Well, you can go to hbs.edu and find my faculty page or amycedmondson.com. And you'll also find me on LinkedIn.

BH: All the great places. Thank you, Amy. 

This has been Rotman Visiting Experts, backstage discussions with world class thinkers and leaders from our acclaimed speaker series. To find out about upcoming speakers and events visiting us here at Canada's leading business school, please visit rotman.utoronto.ca/events. 

This episode was produced by Megan Haynes, recorded by Dan Mazzotta and edited by Damian Kearns. 

For more innovative thinking head over to the Rotman Insights Hub, and please subscribe to this podcast on Spotify, Apple or Google podcasts. Thanks for tuning in.