When robustly tolerable beats precariously optimal
Something is "robustly tolerable" if it performs adequately under a wide range of circumstances. Robustly tolerable things have decent insulation against negative shocks. A car with excellent safety features but a low top speed is robustly tolerable. A fast but dangerous sports car is not.
We often have to pay a price to make something more robustly tolerable. Sometimes we need to trade off performance. If I can only perform an amazing gymnastics routine 10% of the time, it might be better for me to opt for a less amazing routine that I can get right 90% of the time. Sometimes we need to trade off agility. If a large company develops checks on their decision-making processes over time, this may make their decisions more robustly tolerable but reduce the speed at which they can make those decisions.
Being robustly tolerable is not a particularly valuable trait when the expected costs of failure are low, but it's an extremely valuable trait when the expected costs of failure are high. The more high impact something is—the more widely a technology is used or the more important a piece of infrastructure is, for example—the more we want it to be robustly tolerable. When a lot is on the line, we're more likely to opt for a product that is worse most of the time but has fewer critical failures.
What are examples where the expected costs of failure are high? It's clearly very bad if an entire country is suddenly governed poorly. The costs of total failures of governance have historically been very high. This is why being robustly tolerable is a very desirable feature of large-scale governance structures. If your country is already functioning adequately with democractic elections, term limits, a careful legislative process, and checks on power—several branches of government, an independent judiciary, a free press, laws against corruption, and so on—then it seems less likely to suddenly be plunged into an authoritarian dictatorship or to experience political catastrophes like hyperinflation or famine.
I think we can undervalue the property of being robustly tolerable. When we see something that is robustly tolerable, sometimes all we see is a thing that could clearly perform better. (The car could go faster, the decision-making process could be less burdensome, etc.) We don't take into account the fact that—even if the thing never behaves optimally—it's also less likely to do something terrible. How well something functions often is in plain sight. But the downside risk isn't visible most of the time, so it's easy to forget to look at how robust its performance is. Overlooking robustness could be especially harmful if the only way to improve something's performance involves making it less robust.
For example, if a candidate we dislike gets elected, it can be tempting to blame the democratic process that allowed it to happen. People can even claim that it would be better to have less democracy than to have people elect such bad representatives. But the very same democratic process often limits the power of that individual and lets people vote them out. A benevolent dictatorship may seem surprisingly alluring in bad times, but any political system that enables a benevolent dictatorship also puts you at much greater risk of a malevolent one. (As an aside, I find it a bit odd when people's reaction to "bad decisions by the electorate" is to give up on democracy rather than, say, trying to build a more educated and informed electorate.)
Actions and plans can also vary in how robustly tolerable they are. Risk-taking behaviors like starting a company are generally less robustly tolerable, while lower-variance plans, e.g. getting a medical degree, are more robustly tolerable. In line with what I noted in a previous post, we should generally be in favor of robustly tolerable actions and plans when the expected cost of failure is high, and in favor of more fragile but high-yield behaviors and plans when the expected cost of failure is low.
Being robustly tolerable is not always a virtue worth having more of. We can tip the balance between too far in favor of robustness, and we can sacrifice too much performance or agility in order to achieve it. If we do, we can find ourselves in a robust mediocrity that it's difficult to get out of. (You may believe that some of the examples I give above are robustly mediocre rather than robustly tolerable.)
But if something is robustly tolerable then the worst case scenarios are less likely and less harmful. This is a valuable trait to have in domains where the cost of failure is high. It's also a trait that's easy to overlook if we focus exclusively on how well something is performing in the here and now, and forget to consider how well it performs in the worst case scenario.