The life and work of John Boyd is remarkable, and it’s taught me many things. From Boyd I learnt what it takes to create change in a political, bureaucratic institution like the Pentagon. I learnt how to manufacture, survive and thrive amongst conflict. I learnt about the sacrifice and commitment required to construct revolutionary ideas and theories. I learnt that the work matters more than the recognition that comes from it.
These things have saved me an immeasurable number of years and an inestimable amount of pain. But this morning, as I flicked back through Robert Coram’s exquisite biography of Boyd, I came across something else I’d read, but not retained.
In the middle of his career, Boyd was travelling around giving briefings related to the development of the F-X, a new fighter plane. At one point he went to Europe, and whilst there he briefed a four-star general:
“…the general mused on how this new aircraft would require intensive pilot training. The general then boasted about the safety record of fighter pilots under his command and told how he had had no training accidents for several years.
Boyd’s advice seems reckless. He’s basically saying that it’s okay if some pilots die during training. But if we look past the apparent recklessness, all Boyd is saying is that to find and expand the limits, we have to overstep them. And that applies to all areas where we wish to operate at the boundaries of the possible.
This mindset—loss as an inevitable consequence of boundary testing—can also be applied to other, lower risk crafts and disciplines.
In basketball, it’s better to commit a turnover whilst experimenting with a new play in training than it is to make the same mistake in the playoffs.
In online marketing, thanks to split testing, you can try a bold, risky new approach on only a small subset of your potential customers.
It applies to the workflows and systems of individuals and organisations too. Think about it. If a system is on the verge of breaking down, you have two options: you can either scale back the inputs to a manageable level, or you can modify the system to handle the increased load and work at a greater scale. If you want a system to get better and stronger, generally, you’ll choose to modify it. But to know how and why to modify the system, you need to approach the limits. You need to achieve that state of stress.
If your systems aren’t continually coming to a state of near-brokenness, then they’re not being pushed to the limits of their capacity. And if they’re not being pushed to the limits—and occasionally collapsing—how can they get better? How do you know where they’re most likely to break and why?
Boyd’s observation is accurate. For systems and skills to develop, you have to let them, to force them to break down. You learn more from that, from a system’s failure, than you do from its success. But the trick is to get the lesson from failure without having to pay the cost associated with it, to keep the upside of failure whilst shielding yourself from the accompanying downside.