The Building That Almost Fell

The Building That Almost Fell

There is something quietly terrifying about the fact that we trust the world to hold.

We assume bridges will carry us. Elevators will lift us. Buildings will stand. We walk into skyscrapers not because we’ve done the math ourselves, but because someone—somewhere—has. And we trust they got it right.

In the 1970s, a skyscraper rose in Manhattan with an unusual design. The Citicorp Center stood on four columns—but not at the corners. Each column was placed at the center of the building’s sides, elevated above a small church that refused to move. The building bent around that church. And the engineering bent around the building.

To make it work, the weight of the structure had to be funneled through massive diagonal steel braces—every floor transferring its burden down and inward to those misaligned columns. It was, by all accounts, a technical marvel. The structural engineer, William LeMessurier, believed it would stand. A tuned mass damper—a 400-ton counterweight at the top—was added to keep it steady in the wind. On paper, the system worked.

But the wind does not follow paper.

It was a student—an undergraduate—who first asked the question. What about wind hitting the corners? Not the flat faces, but the sharp edges. Quartering winds. The kind that twist and torque, that don’t announce themselves neatly.

And that question sent LeMessurier back to his numbers. Back to the design.

What he found was worse than a miscalculation. It was a compound failure: the original wind analysis hadn’t accounted for the more dangerous wind directions. And during construction, a decision had been made to use bolts instead of welds to connect the building’s crucial joints—a cost-saving measure, approved and passed through, because it looked harmless at the time.

Those bolted joints, under quartering wind pressure, could fail. The building could fall. Not under rare conditions. Under common ones. A strong enough summer storm, a regular hurricane brushing New York—and the structure might not hold.

At that point, LeMessurier had a choice.

He could stay silent.

He could manage the risk on paper, hope for calm seasons, pray no one asked further questions.

Or he could act—and face public shame, lawsuits, financial ruin.

He chose to act. And he was lucky. The people at Citicorp responded quickly, and quietly. They didn’t call a press conference. They didn’t warn the public. Instead, they worked in secret. Each night, after workers left, welders entered the building and reinforced the flawed joints with steel plates. An emergency generator was installed to keep the damper operational even during a blackout. The crisis passed. No one was harmed. The public never knew.

It would be almost twenty years before the story surfaced. And even now, the engineering firms involved say very little. Their websites present the building as an icon, not a near-tragedy. The official version is still a version of success.

But is it?

What does it say about a system that required an outsider—a student—to ask the right question? What does it say that so many experts missed the flaw, or assumed someone else had checked it? What does it say that the building only survived because one man—one—chose not to look away?

This wasn’t a lone error. It was a pattern. And it’s a pattern we haven’t broken.

We tend to overestimate the systems we create. We assume that complexity must equal correctness. That the people at the top have seen what needs seeing. But in truth, systems are just stories we all agree to trust. And sometimes those stories are wrong.

We also punish error so fiercely—so immediately—that admitting one becomes an existential risk. If LeMessurier had been publicly shamed, disbarred, sued into the ground, what would the next engineer do? The incentives still point toward silence.

So we ask ourselves: could this be prevented next time? Could we build in systematic reviews, post-mortems, reflective audits—not just after failure, but as a routine part of design?

Maybe. But then again, maybe not.

Maybe the time, the cost, the complexity—maybe all of that would pile up faster than the culture can tolerate. Maybe shareholders wouldn’t wait. Maybe the next deadline would already be looming. Maybe the questions would be asked too late.

And so we return to the uneasy truth: this was a lucky catch. A student asked a question. An engineer listened. A company acted. Quietly. That’s not a system. That’s a miracle.

And we cannot build a future on miracles.

So what would it look like—not just to fix the next flaw, but to expect one? To make peace with the fact that even our best work needs to be questioned? That the wind will always come, and that our confidence in the face of it must be tempered by humility?

Because we don’t lack brilliance. We don’t lack technology.

What we lack is the habit of looking back.

We move from project to project, platform to platform, campaign to campaign—rarely stopping to ask what we’ve missed, what’s shifting beneath us, what needs to be rethought before it fails. And when we do stop, we call it “costly.” We call it “inefficient.” We call it “paralysis by analysis.”

But maybe what looks like slowness is actually survival.

And maybe the question is not whether we can afford to reflect, but whether we can afford not to.

Because innovation, for all its beauty—for all its breakthroughs, its efficiencies, its shining structures—does not come free. It carries with it new forms of fragility. New failure modes we haven’t yet learned how to see. The faster we build, the more we risk building atop assumptions we haven’t even named.

No one in that room, signing off on bolts instead of welds, believed they were gambling with lives. They believed they were making the system work. They were saving money, following procedure, doing what was done. And that’s the danger. The biggest threats don’t always look like recklessness. Sometimes they look like optimization.

So what do we do with that?

Do we try to slow innovation down? Can we? Would any nation or institution willingly pull back, knowing others won’t? Is our pursuit of progress so entangled with our survival that to pause feels like falling behind?

That’s the question that won’t leave us. Not whether we can build better systems—but whether we are capable, as a society, of stepping out of the race long enough to see what the race is doing to us.

And if we can't... what then?

Because the wind will come. It always does. And when it does, it doesn’t care what we meant to do. Only what we actually built.