Welcome to Bookshelf, a new series featuring books written by Googlers. For our first installment, Senior Staff Designer Cliff Kuang shares an adapted selection from User Friendly: How the Hidden Rules of Design Are Changing the Way We Live, Work, and Play. Written with Robert Fabricant and described by the New York Times as “a tour de force,” the book is a must read for anyone working in UX today.—On March 28, 1979, the United States came within thirty minutes of unthinkable doom, after a near-meltdown of Reactor Two at the Three Mile Island nuclear power plant in Middletown, PA. As investigators later discovered, the 150-tons of uranium within the reactor core had hit an astounding 4300°F. At just 700°F hotter, it would have melted through the eight-inch steel reactor containment shell, boring through the earth until hitting bedrock beneath the Susquehanna River, and blowing radioactive steam geysers into the air just 90 miles away from Philadelphia.At the time, bureaucratic poobahs blamed the near-miss on a favorite scapegoat that we still reach for when something catastrophic happens, whether it’s a crash in a self-driving car or a botched missile warning in Hawaii: user error. But the fact is that TMI wasn’t a user error; the people operating the reactor never had a chance. TMI was a failure of design, and one that never should have happened.The problems began with the reactor’s control panel, which, after a clog in the plant’s cooling system, lit up in a storm of warning lights and klaxons. There were literally hundreds of alarms and lights going off at once. The operators duly rushed to turn them off, trying to figure out the problem. A fog of confusion—what is happening?!—already started to descend. The panel itself gave no clear sense of how the plant actually worked, clustering bits of information in meaningless ways. For example, the panel indicating reactor leaks was next to the one announcing elevator problems; investigators later found that a red light could mean fourteen different things, some bad, some good. Faced with that rat’s nest of conflicting signals, none of the operators could discern the cause and effect of all the ministrations they attempted to address a once-minor clog.But maybe the most egregious design error was even more fundamental. There was a release valve at the top of the reactor cooling system. In the control room, that valve was so important that it got its own light, and its own switch. As it turns out, the light was wired merely to the switch, not the valve. Therefore it could only relay whether someone had flipped the switch—and not whether, in fact, the valve had actually been closed. Put another way, that single, idiotic light was only capable of marking the intent to close the valve—not the actual closing of that valve. So when the operators went to check, the light simply lied. And because of that single piece of misbegotten feedback, that valve stayed open for hours as water boiled away from the reactor core, sending the temperature inside on its terrifying path to 5000°F.What happened at TMI caused a sad cascade of events. Together with the wildly misinformed disaster movie The China Syndrome, the accident at TMI helped end political support for nuclear power in American—which experts still point out is one of the safest, most reliable sources of clean energy. So as we look back at the clean energy that might have been, it’s safe to say that TMI was the biggest design failure in US history. It’s also the most instructive. The failures at TMI teach us volumes about what good design actually means: If the feedback in an app or gadget isn’t just right, you’ll never understand how it works. But more than just apps or gadgets, it’s the feedback mechanisms all around us that teach how the world works. Those mechanisms show us what’s important. They give us agency, when the world would otherwise be just meaningless lights, lying to us about what they do.