Facts won't always convince people you're right - a former flight director at NASA shares how he learned to argue better in the wake of a disaster
• He learned this firsthand as he investigated the causes of the fatal accident, which killed seven astronauts.
• Hill said appealing to data did little to restore trust amongst the team members.
The Space Shuttle Columbia disintegrated upon re-entering the Earth's atmosphere on February 1, 2003, killing all seven crew members on board.
"'If I had been good enough, I could've stood up and said something. Why wasn't I good enough to say what needed to be said?' Most of us felt that way," he told Business Insider.
Hill, the author of "Leadership from the Mission Control Room to the Boardroom: A Guide to Unleashing Team Performance," worked on 24 different space shuttle and ISS missions as a flight director over the course of his career. He was also appointed to lead the investigation into the 2003 Columbia disaster.
Hill and his team needed to pinpoint what had gone wrong and identify tools, operating techniques, and methods of detecting and repairing shuttle damage to prevent future accidents. They ultimately learned that catastrophe was caused by a piece of foam that damaged the orbiter's wing.
Hill said higher-ups wanted to fly again within six months. Ultimately, while the engineering solutions were largely established within six months, it took two years to actually engineer and manufacture the necessary components in order to return to space.
Over the course of the investigation, Hill said people throughout the community began coming forward with their ideas on how to keep crews safe in space. Many were devastated by the loss of seven colleagues. As a result, most were more emotionally invested in their suggestions than usual, Hill said.
"Frequently they would say, 'If you don't adopt this answer, then you don't care about solving this problem,'" Hill said.
Over time, he said the "rumors in the hallway" began to intensify, and Hill was painted as acting in bad faith.
People said "I was in cahoots with the most senior NASA management," Hill told Business Insider. They said, "All we wanted to do is fly again, and it didn't really matter if what my team was coming forward with would work or if we'd get the next astronaut killed. As long as we're able to fly again."
He said one colleague even shared such concerns with the media.
"To then have not just anybody - but astronauts - say, 'Well, he's not taking my recommendation, so he doesn't care if he kills the next person like me,'" he said. "It was soul-crushing."
Paul Sean Hill
To combat the growing distrust, Hill said he tried to double down on the data to back up his proposals. He said the strategy was a "mission control" approach, as he was used to working in a fast-paced environment that valued data and logic. He figured he could apply the same tactic to convince the doubters.
"The effect that it had?" he said. "It just pissed off the people that didn't agree with me. Because they weren't listening. They knew what they thought and they knew that I was doing something in bad faith. The more pissed off they got, the more I would focus on, 'Well here's the data.'"
Hill said he won most of these data-based arguments, and his solutions were ultimately implemented. He was even assigned the role of lead flight director.
But the intense division took its toll. To this day, he estimates 70% of the people he clashed with over the course of the investigation probably wouldn't even make eye contact with him in an elevator.
"The whole community inside NASA completely came apart," he said. "It was kind of like what we're seeing in national politics today. The country is so divided."
Like today's politically divided climate, he said the situation at NASA was hampered by a lack of people listening to those they disagreed with.
"You don't really care what their opinion is, you don't really care what their argument is or why they differ from you," he said. "All you know is, 'Here's what I believe, stop saying what you're saying, start saying what I believe.' That's exactly what was happening inside our community."
The toxic environment nearly prompted Hill to quit the organization. Instead, he said colleagues convinced him to stick around following the investigation and take a temporary management role.
During a management course, Hill said he came to some realizations on how to better deal with disunity. Instead of preaching data to skeptics, he said the best strategy would be to bridge the gap by seeking out and convincing individuals the hardcore skeptics trust.
"The fundamental problem we had wasn't going to be solved just by doubling down on data," he said. "There are other discussions we need to be having to get everybody back on the same page. You can't continue to pound on the table yelling, 'Damn it, read the data? What's the matter with you people?'"