Building safety in at the design level only goes so far. You can add all the physical safeguards you want, but if you don’t have supportive procedures alongside them, well… someone’s going to do something stupid.
Take those VTOL arming plugs, for example. Because the plugs complete the electrical circuit for the VTOL motors, the idea is that you do not touch them until everything else is ready for those propellers to spin. This is true whether you’re about to launch the bird for a flight or just doing some ground testing.
But that order of operations is procedure. If someone wanted to be real spicy, they could absolutely plug in the batteries, flip the arming plugs over, and arm the VTOL system before the aircraft is even taken to the launch point.
That doesn’t happen, though, because the people who developed the aircraft and procedures were deliberate in preventing it.
They ensured the pre-flight procedure clearly communicated not to touch the arming plugs until the absolute last step. They enforced that understanding among the first operators. And they drilled it into the heads of every single student they instructed (including myself). They backed up a physical safety feature with a procedural, and even cultural, one.
Speaking more broadly, prioritizing safety in operations goes beyond design features. It shows up in things like cross-checks: one person will go through the full checklist for their area of responsibility, and then swap with another person to have it verified.
As an example, often there will be one person operating the computer serving as the ground station of the aircraft (i.e., the flight control system). Another person, the crew chief, does the full pre-flight checkout of the aircraft itself. And then the two cross-check: the operator runs through the pre-flight checkout, and the crew chief reviews the ground station setup.
This isn’t for a lack of trust. It’s the opposite: it’s telling your team member that you trust them enough to catch your mistakes. Getting another pair of eyes on everything helps find errors and gives everyone involved a dose of confidence.
And then there’s something that might be a bit more contentious: being mindful of whether you’re actually helping someone versus just disrupting their work.
For one flight, one team member was established as the pilot in command (PIC). They had set up the flight plan at the ground control station as was briefed to the whole team.
At one point, they stepped away. Another team member came in and changed altitudes for some of the waypoints, without telling anyone.
This could have been very bad—thankfully, the PIC caught the changes before they launched the bird. But even if those changes were an intentional fix, the real issue was the lack of communication.
If there was a concern, this should have been a conversation between the PIC and the other team member. The entire flight team should have been informed of any needed updates.
And considering it’s the pilot in command who makes the final decisions and holds ultimate responsibility for the aircraft, there absolutely should not have been changes made to their own flight plan.
There it is again: communication. A lack of communication contributed to the loss of an X-31. And it’s robust, continuous, honest communication that keeps UAV operations, and the personnel involved, safe.