#1: Can Autonomy Lose its Wings?
“Procedures are often written after [car] accidents, not before” – Sidney Dekker
At face value, this is ridiculous. After all, you wouldn’t send your Model 3 to additional training if it crashed into a telephone pole. Tesla would be held accountable. But who is accountable when CCA (Collaborative Combat Aircraft) misbehaves?
If we treat CCA like an MQ-9 or RQ-4, the pilot in command, likely the fifth gen fighter pilot, is to blame. If we treat them like an air-to-air missile, perhaps the manufacturer or the test team could hold the bag. Before we shake our fists at the Wright Brothers for making this possible, consider this:
Human vs autonomous actions is largely a difference with no distinction in this case. Barring a hardware or software malfunction, actions are simply a model’s output to inputs. A human chose model’s structure, the training data, and most importantly when it was good enough to pass test and be used in anger.
Will CCA make the need for human control of moral decisions in warfare obsolete? Almost certainly not. Will it force us to think harder about who owns life and death decisions, yes.
If you would like to see more, please subscribe.
