In 2018, an autonomous Uber vehicle hit and killed a pedestrian. Although autonomous, the vehicle was not driverless: the onboard artificial intelligence (AI) had handed off control to a safety driver at the last second. “Handed off”—a just-launched National Highway Traffic Safety Administration investigation into Tesla is focusing on this very issue. After all, like many socio-technical systems, semi- and fully autonomous vehicles are designed so that decision-making can be handed off from a machine to a human operator. This is worrisome because decades of human factors research have shown that people perform poorly under such conditions. When situated in the handoff recipient role, humans suffer from a combination of four linked issues: complacency, inattention, skill atrophy, and automation bias (i.e., over-trust in the machine).