We review the theoretical foundation for the need for human factors science. Over the past 2.8 million years, humans and tools have co-evolved. However, in the last century, technology is introduced at a rate that exceeds human evolution. The proliferation of computers and, more recently, robots, introduces new cognitive demands, as the human is required to be a monitor rather than a direct controller. The usage of robots and artificial intelligence is only expected to increase, and the present COVID-19 pandemic may prove to be catalytic in this regard. One way to improve overall system performance is to ‘adapt the human to the machine’ via task procedures, operator training, operator selection, a Procrustean mandate. Using classic research examples, we demonstrate that Procrustean methods can improve performance only to a limited extent. For a viable future, therefore, technology must adapt to the human, which underwrites the necessity of human factors science. Practitioner Summary: Various research articles have reported that the science of Human Factors is of vital importance in improving human-machine systems. However, what is lacking is a fundamental historical outline of why Human Factors is important. This article provides such a foundation, using arguments ranging from pre-history to post-COVID.
- allocation of functions
- General ergonomics
- individual differences
- learning and skill acquisition
- robotics and cybernetics