What the Optus Outage Teaches Us About Human Factors and the Systems We Build

Behind every network failure lies a human one. Here’s why designing for human factors is critical to building resilient systems.

When Optus’s network failed earlier this month, hundreds of Australians couldn’t reach 000, resulting in at least three deaths. While the immediate cause was a misconfigured firewall, the deeper story reflected a breakdown in the relationship between humans and technology. Engineers trusted dashboards that didn’t monitor emergency call routing. Escalation pathways failed. Information moved slowly, filtered and softened by hierarchy. These events offer a lens to explore how human behaviour and cognition, and system design combine to create or prevent crises. This article looks at the human factors that shaped the Optus outage and what they reveal about the systems we build.

The idea that crises emerge from the interaction between people and systems sits at the heart of human factors, a field that branches from industrial and organisational psychology. Human factors studies how people interact with technology, environments, and processes, and how design can either support or undermine human performance. The discipline first started in aviation and military contexts, where it became clear that many disasters weren’t caused by faulty machines but by systems that didn’t fit the way humans think and act under pressure. Despite its importance in safety-critical industries, human factors thinking remains underused in telecommunications, a sector built on complex, interconnected technologies that depend on both automation and human oversight.



The Optus outage shows why this field matters. Human factors research reveals that most major failures are not purely technical, but socio-technical: the result of how human behaviour, organisational culture, and system design interact under stress. One important human factors concept relevant to the Optus outage is Automation bias. This is the tendency to over-trust technology and to assume that if the system isn’t flagging a problem, there must not be one. This feeds the beliefs that the absence of alerts is a confirmation of safety. At Optus, silence from the monitoring tools bred confidence. Humans stopped asking questions because the machine wasn’t warning them to. This reflects a design flaw more than a human one. When systems don’t communicate uncertainty, people learn to stop looking for it.

This cognitive challenge is magnified by a concept known as normalisation of deviance (when small shortcuts slowly pull away from best practice). When short cuts don’t cause visible harm, people learn that rules are flexible. Over time, informal shortcuts (“this firewall update is low-risk, we’ve done it before”) become standard practice. The social reinforcement loop (“nothing went wrong last time”) overrides formal risk frameworks. As Abdullah et al. (2023) put it, “routine shortcuts that appear harmless” erode standards and create systemic vulnerability. This cultural drift explains why many crises appear sudden: the system has been quietly rewiring itself through everyday adaptation.

The final concept to explore are the social and cultural dynamics that existed within Optus’ social system of culture. Communication barriers and rigid hierarchies slowed escalation (a pattern often described in social psychology as diffusion of responsibility, where everyone assumes someone else will act). In high-reliability industries like aviation or healthcare, this is countered with flatter decision pathways and psychological safety so that anyone can raise a flag. Telecommunications, by contrast, still rely on legacy hierarchies that filter information just when speed and clarity are most needed.

Taken together, these human and organisational factors reveal that Optus’s outage was not an isolated failure but a predictable outcome of socio-technical drift. The lesson extends far beyond telecommunications. As automation and AI spread, organisations are increasingly handing perception, analysis, and decision-making to machines, often without redesigning the human loops around them. Abdullah et al. (2023) warn that “automation should not remove humans from the loop, but redesign the loop to fit human cognition.” Without that redesign, complexity will continue to outpace comprehension.

For leaders, this shift demands a new kind of strategic workforce planning that doesn’t just forecast skills, but designs systems of work. Every new technology changes how humans perceive, prioritise, and act. Planning for the future of work therefore means planning for the interaction between people and technology: how they make sense of information, share responsibility, and recover when things go wrong. The most resilient organisations will be those that design their workforce and systems as one integrated ecosystem where humans are supported by the tools they use.

By Georgia Marasco



Further readings

Abdullah, H. A., Alenezi, A., Alabdulatif, A., & Alshamrani, A. (2023). Human factor: A critical weak point in the information security of an organization’s Internet of Things. Sensors, 23(12), 5553. https://doi.org/10.3390/s23125553

Neumann, W. P., Winkelhaus, S., Grosse, E. H., & Glock, C. H. (2021). Industry 4.0 and the human factor: A systems framework and analysis methodology for successful development. International Journal of Production Economics, 233, 107992. https://doi.org/10.1016/j.ijpe.2020.107992

Salvendy, G. (Ed.). (2012). Handbook of human factors and ergonomics (4th ed.). John Wiley & Sons



Get in touch
Next
Next

The National Leadership Forum: The Ripple Effect of Servant Leadership