Our Thinking

Design that connects us

Safeguarding Critical Infrastructure – Integrating Human Factors in Cyber-Security

By: Dr Eylem Thron | Tags: Human Factors & Ergonomics

The ongoing digital transformation across critical infrastructure sectors, such as railways, presents numerous advantages but also introduces new and complex cybersecurity challenges. It brings an unfamiliar operational environment characterized by increased networking over the internet, thus creating vulnerabilities to cyber-attacks and data breaches. While this increased connectivity offers operational benefits, it also exposes critical systems to unprecedented risks, including unauthorised access, interference and data leaks, which were not previously possible with traditional closed systems.

An Engineer manager uses tablets and AI systems to inspect and control the operation of a train
A man types on a tablet faced by trains arriving in a large station in a metropolitan area

However, system developers and operators often prioritize technical measures and training to manage and control these risks, neglecting the diverse roles and behaviours of human operators interacting with the system. While technical solutions play a vital role in cyber-security, relying solely on them is not sufficient. A Human Factors (HF) approach is essential to effectively mitigate cyber-security risks. This approach recognizes that human behaviour and cognition are critical factors that must be considered in designing secure systems. In a recent journal paper, we emphasized the critical importance of HF in cyber-security for railways, particularly focusing on signallers. These frontline operators face increased risks due to automation and interconnectedness, necessitating a comprehensive understanding of HF in cyber-security.

Automation and connectivity advancements escalate cyber threats, particularly from human error, which can impact critical infrastructure operations like railways. Research indicates that approximately 90% of cyber failures can be attributed to human error, even among individuals who are well-trained and motivated. Despite the implementation of technical solutions and organizational policies, human error remains a prevalent and significant challenge in cyber-security. As critical infrastructure embraces Internet of Things (IoT), cloud computing, and Artificial Intelligence (AI), there is an exponential increase in the cyber threat from multiple directions. Therefore, effective cyber-security strategies must adopt an HF approach, considering human capabilities, behaviours, motivations, and vulnerabilities to ensure that the various security measures cannot be easily circumvented by human errors.

A group of rail passengers move down a dimly-lit stairwell
A group of rail passengers move down a dimly-lit station stairwell

System developers and operators often prioritize technical solutions such as passwords, firewalls, and adherence to security standards. However, factors such as usability issues, time pressures, organizational ambiguity, inadequate training, and lack of risk awareness contribute to human error and pose significant challenges to cyber-security efforts. In safety-critical environments, where decisions are time-sensitive and workload is high, the risk of human error escalates, especially in degraded or emergency modes where equipment faults can amplify risks, which may lead to serious operational or organisational consequences.

Human Factors encompasses various aspects, including Human Machine Interface (HMI) usability, Human Error Analysis, organizational processes and policies, training adequacy, and clarity over responsibilities and communication. HF methodologies, such as cognitive task analysis and workload analysis, help identify human error sources and potential security risks.

Common sources of human errors stem from various factors, including:

  • Usability issues: Poorly designed user interfaces contribute to user (operator) errors.

  • Pressure: Time constraints and organizational demands lead to high workload and increase the likelihood of mistakes.

  • Weak operational organization: Lack of clarity regarding responsibilities, policies and processes creates confusion.

  • Inadequate training: Insufficient training leaves users ill-equipped to handle cyber-security challenges effectively.

An HF approach addresses these challenges by focusing on understanding human tasks, goals, behaviour and cognition to design systems that are much more resilient to human error. HF professionals can provide interventions, such as:

  • Improving user interface design to enhance usability and reduce errors.

  • Developing comprehensive training programs to increase user awareness and proficiency in cyber-security protocols.

  • Enhancing organizational policies and processes to clarify responsibilities and promote a culture of cyber-security awareness.

  • Providing support during incident reporting and response to improve resilience to cyber threats.

Addressing these factors increasingly requires collaboration between HF professionals, security engineers, and safety experts. Collaborative analysis of user tasks and vulnerabilities allow for a comprehensive approach to cyber-security triad (confidentiality, integrity and availability), bridging the gap between technical solutions and HF considerations.

Thus, as organizations navigate the complex landscape of cyber-security in the digital age, integrating an HF approach is indispensable. By acknowledging the inherent fallibility of human actors, recognizing the importance of human behaviour and cognition in cyber-security and addressing operator needs and limitations, organizations can develop more robust and resilient systems that safeguard critical infrastructure against evolving threats. Embracing an HF approach is not merely a matter of enhancing technical resilience; it is a fundamental step towards safeguarding the integrity, confidentiality, and availability of critical systems in an ever-evolving cyber landscape.

Written by:

Photo of Dr Eylem Thron

Dr Eylem Thron
Principal Human Factors Consultant

Eylem is a Chartered Ergonomist with a PhD in Engineering and MSc in Human Factors, who has over 10 years experience across aerospace, defence and rail. Her focus is on user-centred design and accessibility and has worked on such high-profile projects as Crossrail and Thameslink.