Our Thinking
Design that connects us

Human-Centred Secure-by-Design: Building Resilience in Rail and Aviation

By: Dr Eylem ThronBy: Mima | Tags: Human Factors & Ergonomics, Control Centre Design

Security in critical infrastructure is still too often treated as a purely technical problem. Firewalls, intrusion detection systems, patching regimes, and compliance frameworks tend to dominate the conversation. Yet across rail and aviation, operational experience consistently tells a different story. When cyber incidents disrupt services, the causes rarely sit in technology alone. They emerge from the interaction between people, digital systems, procedures, organisational boundaries, and the pressures of real-world operations.

These systems are among the most complex socio-technical environments in operation today. They combine long-established infrastructure with modern digital platforms, involve multiple operators and suppliers, and depend on continuous coordination across safety, security, engineering, and operations. In this context, cyber risk cannot be meaningfully understood - or managed - without considering how people actually work within and across these environments.

Futuristic airport terminal scene with silhouetted travelers and airplanes, overlaid with digital interface graphics showing maps, charts, data points, and location indicators.
Futuristic airport terminal scene with silhouetted travelers and airplanes, overlaid with digital interface graphics showing maps, charts, data points, and location indicators.

Human Factors and Cyber Resilience

From a human factors perspective, people are not the weakest link in cyber resilience. They are the adaptive layer that keeps complex, safety-critical systems functioning when conditions deviate from the plan. What is often labelled as “human error” is more accurately a signal that systems were not designed to support real work as it happens.

In control rooms and operations centres, staff constantly adjust, prioritise, workaround, and compensate to keep services running safely. Cyber resilience depends on that adaptability - not on attempts to design it out of the system. When people appear to be the problem, it is often because design decisions have failed to account for how work is actually done under operational pressure.

What Secure-by-Design Means in Practice

This is why secure-by-design must be much more than a late-stage compliance exercise. At its core, secure-by-design means building security into systems from the outset and not bolting it on once technology is deployed. For non-technical audiences, this can be thought of as asking early, practical questions such as: How will people notice something is wrong? How will they respond under pressure? What happens when systems degrade or behave unexpectedly?

Meaningful resilience is shaped early, when design decisions lock in assumptions about human behaviour, automation, recovery, and degraded operations. Choices made during system design, architecture, procurement and integration determine whether people will be supported during a cyber disruption - or whether they will be forced to improvise. If systems only function when everything goes right, they will struggle when it matters most.

Secure-by-design only works when systems are designed for real people, not ideal behaviour. Operations rarely unfold exactly as procedures describe. Staff work across organisational and contractual boundaries, manage competing priorities, and operate systems that evolve over decades. Designing for security therefore means designing for handovers, uncertainty, time pressure, and imperfect information. It also means making recovery paths visible and usable, rather than assuming that technical controls alone will prevent failure.

In these environments, new digital systems are typically introduced into existing, long-lived infrastructure shaped by decades of operational and safety decisions. Secure-by-design in this context is less about technical perfection and more about managing trade-offs transparently. Human factors approaches help identify where new security controls may introduce cognitive load, slow decision-making or conflict with operational priorities - risks that are easy to miss if design is assessed solely against technical requirements.

Cyber resilience depends on adaptability - not on attempts to design it out of the system. When people appear to be the problem, it is often because design decisions have failed to account for how work is actually done under operational pressure.

Dr Eylem Thron, Principal Human Factors Consultant Mima

AI, Automation and Changing Human Roles

These themes are increasingly reflected in industry discussions across both sectors. At the RSSB Human Factors in Rail conference, recent conversations - including those focused on AI and human factors - highlighted how intelligent systems change human roles rather than remove them, reinforcing the need to treat cyber risk as a socio-technical issue. Similarly, discussions on airport operations at various events have highlighted how cyber risk intersects with operational decision-making, organisational interfaces, and frontline performance.

AI and automation are often presented as solutions to growing cyber threats, promising faster detection and more consistent responses. From a human factors standpoint, the key question is not whether these technologies are useful, but how they reshape human work. Poorly designed automation can introduce new risks such as over-reliance, loss of situational awareness, automation bias and confusion during handover between human and machine control. In safety-critical environments, these effects often remain hidden until abnormal or degraded conditions arise.

Secure-by-design therefore requires explicit consideration of how humans and intelligent systems will work together throughout the system lifecycle. Good design supports calibrated trust, clear system intent and meaningful human oversight, rather than assuming automation will remove risk. It also recognises that human operators frequently act as the last line of defence, detecting subtle anomalies and making judgement calls that no algorithm was designed to anticipate.

Large, sunlit train station concourse with ticket gates in the foreground and people walking and waiting near platforms in the background.
Large, sunlit train station concourse with ticket gates in the foreground and people walking and waiting near platforms in the background.

Operational Resilience Under Pressure

Operational resilience is ultimately tested during disruption. In cyber incidents affecting these systems, what tends to break first is rarely a single technical component. More often, it is coordination: unclear roles, overloaded staff, brittle procedures, and gaps between safety, security and operations. Interfaces between organisations - including suppliers, maintainers and service providers - can become particular points of vulnerability, especially when responsibilities for cyber response are fragmented.

Business continuity and disaster recovery plans may exist, but unless they are designed around how people actually work under real operational pressures, they risk remaining theoretical. From a human factors perspective, resilient behaviour includes anticipation, improvisation, communication and learning. Supporting these behaviours requires more than documentation; it requires training that goes beyond awareness and compliance, and exercises that expose teams to realistic, cross-disciplinary scenarios.

Regulation, Assurance and Learning

Regulation and assurance play a critical role, but they can unintentionally reinforce a tick-box mindset if not carefully applied. There is a growing opportunity for regulators and organisations to encourage secure-by-design thinking by focusing less on static compliance and more on how systems perform under realistic operational conditions.

Aligning safety, security and data-protection requirements around human behaviour - rather than treating them as separate silos - can help organisations address cyber risk more holistically and effectively. Scenario-based assurance, early cross-disciplinary engagement and learning from operational feedback all support this improved alignment.

Designing for Human Adaptability

Across rail and aviation, people remain the last line of defence when technology fails or behaves unexpectedly. Strengthening cyber resilience therefore does not mean removing humans from the system. It means recognising their role as problem-solvers and designing systems that enable, rather than constrain, that capability.

A human-centred approach to secure-by-design provides a practical pathway to building infrastructure that can absorb disruption, adapt under pressure and continue to operate safely.

In complex infrastructure, resilience is not designed into technology alone - it is designed into the relationship between people, systems and the conditions they operate in.

Written by:

Photo of Dr Eylem Thron

Dr Eylem Thron
Principal Human Factors Consultant

Eylem is a Chartered Ergonomist with a PhD in Engineering and MSc in Human Factors, who has over 10 years experience across aerospace, defence and rail. Her focus is on user-centred design and accessibility and has worked on such high-profile projects as Crossrail and Thameslink.