AI in rail – harnessing AI today for the railways of tomorrow
21 January 2025
21 January 2025
Ashurst started the UK rail AI conversation in earnest on 24 October 2024 with a group of 20 industry leaders, including from the DfT and RSSB. The lively discussion addressed the unique opportunities presented by AI, as well as the risks it poses to the rail industry. It took place against the backdrop of the DfT's forthcoming AI strategy and the EU AI Act entering into force.
The UK has no rail-specific AI regulations at present. However:
The EU AI Act provides a comprehensive regulatory framework governing how AI systems in the EU are deployed and used.
An AI System:
The risk categories for AI Systems and related obligations are based on risk to health, safety and fundamental legal rights of individuals. High-risk systems are subject to stringent compliance requirements, including conformity assessments, quality management systems and post-market monitoring.
AI systems for rail are likely to be high risk where they relate to safety and where the risk is significant enough to warrant a third-party conformity assessment under the Interoperability Directive.
Read more in our Global AI Regulation Guide.
The backdrop to the UK's regulatory approach to rail AI is the Government's AI "innovation over regulation" White Paper, which outlines five key principles for regulating AI. The Government's aim is to foster innovation while ensuring AI is developed and deployed responsibly, in a way that prioritises public trust and safety. Sector regulators (rather than the UK Government) are expected to decide where and how to regulate AI use in their sectors by applying these five principles:
DfT has identified three priorities to bear in mind when approaching rail AI in particular, and transport systems more generally:
The DfT Transport AI strategy will be viewed in the context of these three priorities. The DfT is keen to gather insights from the rail industry to inform its approach to AI.
There are some excellent examples of AI being used in the rail sector to date.
These largely focus on non-safety critical functions, and tend to be advisory, with a human firmly in the loop (at present) – such as predictive maintenance for track and trains/equipment.
In the generally very risk averse rail industry, operators, suppliers and regulators are all likely to need to feel that they understand AI related risks and challenges more before AI adoption in safety critical situations.
A clear regulatory direction could therefore assist earlier AI adoption and innovation in the rail sector.
Businesses navigating the AI landscape should:
Rail AI regulation could:
The UK’s hands-off, decentralised regulatory stance could be impacted by the EU's comprehensive regulatory AI framework because (i) the EU AI Act's standards will filter into the UK, for UK businesses supplying to Europe; and (ii) the EU's comprehensive approach may serve as a useful example of best practice in terms of how to manage rail AI risk. The question is whether UK regulators will see fit to follow the EU example.
As a predictive tool, AI accuracy and explainability are vital. AI generated results often require human interpretation, and models can sometimes fail to take account of extreme scenarios. If the objectives behind AI regulation are clearly articulated and understood, it is more likely that opportunities will be grasped and related risks managed effectively. In this way, more comprehensive AI regulation could in fact facilitate innovation.
Understanding the purpose, system function, utility and significance of AI systems while anticipating the impact of potential failures or unintended results is critical to AI assurance. The ability to second-guess the behaviour of AI systems over time is essential in ensuring the integrity and reliability of AI in rail.
The impact of existing regulations on rail AI is interesting and complex. The Provision and Use of Work Equipment Regulations, Supply of Machinery (Safety) Regulations, and the Construction (Design and Management) Regulations (among others) intersect with rail-specific regulations like RIR and ROGS and already apply to the introduction of AI systems. The question is what more, if anything, is required for regulating rail AI now and in the future.
The self-evolving nature of AI systems cannot be ignored. Robust internal governance frameworks must be established to ensure ongoing compliance with ethical/legal standards; and management of evolving risks in the future (to the extent we are able to predict and future-proof these things).
Rail AI has the potential to deliver significant efficiencies and value, but may also bring increased risk. Proactive risk management involves identifying and mapping AI-related risks from an early stage, to improve decision making and more effective AI adoption, streamlining compliance efforts.
A proactive approach reassures investors and stakeholders that potential risks are being continually and diligently identified and mitigated; stakeholder confidence is essential for investment in rail AI. The resulting focus on transparency and accountability also builds community and customer trust.
Managing AI risks in the rail sector requires a comprehensive approach across the various risk domains, including regulatory, legal, business continuity, cybersecurity, and safety. Implementing a robust AI governance framework will mean that rail entities can ensure they use AI responsibly, enhance their decision making, and realise the full value of their AI investments.
This publication is a joint publication from Ashurst LLP and Ashurst Risk Advisory LLP, which are part of the Ashurst Group.
The Ashurst Group comprises Ashurst LLP, Ashurst Australia and their respective affiliates (including independent local partnerships, companies or other entities) which are authorised to use the name "Ashurst" or describe themselves as being affiliated with Ashurst. Some members of the Ashurst Group are limited liability entities.
Ashurst Risk Advisory LLP is a limited liability partnership registered in England and Wales under number OC442883 and is part of the Ashurst Group . Ashurst Risk Advisory LLP services do not constitute legal services or legal advice, and are not provided by qualified legal practitioners acting in that capacity. Ashurst Risk Advisory LLP is not regulated by the Solicitors Regulation Authority of England and Wales. The laws and regulations which govern the provision of legal services in other jurisdictions do not apply to the provision of risk advisory services.
For more information about the Ashurst Group, which Ashurst Group entity operates in a particular country and the services offered, please visit www.ashurst.com.
This material is current as at 21 January 2025 but does not take into account any developments after that date. It is not intended to be a comprehensive review of all developments in the law or in practice, or to cover all aspects of those referred to, and does not constitute professional advice. The information provided is general in nature, and does not take into account and is not intended to apply to any specific issues or circumstances. Readers should take independent advice. No part of this publication may be reproduced by any process without prior written permission from Ashurst. While we use reasonable skill and care in the preparation of this material, we accept no liability for use of and reliance upon it by any person.