Patient safety through cybersecurity: preventing harm from digital threats – podcast transcript

Host: Rudi Hennessy
Guests: Chris Day and Robyn Dennis

Transcript

Rudi Hennessy: Hello and welcome to our Digital Clinical Safety Podcast, brought to you by NHS England. My name’s Rudi Hennessy and I’m in the digital clinical informatics safety team at NHS England. Today I’m delighted to be joined by Chris Day and Robyn Dennis. So please welcome Chris and Robyn.

Chris Day: Hello.

Robyn Dennis: Hello.

Rudi Hennessy: So we’re very excited for you to be able to share with us some insight into the role of a cybersecurity CSO and Tiger Team Leader.

So we can’t wait to hear a bit more about you, your experiences, any top tips that you can pass on. So just to get things started, Chris, can you first tell us a bit about your journey of how you became a cyber clinical safety officer and what led you to where you are today?

Chris Day: Yeah, so it’s been a little bit of a long journey and a little bit twisted and, and a lot of, sort of, change direction. But I’m, I’m a physio by background and still practise on weekends now. But I moved from a physio, a rapid response physio in community to a clinical informatics manager a number of years ago, within an acute trust, put lots of different products in from a PACS replacement to an EPR, as well as other digital implementations, and then moved from that to NHS Digital a couple of years ago. I’m a clinical safety officer as well, inn terms of my training. And moved into cyber operations when I moved across to the national team that’s then moved across to NHS England that we’re in now. So yeah, quite, quite a different journey to a lot of other people.

Rudi Hennessy: Yeah. Lots, lots for you to draw from as well in the, in your current role. So just in terms of your role that you’re doing now, what does that look like and how does it differ from a cyber security officer?

Chris Day: Yeah. So with Robyn on the call, so she will tell you a bit about more about the cyber security officer and that type of role shortly. But in terms of the clinical element of of my specific role, it’s to actually look at the impacts that cyber security is potentially preventing or causing. And looking at the true impact to the clinical front line, as well as what potential patient harm might be caused due to that.

Rudi Hennessy: Great to have you on, Chris. So hiya Robyn. So as a tiger team lead, cyber tiger team lead. That was a mouthful. How does your role interact with the clinical safety officer role that Chris has just explained?

Robyn Dennis: Thanks, Rudi. Nice to be here with you both today, by the way. So. Yeah. Tiger team lead. It’s a bit of an unusual sort of job title. So essentially the the purpose of my role is to, basically offer agile, specialised cyber resource, sort of using broad cyber expertise to help unpick thorny problems, and support specific initiatives and objectives, precisely when it counts across cyber ops. So that means I work really closely with our clinical team in cyber ops, which is obviously headed up by Chris. And that means, by default, I, I think I’ve started to think sort of clinically by extension, almost because Chris has brainwashed me. But I also find it helps me to think about the patients and our frontline staff as I work to try and ensure that security is seen as a benefit rather than a blocker.

So yeah, on a day to day basis, I could be working across multiple projects. I find myself sort of flitting between things like, like a bit of a butterfly. And quite a lot of those are, alongside, our clinical colleagues, to help find solutions to those sort of complex issues or use a bit of creativity for, for innovation.

Rudi Hennessy: Yeah. It’s vital that we have those kind of technical minds and clinical minds working together and collaborating. You’ll both know from the role that I do, we look at clinical risk management on a daily basis. Chris, from your perspective, where does clinical risk management come in when we’re talking about cyber security?

Chris Day: Yeah. So you’ve just highlighted there one of the key points is that you you have to have the MDT, multidisciplinary engagement between the different individuals of an organisation, especially when you’re implementing major digital products. And getting that assurance right, and that it’s safe by design, which in essence, comes on the back of secure by design. So looking at the the clinical risk management, there’s a need there in terms of having assurance on the, on the security and the safety of a product that for one, for example, a threat actor is unable to get into it.

Because they might actually compromise the data. There might might change to ultimately cause patient harm. That there are mitigations that also need to be looked at as part of the risk appetite when trying as to what what requirements are. An example of that might be the level of authentication security versus the actual risk it’s causing. And then the clinical impact of putting that in. So you may have, for instance, your remote access solutions that need multifactor authentication. You then might have something on site that’s in a completely, isolated room in radiology, for instance, where actually you need different requirements. So it’s looking at the potential impact of that security requirement compared to what the actual need is there. And then that ultimately will give you your clinical risk management assurance as part of the documentation as you will complete.

Rudi Hennessy: Yeah. From what your saying there, Chris, it’s very much security and safety go hand in hand. And as you’ll know, the audience for this podcast is those working in digital clinical safety, including clinical safety officers. So just for them, just a bit more information on any, digital clinical safety crossover, and activities that you might do.

Chris Day: Yeah. So, as we’re developing digitally across the NHS and that that maturity, we have, a number of organisations now that have got a HIMSS level seven for digital maturity. This is brilliant in terms of the transparency of documentation and and patients data, so that we can ultimately provide a better, form of patient care, and keep people healthy. The issue with that is that as you become more transparent, the data becomes more available to everyone, which includes threat actors. So in essence, it’s key, as you were saying, to work hand in hand to ensure that actually the the safety and health of patients improves. But you’ve got that security on the, on the back of and assurance from that. You’ve got your clinical safety officers that are fundamental with this. And in terms of like with your DCB, your 0129 for products being developed, then, 0160 in terms of implementation, it’s key that cyber security is fundamentally considered there. One thing that helps with that is looking at things like your, your DTAC that has your DCBs in the clinical arm, but then it looks at your data security and protection toolkit that’s in one of the technical arms that covers some IG elements. But a big part of that is the cyber security. And coming back to that engagement, it’s working with those cybersecurity specialists to get that assurance that that’s met. And then hopefully together we can deploy, products that are, safe and fit for the future.

I don’t know, Robyn, whether you wanted to say something on that as well.

Robyn Dennis: Yeah. Thanks, Chris. So, yeah, I was just going to kind of reinforce your points around that. I think it’s it’s absolutely vital that the cyber and information security aspects need to be weaved into our digital solutions. So from the start, from design, to reduce the potential for disruption that could otherwise impact on clinical safety and therefore prevent patient harm. And as many of us have probably seen in the press through several high profile cyber incidents, the impact of a cyber event on digital solutions can absolutely impact on clinical safety. Having those kind of robust security controls in place means that you’re not seen as the the sort of low hanging fruit by threat actors, therefore the the opportunists amongst them are just going to move on to easier targets and leave your organisation safe. And in the health and care sector, where the delivery of services is often time critical and people’s lives and well-being are at stake, the importance of incorporate cyber security into the digital estate really can’t be overstated.

Rudi Hennessy: Yes, from what your both saying it’s very much, a life cycle approach, which is exactly what we do, from a digital clinical safety standards perspective throughout the life cycle of a product, for example. Just leading on from that, what are the current cyber challenges, as you see them and how would they be managed?

Robyn Dennis: If I kick off with this one? I think, many of us may have heard the term, but ransomware is the key cyber threat. So ransomware is where organisations use data against us. So they will either encrypt that data, and, and leave it so it’s not accessible to us, which obviously causes issues. Or they might take that data and then threaten to publish it and hold us to ransom to basically get that data back. So ransomware is a key cyber threat, commercialisation of threat groups and keeping up to date in that area is really challenging. Some of the threat groups that carry out ransomware attacks, they have huge budgets. They have lots of capabilities, and they’re constantly, constantly developing new ways to cause pain. And, you know, as we’ve already talked about, Chris, I think has already mentioned, that downtime could be really extensive. So taking steps that aim to prevent that already key to keep your critical functions, delivering as intended, I think aside from the sort of financial, financially motivated sort of cybercrime, I think, AI is also increasing the risk to a degree.

I think the use of AI to create phishing emails now means that some of the ways that we might have used to spot, dodgy emails like poor spelling and grammar, those kind of things aren’t as prominent now. There’s also something called AI coding, which is basically using AI to develop applications. And, it’s very likely that’s also being used to develop malware. So things like viruses and worms and that kind of thing. Additionally, the use of AI in an unapproved state could introduce risks to data and, lead to poor security posture. So large language models, they know how to code, but they’ve learned from both good and bad examples that are publicly available. So they don’t necessarily know how to code securely.

But from a data security standpoint, we need to be conscious that the free AI services often aren’t entirely free. So the things we put into them could be used to teach the AI. And that means essentially you could be paying with data. A couple of other things I just want to touch on as well are things like technical debt. So that’s the failure to manage a systems product lifecycle, and competing priorities for funds and the limitations, of funding available introduces additional concerns. So that leads to things like unsupported systems and config configurations are, maintained, weak sort of encryption protocols remaining in use, which again will just increase risk in this space. And finally, I wanted to talk a little bit about cultural change.

So that calls back to the need, to help others to understand why cyber is really important. We mustn’t just force security on people. That means that people are going to see it as a pain, a blocker, an additional challenge for them to negotiate in their day. Whereas we want to tell people the why, the benefits,  and work collaboratively to ensure that the most appropriate solutions are implemented. So that means listening to all voices, to find a common ground.

Rudi Hennessy: Yeah, those insights are really valuable just listening to some of the challenges. Obviously without knowing the challenges, we can’t fully, come up with the solutions to the problem. So it’s it’s great to hear you talk through those. So, yeah, just in terms of, incidents, how do cyber security incidents cause detrimental impact to clinical safety? So very much coming back to clinical safety and patients. I don’t know if that’s one for you, Chris.

Chris Day: Yep. So just to give some examples, I think the the first one just to touch on is what, what Robyn alluded to at one point about ransomware. So if someone locks out one of your systems for your network, it’s that consideration of what you’re actually going to do if that happens. And a lot of the time, as times are changing and that maturity and threat actor developing, it’s not a case of if it’s going to happen, it’s when. So people need to be prepared for for these eventualities. And an example of that, a lot of acute trusts, have huge electronic patient records that hold, significant amounts of patient data that that can be very sensitive.

Now, if that becomes unavailable because a threat actors actually locked the system, then on a day to day basis, you may be able to manage for a day or two days on paper, for instance. But as we’ve seen in attacks, over the last couple of years, these incidents don’t last 1 or 2 days. They last six months. They can last a year. In terms of looking at the data repatriation at the end of it, it’s not just about containing the incident. It’s actually how you’re going to run your business while the system is unavailable. But then also, how long is it going to take to rebuild it and get the system back up and running with the original data that you need to put back in it , and also if that’s available.

So there’s a big emphasis now in terms of the business continuity plans that we need to put in place, which are very much an operational requirement, which needs the entire business unit as a as an essential function to come together to ensure that that works, that it’s adequate and that actually it’s tested on on a basis that the, the risk assessors is deemed appropriate so that we can make sure that these things work.

We’ve also got things as another example of like, electronic prescribing, medication administration system, whereby if a threat actor actually gets into your system and you do not, if you do not recognise that they’re there, it’s not alerting your system, then if they start manipulating your data because they’re trying to cause, for instance, political unrest. So then actually you may be treating patients incorrectly with different types of medication. For instance, if you have a certain level of insulin that you need to give an individual, but you then give them three times the dosage because the system says so, then what’s actually going to happen to that patient, which is obviously significant distress and could ultimately lead to death.

You’ve also got the things as well, such as data exfiltration. There are instances whereby you have something called double extortion. So a threat actor would lock your system out, but then they would subsequently steal all the data out of the system as well for potential financial gain of, for instance, putting it on the black market. Regarding that, that is very much, an information governance issue and how you would deal with that data and the different regulatory bodies that you would have to notify for that. But ultimately that clinical safety side needs assurance that when if that happens with that sensitive data, for one, who do you need to tell in terms of individuals impacted? But how are you actually going to support them ,if if you’ve got patients with anxiety, how are you actually going to limit the impact to them? If you, for instance, told them that that data had vanished, or it had been published somewhere. So there there are huge repercussions for this in terms of patient harm that we need to consider as clinical safety officers when, when products are being put in place, and  the security requirements around it.

Rudi Hennessy: Thanks, Chris. It definitely for for those listening, I’m sure it brings it to life when you talk through those examples, and emphasise  why it’s so important to acknowledge that cyber safety, hand in hand really. When you walk through, you know, your proactive and reactive measures within clinical risk management. So just with that in mind, what are the most common causes and hazards that you see?

Robyn Dennis: So I think in terms of the sort of most common issues, the kind of things that we see are weak or recycled passwords, in combination with a lack of multi-factor authentication, so weak or recycled password to those which are, either very easily cracked or ones which are used in multiple different places or, you know, could always be ones that are, you know, you might have January 1 and then January 2, etc.. Those kind of things mean the attackers can get into accounts as, what we call initial access. So that’s their initial access method that’s particularly problematic for remote access systems, which obviously means that those systems can be accessed from anywhere. They, they, can basically be used from any device, and often sort of globally. But linking to an organisation’s internal network or, hosting platform. And once somebody is in they can then do all sorts of untold damage, tied into that, we see sort of cases where identity and access management gaps can contribute. So things like stale or old accounts or old methods of remote access that have been forgotten about and not closed down.

That can include sort of third party access. So, you might have had engineers  that have been supporting particular assets or software, to have, you know, there was no longer needed, but those routes in still exist. And that’s particularly relevant if you’ve got accounts that are privileged. So those, accounts that have got additional kind of administrative capabilities , that means, they give people a lot more access to do more damage. So that’s when you’re going to be using things like the principle of least privilege, limiting it right down to absolutely what’s necessary and and linking with things like, just in time access and just enough access, to, to limit the risks around those kind of things. Another area we see is things like insufficient management, and misconfiguration of systems.

So patching and security updates, those, those are used in response to weaknesses being found in solution. So if patches are applied that means, the the gaps in security are closed. If then if the patches aren’t applied in time, then potentially those gaps remain and the threat actor can use them. And sometimes we’ll see multiple vulnerabilities being used together, to increase the impact.

And another common issue we we see more and more is, is supplier compromise. So organisations often choose to commission digital services from a variety of providers. That’s kind of the norm now. But if those suppliers don’t have the right security controls, it becomes another surface through which attackers can gain access and cause harm. And I think that, again, comes back to we’ve talked about sort of lifecycle management already, but it highlights the importance of supplier risk management. But also why we need to consider cyber security throughout a product lifecycle.

Chris, I don’t know if you’ve got more to add on that.

Chris Day: Yeah. So I think there’s a key point here that is, needs to be touched on, which is that in a lot of implementations of products, you have a clinical safety officer that is put into a programme of work to, to, in essence, implement a specific product. Now due to capacity of trusts, different organisations, that’s normally set for a set period of time until the product is put in, at which time the clinical safety officer can potentially be moved on to a different programme of work due to competing pressures, and requirements. With regards to product life cycle, as you’ve said, Robyn, we need to actually look at the end to end process with this of when we initially implemented the products, we give it assurance.

But then what actually happens in five years time when different pieces of the software start to become out of date? Or they start to become unsupported because the operating platform on them is no longer supported by the provider anymore. You might have something like that where you’re on a 10 or 15 year contract, and these types of things need considering when you’re actually implementing, so that the longevity of the product until its end of life is, is reviewed. And considered. Additionally to that to touch on a point previously, you’ve got AI that is becoming more and more prominent. And this actually emphasises even more that we need to put in that end to end product lifecycle assurance, because you may have changes within the product as as the AI develops and as it integrates with the systems that you’re utilising it for.

And then there’s also the, end of the product’s life of when you need to decommission something and you need to change over to a different system or stop that service. What actually happens to that data? Do you have the security around it for ongoing, safety of that? Is it going to be stored in the right place? Does it need moving somewhere, and is that even possible? Also, it might be that it needs migrating to a different system, in which case how are we going to securely do that? So lots and lots of different things across potentially for instance a 15 year lifespan. But actually it all needs considering at business case level when you’re actually putting that together to potentially implement a product.

Rudi Hennessy: Yeah, some really interesting points there. And definitely with regards to that lifecycle approach, that’s absolutely what we, we recommend. And obviously you’ve got the standards that talk about that lifecycle approach as well, from kind of design development all the way through to decommissioning, and resourcing is part of that, isn’t it. Also really important, Robyn, you pulled out quite a few, common issues of potential causes. We do that for a reason. Obviously, within clinical risk management, we walk through a process so that we can be proactive and reactive in mitigating these causes or issues. That may well come about. So what do you see as the key things an organisation or an individual can actually do to prevent these, cybersecurity incidents?

Robyn Dennis: Yeah. Thanks. Rudi, I think that’s there’s quite a few options here, but I think one area that’s really worth calling out is around staff education and awareness. Because that’s really important. Everyone has a role to play. I’m not sure if others have heard the term, but I’ve used the term before. Cyber is a team sport. So helping colleagues to do the right things, is a really key way, to help prevent cyber security incidents. So ideally, that’s not just an annual e-learning course, but you can also think about how cyber security can be more most beneficial. So phishing emails that’s definitely one area. So I mentioned earlier that the guidance used to be around checking poor spelling and grammar. AI’s changing that now. So what are the other signs we can use. Spoiler alert urgency is often one of the most common tactics around that.

But also, you know, how do we ensure that cyber is deeply embedded throughout the organisation? So as, as Chris just mentioned, around building into business case, how do we make sure cyber is, embedded in your digital related product projects and procurement? So it’s a default action to consider cyber, not an exception. There are lots of aspects of cyber that can impact us both at work and in our personal lives, and helping staff to be aware of those and the types of attacks that could reach them in any sort of capacity of their lives, really helps them to protect themselves and protect the organisation, and that helps to build those consistent behaviours, that build the strong defences that we all need to stay out of harm’s way. I think linked to that is around communication. So communicating the why is really important to ensure, the people understand cyber, and bring it to life. So one example being why do we have multifactor authentication. So that’s to reduce the likelihood of bad guys getting into our accounts, instead of only needing a password now they need to password, plus something else.

And statistics show that if they, they won’t have that there’s something else 99% of the time, and that stops them in their tracks keeps our systems and our data safe. Keeps us safe at home, keeps us safe at work. Staff often find it difficult to understand why someone else would want access to their work account. So I often see things, you know, like memes and stuff on social media where people are saying, why would somebody be trying to get into my work account anyway, if they want to get in and do my work, that’s absolutely fine. But maybe we can help them to understand and recognise that actually to a financially motivated cybercriminal or nation state funded attacker, the information in the systems they can access is really interesting, really valuable.

We all have a duty of care to protect the sensitive information that we hold about our patients, our staff, and everyone else. And I think the final thing I’m going to call out is one really key element. And we’ve talked about being prepared already around business continuity, etc., preparation for an incident is really key. Don’t make the mistake. Think it will never happen to you. As Chris has already said, in the case of cyber, it’s not an if it’s a when, assume it will happen and be prepared. Have your response plans ready, both in terms of technically restoring systems but also maintaining the delivery of essential services. Whilst that goes on, have your communications ready drafted decision makers agreed and test those plans and arrangements work ahead of time. So you’ve got confidence in the process and that staff are familiar with the steps to take.

Rudi Hennessy: Thanks, Robyn. Yes, some really interesting, insights into what individuals can actually do. And obviously organisations and it just sounds very much like it’s it’s a culture, you know, organisations need to create that culture of, of cyber safety. We talk about a culture of digital clinical safety in my role, and it’s not just one person’s job, is it? So just with that as well, from, a patient safety perspective, what can be done to prevent these incidents leading to harm?

Chris Day: Yeah. So I think Robyn’s actually touched on a lot of points. And it’s it does feel like it becomes repetitive sometimes, but it really does come down to ensuring that your business continuity plans are actually in place. They’re tested, and that they actually work. Sometimes we, we’ve found incidents where actually we need to change those business continuity plans and yes, we’ve done that and yes, lessons are learnt. But if that was fundamentally done proactively prior to the incident, then it would actually cause a lot less pressure on the staff that are dealing with the incident. And it would mean that was more successful and didn’t have disruption due to that. There’s also the the disaster recovery side of this, again, of if you are going to – when you get attacked. Let me rephrase. How are you actually going to get this data back available for your clinicians? Because if you don’t have that available, then if, for instance, Mrs. P comes into A&E and you don’t have her past medical history, how do you know how to effectively treat her?

How do you know what she’s sensitive to in terms of medication. It starts to build a picture of where things can become very harmful if we don’t have access to that data now, which ultimately wasn’t the same as when when we were on paper, when actually you’ve you’ve got things said, I’ve got medical notes that are pulled from an archive and isn’t a system that’s, that’s locked down that you can’t touch.

One of the other parts of this. Again, the links with that cultural side is the engagement between different members of staff. So when an incident does happen, it’s getting that assurance that the different types of staff that you have all engage with each other appropriately, about the level of communication needed. So you may have your digital teams that become aware of an incident, but actually, have they told a responsible clinician for that system that this is happening so that if some containment activities are required, then the appropriate risk assessment can be made and the decisions can be put in place so that ultimately that patient safety is is the ultimate consideration.

You’ve also got your operational staff as well within that. If for instance, you need to switch off a system, is there something you need to fundamentally do before you switch it off? For instance, you might need to do something to prepare your business continuity plan, like if if it was paper, for instance, you might need to print more of them off so that you’re actually ready for that happening. Hopefully people are actually in that position. But again, that would happen potentially if you didn’t test this and you didn’t have the right things in place. So I would say engagement and your business continuity, probably the the two key things there.

Rudi Hennessy: Yeah, absolutely. Very much multidisciplinary, working together. Yeah. I totally agree. So you’ve you’ve both added, so much value in terms of tips and pointers, with regard to cyber safety. If I could just ask both of you, is there a key takeaway message, that you’d like listeners, to kind of leave the podcast with.

Chris Day: If I go first Rudi. Sorry for pushing in Robyn. That’s normal. So what I would say my key tip is, is cyber is a business issue. So it’s not an IT issue, when we consider the impacts of a cyber attack, it impacts the operation of essential functions. And fundamentally patients are at the end of that story. So we need to ensure that the appropriate business individuals are involved to ultimately prevent that patient harm from occurring. I don’t know what what else you want to add Robyn?

Robyn Dennis: Yeah. Thanks, Chris. And yeah, used to you jumping in first. Anyway, I think the thing I’d like to say is there’s no stupid questions in cyber, so it doesn’t need to be overly technical or something to shy away from. It’s not meant to be mysterious or scary or confusing. But I feel like sometimes it can come across that way. It is relevant and it is absolutely part of our lives with our increasing reliance on digital solutions. So I definitely encourage everyone to find a cyber specialist, grab one or check out the podcast resources, for some practical tips and guidance. And we’ll make sure that there’s some good signposting in there to, to get you to some, some helpful people. It might even include me and Chris. But yeah, that’s my tip. Thank you.

Rudi Hennessy: Safely grab a cyber specialist. Thank you both. So just leading, from that, are there any specific books, websites, just general resources that you can share for clinical safety officers and those working in digital clinical safety?

Chris Day: Yep. So as I brought attention to earlier. So we have the DTAC. So the digital technology assessment criteria. So that has five different pillars. And we should be using that when we’re actually implementing products to to give that rounded approach. So we have the the clinical side with the DCBs. But I think a main point for individuals that want to know more about cyber and the type of questions that we ask, is the data security and protection toolkit that will actually highlight what, what different things that we’re asking. Now, I appreciate that’s a, more of a cyber security professional’s question, but it gives you that insight in terms of what they’re looking at. But then fundamentally, what assurance that’s giving to you as part of that clinical safety assessment. So I would say that’s probably the best one as a starting point.

Rudi Hennessy: That’s great. Thanks, Chris. So yeah, that’s that’s been a wonderful conversation. So so many tips and pointers there for those listening. Thank you so much for your time, both, Chris and Robyn and thanks for for everyone joining the podcast. I personally understand a lot more than when we first, had this conversation. And I think very much it needs to be hand in hand, cyber and digital clinical safety. Absolutely. So I hope that listeners found it as informative as I did, and hopefully that individuals can take some key points away to their own organisations. So if those listening want to find out any more, there’ll be links and useful resources in the show notes. Thanks for listening.

Publication reference: PRN01487