News

Redesigning the Cancer Patient Experience Survey – Dan Wellings and Yoryos Lyratzopoulos

Today sees the publication of hospital trust and CCG-level data from the Cancer Patient Experience Survey (CPES). These findings, along with the national findings published on 7 June, provide important baselines from which to measure the successful delivery of the national cancer strategy. Dan Wellings and Yoryos Lyratzopoulos, who led the redesign of this year’s survey, explain why the changes are so important:

The Cancer Patient Experience Survey has run since 2010 and has proven to be a powerful way of gathering data and generating insight into the care experiences of people with cancer in England.

This year 71,000 patients took part – a response rate of 65% – making it by far the largest survey of cancer patients anywhere in the world.

Following on from the national results published last month, today sees the publication of local data, allowing providers and local commissioners to see how they are doing and where they should improve.

Achieving world class cancer outcomes: A strategy for England 2015-2020 was published in July 2015, and improving patient experience is central to its success. We needed to ensure that CPES would allow us to track how well we are delivering these aims.

To do this, a number of changes were made to the survey and we want to explain what has been done, why it’s been done and what it means when interpreting the findings.

As there is always a tension between keeping the data comparable over time and ensuring that the survey keeps up with what’s important to patients and practitioners, we didn’t want to make any changes without first understanding what people who fill in the survey, and users of the data, thought.

Over a ten-month period, we talked to a wide range of people to get their views on the survey, whether it asked the right questions and how it could be improved. We worked with patients, Clinical Commissioning Groups, NHS providers and charities. There were a number of themes that emerged from these conversations:

  • the questionnaire was too long and some questions too complex.
  • some questions needed re-testing again with patients to ensure they still captured what mattered.
  • there had been changes to the care pathway (i.e. the different services patients experience during care and treatment) which should be considered.
  • data must be presented as clearly as possible.
  • the addition of an online response mode would make it easier for patients to respond, particularly younger ones.

We took this feedback to the Cancer Patient Experience Advisory Group and worked with members and the survey providers to redesign both the questionnaire and how we present the data, testing various drafts with users.

We have redesigned the questionnaire so it is shorter, clearer and better meets users’ needs. We conducted statistical analysis to ensure we were continuing to measure what matters. Dropping outdated questions made space to include new ones suggested by patients and users of the data.

Meanwhile, we have kept a core of key survey questions from previous years that allow us to look at changes in the experience of cancer patients over time, although some care needs to be applied as subtle changes (e.g. in the survey period, or the addition of an online response mode) need to be accounted for.  This extensive review will allow us to produce data to track progress in the coming years.

At the same time we have improved the reporting of data from the survey, adding the new tier of local-level data with Clinical Commissioning Group reports being produced for the first time alongside the trust report. The data is published today.

We report local performance in a way that can give better understanding of how services performance compares with that expected for their patient case mix – in respect of age, sex, ethnicity, deprivation and cancer site. The adjusted estimates create an ‘even playing field’.

We now identify hospitals or CCGs with outstanding performance – either positive or negative outliers – using a more robust statistical methodology also employed by the CQC.

The changes we have made have been based on feedback from patients and users and that is how it should be. As a result, hospital trusts and CCGs will be able to better use the feedback they get from the survey to help them improve services and measure how well they are achieving this over time.

Thank you to everyone who helped us with this work and, in particular, to all the patients who responded. It is too easy to forget when looking at the overall figures that 71,000 people took the time to tell us their views. It is up to us to listen to them.


Image of Dan Wellings, leads NHS England’s Insight and Feedback TeamDan Wellings leads NHS England’s Insight and Feedback Team, which oversees how the NHS collects experience and outcome data from patients, including national surveys such as CPES. Prior to joining NHS England, Dan was Head of Public Health Research at Ipsos MORI, working in the Social Research Institute. Dan has a Masters in Public Health from the London School of Hygiene and Tropical Medicine and is a Senior Associate of the Nuffield Trust.

Image of Yoryos (Georgios) LyratzopoulosYoryos (Georgios) Lyratzopoulos is Reader in Cancer Epidemiology at UCL, and Cancer Research UK Advanced Clinician Scientist Fellow. Beyond studying variation in cancer diagnosis and other outcomes, he has a substantive research interest in population studies of cancer patient experience. He has acted as an academic adviser to NHS England for the CPES surveys.






Categories: BlogsGuest blogsHome

Tags:

Leave a Reply

Your email address will not be published.