Creating a new NHS England: Health Education England, NHS Digital and NHS England have merged. Learn more.
The new Congenital Heart Disease review: 4th update
Thank you for your continued feedback. We are still working on some of the questions you have raised previously. For example, you told us you wanted more clarity about the scope of the review (what’s in and what’s out?), and you asked about the governance (who makes the decisions? who provides advice, and how is it joined up to the decision making?). We will provide clear responses to these questions, with explanation, in a future blog.
Some of the points which have been raised since then, in the blog and elsewhere, are:
- Is it a requirement of the new review that the number of surgical units must be reduced (eg from 10 to 7)?
- What does NHS England do about patient safety concerns raised during this process?
- Why was the British Heart Foundation invited to a meeting of local charities?
The number of units: Bill McCarthy’s advice in the recent meeting with local charities – and elsewhere when this has been discussed – was that “there is no number”. We may – at the end of the process – have determined that we want a specific number, but that is not where we start from. Instead, we are commissioning a national service against national standards for the whole population. It must be consistently high quality and it must be sustainable. That requires us to look at the latest data and best projections, to use the evidence and to make judgements. And of course, we have been asked to look at adults’ and children’s services together, which is a big change from the last review. For all these reasons, our work does not begin with a target number of units.
Patient safety concerns: As we continue our discussions with stakeholders during this review, from time to time an issue may be raised with us about patient safety at one or more units. It is important that we have a clear and consistent approach to handling these concerns, so we will always:
- inform the NHS England “domain lead” (Dr Mike Durkin) – a very senior official with lead responsibility in NHS England for patient safety
- pass any safety concerns on to NHS England’s medical director in the appropriate region (London; North; Midlands & East; and South). The medical director is well placed to …
- consider the issue with the Care Quality Commission (CQC), who have legal powers to assure essential levels of safety and quality. CQC hosts the government’s new “Chief Inspector of Hospitals” and works with NHS England locally to undertake “quality surveillance”
British Heart Foundation: NHS England wants to take the opportunity presented by this new review to do some things differently. In my first blog (28 June), I mentioned that we were talking to three charities (Involve, National Voices, and the Centre for Public Scrutiny), to help “ensure that everyone’s voice is heard and that we work together constructively”. We’ve now also asked the British Heart Foundation (BHF) to work with us as an independent but informed party. Because they have a broader focus than congenital heart disease alone, we’d like them to be involved in the new review as a critical friend, to help us to engage effectively, and to act as an honest broker as we build mutual trust in the process. We asked them along to the local charities meeting on 7 August so they could hear first-hand the local as well as the national perspective. If BHF, NHS England or other stakeholders don’t think this approach is working, we’ll reflect on that.
Patients, families and their representatives
We scheduled a meeting of local charities and patient groups for 7 August. A few people told us they could not make this meeting because of the school holidays and other pressures on their time as volunteers and carers. We decided to go ahead with the meeting anyway, because most of those we invited could attend and we did not want to delay any further this important meeting. But we do recognise the difficulty of finding times and dates that are convenient for everyone who wants to participate and we will give this careful thought as part of our “stakeholder engagement” plan. I will say more about this plan in a future blog. The enclosed note sets out the main points from the 7 August discussion.
Clinicians and their organisations
NHS England is responsible for all specialised commissioning, including services for congenital heart disease (CHD). It has established clinical reference groups (CRGs) across the full range of specialised services, to provide clinical advice. The CRGs are made up of clinicians, commissioners, Public Health experts and patients and carers who use the relevant specialised services. Together the members form a group which can advise with authority and expertise about a particular area of specialised healthcare. One of the CRGs is specific to congenital heart disease and is chaired by Dr Graham Stuart, a consultant cardiologist at University Hospitals Bristol. More information about the group can be found here:Clinical Reference Group E05 Congenital Heart Services. At its meeting on 30 July, the group received an update on the new CHD review. You can read about the relevant points made in discussion here.
The meeting note refers to work on clinical standards. The work on clinical standards is being undertaken by a Clinical Implementation Advisory Group (CIAG) chaired by Professor Deirdre Kelly, Professor of Paediatric Hepatology at Birmingham Children’s Hospital. This work (as well as work on developing congenital heart networks) began some months ago, and NHS England has agreed that this should still be supported, and has asked Professor Kelly and CIAG to complete this work. We think it is very important to maintain momentum and not lose the progress which has been made so far. When the work is complete we will be able to consider its implementation as part of the new CHD review. Completing this work will benefit the new CHD review and allow us to make faster progress than we otherwise might; but of course it does not decide the outcome of the new review.
NHS England and other partners
On 29 July, the CHD sub-group of the Board of NHS England met to review progress of the work to date. The enclosed note sets out the main points from the discussion.
On 31 July, NHS England’s Chair, Professor Sir Malcolm Grant, wrote to the Rt. Hon Jeremy Hunt MP, Secretary of State for Health, providing a short update on the new review of congenital heart disease (CHD) services. The letter follows NHS England’s Board meeting held in public on 18 July, which describes the challenge facing NHS England in improving congenital heart disease services and outlines early thinking on the way forward. In line with NHS England’s commitment to transparency, a video recording of the Board’s discussion is also available.
I am to publish a new CHD blog every fortnight, usually on a Friday, so you can expect future publication dates to be 23 August, 6 September, 20 September, and so on. However I am on leave from 12-16 August so my next blog (due 23 August) might be a little bit delayed.
Just hope the new review process is not a farce nor a fait accompli. Enough money has been spent on what is now primarily a wasted cause and further monies, up to £0.5million per year on management and administrative purposes to support the latest review.
Hope no pre-conceived ideas, you talk about not knowing the number [of centres] but the decision to have fewer surgical centres seems to have been made, the question is simply ‘where’!
Please respect the wishes of parents and don’t treat with distain, the NHS is a fabulous organisation but systematic failings and political whims should not win the battle for certain geographical areas as there is a war to won against CHD!
I would just like to make my view clear on the following quote from the notes,
“attendees noted that deaths of children with CHD, and other very poor outcomes, were often not a direct consequence of the surgery, but due to a complex series of factors”
Whilst a number of factors might cause some I am not convinced of anything and I do believe in any death or maiming they should look at all factors and not try to automatically blame the after care. Surgery is the one point where witnesses are limited to a very few people and therefore should be scrutinised the most in my opinion. We all know a medical team will stick together very rare for a whistle blower in these circumstances due to the fact its very rare that the team would not be doing their up most to get it right. But mistakes happen and so do cover ups. I think the very nature of the surgery makes it easy for cover ups to happen as in ” not all can be saved” and we all know this is true, but we also know that to many children have died very unexpectedly and in our case never woke up from surgery how can this not be surgery related. I am not convinced and don’t think I ever will be, yes after care was shocking but the damage was already done finding out just how is the hard part.
Interested in the point raised about clinicians leaving.
Take for instance Southampton; their top surgeon left last year. They were rated 2nd best in the country under the old review – obviously his figures were included in this rating – will they now be reassessed to give a true result of where they stand now?
The rankings that that was based had nothing to do with outcomes measures, the centres were scored against various organisational standards by a panel led by Sir Ian Kennedy:
In terms of mortality data, Southampton does well in both the short and long terms, but not second in the country:
You quote Nicor data. But this is incorrect and quite simply doesn’t add up. Deaths missing from funnel plot, I wonder what these poor families feel 🙁
When will the revised updated data be released? After all we are using these figures – relying on – these figures when looking at whether a unit is safe to do a procedure? Take for example CAVSD repair.
Look with your eyes, do they add up?!?!
All the above figures are irrelevant. Kennedy’s figures were judged not fit for purpose by the courts and the IRP. Not all procedures are counted or shown in official data.
The important words to note in the official figures are the words average and partial. If you take the average method a hospital could do exactly the same case mix two years running with the same results but one year be above the average line and one year below because the average is dependent on the results of the other hospitals. It would appear at first glance that the hospitals results had got better or worse but actually the results only changed for the other hospitals. With the partial risk adjusted figures the word partial is important.
What is important is the actual number of adults and children who lose their lives to CHD each year. This has fallen over the last ten years and shows that the service currently provided is extremely good and significant improvements have been made. There is a point that every service will reach were it is unable to improve the numbers further. Just changing the way the numbers are calculated can fool some into believing a service actually better or worse but not all.
We have excellent children heart surgery services and the important thing is to build on what we currently have and consider how to make it better and not to rip apart and destroy what has taken years to build.
@Don’t believe everything put in front of you.
I’m not sure what you’re referring to exactly. The actual funnel plots look fine. Or are you referring to the difference between the plots and the body of the text or the CCAD database? I did notice there were some discrepancies between their most recent funnel plots, before they took them down, and the raw data, but I don’t know whether this is down to risk adjustment.
I think you’re somewhat overstating your case when you say that the panel scores were found to be “unfit for purpose” (see here and the referenced paras here here), or else it needs qualifying. I actually agree that they came to have a disproportionate influence on the final outcome given the weightings, which I think was broadly the IRPs point as well.
In terms of the outcomes data, obviously someone has to be above average, and a centre can be on one side of the average line one year/triennium, and the other the next, purely through random variation, and there seems to be some of that within the data. However there’s also at least one case of what looks to be genuine movement over time (whether that’s because of improvements in risk adjustment or actual progress over time, I don’t know), in addition there are some centres that are consistently one side or the other across what amounts to twelve years of data, and that’s worth taking notice of.
Going to your second point, I’d argue that the only rational way to judge these units is in relation to each other, otherwise you either have to set an arbitrary number of “acceptable” deaths, or else set it at zero which has never, and probably will never be attained. If one centre isn’t keeping up with the pace of improvement in others, that still results in children dying who would have lived had they gone elsewhere, which I think is unacceptable. I think a centre that is (relatively) under performing, either in terms of specific procedures, or generally, ought to either reduce the complexity of the procedures it’s offering or, if it is found to be desirable to concentrate units, that should be a strong deciding factor.
As my name would suggest, I don’t have a dog in this arena, and those links were originally intended as a point of information in response to the original post, but those are my unqualified thoughts.