Data quality Improvement

The Data Services team are responsible for ensuring the quality of the data which will be used by analysts working in the NHS, the team undertake monthly checks on the quality of the data we hold. Data Services routinely reviews and updates its data quality checks and procedures to ensure they are robust and in line with any changes to national policy.

We aim through reporting, consistent procedures and analytical expertise to ensure that high-quality data is available for NHS analysts to:

  • Improve patient care.
  • Support population health management.
  • Support commissioning decisions and policy developments.
  • Create, support and improve patient-centric analyses.
  • Support and improve dashboard development.
  • Improve analysis more generally.

Consistent, timely and accurate data improves patient care and decision making, both at the local and national level. Data Services, by facilitating rapid access for decision makers and clinicians to assured, good quality information contributes to the improvement of patient experience of NHS services.

Why is data quality important to the NHS?

High quality data is important to the NHS as it can lead to improvements in patient care and patient safety. Quality data plays a role in improving services and decision making, as well as being able to identify trends and patterns, draw comparisons, predict future events and outcomes, and evaluate services.

To monitor the quality of data, we ask ourselves the following questions:

  • Consistency: How does our current data look compared to the historic picture? Are there unexplained peaks or troughs that need to be investigated?
  • Accuracy: Does the data in our system fit what we expect to see? Do data items match with historic submissions?
  • Timeliness: Was data submitted in line with deadlines? Are any data providers persistently late in their submissions?
  • Efficiency: Is data processed and made available in a timely fashion? Are DQ checks performed as smoothly and efficiently as possible while still maintaining high standards?
  • Validity: Are submitted data items valid according to national definitions? Is field population of mandatory data items at expected levels?
  • Completeness: Do we have significant gaps in our data? Has a provider missed a submission?  Has a provider removed a portion of its historic data?