A 6-step guide to Data Quality Assessments (DQAs)

Monitoring experts responsible for data management and reporting are increasingly questioned about data validity, reliability, integrity, and timeliness. In short, people want to know whether they can trust the data. In order to answer these questions, a Data Quality Assessment (DQA) is carried out. A DQA can either be carried out internally (by the project M&E team), or a donor agency can engage an external DQA expert to conduct the assessment.

DQA is a systematic process to assess the strengths and weaknesses of a data set, and to inform users of the ‘health’ of data. The assessment mainly focuses on the following aspects of data:

Data aspect Key question
Validity Does the data clearly and adequately represent the intended result?
Would an expert third party agree that the indicator is a valid measure for the stated result?
Reliability Are the indicator definition and data collection and analysis processes clear and are these consistently applied over time?
Integrity Do the data collection, analysis and reporting processes have clear mechanisms in place to reduce manipulation?
Timeliness Is the data sufficient, timely and current (recent) to influence management decision-making?

DQA is a multi-stage process involving several steps, each with its own activities and deliverables. The following sections provide details about the six steps involved in carrying out a DQA.

If you like this article don't forget to register to the ActivityInfo newsletter to receive new guides, articles and webinars on various M&E topics!

This guide is also available in French and Spanish

Step 1: Selection of indicators

Since DQA is a time-consuming and resource-intensive process, experts advise the selection of a minimum number of indicators. Ideally, no more than three indicators should be selected in one mission/DQA assignment using the following criteria:

  • indicators which are of a high importance such as ‘the number of jobs created’
  • indicators which report high progress over time (or those of high targets)
  • indicators which haven’t been under the DQA previously
  • any indicators with suspected data quality issues (or unusual progress)
  • indicators which were previously DQAed and their data quality was rated as ‘poor’

Step 2: Review of available documents/datasets and preparation for the field-phase

In the second step, the DQA expert should review the previous DQA reports (if any) to understand the data collection and data management system as well as the findings and recommendations. Moreover, any available reports, such as narrative progress reports, are also reviewed. In the case of an external DQA, the expert must also review data sets supplied by the project/organisation. The expert may also request (or obtain) the project M&E plan or guidelines to understand the M&E system. The information may be used to develop a DQA matrix which includes the key questions, sub-questions, data sources for the DQA questions, and tools and methods to be used to answer those questions.

Step 3: Review/assessment of data collection and management system

Once the preparatory phase is over, the DQA expert arranges meetings with the relevant project staff (including the M&E team) to understand the data collection and management system.The focus should be on:

  • checking the M&E Plan (if available)
  • reviewing indicator meta-data (or indicator reference sheets)
  • assessing the adequacy of methods and tools
  • understanding the data flow process, roles and responsibilities (background/experience) of the team responsible for data collection and data management
  • understanding the tools and mechanisms in place to ensure the integrity of data The expert may request supporting documents to triangulate the details given by the team in response to the above items.

Step 4: Review of data collection and management system implementation/operationalization

During this stage, the expert should focus on the following questions:

  • Has the data been collected and managed in conformity with the data collection system design?
  • Is the data collected and analysed on a sufficiently timely basis to influence management decision-making?
  • Are adequate data-checking procedures being conducted (excluding field-level verification and validation)?
  • Has the data been analysed and reported in conformity with the data collection system design?

The above questions are answered by reviewing the actual data and analysis. Supporting documents are consulted, and the system/database is checked to ensure that the data collection and management system is in conformity with the data collection system design.

Step 5: Verification and validation of data

At this stage, the expert carries out a verification exercise to validate the reported data. This is done by selecting a sample of data and physically verifying it through supporting documents as well as physical verification of the data.

Step 6: Compilation of the DQA report

Once the review and field phases are over, the DQA expert produces a report. Ideally, a DQA report should include the following:

  1. Executive summary
  2. Background / Introduction of the project
  3. Indicators selected for the DQA: a. Process and methodology followed for the DQA, b. Key findings (separately per indicator), c. Data flow (steps), d. Data Management System Design, e. Implementation/Operationalisation of the Data Management System Design
  4. Data Verification/validation
  5. Scores and overall rating (per indicator)
  6. Recommendations (separately for each indicator)
  7. Conclusion
  8. Annexes

In summary, the DQA process involves selecting indicators, reviewing available documents and datasets, assessing the data collection and management system, reviewing its implementation/operationalisation, verifying and validating the data, and compiling a DQA report. The report serves as a tool to identify weaknesses and strengths in the data collection and management system and to make recommendations for improvement.

The ActivityInfo team would like to thank the Education partner Maheed Ullah Fazli Wahid for this article. Maheed is a high-profile M&E expert with demonstrated experience in designing and managing M&E systems for multi-billion-dollar programmes focusing on humanitarian and development interventions. Over the past few years, Maheed has participated in more than 30 DQAs. Currently, he is the Senior M&E System Manager for the EU Facility for Refugees in Turkey (FRiT), a programme consisting of over 100 projects covering projects in sectors such as Education, Health, Livelihoods, Cash Distribution, Protection, Municipal Infrastructure, and Migration Management.