Skip to main content
Patient-Centered Outcomes Research Institute
Patient-Centered Outcomes Research Institute
  • Blog
  • Newsroom
  • Find It Fast
  • Help Center
  • Subscribe
  • Careers
  • Contact Us

PCORI

Patient-Centered Outcomes Research Institute

Search form

  • About Us
    Close mega-menu

    About Us

    • Our Programs
    • Governance
    • Financials and Reports
    • Procurement Opportunities
    • Our Staff
    • Our Vision & Mission
    • Contact Us

    Fact Sheets: Learn More About PCORI

    Download fact sheets about out work, the research we fund, and our programs and initiatives.

    Find It Fast

    Browse through an alphabetical list of frequently accessed and searched terms for information and resources.

    Subscribe to PCORI Email Alerts

    Sign up for weekly emails to stay current on the latest results of our funded projects, and more.

  • Research & Results
    Close mega-menu

    Research & Results

    • Explore Our Portfolio
    • Research Fundamentals
    • Research Results Highlights
    • Putting Evidence to Work
    • Peer Review
    • Evidence Synthesis
    • About Our Research

    Evidence Updates from PCORI-Funded Studies

    These updates capture highlights of findings from systematic reviews and our funded research studies.

    Journal Articles About Our Funded Research

    Browse through a collection of journal publications that provides insights into PCORI-funded work.

    Explore Our Portfolio of Funded Projects

    Find out about projects based on the health conditions they focus on, the state they are in, and if they have results.

  • Topics
    Close mega-menu

    Topics

    • Addressing Disparities
    • Arthritis
    • Asthma
    • Cancer
    • Cardiovascular Disease
    • Children's Health
    • Community Health Workers
    • COVID-19
    • Dementia and Cognitive Impairment
    • Diabetes
    • Kidney Disease
    • Medicaid
    • Men's Health
    • Mental and Behavioral Health
    • Minority Mental Health
    • Multiple Chronic Conditions
    • Multiple Sclerosis
    • Obesity
    • Older Adults' Health
    • Pain Care and Opioids
    • Rare Diseases
    • Rural Health
    • Shared Decision Making
    • Telehealth
    • Transitional Care
    • Veterans Health
    • Women's Health

    Featured Topic: Women's Health

    Learn more about the projects we support on conditions that specifically or more often affect women.

  • Engagement
    Close mega-menu

    Engagement

    • The Value of Engagement
    • Engagement in Health Research Literature Explorer
    • Influencing the Culture of Research
    • Engagement Awards
    • Engagement Resources
    • Engage with Us

    Engagement Tools and Resources for Research

    This searchable peer-to-peer repository includes resources that can inform future work in patient-centered outcomes research.

    Explore Engagement in Health Literature

    This tool enables searching for published articles about engagement in health research.

    Research Fundamentals: A New On-Demand Training

    It enables those new to health research or patient-centered research to learn more about the research process.

  • Funding Opportunities
    Close mega-menu

    Funding Opportunities

    • What & Who We Fund
    • What You Need to Know to Apply
    • Applicant Training
    • Merit Review
    • Awardee Resources
    • Help Center

    PCORI Funding Opportunities

    View and learn about the newly opened funding announcements and the upcoming PFAs in 2021.

    Tips for Submitting a Responsive LOI

    Find out what PCORI looks for in a letter of intent (LOI) along with other helpful tips.

    PCORI Awardee Resources

    These resources can help awardees in complying with the terms and conditions of their contract.

  • Meetings & Events
    Close mega-menu

    Meetings & Events

    • Upcoming
    • Past Events

    January 2021 Board of Governors Meeting

    The Board approved funding for a new research study relating to kidney health and a new funding allocation for PCORnet. Learn more

    Confronting COVID-19: A Webinar Series

    Learn more about the series and access recordings and summary reports of all six sessions.

    2020 PCORI Annual Meeting

    Watch recordings of all sessions, and view titles and descriptions of the posters presented at the virtual meeting.

You are here

  • Research & Results
  • Explore Our Portfolio
  • Developing Standards for Improving Me...

This project has results

Developing Standards for Improving Measurement and Reporting of Data Quality in Health Research

Sign Up for Updates to This Study  

Results Summary and Professional Abstract

Results Summary

Results Summary

Download Summary Audio Recording (mp3)

What was the research about?

Many healthcare systems use electronic health records. Researchers use data from these records in their studies. Some records have missing or incorrect data. When this happens, people might not be able to trust a study’s results. The research team wanted to:

  • Create guidance to judge whether data that a study used were high quality
  • Find new ways to display the quality of data
  • Learn why researchers don’t always report the quality of data that they used in studies

What were the results?

The research team developed guidance to help people judge whether data in research studies were high quality. The guidance included ways to report quality. High-quality data are complete, believable, and reliable.

The research team found that the guidance helped researchers from six large healthcare systems judge and report data quality from electronic health records.

The study members created new ways to show data quality in pictures or graphs.

The research team found that cost, time, and lack of guidance were the primary reasons that researchers did not report on data quality.

Who was in the study?

About 100 people joined the study. The study members included healthcare workers, patient advocates, and policy makers. They also included project managers, people who work with healthcare data, and researchers. All the people in the study were interested in the quality of data that research studies use.

What did the research team do?

The research team held two in-person meetings and monthly online meetings for the study members. The team then used information from these meetings to write guidance about how to measure data quality. The team made sure that all study members agreed on the guidance.

The team used a website to ask for feedback from other people interested in data quality about the new guidance. The team tested the guidance using electronic health records from six large healthcare systems.

The research team asked study members for ideas about pictures and graphs that could show data quality. The research team also surveyed other researchers to find out what kept them from reporting on data quality.

What were the limits of the study?

The study included about 100 people interested in the quality of data used in research studies. Other people may have different ideas about looking at data quality.

People who took part in the study were interested in the use of electronic health records for research. The study didn’t include people who use other types of research data, such as data from science laboratories or from social media. People who use other types of data may have different ideas about reporting on data quality.

How can people use the results?

Having common guidance about measuring and reporting the quality of research data can help people understand whether data that studies used are high quality and trustworthy. Figuring out why researchers don’t report the quality of their data may lead to new ideas about how to better share the quality of data with everyone.

Professional Abstract

Professional Abstract

Objective

To create standards for evaluating and reporting data quality in electronic health records by

  • Developing data-user-driven recommendations for evaluating and reporting data quality
  • Defining and assessing a common model for storing data-quality measures
  • Developing data-quality reports and visuals tailored to data users
  • Exploring technical, professional, and policy barriers to increasing data-quality transparency

Study Design

Design Element Description
Design Empirical analysis
Data Sources and Data Sets

Qualitative data: transcripts from meetings with data users, including patients, patient advocates, healthcare policy makers, informatics professionals, statisticians, and clinical investigators

Quantitative data: 6 large health-system datasets representing 11,026 data-quality checks
Analytic Approach

Data collection: online webinars, face-to-face meetings, online surveys, and data-quality checks of health-system datasets

Data analysis: qualitative content analysis of meeting transcripts, iterative consensus development for terminology, data-quality-check analysis, descriptive statistics, analysis of variance, and exploratory factor analysis for survey results on barriers to reporting data-quality findings

Outcomes

Primary: standardized recommendations for evaluating data quality

Secondary: evaluation of data-quality checks from data networks, prototypes for storing and reporting data-quality results, description of barriers to performing data-quality assessment, reporting findings

Patient-centered outcomes research relies on the increasing availability of operational patient-specific electronic data sources, including electronic health records. Because these data sources are typically developed for purposes other than research, challenges arise when attempting to analyze and report the data. Data-quality issues prevalent in electronic health records include missing, inaccurate, and inconsistent values.

The researchers used personal contacts with healthcare and research organizations to recruit 92 participants for two study groups. One group included patients, patient advocates, and healthcare policy makers. The second group included informatics professionals, statisticians, and clinical investigators. Study participants drafted data-quality terms, categories, and definitions during face-to-face workshops, monthly webinars, and 10 presentations at professional meetings.

The researchers also collected input from another 138 data-quality researchers on drafts of the data-quality terms, categories, and definitions through an online wiki. The researchers analyzed transcripts of the meetings using iterative thematic analysis to determine consensus-based data quality standards and reporting metrics.

The researchers recruited project leaders at six large health systems to perform data-quality checks on datasets using the data-quality standards generated by the study groups. The research team held workshops for study participants to generate data codes for a common data model and to explore effective ways to display data for data users.

The researchers also distributed an anonymous online survey to 141 data users to assess professional and personal barriers to data-quality reporting.

Results

Based on feedback from the participant groups, the research team wrote a set of 20 recommendations for data-quality-reporting standards.

The team identified three major categories of data quality: conformance, or agreement of values with technical specifications; completeness, or the extent to which data are present or absent; and plausibility, or the extent to which data are believable or correct based on the technical specifications. The study groups further divided the categories into two contexts: verification, or checking that data conform to internal constraints or expectations; and validation, or checking that data conform to external constraints or expectations.

Study participants from the large health systems validated these categories with agreement in 11,023 of 11,026 data-quality checks.

The researchers generated a prototype data-quality common model, which provides a way to store the data-quality summary statistics independent of the data source. The researchers developed new models for data visualization using data-quality summary statistics from the common model. The visualizations offer ways to quickly identify data-quality features of large datasets for use by both informatics specialists and clinical investigators.

Applying factor analysis to data from the online survey revealed three individual barriers to data-quality assessment and reporting: personal consequences, reporting-process problems, and lack of resources. The analysis also revealed two organizational barriers: environmental conditions and typical practices.

Limitations

Although the data-quality and transparency standards reflect community engagement and consensus from interested and knowledgeable participants, the generated standards may not represent the concerns of all data users. Other approaches for evaluating data quality, such as implementation of the Delphi method, may yield alternative standards.

The data owners and users in this study represented communities that use electronic health records and administrative records. These results may not be applicable to users of genomic, biologic, and social media data.

Conclusions and Relevance

Based on multiple rounds of feedback from patients, researchers, and policy makers, the study team created a set of standards to guide assessment and reporting of data quality for electronic health records. The study’s data-quality standards, data-storage model, and data-reporting visuals may help researchers conduct analyses and report results more consistently and transparently, facilitating improved interpretation and comparison of study results among data users.

Future Research Needs

Future research could solicit additional input on the data-quality standards from individuals from other relevant communities. Additional studies could involve users of genomic, biologic, and social media data in the development of additional data-quality standards.

Future research could also examine the implementation of the recommendations and measure the costs and impact of such implementation.

Final Research Report

View this project's final research report.

Journal Articles

Related Articles

EGEMS

A Harmonized Data Quality Assessment Terminology and Framework for the Secondary Use of Electronic Health Record Data

EGEMS

Transparent reporting of data quality in distributed data networks

EGEMS

Patients, consumers, and caregivers: the original data stewards

More on this Project  

Peer-Review Summary

Peer review of PCORI-funded research helps make sure the report presents complete, balanced, and useful information about the research. It also assesses how the project addressed PCORI’s Methodology Standards. During peer review, experts read a draft report of the research and provide comments about the report. These experts may include a scientist focused on the research topic, a specialist in research methods, a patient or caregiver, and a healthcare professional. These reviewers cannot have conflicts of interest with the study.

The peer reviewers point out where the draft report may need revision. For example, they may suggest ways to improve descriptions of the conduct of the study or to clarify the connection between results and conclusions. Sometimes, awardees revise their draft reports twice or more to address all of the reviewers’ comments. 

Reviewers’ comments and the investigator’s changes in response included the following:

  • The awardee provided more information about the Data Quality Collaborative (DQC) and its work in identifying key data quality recommendations.
  • Based on reviewer recommendations, the awardee highlighted key study results involving harmonized data quality terms and recommendations by reorganizing the report by the three distinct categories of study findings.
  • The reviewers requested that the investigator clarify the description of the factor analyses completed on the survey data, including replacing a more-technical table with a more-intuitive figure.
  • The awardee added a discussion of the relevance and potential impact of this study on patient-centered outcomes research. The results  improved the ability to assess data quality of a specific data set.

Conflict of Interest Disclosures

View the COI disclosure form.

Project Details

Principal Investigator
Michael G. Kahn, MD, PhD
Project Status
Completed; PCORI Public and Professional Abstracts, and Final Research Report Posted
Project Title
Building PCOR Value and Integrity with Data Quality and Transparency Standards
Board Approval Date
September 2013
Project End Date
July 2017
Organization
University of Colorado Denver
Year Awarded
2013
State
Colorado
Year Completed
2018
Project Type
Research Project
Funding Announcement
Improving Methods for Conducting Patient-Centered Outcomes Research
Project Budget
$1,100,530
DOI - Digital Object Identifier
10.25302/3.2018.ME.13035581
Study Registration Information
HSRP20143573
Page Last Updated: 
September 5, 2019

About Us

  • Our Programs
  • Governance
  • Financials and Reports
  • Procurement Opportunities
  • Our Staff
  • Our Vision & Mission
  • Contact Us

Research & Results

  • Explore Our Portfolio
  • Research Fundamentals
  • Research Results Highlights
  • Putting Evidence to Work
  • Peer Review
  • Evidence Synthesis
  • About Our Research

Engagement

  • The Value of Engagement
  • Engagement in Health Research Literature Explorer
  • Influencing the Culture of Research
  • Engagement Awards
  • Engagement Resources
  • Engage with Us

Funding Opportunities

  • What & Who We Fund
  • What You Need to Know to Apply
  • Applicant Training
  • Merit Review
  • Awardee Resources
  • Help Center

Meetings & Events

January 21
Cycle 1 2021 Broad PFA Applicant Town Hall
February 2
PCORI 2021 and Beyond: Opportunities for Funding and Involvement in Patient-Centered Research
February 9
Board of Governors Meeting: February 9, 2021

PCORI

Footer contact address

Patient-Centered Outcomes
Research Institute

1828 L Street, NW, Suite 900
Washington, DC 20036
Phone: (202) 827-7700 | Fax: (202) 355-9558
[email protected]

Subscribe to Newsletter

Twitter Facebook LinkedIn Vimeo

© 2011-2021 Patient-Centered Outcomes Research Institute. All Rights Reserved.

Privacy Policy | Terms of Use | Trademark Usage Guidelines | Credits | Help Center