Final Research Report

View this project's final research report.

Peer-Review Summary

Peer review of PCORI-funded research helps make sure the report presents complete, balanced, and useful information about the research. It also assesses how the project addressed PCORI’s Methodology Standards. During peer review, experts read a draft report of the research and provide comments about the report. These experts may include a scientist focused on the research topic, a specialist in research methods, a patient or caregiver, and a healthcare professional. These reviewers cannot have conflicts of interest with the study.

The peer reviewers point out where the draft report may need revision. For example, they may suggest ways to improve descriptions of the conduct of the study or to clarify the connection between results and conclusions. Sometimes, awardees revise their draft reports twice or more to address all of the reviewers’ comments. 

Peer reviewers commented and the researchers made changes or provided responses. Those comments and responses included the following:

  • The reviewers asked for clarity regarding the goals of the project. They noted that although the report alludes to how providers use their time in a clinic visit and to understanding the contents of a clinic visit, these concepts are not addressed further in the report. The researchers explained that they provided these examples of time use and content in clinical visits as examples of how the methods they developed in this study could be used to address those examples. The researchers revised their conclusions to more directly describe how the study results can inform providers about how to use their time and gain a greater understanding of visit content.
  • Reviewers asked why, if a goal of the study was to evaluate emotions during the clinic visit, that coders used audio recordings of the visits rather than transcripts. The researchers explained that they did not have access to the audio recordings, and they confirmed that transcripts provided important emotional information as well.
  • The reviewers appreciated the extent of community involvement in this study and the fact that patients were able to connect the future use of natural language processing as a feedback mechanism that empowers patients to advocate for their own health care. The reviewers suggested that the researchers consider using patients to review the visit transcripts and evaluate how well these methods captured the content and emotional valence of the doctor-patient interaction. The researchers noted that they did involve patients in informal meetings where content of the clinic visit was discussed but had not made this patient review a formal part of the process. The researchers agreed that this would be an interesting and useful addition to their study methods for the future.
  • The reviewers asked the researchers to provide some estimations of what level of accuracy would indicate that the natural language processing methods are good enough to assess the content and emotional valence of doctor-patient interactions. The researchers agreed that this might be helpful but stated that a measure of accuracy would not necessarily provide a true measure of whether a method was good enough. They gave an example from the study results, where their method had no more than 62 percent accuracy for topic classification, but the method matched human predictions quite well, including in making similar errors.

Conflict of Interest Disclosures

Project Information

Zac E. Imel, PhD
Ming Tai-Seale, PhD, MPH
University of Utah
$712,232
10.25302/08.2021.ME.160234167
Development of Computational Methods for Evaluating Doctor-Patient Communication

Key Dates

December 2016
July 2021
2016
2021

Study Registration Information

Tags

Has Results
Award Type
State State The state where the project originates, or where the primary institution or organization is located. View Glossary
Last updated: March 4, 2022