Results Summary and Professional Abstract
Peer review of PCORI-funded research helps make sure the report presents complete, balanced, and useful information about the research. It also assesses how the project addressed PCORI’s Methodology Standards. During peer review, experts read a draft report of the research and provide comments about the report. These experts may include a scientist focused on the research topic, a specialist in research methods, a patient or caregiver, and a healthcare professional. These reviewers cannot have conflicts of interest with the study.
The peer reviewers point out where the draft report may need revision. For example, they may suggest ways to improve descriptions of the conduct of the study or to clarify the connection between results and conclusions. Sometimes, awardees revise their draft reports twice or more to address all of the reviewers’ comments.
Peer reviewers commented and the researchers made changes or provided responses. Those comments and responses included the following:
- The reviewers asked for clarification on which of the project aims were successfully addressed and which could not be addressed because of the difficulty of enrolling and retaining physicians in the study. The researchers added text to clarify that aim 2 could not be fully addressed because of the low uptake of the intervention among providers and high dropout rate. The reviewers emphasized that this made the results inconclusive, not that the results indicated that the Studying Communication in Oncologist-Patient Encounters (SCOPE) program was ineffective.
- The reviewers asked for more detail on the similarities and differences between the original SCOPE program and the enhanced SCOPE program. The researchers explained that both the original and enhanced SCOPE programs offered tailored feedback for providers, but the enhanced program also offered subjective feedback to providers from patient reviewers.
- The reviewers noted that the original SCOPE program also seemed to be acceptable to physicians, as evident from the greater uptake of the intervention in the prior study, and still improved communication. The researchers argued that the greater uptake in the previous study was not due to the original SCOPE program being more acceptable but due to problems in physicians collecting the Consumer Assessment of Healthcare Providers and Systems (CAHPS survey. The researcher explained in the report that because of changes in training requirements, the physicians were no longer mandated to provide the CAHPS to their patients, and many found the burden of completing the surveys to outweigh the benefit. This led to lower physician completion rates in this study.
- The reviewers asked about the communication skills of the small number of providers who did complete the study and whether there was any evidence that the providers who did participate were more empathic communicators than those who didn’t, indicating a volunteer bias. The researchers said the baseline rate of empathic responses in this study, 16 percent, was lower than in their prior study in which it was 28 percent, so volunteer bias did not seem to be present here.
- The reviewers wondered why the study had such a low retention rate. The researchers said the oncologists who finished the study wanted to do so because of their interest in the topic, and they thought the experience would improve their practice. The oncologists also received a small amount of money, board certification, and continuing education credits. The barriers to participation were primarily collecting recordings and obtaining patient consent.
Conflict of Interest Disclosures
View the COI disclosure form.
Training and Education Interventions
^James Tulsky was affiliated with Duke University when this project was funded.