Results Summary

What was the research about?

People who have cancer often feel scared, anxious, or depressed. They may have a hard time asking questions or listening to what their doctor is saying. When doctors respond to patients with understanding, or empathy, patients may be more able to follow treatment plans.

In this study, the research team created a training program called SCOPE. The program focuses on increasing doctors’ skills in responding to patients’ feelings. The team wanted to learn whether, compared with a standard training program, SCOPE improved

  • How satisfied patients were with their doctor’s communication
  • How often cancer doctors showed empathy for their patients’ feelings

What were the results?

Satisfaction with doctor communication didn’t differ between patients who saw doctors in SCOPE and patients who saw doctors in the standard training program. A month after the program, doctors in SCOPE showed empathy for their patients three times as often as doctors in the standard program.

Who was in the study?

The study started with 148 cancer doctors from across the United States. But only 27 doctors finished the study. Of these, 63 percent were white, 22 percent were Asian, and 15 percent were other races. The average age was 53, and 85 percent were men.

What did the research team do?

The research team assigned cancer doctors by chance to SCOPE or a standard training program. Before starting their programs, each doctor audio-recorded clinic visits with four patients. The doctors also gave surveys to five patients. Surveys asked about the patients’ satisfaction with their doctor’s communication.

Doctors in SCOPE watched online training videos that taught communication skills and how to respond to patients’ feelings. Doctors received patient survey results and feedback on their clinic visit recordings from the research team and a patient advisory group. The feedback pointed out times doctors used skills from the videos; it also gave advice on things doctors could improve. After finishing the video lessons, each doctor recorded two more clinic visits. Doctors got a second round of feedback on these recordings.

Doctors in the standard training program received patient survey results but didn’t get feedback on their videos. After completing the training, the doctors set goals for improving their communication with patients.

A month after completing their programs, all doctors recorded four more clinic visits. They also collected surveys from five new patients. The research team compared these recordings and surveys with those from the start of the study. The doctors submitted a total of 455 patient surveys and 254 clinic visit recordings.

An advisory group of patient advocates gave input throughout the study.

What were the limits of the study?

Because only 27 doctors finished the study, the research team can’t say for sure how well SCOPE works. Before the study, cancer doctors had to take communication training to stay certified. During the study, such training became voluntary, which may have reduced how many doctors finished the study.

Future research could explore ways to add communication training to existing programs that improve cancer care quality.

How can people use the results?

Hospitals and health clinics could use these results when considering ways to help cancer doctors improve their communication skills.

Final Research Report

View this project's final research report.

More About This Research

Peer-Review Summary

Peer review of PCORI-funded research helps make sure the report presents complete, balanced, and useful information about the research. It also assesses how the project addressed PCORI’s Methodology Standards. During peer review, experts read a draft report of the research and provide comments about the report. These experts may include a scientist focused on the research topic, a specialist in research methods, a patient or caregiver, and a healthcare professional. These reviewers cannot have conflicts of interest with the study.

The peer reviewers point out where the draft report may need revision. For example, they may suggest ways to improve descriptions of the conduct of the study or to clarify the connection between results and conclusions. Sometimes, awardees revise their draft reports twice or more to address all of the reviewers’ comments. 

Peer reviewers commented and the researchers made changes or provided responses. Those comments and responses included the following:

  • The reviewers asked for clarification on which of the project aims were successfully addressed and which could not be addressed because of the difficulty of enrolling and retaining physicians in the study. The researchers added text to clarify that aim 2 could not be fully addressed because of the low uptake of the intervention among providers and high dropout rate. The reviewers emphasized that this made the results inconclusive, not that the results indicated that the Studying Communication in Oncologist-Patient Encounters (SCOPE) program was ineffective.
  • The reviewers asked for more detail on the similarities and differences between the original SCOPE program and the enhanced SCOPE program. The researchers explained that both the original and enhanced SCOPE programs offered tailored feedback for providers, but the enhanced program also offered subjective feedback to providers from patient reviewers. 
  • The reviewers noted that the original SCOPE program also seemed to be acceptable to physicians, as evident from the greater uptake of the intervention in the prior study, and still improved communication. The researchers argued that the greater uptake in the previous study was not due to the original SCOPE program being more acceptable but due to problems in physicians collecting the Consumer Assessment of Healthcare Providers and Systems (CAHPS survey. The researcher explained in the report that because of changes in training requirements, the physicians were no longer mandated to provide the CAHPS to their patients, and many found the burden of completing the surveys to outweigh the benefit. This led to lower physician completion rates in this study.
  • The reviewers asked about the communication skills of the small number of providers who did complete the study and whether there was any evidence that the providers who did participate were more empathic communicators than those who didn’t, indicating a volunteer bias. The researchers said the baseline rate of empathic responses in this study, 16 percent, was lower than in their prior study in which it was 28 percent, so volunteer bias did not seem to be present here.
  • The reviewers wondered why the study had such a low retention rate. The researchers said the oncologists who finished the study wanted to do so because of their interest in the topic, and they thought the experience would improve their practice. The oncologists also received a small amount of money, board certification, and continuing education credits. The barriers to participation were primarily collecting recordings and obtaining patient consent.

Conflict of Interest Disclosures

View the COI disclosure form.

Peer-Review Summary

Peer review of PCORI-funded research helps make sure the report presents complete, balanced, and useful information about the research. It also assesses how the project addressed PCORI’s Methodology Standards. During peer review, experts read a draft report of the research and provide comments about the report. These experts may include a scientist focused on the research topic, a specialist in research methods, a patient or caregiver, and a healthcare professional. These reviewers cannot have conflicts of interest with the study.

The peer reviewers point out where the draft report may need revision. For example, they may suggest ways to improve descriptions of the conduct of the study or to clarify the connection between results and conclusions. Sometimes, awardees revise their draft reports twice or more to address all of the reviewers’ comments. 

Peer reviewers commented and the researchers made changes or provided responses. Those comments and responses included the following:

  • The reviewers asked for clarification on which of the project aims were successfully addressed and which could not be addressed because of the difficulty of enrolling and retaining physicians in the study. The researchers added text to clarify that aim 2 could not be fully addressed because of the low uptake of the intervention among providers and high dropout rate. The reviewers emphasized that this made the results inconclusive, not that the results indicated that the Studying Communication in Oncologist-Patient Encounters (SCOPE) program was ineffective.
  • The reviewers asked for more detail on the similarities and differences between the original SCOPE program and the enhanced SCOPE program. The researchers explained that both the original and enhanced SCOPE programs offered tailored feedback for providers, but the enhanced program also offered subjective feedback to providers from patient reviewers. 
  • The reviewers noted that the original SCOPE program also seemed to be acceptable to physicians, as evident from the greater uptake of the intervention in the prior study, and still improved communication. The researchers argued that the greater uptake in the previous study was not due to the original SCOPE program being more acceptable but due to problems in physicians collecting the Consumer Assessment of Healthcare Providers and Systems (CAHPS survey. The researcher explained in the report that because of changes in training requirements, the physicians were no longer mandated to provide the CAHPS to their patients, and many found the burden of completing the surveys to outweigh the benefit. This led to lower physician completion rates in this study.
  • The reviewers asked about the communication skills of the small number of providers who did complete the study and whether there was any evidence that the providers who did participate were more empathic communicators than those who didn’t, indicating a volunteer bias. The researchers said the baseline rate of empathic responses in this study, 16 percent, was lower than in their prior study in which it was 28 percent, so volunteer bias did not seem to be present here.
  • The reviewers wondered why the study had such a low retention rate. The researchers said the oncologists who finished the study wanted to do so because of their interest in the topic, and they thought the experience would improve their practice. The oncologists also received a small amount of money, board certification, and continuing education credits. The barriers to participation were primarily collecting recordings and obtaining patient consent.

Conflict of Interest Disclosures

Project Information

James Tulsky, MD
Dana-Farber Cancer Institute
$1,768,800
10.25302/08.2020.CDR091501IC

Key Dates

50 months
September 2014
March 2020
2014
2019

Study Registration Information

^James Tulsky was affiliated with Duke University when this project was funded.

Tags

Has Results
Project Status
Award Type
Health Conditions

Health Conditions

These are the broad terms we use to categorize our funded research studies; specific diseases or conditions are included within the appropriate larger category. Note: not all of our funded projects focus on a single disease or condition; some touch on multiple diseases or conditions, research methods, or broader health system interventions. Such projects won’t be listed by a primary disease/condition and so won’t appear if you use this filter tool to find them.

View Glossary
Funding Opportunity Type
Intervention Strategy

Intervention Strategies

PCORI funds comparative clinical effectiveness research (CER) studies that compare two or more options or approaches to health care, or that compare different ways of delivering or receiving care.

View Glossary
Research Priority Area
State

State

The state where the project originates, or where the primary institution or organization is located.

View Glossary
Last updated: October 20, 2021