Results Summary
What was the research about?
People who have cancer often feel scared, anxious, or depressed. They may have a hard time asking questions or listening to what their doctor is saying. When doctors respond to patients with understanding, or empathy, patients may be more able to follow treatment plans.
In this study, the research team created a training program called SCOPE. The program focuses on increasing doctors’ skills in responding to patients’ feelings. The team wanted to learn whether, compared with a standard training program, SCOPE improved
- How satisfied patients were with their doctor’s communication
- How often cancer doctors showed empathy for their patients’ feelings
What were the results?
Satisfaction with doctor communication didn’t differ between patients who saw doctors in SCOPE and patients who saw doctors in the standard training program. A month after the program, doctors in SCOPE showed empathy for their patients three times as often as doctors in the standard program.
Who was in the study?
The study started with 148 cancer doctors from across the United States. But only 27 doctors finished the study. Of these, 63 percent were white, 22 percent were Asian, and 15 percent were other races. The average age was 53, and 85 percent were men.
What did the research team do?
The research team assigned cancer doctors by chance to SCOPE or a standard training program. Before starting their programs, each doctor audio-recorded clinic visits with four patients. The doctors also gave surveys to five patients. Surveys asked about the patients’ satisfaction with their doctor’s communication.
Doctors in SCOPE watched online training videos that taught communication skills and how to respond to patients’ feelings. Doctors received patient survey results and feedback on their clinic visit recordings from the research team and a patient advisory group. The feedback pointed out times doctors used skills from the videos; it also gave advice on things doctors could improve. After finishing the video lessons, each doctor recorded two more clinic visits. Doctors got a second round of feedback on these recordings.
Doctors in the standard training program received patient survey results but didn’t get feedback on their videos. After completing the training, the doctors set goals for improving their communication with patients.
A month after completing their programs, all doctors recorded four more clinic visits. They also collected surveys from five new patients. The research team compared these recordings and surveys with those from the start of the study. The doctors submitted a total of 455 patient surveys and 254 clinic visit recordings.
An advisory group of patient advocates gave input throughout the study.
What were the limits of the study?
Because only 27 doctors finished the study, the research team can’t say for sure how well SCOPE works. Before the study, cancer doctors had to take communication training to stay certified. During the study, such training became voluntary, which may have reduced how many doctors finished the study.
Future research could explore ways to add communication training to existing programs that improve cancer care quality.
How can people use the results?
Hospitals and health clinics could use these results when considering ways to help cancer doctors improve their communication skills.
Professional Abstract
Objective
To compare the effectiveness of an online, interactive communication training program for oncologists, versus standard communication training, in increasing patient satisfaction with oncologist communication
Study Design
Design Element | Description |
---|---|
Design | Randomized controlled trial |
Population | 148 ABIM-certified oncologists |
Interventions/ Comparators |
|
Outcomes |
Primary: patient satisfaction with oncologist communication Secondary: oncologist responses to opportunities to show empathy |
Timeframe | 1-month follow-up for primary outcome |
In this randomized controlled trial, the research team evaluated the effectiveness of the Studying Communication in Oncologist-Patient Encounters (SCOPE) program in increasing patient satisfaction with oncologist communication and the frequency of oncologist responses to opportunities to show empathy.
The research team randomly assigned oncologists to SCOPE or a standard communication improvement program. Before beginning their programs, all oncologists collected satisfaction surveys from five patients and audio-recorded clinic visits with four different patients.
Oncologists assigned to SCOPE received results from their patients’ satisfaction surveys and tailored feedback on their visit recordings. They then completed the five SCOPE modules: Principles of Effective Communication, Recognizing Empathetic Opportunities, Responding to Empathetic Opportunities, Conveying Prognosis, and Answering Difficult Questions. Each SCOPE module taught communication skills using video demonstrations. The modules also played audio clips from each oncologist’s clinic visits to show when they used the skills and to offer advice on alternate approaches when doctors used negative behaviors or missed opportunities to show empathy. Patient advocates from the study’s advisory board also provided feedback on the recordings. After completing the video lessons, oncologists recorded two more clinic visits and received another round of tailored feedback.
Oncologists in the control arm received satisfaction survey results and completed the American Board of Internal Medicine (ABIM) communication skills practice improvement module. This program did not provide feedback. Next, the oncologists identified areas for self-improvement and created an action plan for achieving their goals in those areas.
One month after completing their programs, all oncologists recorded four clinic visits and collected satisfaction surveys from a new sample of five patients. The oncologists submitted a total of 455 satisfaction surveys and 254 clinic visit recordings.
The study initially included 148 ABIM-certified oncologists, but only 27 completed the study. Of these, 63% were white, 22% were Asian, and 15% were other races. The average age was 53, and 85% were male.
A patient advisory council provided input throughout the study.
Results
Patient satisfaction with oncologist communication did not differ between the SCOPE and control groups. After one month, oncologists in SCOPE responded to a higher proportion of opportunities to show empathy than those in the control group at baseline (odds ratio=2.75; 95% confidence interval [CI]: 1.10, 7.01).
Limitations
The study did not reach its sample target of 100 oncologists completing the study, which limited the research team’s ability to detect differences between the SCOPE and control groups. Before the study, practice improvement modules were a required part of oncologists’ recertification process. During the study, such training became voluntary, which may have reduced study participation.
Conclusions and Relevance
The study did not find evidence that SCOPE increased patient satisfaction with oncologist communication. However, among those who completed the program, SCOPE improved oncologists’ reactions to opportunities to show empathy.
Future Research Needs
Future research could explore ways to integrate communication training into existing oncology quality improvement programs.
Final Research Report
View this project's final research report.
Journal Citations
Related Journal Citations
Peer-Review Summary
Peer review of PCORI-funded research helps make sure the report presents complete, balanced, and useful information about the research. It also assesses how the project addressed PCORI’s Methodology Standards. During peer review, experts read a draft report of the research and provide comments about the report. These experts may include a scientist focused on the research topic, a specialist in research methods, a patient or caregiver, and a healthcare professional. These reviewers cannot have conflicts of interest with the study.
The peer reviewers point out where the draft report may need revision. For example, they may suggest ways to improve descriptions of the conduct of the study or to clarify the connection between results and conclusions. Sometimes, awardees revise their draft reports twice or more to address all of the reviewers’ comments.
Peer reviewers commented and the researchers made changes or provided responses. Those comments and responses included the following:
- The reviewers asked for clarification on which of the project aims were successfully addressed and which could not be addressed because of the difficulty of enrolling and retaining physicians in the study. The researchers added text to clarify that aim 2 could not be fully addressed because of the low uptake of the intervention among providers and high dropout rate. The reviewers emphasized that this made the results inconclusive, not that the results indicated that the Studying Communication in Oncologist-Patient Encounters (SCOPE) program was ineffective.
- The reviewers asked for more detail on the similarities and differences between the original SCOPE program and the enhanced SCOPE program. The researchers explained that both the original and enhanced SCOPE programs offered tailored feedback for providers, but the enhanced program also offered subjective feedback to providers from patient reviewers.
- The reviewers noted that the original SCOPE program also seemed to be acceptable to physicians, as evident from the greater uptake of the intervention in the prior study, and still improved communication. The researchers argued that the greater uptake in the previous study was not due to the original SCOPE program being more acceptable but due to problems in physicians collecting the Consumer Assessment of Healthcare Providers and Systems (CAHPS survey. The researcher explained in the report that because of changes in training requirements, the physicians were no longer mandated to provide the CAHPS to their patients, and many found the burden of completing the surveys to outweigh the benefit. This led to lower physician completion rates in this study.
- The reviewers asked about the communication skills of the small number of providers who did complete the study and whether there was any evidence that the providers who did participate were more empathic communicators than those who didn’t, indicating a volunteer bias. The researchers said the baseline rate of empathic responses in this study, 16 percent, was lower than in their prior study in which it was 28 percent, so volunteer bias did not seem to be present here.
- The reviewers wondered why the study had such a low retention rate. The researchers said the oncologists who finished the study wanted to do so because of their interest in the topic, and they thought the experience would improve their practice. The oncologists also received a small amount of money, board certification, and continuing education credits. The barriers to participation were primarily collecting recordings and obtaining patient consent.
Conflict of Interest Disclosures
Project Information
Key Dates
Study Registration Information
^James Tulsky was affiliated with Duke University when this project was funded.