Final Research Report

View this project's final research report.

Journal Citations

Peer-Review Summary

Peer review of PCORI-funded research helps make sure the report presents complete, balanced, and useful information about the research. It also assesses how the project addressed PCORI’s Methodology Standards. During peer review, experts read a draft report of the research and provide comments about the report. These experts may include a scientist focused on the research topic, a specialist in research methods, a patient or caregiver, and a healthcare professional. These reviewers cannot have conflicts of interest with the study.

The peer reviewers point out where the draft report may need revision. For example, they may suggest ways to improve descriptions of the conduct of the study or to clarify the connection between results and conclusions. Sometimes, awardees revise their draft reports twice or more to address all of the reviewers’ comments. 

Peer reviewers commented and the researchers made changes or provided responses. Those comments and responses included the following:

  • The reviewers suggested asking stakeholders about the implications of the report’s findings. The researchers asked the stakeholders for feedback regarding their ideas that the researchers used in this report, and the researchers incorporated stakeholders’ ideas into the report revisions. The researchers explained that early stakeholder input led to greater time spent reviewing literature and investigating analysis methods, which they said strengthened the study but took time. Because of time constraints, the researchers did not have the time to get stakeholders’ feedback on study implications before submitting the final report.
  • A reviewer took issue with the researchers’ recommendation that published discrete choice experiments present more methodological data, saying this was unlikely given journal space constraints and seemed to criticize the majority of literature on discrete choice experiments. The researchers adjusted their language to avoid seeming to blame authors of published studies for leaving out information, but they reiterated that the inadequacy of current methodologic reporting is a barrier to assessing study quality and that it is reasonable to expect more methodologic detail in supplementary material for published discrete choice experiments.
  • The reviewers questioned the validity and generalizability of the simulations in this methods study because the simulations were based on two choice-experiment studies that seemed unusual compared to other choice-experiment studies conducted in the last several years. According to the reviewers, these two studies seemed to differ from the rest of the literature in the number of choice questions that were asked, the number of study participants, and the types of attributes the choice questions measured. The researchers disagreed that both of the studies were unusual, pointing out that the second study had a similar sample size and similar goals to many of the similar studies. However, the researchers did revise their systematic review discussion to include findings on choice tasks, alternatives, and attributes, and compared the two studies they used for their simulations to the studies discussed in the systematic review.  
  • Reviewers said the researchers raised serious questions about the allocation of scarce research resources. While the researchers had to grapple with tradeoffs in their study design given realistic time and funding constraints, they recommended that other researchers make laborious efforts to fine tune their model designs. The researchers said they and the expert stakeholders they worked with agreed that this work suggests there is an important resource tradeoff. The researchers said they had backed away from implying that much of the literature in this area may be fatally flawed, but that may be the case, and they noted that they proposed a strategy at the end of the discussion section that may help analysts manage resource constraints without sacrificing study quality.

Conflict of Interest Disclosures

Project Information

Alan R. Ellis, PhD, MSW
North Carolina State University
$123,972
10.25302/03.2021.ME.160234572
Improving Study Design and Reporting for Stated Choice Experiments

Key Dates

December 2016
November 2020
2016
2020

Study Registration Information

Tags

Has Results
Award Type
State State The state where the project originates, or where the primary institution or organization is located. View Glossary
Last updated: March 4, 2022