|This project's final research report is expected to be available by February 2020.|
Peer review of PCORI-funded research helps make sure the report presents complete, balanced, and useful information about the research. It also assesses how the project addressed PCORI’s Methodology Standards. During peer review, experts read a draft report of the research and provide comments about the report. These experts may include a scientist focused on the research topic, a specialist in research methods, a patient or caregiver, and a healthcare professional. These reviewers cannot have conflicts of interest with the study.
The peer reviewers point out where the draft report may need revision. For example, they may suggest ways to improve descriptions of the conduct of the study or to clarify the connection between results and conclusions. Sometimes, awardees revise their draft reports twice or more to address all of the reviewers’ comments.
Peer reviewers commented, and the researchers made changes or provided responses. The comments and responses included the following:
- Reviewers said the level of experience of abstractors involved in the study may not reflect the level of expertise of those conducting actual systematic reviews. The reviewers suggested that not enough was done to validate the expertise of the volunteer abstractors. The researchers noted that they summarized the backgrounds of the trial participants in Tables 2 and 3. They pointed out that 90 percent of participants had previously abstracted data from 10 or more studies and all had received some form of training in systematic reviews. The researchers noted that nearly all participants described themselves as “somewhat or moderately experienced” or “very experienced.” The researchers did not feel the need to make changes to the report on the subject of abstractor expertise.
- Reviewers pointed out that normally abstractors are well informed in the areas in which they prepare reviews and suggested that volunteers in this study were not likely to be as motivated as coauthors of a review. The researchers agreed that the motivation of data abstractors in the trial may be different from those preparing real-world systematic reviews, but they noted that some speculate that participating in a research study could lead to greater motivation rather than less. They added that the error and time reported in this study were consistent with error and time reported in other studies. Thus, the researchers did not feel there was enough evidence to support assumptions about abstractor motivation in either direction.
- Reviewers suggested that it may have been better to recruit data abstractors who were planning to work on reviews for publication. The researchers said they chose their research design with the goal of recruiting a large number of participants who were relatively representative of data abstractors. The researchers said they would encourage future research that considers alternate designs.
- Reviewers noted that aim 1 lacked a theoretical framework for technology adoption that could have guided study design and led to explorations in a wider range of outcome measures. The researchers responded, noting that aim 1 just focused on software development and usability. Although additional work on promoting the adoption of the technology is a good idea, it is beyond the scope of this project.
Conflict of Interest Disclosures
The Conflict of Interest Disclosures for this project will be posted here soon.