Skip to main content
Patient-Centered Outcomes Research Institute
  • Blog
  • Newsroom
  • Glossary
  • Subscribe
  • Careers
  • Contact

PCORI

Patient-Centered Outcomes Research Institute

Search form

  • Home
  • About Us
    • Our Programs
      • Clinical Effectiveness and Decision Science
      • Healthcare Delivery and Disparities Research
      • Evaluation and Analysis
      • Engagement
      • Research Infrastructure
    • Governance
      • Board of Governors
      • Methodology Committee
      • Committees
        • Engagement, Dissemination, and Implementation Committee
        • Research Transformation Committee
        • Science Oversight Committee
        • Finance and Administration Committee
        • Executive Committee
        • Governance Committee
          • Executive Evaluation and Compensation Subcommittee
        • Scientific Publications Committee
        • Selection Committee
    • Financials and Reports
      • Our Funding
    • Procurement Opportunities
      • Operations Support Funding
      • Research Support Funding
      • Contracted Projects
    • Our Staff
      • Executive Team
      • Office of the Executive Director
        • Evaluation and Analysis
        • Research Infrastructure
      • Office of the General Counsel
      • Science
        • Office of the Chief Science Officer
        • Clinical Effectiveness and Decision Science
        • Healthcare Delivery and Disparities Research
        • Merit Review
        • Peer Review
        • Research Synthesis
      • Engagement
        • Communications
        • Dissemination and Implementation
        • Engagement Awards
        • Office of the Chief Engagement and Dissemination Officer
        • Public and Patient Engagement
        • Public Policy
        • Training
      • Operations
        • Office of the Chief Operations Officer
        • Administrative Services
        • Contracts Management and Administration
        • Finance
        • Human Resources
        • Information Technology
        • Procurement
  • Research & Results
    • Explore Our Portfolio
    • Dissemination and Implementation
      • CME/CE Activities
      • Dissemination and Implementation Framework and Toolkit
    • Peer Review
      • Step-by-Step Instructions for Awardees: Peer Review of Draft Final Research Report
      • Peer Review FAQ
    • Research Spotlights
    • About Our Research
      • How We Select Research Topics
        • Generation and Prioritization of Topics for Funding Announcements
        • Topics in the Prioritization Pathway
      • Research We Support
        • National Priorities and Research Agenda
          • How We Developed our National Priorities and Research Agenda
            • Public Comments for PCORI’s National Priorities and Research Agenda
      • Collaborating with Other Research Funders
      • Research Methodology
        • PCORI Methodology Standards
          • Suggest a Topic Area for New Methodology Standards
        • The PCORI Methodology Report
          • Draft Methodology Report Public Comment Period
        • PCORI Methodology Standards and Report FAQ
        • Methodology Standards Academic Curriculum
          • Category 1: Standards for Formulating Research Questions
          • Category 2: Standards Associated with Patient-Centeredness
          • Category 3: Standards for Data Integrity and Rigorous Analyses
          • Category 4: Standards for Preventing and Handling Missing Data
          • Category 5: Standards for Heterogeneity of Treatment Effects
          • Category 6: Standards for Data Registries
          • Category 7: Data Networks as Research-Facilitating Structures
          • Category 8: Standards for Causal Inference Methods
          • Category 9: Standards for Adaptive and Bayesian Trial Designs
          • Category 10: Standards for Studies of Diagnostic Tests
          • Category 11: Standards for Systematic Reviews
        • Methodology Committee - Background
        • Methodology Committee - Workshops and Events
      • Evaluating Our Work
        • Planning Our Evaluation, Reporting the Results
        • PCORI Evaluation Group (PEG)
        • How We Evaluate Key Aspects of Our Work
        • Related Blog Posts
  • Engagement
    • What We Mean by Engagement
      • PCORI’s Stakeholders
    • Engagement Awards
    • Engage with Us
      • Voices of Engagement
        • David White
        • Elizabeth Cox
        • Regina Greer-Smith
        • Kimberly Jinnett
        • Toya Burton
        • David Hahn
        • Rebekah Angove
        • Neely Williams
        • Peter W. Thomas
        • Megan O'Boyle
        • Stephanie Buxhoeveden
      • Become a Merit Reviewer
        • PCORI Stakeholder Reviewer Communities
        • Reviewer Qualifications
        • Reviewer Responsibilities
      • Become a Peer Reviewer
      • Join an Advisory Panel
        • Advisory Panel Openings
        • PCORI Advisory Panels FAQs
        • Advisory Panel on Addressing Disparities
          • Biographies - Advisory Panel on Addressing Disparities
        • Advisory Panel on Assessment of Prevention, Diagnosis, and Treatment Options
          • Biographies - Advisory Panel on Assessment of Prevention, Diagnosis, and Treatment Options
        • Advisory Panel on Clinical Trials
          • Biographies - Advisory Panel on Clinical Trials
          • Advisory Panel on Clinical Trials Subcommittee on Recruitment, Accrual, and Retention
          • Advisory Panel on Clinical Trials Subcommittee on the Standardization of Complex Concepts and their Terminology
        • Advisory Panel on Communication and Dissemination Research
          • Biographies - Advisory Panel on Communication and Dissemination Research
        • Advisory Panel on Improving Healthcare Systems
          • Biographies - Advisory Panel on Improving Healthcare Systems
        • Advisory Panel on Patient Engagement
          • Biographies - Advisory Panel on Patient Engagement
        • Advisory Panel on Rare Disease
          • Biographies - Advisory Panel on Rare Disease
      • Become a PCORI Ambassador
        • History of the Ambassador Program
        • About Ambassadors
        • Who are PCORI’s Ambassadors?
          • PCORI Individual Ambassadors
          • PCORI Organizational Ambassadors
          • PCORI Ambassadors: Northeast Region
          • PCORI Ambassadors: South Region
          • PCORI Ambassadors: Midwest Region
          • PCORI Ambassadors: Western Region
        • Ambassador Program Interest Form
      • Provide Input
        • Past Opportunities to Provide Input
          • Data Access and Data Sharing Policy: Public Comment
            • Data Access and Data Sharing Policy: Public Comment Submissions
          • Comment on the Proposed New and Revised Methodology Standards
          • Peer Review Process Comments
      • Suggest a Patient-Centered Research Question
        • How to Write a Research Question
      • Participate in PCORI Events
        • PCORI in Practice
  • Funding Opportunities
    • What & Who We Fund
    • What You Need to Know to Apply
      • FAQs for Applicants
      • Glossary
      • Have a Question?
    • Applicant Training
    • Merit Review Process
      • Merit Review Criteria
      • Merit Reviewer Resources
        • Merit Review Timeline
        • Reviewer Training
        • Meet Our Reviewers
        • Reviewer FAQs
    • Research Support Funding Opportunities
      • Eugene Washington PCORI Engagement Awards
        • Eugene Washington PCORI Engagement Awards: Review Process
        • Eugene Washington PCORI Engagement Awards FAQs
        • Eugene Washington PCORI Engagement Awards: Recently Funded Projects
      • Pipeline to Proposal Awards
        • Pipeline to Proposal Awards: Program Offices
        • Pipeline to Proposal Awards: Three-Tiered Program
      • PCORnet Infrastructure Awards
      • The PCORI Matchmaking App Challenge
        • 2014 PCORI Matchmaking App Challenge - Runners Up
          • Judges for PCORI 2014 Challenge
        • PCORI Challenge Initiative - 2013
          • Judges for PCORI 2013 Challenge
      • Research Support Funding
    • Awardee Resources
      • Closed PCORI Funding Announcements
      • Post-Award FAQs
  • Meetings & Events
    • Upcoming
    • Past Events

You are here

  • Pilot Project: Measuring Care in Psyc...

« See all projects with results posted

Pilot Project: Measuring Care in Psychiatric Hospitals from Patient and Staff Points of View

This project has results available

Public Abstract

Public Abstract

PCORI funded the Pilot Projects to explore how to conduct and use patient-centered outcomes research in ways that can better serve patients and the healthcare community. Learn more.

Background

There is little research on how to produce the best treatment results in psychiatric hospitals. At the time of this study, there was little research about how patients feel about the care they receive in mental health units and what experiences are important to them and their recovery.

Project Purpose

The purpose of this study was to develop and test a way to measure safety and patient-centered care in mental health care. The researchers developed a measure called the Combined Assessment of Psychiatric Environments (CAPE). The CAPE consists of two surveys: one for patients and one for nurses. The researchers created two surveys because both patients and nurses need to have a positive environment to achieve good care. The researchers did several tests to make sure the patient and nurse surveys were measuring the same issues.

Research Methods

To develop the CAPE, the researchers started by looking at studies on patient and nurse experiences. They also looked at what individuals who had been hospitalized said in earlier focus groups and what nurses working in mental health units said during past individual interviews. Then the researchers made a list of important experiences patients have while getting treatment in mental health units. They asked 30 former patients and 30 nurses to rate the importance of each experience, using a four-point scale. The researchers then used the experiences that the patients and nurses had rated as most important to create the two surveys. Expert panels consisting of eight patients and eight nurses reviewed the surveys. The expert panels looked at the substance of the questions, how the questions were grouped together, and whether any important concepts were missing. The researchers used the feedback to revise the surveys. Finally, the researchers did interviews to learn whether people understood the questions in the survey the way the researchers thought they would. Ten patients and 10 nurses were asked what questions meant, why they thought questions were or were not important, and if the questions made sense together. Researchers used these answers to improve the CAPE surveys.

After creating the surveys, the researchers did a pilot test. The pilot test was a way of seeing if the surveys worked with a small group of patients and nurses before asking larger groups to use it. The research team tested the CAPE surveys with 150 patients and 113 nurses. Patients needed to be at least 21 years old and speak English. They also had to have been in the mental health unit for at least four days and planning to go home the next day. Nursing staff needed to be registered nurses or mental health counselors who worked at least half-time in a mental health unit.

The pilot test took place in six mental health units with an average of 20 beds each. These facilities included three mental health units in community hospitals, two mental health units in medical centers, and one psychiatric hospital. Patients in these mental health units were people who had a mental illness and were considered to be dangerous to themselves or others, or unable to take care of themselves.

In the pilot test, patients and nurses completed their respective versions of the CAPE survey. Patients also completed the 16-question Perceptions of Care survey, which asks about patient satisfaction. Nurses and mental health workers also completed the 31-question Practice Environment Scale of the Nursing Work Index, which asks about the nursing work environment.

The research team looked for themes in the answers to the survey questions. The team also used statistical tests to see whether people answered similar questions the same way.

Findings

Patients’ answers to the CAPE survey focused on two themes about their treatment:

  • Were staff capable?
  • Was treatment effective?

Staff answers to the CAPE survey showed five aspects of experiences on the mental health unit:

  • a sense of being effective
  • whether there were enough resources and staff
  • management involvement with care
  • how well staff worked together
  • how much independence the staff have on the unit

For the most part, patients and staff answered similar types of questions in the same ways.

Limitations

There were some unexplained differences between patients’ and nurses’ answers to survey questions. Patients’ answers suggested that they believed they frequently had all of the experiences the study asked about, regardless of what mental health unit they were in. In contrast, staff answers about topics such as whether their unit was safe were different, depending on the unit in which they worked. This could be because the survey’s questions need to be worded differently or because patients are likely to answer survey questions about treatment experiences in a positive way.

Importance of Findings

The CAPE patient and staff surveys appear to work well together and to reflect what patients and staff experience. The CAPE survey questions seem to show what patients believe are important aspects of treatment. Until recently, patients treated in mental health units were not given surveys about their experiences in obtaining mental health care. The CAPE survey is a contribution to measure what matters to patients when obtaining mental health care.

Sharing the Results

The research team has published on the CAPE and has presented the study at professional meetings (see below). Other researchers have begun using the CAPE to test the relationship between quality of care and trust in the mental health system.

Future Research

The research team is creating a shorter version of the CAPE. In addition, future research will determine whether the CAPE produces different results with patients who are in restraints than with those who are not.

Technical Abstract

Technical Abstract 

PCORI funded the Pilot Projects to explore how to conduct and use patient-centered outcomes research in ways that can better serve patients and the healthcare community. Learn more.

Background

There is little research on best practices or clinical outcomes of psychiatric inpatient treatment or general use tools to assess the quality of inpatient psychiatric settings from the patients’ perspective. Specifically, it is important to understand what experiences are important to patients their recovery.

Project Purpose

The purpose of this study was to develop and conduct initial testing on a measure to evaluate safety and person-centered care in inpatient psychiatric settings. The measure is predicated on the idea that if optimal care is to be achieved, then all major stakeholders (patients and staff) need to experience a positive environment.

Study Design

This study employed an instrument development design to create the patient and staff nurse versions of the Combined Assessment of Psychiatric Environments (CAPE) (phase one) and to test dimensions and items of both versions (phase two). In preliminary work, dimensions for the CAPE patient and staff nurse versions were derived from the qualitative literature on patient and staff nurse experiences, and items were generated from focus groups with individuals who had been hospitalized and staff nurses who were working in psychiatric units. Phase one of the project included event judgment, expert panel review, and cognitive interviewing.

  • Event Judgment: From the previously conducted staff nurse interviews and consumer focus groups, a list of items describing important experiences of inpatient psychiatric treatment was generated and, informed by existing literature on the psychiatric inpatient and staff nurse experience, organized into five dimensions. Following item reduction, 30 former inpatients and 30 nursing staff were asked to judge the importance of the nominated events to establish which were most valued. Judgments were made on each statement by rating the importance of events on a four-point scale. From these data, potential items for both versions of the CAPE measure were generated.
  • Expert Panel Review: Items nominated as most important and relevant during event judgment were organized into CAPE patient and staff nurse pilot versions. These pilot versions were then reviewed by an expert panel of eight peer specialists (patient version) and eight expert staff nurses (staff nurse version). Each member of the expert panel was mailed the pilot version and asked to evaluate the dimensions and items from the perspectives of patients (peer specialist panel) and psychiatric nurses (expert nurses panel). Items were evaluated based on their importance, relevance, and fit within dimensions. Participants were also asked to respond to an open-ended question about whether any important quality dimensions of the psychiatric inpatient setting were missing.
  • Cognitive Interviewing: The purpose of the cognitive interviews was to determine whether items were understandable, answerable, and evoked the responses anticipated by the investigators. Following the expert panel review, 25 items were selected for beta versions of the surveys. From the group of participants in the event judgment procedure, 10 participants in each group (staff and consumer) were asked to elaborate on their understanding of these items to ascertain how they interpreted items, their rationale for the importance of items, and whether the item fit in the dimension. From these data, potential items for both versions of the CAPE measure were generated.

This summary will focus on phase two of the study, testing the pilot instruments.

Participants, Interventions, Settings, and Outcomes

Because several staff in one unit did not submit completed forms, the research team ended with a sample of 113 staff and 150 patients. The inclusion criteria for patients required that they be (1) at least 21 years old, (2) in the unit for at least four days, (3) within 24 hours of discharge, and (4) English speakers. Nursing staff were either registered nurses or mental health counselors who worked at least half time in the study unit. Patients and staff members for the study were recruited in collaboration with the managers of the units.

The pilot patient and staff nurse versions of the CAPE were administered to convenience samples in six inpatient psychiatric units with an average bed size of 20. These facilities included three psychiatric units in community hospitals, two psychiatric units in medical centers, and a unit in a free-standing psychiatric hospital. All are designed to provide brief psychiatric inpatient treatment to patients who, because of mental illness, were deemed dangerous to themselves or others or unable to care for themselves.

Data Sources

To test the pilot versions of the CAPE, patients and staff nurses completed their respective versions of the pilot instrument. In addition, patients completed the Perceptions of Care (POC) survey, a 16-item, clinical-care-oriented, self-report satisfaction rating scale. Nurses and mental health workers completed the Practice Environment Scale of the Nursing Work Index (PES-NWI), a 31-item instrument used to measure the quality of the nursing work environment.

Data Analysis

Psychometric evaluation of the patient and staff versions of the CAPE was conducted in successive phases. First, internal consistency of the two forms (patient and staff nurse) was assessed to estimate Cronbach’s alpha. Internal reliability was established with Cronbach’s alpha and test-retest reliability, using 20 percent of former inpatient and nurse samples. Convergent validity was established with the PES-NWI (staff nurses) and the POC survey (patients). The dimensionality was explored using a confirmatory factor analysis—specifically, structural equation modeling—to examine whether the five quality dimensions described in the proposal were reflected in the empirical data.

Findings

The overall alpha for the tools was .91 (staff nurses) and .91 (patients). The subscales ranged from .654 to .804 (patients) and .535 to .846 (staff nurses), but particular items had a negative impact on the subscale alphas. Thus, the tools had good internal consistency but the researchers had questions about several of the subscales for both patients and staff nurses. The alphas for the test-retest were .82 (staff nurses) and .76 (patients), indicating adequate internal reliability. Correlations between the PES-NWI (staff nurses) and the POC survey (patients) confirmed convergent validity, and correlations between the subscales of the instruments were in the expected direction. The tool measures two distinct dimensions for the patient version and five dimensions for the staff nurse version. For patients, the dimensions were:

  • Consumer perceptions of staff competence
  • Consumer perceptions of treatment efficacy.

For staff nurses, dimensions were:

  • A sense of personal competence or their own effectiveness
  • Staff perception of whether there were adequate resources and staffing on the unit
  • Management involvement with care
  • How well staff in the unit worked together as a team
  • The amount of autonomous control the staff have in the unit.

Limitations

The patient group tended to rate almost all items high, meaning they believed they frequently experienced the events depicted in the study items. Thus, although the staff saw differences in dimensions of their experiences in a particular unit, such as their view of the safety of the unit, these differences were not apparent in the patient data. It could be that the questions on the tool need to be reworded to capture differences, or it could be that the tool follows the known pattern of patients rating treatment experiences relatively positively.

Conclusions

The CAPE patient and staff nurse versions appear to be instruments with good internal consistency and internal reliability as well as convergent validity. There appear to be underlying dimensions that inform what patients believe are important aspects of treatment. There is a growing interest in patient-reported outcome measures and patient-centered care. Until recently, patients treated in inpatient psychiatric units were excluded from standard hospital measures (e.g., Press-Ganey).

Future Research

Future research will include hospitals using the CAPE pre- and post-culture change and testing to determine whether the CAPE is sensitive to restraint use. The researchers are also in the process of creating a shorter version of the CAPE.

Project Details

Principal Investigator
Kathleen R. Delaney, PhD, PMH-NP
Other Principal Investigator
Mary E. Johnson, PhD, PMH-CNS
Project Status
Completed; Results posted
Project Title
CAPE: Patient-Centered Quality Assessment of Psychiatric Inpatient Environments
Project Start Date
June 2012
Project End Date
December 2013
Organization
Rush University Medical Center/Rush College of Nursing
Year Awarded
2012
State
Illinois
Project Budget
$304,786
Study Registration Information
HSRP20133127

More on This Project

Delaney, K.R., Johnson, M.E., Fogg, L., "Development and Testing of the Combined Assessment of Psychiatric Environments: A Patient-Centered Quality Measure for Inpatient Psychiatric Treatment," Journal of the American Psychiatric Nurses Association (May 2015).

Page Last Updated: 
February 16, 2017
  • Top of Page

About Us

  • Our Programs
  • Governance
  • Financials and Reports
  • Procurement Opportunities
  • Our Staff

Research & Results

  • Explore Our Portfolio
  • Dissemination and Implementation
  • Peer Review
  • Research Spotlights
  • About Our Research

Engagement

  • What We Mean by Engagement
  • Engagement Awards
  • Engage with Us

Funding Opportunities

  • What & Who We Fund
  • What You Need to Know to Apply
  • Applicant Training
  • Merit Review Process
  • Research Support Funding Opportunities
  • Awardee Resources

Meetings & Events

July 18
Board of Governors Meeting
July 19
Patient and Stakeholder Engagement in Research: Strategies for Initiating Research Partnerships
July 26
Understanding Key Evidence Gaps in the Treatment of Anxiety Disorders in Children, Adolescents, and Young Adults: A Stakeholder Workshop

PCORI

Footer contact address

Patient-Centered Outcomes
Research Institute

1828 L Street, NW, Suite 900
Washington, DC 20036
Phone: (202) 827-7700 | Fax: (202) 355-9558
info@pcori.org

Subscribe to Newsletter

Twitter Facebook LinkedIn Vimeo

© 2011-2017 Patient-Centered Outcomes Research Institute. All Rights Reserved.

Privacy Policy | Terms of Use | Credits | Help Center