Date of Award

7-2022

Document Type

Thesis

Degree Name

Master of Science (MS)

Department

Behavioral Analysis

First Advisor

David A. Wilder

Second Advisor

Patrick D. Converse

Third Advisor

Rachael E. Ferguson

Fourth Advisor

Robert A. Taylor

Abstract

The PDC-HS is an informant-based assessment tool used by OBM practitioners in human service settings. An updated version known as the PDC-HS 1.1 has recently been released. The present study measured the validity and reliability of the PDCHS 1.1 by analyzing answers obtained while watching a video of a simulated interview between a consultant and a supervisor. Three video vignettes were created, each describing a performance concern in one or more areas of the PDCHS 1.1. Twenty-one participants watched all vignettes and filled out the PDC-HS 1.1 based on the answers provided. Validity was measured by calculating the percentage of participants who correctly identified the area(s) in the PDC-HS 1.1 responsible for the performance concern presented in each vignette. Participants repeated the assessment about two weeks later to assess test-retest reliability. Interrater reliability was measured by pairing participants randomly and comparing their scores. In addition, an intervention selection component was included to assess whether a corresponding intervention was selected to target the indicated domain. Results demonstrate that about 90% of participants correctly identified the indicated area of the PDC-HS 1.1 and 79% selected an appropriate intervention. Test-retest and interrater reliability scores were above 85% and demonstrate that the tool is generally reliable. The results provide support for the use of informant-based assessments in human services settings and suggest that participants with relatively little experience in behavior analysis can conduct assessment interviews accurately and reliably.

Share

COinS