Date of Award

August 2022

Degree Type

Dissertation

Degree Name

Doctor of Philosophy

Department

Health Sciences

First Advisor

Roger O Smith

Committee Members

Virginia C Stoffel, Mark V Johnston, Inga Wang, Jaclyn Schwartz

Keywords

Accessibility, Accessibility measurement, Community accessibility, Inclusive environments, Public building accessibility

Abstract

Public buildings accessibility is of societal importance for all individuals, especially people with disabilities (PWD). Despite of all the efforts that have been made on the societal and community levels, PWD are still limited from participating in the community due to inaccessible environments. One of the major limitations in the current practices regarding public buildings’ accessibility is the lack of a commonly accepted, comprehensive, and metrically sound tool to objectively measure the accessibility of public buildings. Such a measure is needed. AccessTools is a newly developed assessment tool to identify, document, and objectively measure the complete accessibility of different building elements. This dissertation investigates the interrater reliability (IRR) of the AccessTools assessment and reports on three studies. The first study evaluates the IRR of AccessTools after performing on-site assessments and studies the effects of training on the IRR of the tool. The second study investigates the IRR of AccessTools after performing video-simulated assessments, and studies the effects of training, raters’ educational level, collection sites, and students’ performance in the knowledge quiz on the IRR of the tool. The third study investigates the effects of the branching system in AccessTools on IRR. 573 participants were recruited for this study from students taking Assistive Technology related courses from multiple universities in the US over two academic years. The study included completing a self-paced online training program on several topics surrounding the accessibility of public buildings and performing accessibility evaluations on restaurants using the AccessTools assessment. Participants were asked to complete a knowledge quiz at baseline, and after completing each of the training and evaluation tasks. A cross-over design was implemented to study the effect of the training program. In the first academic year, the students performed on-site accessibility evaluations, while video-simulated evaluations were implemented in the second academic year due to the COVID-19 pandemic. The Gwet’s AC1 agreement coefficients were compared across the different raters’ groups, collection sites, and restaurants. The results of this study revealed that the AC1 reliability coefficients for both studies were found to be of ‘Moderate’ strength when averaging across the restaurants (AC1 for Study 1 was 0.504, and 0.531 for Study 2). Interestingly, the results from the two studies showed that training had a negative effect on the IRR of the tool, and that the undergraduate raters achieved higher AC1 coefficients compared to their graduate peers. Agreement coefficients differed across the collection sites. The participants’ group with higher scores in the knowledge quiz had significantly higher AC1 agreement coefficients across all three assessed restaurants. The branching study revealed that different branching levels had mixed effects on AC1 agreement coefficients. A simulation study was conducted to investigate the effects of answering higher root level questions in the implemented branching system of the tool. Overall, the AccessTools assessment achieved at least a ‘Moderate’ strength IRR across the two evaluation mediums. Several factors need to be considered in order to improve the IRR of the tool. A more comprehensive tool-specific training module is needed. Another factor is to consider changing the scoring system of the higher root level questions into a dichotomous system, while maintaining the trichotomous system for the lower root level questions. Additionally, the findings of this study and other studies suggest that video-simulated evaluations could improve training of novice raters’ before measuring community building accessibility.

Share

COinS