Dissemination & Implementation Science
Implementing a Progress and Practice Monitoring Dashboard to Enhance Clinical Co-Management
Puanani J. Hee, Ph.D.
Clinical Data Director
Hawai'i State Child and Adolescent Mental Health Division
Lihue, Hawaii, United States
Tristan J. Maesaka, M.A.
Graduate Student
University of Hawai’i at Manoa
Honolulu, Hawaii, United States
Kristy Bowen, M.A.
Graduate Assistant
University of Hawai’i at Manoa
Honolulu, Hawaii, United States
Ashlyn W. W. A. Wong, B.A. (she/her/hers)
Graduate Student
University of Hawai‘i at Mānoa
Honolulu, Hawaii, United States
Brad Nakamura, Ph.D.
Professor and Director
University of Hawaii at Manoa
Honolulu, Hawaii, United States
Max Sender, M.S.
Data Analyst
Hawai‘i State Department of Health
Honolulu, Hawaii, United States
David Jackson, Ph.D.
Research & Evaluation Specialist
University of Hawaii & Hawaii Child and Adolescent Mental Health Division
Honolulu, Hawaii, United States
Scott K. Shimabukuro, ABPP, Ph.D.
Practice Development Specialist
Hawaii Department of Health
Honolulu, Hawaii, United States
Keli Acquaro, M.A.
Adminstrator
Child and Adolescent Mental Health Division, Department of Health Hawaii
Honolulu, Hawaii, United States
Trina E. Orimoto, Ph.D.
Dissemination and Implementation Specialist
Hawai’i Department of Health
Honolulu, Hawaii, United States
In the Hawai‘i youth public mental health system, a clinical co-management model is used to serve youth/family clients via case management, clinical oversight provided by a psychologist or psychiatrist, and therapeutic services delivered by a community-based contracted agency. Such delivery models carry several benefits and formal structures help to ensure clinical co-management occurs. These include routine treatment planning meetings, required documentation, and clearly defined roles for treatment team members. Yet, challenges include consistent team-based utilization management and review of youth practice and progress clinical dashboards for driving decision-making (Chorpita & Daleiden, 2018, Nakamura et al., 2012). To support team-based data driven decisions and co-management of care, we developed and implemented a user-informed progress and practice dashboard to expedite identification of significant concerns warranting treatment decisions and team communication. Thus, this implementation study aims to investigate the dashboard’s (1) acceptability, feasibility, and appropriateness, (2) monthly utilization patterns, and (3) facilitators and barriers of use.
Sixty-one clinical staff members serving approximately 2,000 youth statewide (average age 13.7 years, 48% female, 56% multiethnic) from eight regional centers completed the (1) Acceptability of Intervention Measure (AIM), Intervention Appropriateness Measure (IAM), and Feasibility of Intervention Measure (FIM; Weiner et al., 2017). Clinical staff are primarily female (92%), range in age from 30 to 65, and approximately half hold a master’s degree (52%). Initial findings suggest that the vast majority (94-97%) of respondents indicated high acceptability, feasibility, and appropriateness of the report. Further, over the one-year period since report implementation at least 80% of clinicians statewide and at least one individual per region were observed to use the report each month. Respondents reported facilitators and barriers of the clinical reporting tool via two, open-ended items. Three graduate students utilized the Consolidated Framework for Implementation Research 2.0 Coding System (CFIR; Damschroder, 2022) to develop coding instructions and subsequently code the responses for themes. Interrater reliability was moderate for facilitators (κ = 0.44, p < .01; 52.5% agreement) and substantial for barriers (κ = 0.70, p < .01, 74.2% agreement). Common facilitator themes included innovation design, innovation complexity, motivation, and work infrastructure. For example, “being able to access information about my cases at a glance in one place.” Common barrier themes included capability, opportunity, and innovation recipients. For instance, “not understanding where all the functions are.” Initial findings suggest that systems wishing to implement similar progress and practice monitoring dashboards should consider the use of centralized data reporting tools, provide training, and build dashboard use into expectations and workflows.