Use of a Secure Web-Based Data Management Platform to Track Resident Operative Performance and Program Educational Quality Over Time

Author Department

Surgery

Document Type

Article, Peer-reviewed

Publication Date

6-2020

Abstract

Objective: In surgery residency programs, Accreditation Council for Graduate Medical Education mandated performance assessment can include assessment in the operating room to demonstrate that necessary quality and autonomy goals are achieved by the conclusion of training. For the past 3 years, our institution has used The Ottawa Surgical Competency Operating Room Evaluation (O-SCORE) instrument to assess and track operative skills. Evaluation is accomplished in near real-time using a secure web-based platform for data management and analytics (Firefly). Simultaneous to access of the platform's case logging function, the O-SCORE instrument is delivered to faculty members for rapid completion, facilitating quality, and timeliness of feedback. We sought to demonstrate the platform's utility in detecting operative performance changes over time in response to focused educational interventions based on stored case log and O-SCORE data.

Design: Stored resident performance assessments for the most frequently performed laparoscopic procedures (cholecystectomy, appendectomy, inguinal hernia repair, ventral hernia repair) were examined for 3 successive academic years (2016-2019). During this time, 4 of 36 residents had received program-assigned supplemental simulation training to improve laparoscopic skills. O-SCORE data for these residents were extracted from peer data, which were used for comparisons. Assigned training consisted of a range of videoscopic and virtual reality skills drills with performance objectives. O-SCORE responses were converted to integers and autonomy scores for items pertaining to technical skill were compared before and after educational interventions (Student's t-tests). These scores were also compared to aggregate scores in the nonintervention group. Bayesian-modeled learning curves were used to characterize patterns of improvement over time.

Setting: University of Massachusetts Medical School-Baystate Surgery Residency and Baystate Medical Center PARTICIPANTS: General surgery residents (n = 36) RESULTS: During the period of review, 3325 resident cases were identified meeting the case type criteria. As expected, overall autonomy increased with the number of cases performed. The 4 residents who had been assigned supplemental training (6-18 months) had preintervention score averages that were lower than that of the nonintervention group (2.25 ± 0.43 vs 3.57 ± 1.02; p < 0.0001). During the respective intervention periods, all 4 residents improved autonomy scores (increase to 3.40 ± 0.61; p < 0.0001). Similar improvements were observed for tissue handling, instrument handling, bimanual dexterity, visuospatial skill, and operative efficiency component skills. Postintervention scores were not significantly different compared to scores for the non-intervention group. Bayesian-modeled learning curves showed a similar pattern of postintervention performance improvement.

Conclusions: The data management platform proved to be an effective tool to track responses to supplemental training that was deemed necessary to close defined skills gaps in laparoscopic surgery. This could be seen both in individual and in aggregated data. We were gratified that at the conclusion of the supplemental training, O-SCORE results for the intervention group were not different than those seen in the non-intervention group.

PMID

32600891

Share

COinS