Development and Validation of a Test for Competence in Evidence-Based Medicine

Author Department

Geriatric Medicine; Medicine

Document Type

Article, Peer-reviewed

Publication Date

12-2019

Abstract

BACKGROUND:

Medical educators need valid, reliable, and efficient tools to assess evidence-based medicine (EBM) knowledge and skills. Available EBM assessment tools either do not assess skills or are laborious to grade.

OBJECTIVE:

To validate a multiple-choice-based EBM test-the Resident EBM Skills Evaluation Tool (RESET).

DESIGN:

Cross-sectional study.

PARTICIPANTS:

A total of 304 medicine residents from five training programs and 33 EBM experts comprised the validation cohort.

MAIN MEASURES:

Internal reliability, item difficulty, and item discrimination were assessed. Construct validity was assessed by comparing mean total scores of trainees to experts. Experts were also asked to rate importance of each test item to assess content validity.

KEY RESULTS:

Experts had higher total scores than trainees (35.6 vs. 29.4, P < 0.001) and also scored significantly higher than residents on 11/18 items. Cronbach's alpha was 0.6 (acceptable), and no items had a low item-total correlation. Item difficulty ranged from 7 to 86%. All items were deemed "important" by > 50% of experts.

CONCLUSIONS:

The proposed EBM assessment tool is a reliable and valid instrument to assess competence in EBM. It is easy to administer and grade and could be used to guide and assess interventions in EBM education.

PMID

31848856

Share

COinS