Direct Observation vs. Video-Based Assessment in Flexible Cystoscopy

Research output: Contribution to journalJournal articleResearchpeer-review

Standard

Direct Observation vs. Video-Based Assessment in Flexible Cystoscopy. / Dagnaes-Hansen, Julia; Mahmood, Oria; Bube, Sarah; Bjerrum, Flemming; Subhi, Yousif; Rohrsted, Malene; Konge, Lars.

In: Journal of Surgical Education, Vol. 75, No. 3, 2018, p. 671-677.

Research output: Contribution to journalJournal articleResearchpeer-review

Harvard

Dagnaes-Hansen, J, Mahmood, O, Bube, S, Bjerrum, F, Subhi, Y, Rohrsted, M & Konge, L 2018, 'Direct Observation vs. Video-Based Assessment in Flexible Cystoscopy', Journal of Surgical Education, vol. 75, no. 3, pp. 671-677. https://doi.org/10.1016/j.jsurg.2017.10.005

APA

Dagnaes-Hansen, J., Mahmood, O., Bube, S., Bjerrum, F., Subhi, Y., Rohrsted, M., & Konge, L. (2018). Direct Observation vs. Video-Based Assessment in Flexible Cystoscopy. Journal of Surgical Education, 75(3), 671-677. https://doi.org/10.1016/j.jsurg.2017.10.005

Vancouver

Dagnaes-Hansen J, Mahmood O, Bube S, Bjerrum F, Subhi Y, Rohrsted M et al. Direct Observation vs. Video-Based Assessment in Flexible Cystoscopy. Journal of Surgical Education. 2018;75(3):671-677. https://doi.org/10.1016/j.jsurg.2017.10.005

Author

Dagnaes-Hansen, Julia ; Mahmood, Oria ; Bube, Sarah ; Bjerrum, Flemming ; Subhi, Yousif ; Rohrsted, Malene ; Konge, Lars. / Direct Observation vs. Video-Based Assessment in Flexible Cystoscopy. In: Journal of Surgical Education. 2018 ; Vol. 75, No. 3. pp. 671-677.

Bibtex

@article{85901e72309049d78cb95edc690d8658,
title = "Direct Observation vs. Video-Based Assessment in Flexible Cystoscopy",
abstract = "Objective: Direct observation in assessment of clinical skills is prone to bias, demands the observer to be present at a certain location at a specific time, and is time-consuming. Video-based assessment could remove the risk of bias, increase flexibility, and reduce the time spent on assessment. This study investigated if video-based assessment was a reliable tool for cystoscopy and if direct observers were prone to bias compared with video-raters. Design: This study was a blinded observational trial. Twenty medical students and 9 urologists were recorded during 2 cystoscopies and rated by a direct observer and subsequently by 2 blinded video-raters on a global rating scale (GRS) for cystoscopy. Both intrarater and interrater reliability were explored. Furthermore, direct observer bias was explored by a paired samples t-test. Results: Intrarater reliability calculated by Pearson's r was 0.86. Interrater reliability was 0.74 for single measure and 0.85 for average measures. A hawk-dove effect was seen between the 2 raters. Direct observer bias was detected when comparing direct observer scores to the assessment by an independent video-rater (p < 0.001). Conclusion: This study found that video-based assessment was a reliable tool for cystoscopy with 2 video-raters. There was a significant bias when comparing direct observation with blinded video-based assessment.",
keywords = "cystoscopy, interrater variability, rater-based assessment, surgical education, video recording",
author = "Julia Dagnaes-Hansen and Oria Mahmood and Sarah Bube and Flemming Bjerrum and Yousif Subhi and Malene Rohrsted and Lars Konge",
year = "2018",
doi = "10.1016/j.jsurg.2017.10.005",
language = "English",
volume = "75",
pages = "671--677",
journal = "Journal of Surgical Education",
issn = "1931-7204",
publisher = "Elsevier",
number = "3",

}

RIS

TY - JOUR

T1 - Direct Observation vs. Video-Based Assessment in Flexible Cystoscopy

AU - Dagnaes-Hansen, Julia

AU - Mahmood, Oria

AU - Bube, Sarah

AU - Bjerrum, Flemming

AU - Subhi, Yousif

AU - Rohrsted, Malene

AU - Konge, Lars

PY - 2018

Y1 - 2018

N2 - Objective: Direct observation in assessment of clinical skills is prone to bias, demands the observer to be present at a certain location at a specific time, and is time-consuming. Video-based assessment could remove the risk of bias, increase flexibility, and reduce the time spent on assessment. This study investigated if video-based assessment was a reliable tool for cystoscopy and if direct observers were prone to bias compared with video-raters. Design: This study was a blinded observational trial. Twenty medical students and 9 urologists were recorded during 2 cystoscopies and rated by a direct observer and subsequently by 2 blinded video-raters on a global rating scale (GRS) for cystoscopy. Both intrarater and interrater reliability were explored. Furthermore, direct observer bias was explored by a paired samples t-test. Results: Intrarater reliability calculated by Pearson's r was 0.86. Interrater reliability was 0.74 for single measure and 0.85 for average measures. A hawk-dove effect was seen between the 2 raters. Direct observer bias was detected when comparing direct observer scores to the assessment by an independent video-rater (p < 0.001). Conclusion: This study found that video-based assessment was a reliable tool for cystoscopy with 2 video-raters. There was a significant bias when comparing direct observation with blinded video-based assessment.

AB - Objective: Direct observation in assessment of clinical skills is prone to bias, demands the observer to be present at a certain location at a specific time, and is time-consuming. Video-based assessment could remove the risk of bias, increase flexibility, and reduce the time spent on assessment. This study investigated if video-based assessment was a reliable tool for cystoscopy and if direct observers were prone to bias compared with video-raters. Design: This study was a blinded observational trial. Twenty medical students and 9 urologists were recorded during 2 cystoscopies and rated by a direct observer and subsequently by 2 blinded video-raters on a global rating scale (GRS) for cystoscopy. Both intrarater and interrater reliability were explored. Furthermore, direct observer bias was explored by a paired samples t-test. Results: Intrarater reliability calculated by Pearson's r was 0.86. Interrater reliability was 0.74 for single measure and 0.85 for average measures. A hawk-dove effect was seen between the 2 raters. Direct observer bias was detected when comparing direct observer scores to the assessment by an independent video-rater (p < 0.001). Conclusion: This study found that video-based assessment was a reliable tool for cystoscopy with 2 video-raters. There was a significant bias when comparing direct observation with blinded video-based assessment.

KW - cystoscopy

KW - interrater variability

KW - rater-based assessment

KW - surgical education

KW - video recording

U2 - 10.1016/j.jsurg.2017.10.005

DO - 10.1016/j.jsurg.2017.10.005

M3 - Journal article

C2 - 29102559

AN - SCOPUS:85033471752

VL - 75

SP - 671

EP - 677

JO - Journal of Surgical Education

JF - Journal of Surgical Education

SN - 1931-7204

IS - 3

ER -

ID: 196915553