Home About us Editorial board Ahead of print Current issue Search Archives Submit article Instructions Subscribe Contacts Login 
  • Users Online: 1047
  • Home
  • Print this page
  • Email this page


 
 Table of Contents  
ORIGINAL ARTICLE
Year : 2020  |  Volume : 6  |  Issue : 2  |  Page : 116-120

Traditional interviewing with Visual Analog Scale predicts emergency medicine resident performance and does not correlate to the standardized video interview: A prospective cohort and cross-sectional study


Department of Emergency Medicine, St. Luke's University Health Network Department of Emergency Medicine, Bethlehem, PA, USA

Date of Submission06-Apr-2020
Date of Acceptance20-Apr-2020
Date of Web Publication29-Jun-2020

Correspondence Address:
Dr. Rebecca Jeanmonod
St. Luke's University Health Network, 801 Ostrum Street, Bethlehem, PA 18015
USA
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/IJAM.IJAM_29_20

Rights and Permissions
  Abstract 


Study Objective: We evaluate the validity of our internal resident applicant interview scoring system with actual resident performance after at least 2 years of training. We also compare our internal scoring system with the Standardized Video Interview (SVI) scores obtained by the Electronic Residency Application Service.
Materials and Methods: The first phase of our study was a before-and-after cohort of six consecutive classes from a single emergency medicine residency program. Faculty members were blinded to each resident's interview score before starting residency and asked to assess their current performance on the same scoring system. The second phase of the study was a prospective cohort of 124 emergency medicine residency candidates interviewing during the 2017–2018 cycle.
Results: Fifty-one residents at the postgraduate year 2 level or higher had scoring data available from their interviews and participated in the before-and-after phase of this study. Their mean interview score before starting their residencies was 69.2 on a 100-mm Visual Analog Scale (VAS), with a range of 38.5–94.3. Their performance VAS score after at least 2 years of training had a mean of 69.7 (standard deviation: 15.8), with a range of 13.2–90.1. Using Wilcoxon ranked-sum testing for repeated measures, there were no differences (P = 0.95). Only four residents' VAS scores dropped more than 2 cm. The second phase included a cohort of 124 total applicants from the 2017 to 2018 cycle. Applicant VAS scores ranged from 5 to 91.7 mm, with a mean of 60 mm. Their SVI scores ranged from 13 to 27, with an average of 19.4. The values had a weakly negative relationship, with a correlation coefficient of −0.1.
Conclusions: Traditional interviews are a relatively accurate predictor of individual resident performance. There is no correlation between traditional interviews and SVI scores. Although the SVI was initiated to help demonstrate an applicant's interpersonal and communication skills, a face-to-face conversation is irreplaceable.
The following core competency statement: Systems-based practice.

Keywords: Resident evaluation, resident interview, resident selection


How to cite this article:
Moon A, Yang A, Lorenzo C, Morley K, Melanson S, Jeanmonod R. Traditional interviewing with Visual Analog Scale predicts emergency medicine resident performance and does not correlate to the standardized video interview: A prospective cohort and cross-sectional study. Int J Acad Med 2020;6:116-20

How to cite this URL:
Moon A, Yang A, Lorenzo C, Morley K, Melanson S, Jeanmonod R. Traditional interviewing with Visual Analog Scale predicts emergency medicine resident performance and does not correlate to the standardized video interview: A prospective cohort and cross-sectional study. Int J Acad Med [serial online] 2020 [cited 2020 Sep 21];6:116-20. Available from: http://www.ijam-web.org/text.asp?2020/6/2/116/287950




  Introduction Top


Emergency medicine has undoubtedly become one of the more competitive fields in medicine over recent years. Emergency medicine as a specialty requires a high level of medical knowledge, ability to adapt to changeable circumstances, skill at multitasking and prioritization, and strong interpersonal communication skills and rapport building.

During the application process, each program makes every attempt to acquire the most qualified residents to train as future professionals. Traditionally, measures of medical knowledge and test-taking ability, such as board examinations, course grades, and class rank, have featured prominently in ranking students for residency positions. These measures do not adequately assess other core skill sets, such as professionalism, communication skills, and behavior. Letters of recommendation bridge this gap to some degree with subjective personal commentary on student performance. In spite of this, according to a survey by the Association of American Medical College in 2016, program directors were the least satisfied with current tools to evaluate student characteristics such as integrity, reliability, and professionalism.[1]

Emergency medicine attempted to address this issue in the 2017 residency application cycle by being the first to implement the Standardized Video Interview (SVI). The purpose of the SVI is to provide objective information about an applicant's knowledge of professional behaviors and interpersonal and communication skills.[2] The interview consists of six questions, either behavioral or situational, that are graded on a scale of 1–5 for a maximum of thirty points. This score is reported as a numerical value to aid in applicant selection. As with any new endeavor, the utility and impact of this project have yet to be determined.

Objectives

In our training program, we have been utilizing a 100-mm Visual Analog Scale (VAS) scoring system formed by the resident applicant interview team to rank applicants for 8 years. Our objectives were two-fold. First, we address the validity of our VAS by comparing it with actual resident performance as assessed by faculty blinded to initial VAS. We also sought to compare the relationship between our own 100-mm VAS with the SVI scores obtained by the Electronic Residency Application Service (ERAS).


  Materials and Methods Top


Study design

This study had two phases. The first phase was a before-and-after cohort study of six consecutive residency classes (eight residents per class) from a single emergency medicine residency program. The second phase of this study was a prospective cohort of 124 emergency medicine residency candidates interviewing at the same program during the 2017–2018 cycle. The study was reviewed by the institutional review board (IRB) and found to be exempt. It is consistent with STROBE guidelines.

Study setting and population

The study was conducted at a community-based Level I trauma center that hosts an emergency medicine residency program. The core faculty at the study institution is composed of nine individuals, all of whom are board certified in emergency medicine. A single faculty member left the study institution during the before-and-after cohort phase and was replaced by a new faculty member. All other faculty members were involved in all phases of the study.

The residents participating in the before-and-after cohort phase of this study include individuals from the Northeast, South, West, and Midwest United States. Approximately 35% were DOs, and 30% were women. Interns were excluded from the study, as we felt that a single year was inadequate to assess resident performance, as many residents come into their own during their 2nd year of training.

The students interviewing at the study institution for residency positions in the 2017–2018 cycle of the Accreditation Council for Graduate Medical Education match were included. The residency program typically receives about 800 applications for 8 residency positions and interviews 120–130 applicants per cycle. The interviewees are almost exclusively graduates of American medical schools and represent both allopathic and osteopathic medical programs.

Study protocol and measurements

During applicant interviews, each applicant is ranked on a 10-cm VAS by three interviewing faculty members. The VAS has no internal numerical designations but provides the narrative anchors “bottom of rank list” and “top of rank list” at the 0- and 10-cm positions, respectively. Faculty members are instructed to score each applicant by placing a single vertical hash mark on the scale to correspond to where they would place the applicant on a rank list. The mean of the 3 VAS scores is recorded as the applicant's interview score. All core faculty members participate in interviewing applicants and are familiar with the VAS. Faculty is not blinded to applicant academic performance or letters of recommendation during the interview scoring, but there is no standardization as to how those measures are applied during the interview. These parameters are taken into account separately during both selection of interview candidates and in formulation of the rank list.

For the before-and-after phase of this study, researchers recorded the interview VAS scores for applicants who ultimately matched at the study institution over the course of 6 consecutive years. This value represented the “before” measurement for each resident. All current core faculty members (n = 10) were then asked to complete an identical VAS for residents who had completed 2 or more years of training. This value represented the “after” measurement for each resident. Core faculty members were blinded to each resident's interview VAS score. A single faculty position turned over during the study period. The remaining nine core faculty members had been core faculty throughout the entire study period. We a priori determined among the faculty that a 2-cm difference would represent a clinically relevant difference in resident performance.

For the prospective cohort phase of this study, applicants' interview scores were compiled as described above and compared to the Standard Video Interview score provided by ERAS.

Data analysis

For the before-and-after cohort phase of this study, each resident's VAS scores from his/her interview and after at least 2 years of training were compared using Wilcoxon signed-rank test for repeated measures. In addition, the pooled scores from before and after were compared using Mann–Whitney U-test.

For the prospective cohort phase of this study, applicants' VAS scores were compared to the SVI provided by ERAS using correlation.


  Results Top


Fifty-one residents at the postgraduate year 2 level or higher had VAS data available from their interviews and participated in the before-and-after phase of this study. Their mean interview VAS score before starting their residencies was 69.2 (standard deviation: 12.7 with normal distribution), with a range of 38.5–94.3. Their performance VAS score after at least 2 years of training had a median of 74.2 (interquartile range: 63.2–79.1), with a range of 13.2–90.1. There was no statistical difference between VAS between these two groups (P = 1, Mann–Whitney). Using Wilcoxon ranked-sum test for repeated measures, there were no statistically significant differences over time between the two groups (P = 0.95).

In looking descriptively at individual data points, 39 residents had less than a 2-cm change in their VAS from their preresidency interview to their VAS after at least 2 years of training. For residents with VAS scores that increased, they went up a mean of 13.3 mm (range: 1.6–41.3). For residents with VAS scores that decreased, they fell by a mean of 16.4 mm (range: 1.4–65.8). Only four residents had VAS scores that decreased more than 2 cm [Figure 1].
Figure 1: Graph depicting the overall change in resident Visual Analog Scale score from their initial entrance interview into residency to their Visual Analog Scale after at least 2 years of training, as assessed by three core faculty members

Click here to view


One hundred and twenty-four total applicants from the 2017–2018 cycle completed the prospective cohort phase of this study. Their VAS scores ranged from 5 to 91.7 mm, with a mean of 60 mm (P = 0.001 when compared to VAS scores of matched residents, both data sets of which were normally distributed). The applicants' SVI scores ranged from 13 to 27, with an average of 19.4. The correlation between SVI and our VAS was weakly negative, with a correlation coefficient of − 0.1 (P = 0.27). Our lowest VAS score of 5 mm had a SVI score of 23, and our highest VAS score of 91.7 mm had a SVI score of 20.


  Discussion Top


We found no correlation between traditional interviews and SVI scores. However, traditional interviewing is a valuable tool in resident selection and is a reliable predictor of resident performance.

From both the point of view of a residency program and of an applicant, finding the right match in an emergency medicine training program is an imperfect and expensive endeavor. On average, applicants apply to 42 programs and interview at 13.7, costing over $5000–$8000 related almost entirely to transportation and lodging (without consideration for intangible costs, such as time).[3],[4] Emergency medicine programs spend, on average, $210,649.04 each year on selecting and interviewing candidates.[4] Of this total, about 20% is spent on reviewing and selecting applicants, 73% is spent on conducting interviews, and the remaining is spent on social events related to the interview process.[4]

Asynchronous standardized video interviewing has an appeal because it creates objective data for important yet difficult to measure qualities, such as professionalism and interpersonal skills, possibly saving programs from the cost of interviewing candidates who are not suitable for their programs. In addition, if shown to be a reliable predictor of resident performance, asynchronous standardized interviewing has the potential to replace traditional interviewing, saving applicants the cost and hassle of traveling to a large number of programs for the opportunity to match.

Unfortunately, the SVI as currently implemented does not appear to be the solution to the rapidly growing expense of residency interview season. Although it may decrease the likelihood of an applicant receiving a traditional interview invitation, it has not been shown to correlate well with faculty and patient ratings on communication and professional skills in actual clinical practice.[5],[6] Other studies have found that the SVI has a small positive correlation to traditional interview impressions as well as to performance on standardized letters of recommendation, but these studies were not designed to assess clinical performance.[7],[8],[9] In our study, we found a small negative correlation of SVI to our interview assessments, adding to the body of literature that the SVI is not a very helpful tool.

On the contrary, our study suggests that interviewing an applicant in an unstructured format provides a reliable indicator of future performance as a resident. Our interview scoring does not attempt to parse out “professionalism,” “communication skills,” “core values,” “fit,” or any other specific categories but merely asks for a global assessment on a 10-cm VAS representing a theoretical rank order list. Although at face value this would appear to be a very imprecise tool, using a global VAS assessment resulted in very few negative surprises in residency performance, with fewer than 8% of residents dropping more than 2 cm after 2 years of training, regardless of their initial score.

There is speculation as to why the SVI has been unhelpful and why traditional interviewing is still a mainstay of resident selection. The SVI divides its assessment into situational and behavioral components, and all interview questions have been thoroughly vetted. At face value, this would appear to be adequate. However, questioning an applicant regarding communication and professionalism asynchronously does not equate with actually communicating with the applicant. During an interview, the interviewer has the opportunity to witness the applicant's communication skills first hand and interact with him or her in a professional way. This allows for a direct experience with not just the applicant's verbal communication but also his or her nonverbal communication skills as well. Nonverbal communication plays an important role in patient satisfaction, rapport building, and interpersonal relationships and is a critical component to empathy.[10],[11] However, the SVI is unable to assess these critical nonverbal communication skills except indirectly. These measures, which are difficult to measure and not necessarily categorizable using traditional metrics, are likely captured in a global VAS.

It is appealing to standardize the interview process to make measurable and reproducible outcomes regarding “soft” skills such as communication and professionalism. This would lend credibility to resident applicant selection and would make the process more straightforward for residency applicants. Current technology, however, is inadequate to replace face-to-face interviewing. Furthermore, the SVI, although initiated with important intentions, does not act as a useful supplement to the traditional process.

Limitations

This study has a number of limiting factors. For the before-and-after cohort, faculty members were asked to complete a VAS with the narrative anchors “top of rank list” and “bottom of rank list,” but these were completed after a minimum of 2 years of training. Therefore, some residents (n = 8) were not finished with their training at the time of the study. It is possible that some of these may have been “late bloomers” and received a lower VAS than they might have otherwise had they completed all 3 years of training. In addition, in assessing residents that had already graduated, it is conceivable that faculty may have forgotten information and details that might have caused them to assess residents either more or less favorably. Although it is conceivable that faculty may have recalled their assigned VAS for a given resident from the resident's prematch interview, it is exceedingly unlikely, given the number of interviews performed per se ason and the number of residents trained. In addition, the study was performed at a single residency training site, and may not be generalizable to other training facilities.

The prospective cohort phase comparing VAS and SVI was completed using only a single application cycle. This small number of total applicants may not accurately represent all of the medical students in the nation that are applying to emergency medicine residencies. In addition, in assigning a VAS, our faculty members are not blinded to applicant test scores or letters of recommendation, which are not taken into consideration in scoring the SVI, which may have led to the lack of correlation of these two tools. Finally, the SVI, which is not being used in the future, does not have a significant history, and therefore, we were not able to use the SVI as a predictor of actual resident performance, as very few of our residents did a SVI.


  Conclusions Top


Our data indicate that there is no correlation between traditional interviews and SVI scores, but that traditional interviewing is a valuable tool in resident selection. Traditional interviewing with a global VAS is a reliable predictor of resident performance, with most residents remaining within 2 cm of their initial assessment after 2 or more years of training.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.

Research quality and ethics statement

The authors of this manuscript declare that this scientific work complies with reporting and quality, formatting, and reproducibility guidelines set forth by the EQUATOR Network. The authors also attest that this clinical investigation was determined to not require IRB/Ethics Committee review, and the corresponding protocol/approval number is not applicable.



 
  References Top

1.
Dunleavy D, Geiger T, Overton R, Prescott J. Results of the 2016 Program Directors Survey. Current Practices in Residency Selection. Association of American Medical Colleges; 2016. Available from: https://store.aamc.org/results-of-the-2016-pro gram-directors-survey.html. [Last accessed on 2020 Aug 06].  Back to cited text no. 1
    
2.
The AAMC Standardized Video Interview: Essentials for the ERAS 2020 Season. Association of American Medical Colleges; 2019. Available from: https://students-residents.aamc.org/appl ying-reaasidency/applying-residencies-era s/aamc-standardized-vi deo-interview/. [Last accessed on 2020 Aug 06].  Back to cited text no. 2
    
3.
Blackshaw AM, Watson SC, Bush JS. The cost and burden of the residency match in emergency medicine. West J Emerg Med 2017;18:169-73.  Back to cited text no. 3
    
4.
van Dermark JT, Wald DA, Corker JR, Reid DG. Financial implications of the emergency medicine interview process. AEM Educ Train 2017;1:60-9.  Back to cited text no. 4
    
5.
Husain A, Li I, Ardolic B, Bond MC, Shoenberger J, Shah KH, et al. The standardized video interview: How does it affect the likelihood to invite for a residency interview? AEM Educ Train 2019;3:226-32.  Back to cited text no. 5
    
6.
Hall MM, Lewis JJ, Joseph JW, Ketterer AR, Rosen CL, Dubosh NM. Standardized video interview scores correlate poorly with faculty and patient ratings. West J Emerg Med 2019;21:145-8.  Back to cited text no. 6
    
7.
Chung AS, Shah KH, Bond M, Ardolic B, Husain A, Li I, et al. How well does the standardized video interview score correlate with traditional interview performance? West J Emerg Med 2019;20:726-30.  Back to cited text no. 7
    
8.
Hopson LR, Dorfsman ML, Branzetti J, Gisondi MA, Hart D, Jordan J, et al. Comparison of the standardized video interview and interview assessments of professionalism and interpersonal communication skills in emergency medicine. AEM Educ Train 2019;3:259-68.  Back to cited text no. 8
    
9.
Hopson LR, Regan L, Bond MC, Branzetti J, Samuels EA, Naemi B, et al. The AAMC standardized video interview and the electronic standardized letter of evaluation in emergency medicine: A comparison of performance characteristics. Acad Med 2019;94:1513-21.  Back to cited text no. 9
    
10.
Vogel D, Meyer M, Harendza S. Verbal and non-verbal communication skills including empathy during history taking of undergraduate medical students. BMC Med Educ 2018;18:157.  Back to cited text no. 10
    
11.
Kraft-Todd GT, Reinero DA, Kelley JM, Heberlein AS, Baer L, Riess H. Empathic nonverbal behavior increases ratings of both warmth and competence in a medical context. PLoS One 2017;12:e0177758.  Back to cited text no. 11
    


    Figures

  [Figure 1]



 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
Abstract
Introduction
Materials and Me...
Results
Discussion
Conclusions
References
Article Figures

 Article Access Statistics
    Viewed192    
    Printed26    
    Emailed0    
    PDF Downloaded0    
    Comments [Add]    

Recommend this journal


[TAG2]
[TAG3]
[TAG4]