Home About us Editorial board Ahead of print Current issue Search Archives Submit article Instructions Subscribe Contacts Login 
  • Users Online: 404
  • Home
  • Print this page
  • Email this page


 
 Table of Contents  
CONFERENCE ABSTRACTS AND REPORTS
Year : 2020  |  Volume : 6  |  Issue : 3  |  Page : 234-277

The 2020 St. Luke's University Health Network Annual Research Symposium: Event highlights and scientific abstracts


1 Department of Anesthesiology, St. Luke's University Health Network, Bethlehem, Pennsylvania, USA
2 Department of Research & Innovation, St. Luke's University Health Network, Bethlehem, Pennsylvania, USA
3 Department of Graduate Medical Education, Data Management and Outcomes Assessment, St. Luke's University Health Network – Richard A. Anderson Campus, Easton, Pennsylvania, USA

Date of Submission25-Aug-2020
Date of Acceptance03-Sep-2020
Date of Web Publication26-Sep-2020

Correspondence Address:
Dr. Stanislaw P Stawicki
Department of Research & Innovation, St. Luke's University Health Network, Bethlehem, Pennsylvania
USA
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/IJAM.IJAM_115_20

Rights and Permissions

How to cite this article:
Pellegrino AN, Birk R, Kaur P, Stawicki SP. The 2020 St. Luke's University Health Network Annual Research Symposium: Event highlights and scientific abstracts. Int J Acad Med 2020;6:234-77

How to cite this URL:
Pellegrino AN, Birk R, Kaur P, Stawicki SP. The 2020 St. Luke's University Health Network Annual Research Symposium: Event highlights and scientific abstracts. Int J Acad Med [serial online] 2020 [cited 2020 Oct 27];6:234-77. Available from: https://www.ijam-web.org/text.asp?2020/6/3/234/296138




  Background Information and Event Highlights Top


The Annual St. Luke's University Health Network (SLUHN) Research Symposium was established in 1992 to showcase research and quality improvement projects by residents, fellows, and other trainees. The event is organized by a multi-departmental planning committee, with collaboration and consultation provided by graduate medical education (GME) leadership, medical school leadership, as well as residency and fellowship faculty. Residents, fellows, and students submit an application for podium (8-min) or quick shot (5-min) presentation along with an accompanying abstract describing their project or case report.

This year's event featured the largest number of podium and quick shot presentations in SLUHN's 28-year Research Symposium history. Before the event, each submitted project was assessed by at least two independent judges for the overall scientific quality (60% of the score). This was followed by a live audience vote for the best presentation (40% of the score). Based on the above methodology, prizes were awarded to top three podium presenters and to the best quick shot presenter. Since 2018, students from the Temple/St. Luke's Medical School were invited to participate in the Research Symposium, and this year's event is the first to feature scientific competition prize for the best presentation by a medical student.

The 2020 Research Symposium winners were as follows:

  1. Podium presentations


    1. First place – Amanda Gifford, MD (General Surgery Residency), “Disproportionately affected and underinsured: Trauma and violence in young African American and Latino men extend beyond major urban centers.”
    2. Second place – Farrah Harmouch, MD (Internal Medicine Residency, Bethlehem), “Association of metabolic syndrome with diverticulosis and internal hemorrhoids in geriatric patients ages 75 years and older.”
    3. Third place – Kyle Dammann, MD (General Surgery Residency), “The use of liposomal bupivacaine in transversus abdominus plane blocks for postoperative pain control.”


  2. Quick shot presentation


    1. First place – Meagan Corrigan, DO (Emergency Medicine Residency, Bethlehem), “Lidocaine for treatment of acute pain syndromes.”


  3. Medical student presentation


    1. First place (jointly) – Jessica Fleischer and Rachel Pallay (Temple/St. Luke's Medical School), “Risk factors associated with poor outcomes in younger COVID-19 hospitalized patients.”


As in the previous 4 years, the 2020 Research Symposium included Keynote Speakers. This year's invitees were Prof. Manish Garg, MD, Residency Program Director from the Departments of Emergency Medicine, Weill Cornell Medicine, Columbia University College of Physicians and Surgeons, New York, NY; and Dr. Shikha Kapil, MD, from the Departments of Critical Care and Emergency Medicine, MedStar Washington Hospital Center and MedStar Georgetown University Hospital, Washington, DC. The joint Keynote presentation discussed the importance of research and evidence-based approaches in clinical management of patients with coronavirus disease 2019 (COVID-19). Guest speakers described their institutional experiences with the ongoing pandemic, provided valuable insights and perspectives on effective teaching during this challenging time, and emphasized the importance of critical thinking when translating evidence-based literature into bedside practice. The Keynote Address concluded the morning session of the Research Symposium.

The afternoon session of the event included presentations from various departments that directly and indirectly support research and scholarly activity at SLUHN. This highly informative session featured content from the following areas: Clinical Trials, the Institutional Review Board, GME Data Management and Outcomes Assessment, Temple/St. Luke's Medical School, Information Technology (IT)/St. Luke's Technology Ventures, Knowledge Management, Nursing/Evidence-Based Practice, Physical Therapy, Quality and Safety, Narrative Medicine, and other specialties, departments, and topics. Furthermore, due to ongoing COVID-19 pandemic and associated large gatherings restrictions, the 2020 Research Symposium took place entirely online. With the help of the St. Luke's IT Media Team, we were able to conduct the largest, and the most complex, virtual event in the Network's history.


  Abstract Number 1 Top


Appropriateness and Efficiency of Diagnostic Imaging Orders Recommended by Physical Therapists

K. G. Patrick 1, S. M. Kareha 1

1 St. Luke's Orthopaedic Physical Therapy Residency, Allentown, PA, USA

Introduction: Spine-related pain is a massive problem throughout the world, resulting in high medical expenditures and frequent disability. Physical therapists are equipped with knowledge and skills to make appropriate referrals for diagnostic imaging to reduce cost and improve outcomes by increasing efficiency in care delivery.

Methods: Consecutive patients who consulted with a physical therapist first within the Comprehensive Spine Program at SLUHN were included in this retrospective analysis. The appropriateness of the order recommendation by the physical therapist was determined by chart review and compared to the gold standard of the American College of Radiology (ACR) Appropriateness Criteria ®. To determine the efficiency of physical therapist ordering, the number of physician visits and days between physical therapist recommendation and physician order placement were analyzed.

Results: Physical therapists placed imaging order recommendations in 15 out of 1164 cases. Physical therapists ordered the appropriate diagnostic imaging modality 94% of the time. The median number of physician visits between physical therapist recommendation and formal ordering was 1 visit (range 1–3). The median number of days between physical therapist recommendation and formal ordering by the physician was 15 days (range 1–104).

Conclusion: The results of this study demonstrate that given the authority to order diagnostic imaging, physical therapists do not over-utilize diagnostic imaging and order in adherence to the ACR Appropriateness Criteria ®. Furthermore, the authorization for physical therapists to order diagnostic imaging would improve the efficiency of care for those patients who need further diagnostic testing.


  Abstract Number 2 Top


Enhancing Providers' Cultural Competency of the Lesbian, Gay, Bisexual, Transgender, Queer, or Intersex Population: A Mixed-Method Intervention Study

R. H. Markson 1, N. Defenbaugh 1, P. Kaur 1, A. Rhoads 1

1 Family Medicine Residency, Richard A. Anderson Campus, Easton, PA, USA

Introduction: Sexual and gender minorities (SGMs) experience tremendous social and health inequities compared to the general population. This topic is even more relevant and needed given the likely impact of COVID-19 on SGM patients.[1],[2],[3],[4] Furthermore, SGM individuals such as members of the lesbian, gay, bisexual, transgender, queer, or intersex (LGBTQI) community are at a disproportionate risk of many health conditions.[2],[3],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14] For instance, transgender individuals are three times as likely to contract HIV.[6] Many LGBTQI patients avoid seeking medical care due to the barriers they face in healthcare, such as issues with insurance, fear of discrimination, and lack of provider knowledge and cultural competence about LGBTQI issues.[1],[8],[9],[11] Medical education plays an essential role in ensuring inclusive healthcare; however, many medical training programs fail to provide adequate education regarding how to effectively treat members of the LGBTQI community. A recent study suggests that the mean time spent on this topic in medical school is only 5 h.[10] In this study, we provided a Medical Education Grand Rounds (MEGR) presentation entitled, “Foundations of Creating a Safe Space for LGBTQI in Healthcare” to St. Luke's healthcare providers. The aim was to demonstrate through quantitative and qualitative analysis, the ability of MEGR to enhance awareness and attitudes, increase knowledge, and inspire behavioral change to produce more inclusive healthcare to our LGBTQI community.

Methods: This mixed-method study used quantitative and qualitative approaches. The study participants included St. Luke's employees who attended the education session. We included those who attended the session in person and remotely. Participants were given pre- and post-surveys reflecting some of the themes and topics in the presentation [See Tables]. The survey used was a shorter, modified version from a validated survey of 32 questions.[12] The presurvey included 12 questions assessing provider's knowledge (2 questions), attitudes (4 questions), and practice behaviors (6 questions) in regard to the LGBTQI population. The prelecture survey also includes eight demographic questions. The postlecture survey includes the same 12 questions as the presurvey with an additional open-ended question, “How do you envision this session will impact your future patient care?” Surveys were collected in an HIPAA-protected data collection software, REDCap (Vanderbilt University, Nashville, TN). All statistical analyses were conducted using IBM (Armonk, NY) SPSS for Windows Version 18. Wilcoxon signed-rank tests were performed to determine the effect of the LGBTQI presentation on the knowledge, attitude, and behaviors (KAB) of the participants. A P < 0.05 was considered statistically significant. For the qualitative analysis of the open-ended question, we conducted content analysis. Two of the authors performed a multi-stage inductive analysis that included open coding, axial coding, and rater reliability to consolidate themes. Using a deductive approach, the final coding scheme that was applied to the data was based on the KAB features identified in the Tamargo et al.'s study.[12]

Results: A total of 96 participants completed the surveys. Missing values were not included in the analysis. Most participants were medical residents (72.3%), followed by medical students and fellows. For gender, 55.3% were female and 44.7% male, and for sexual orientation, 92.4% were heterosexual with few as other sexual minorities (e.g., lesbian = 3.16%). There were differences among demographic groups; however, none were statistically significant. We saw statistically significant changes in distribution of answers for both knowledge questions (P < 0.01 for both); two of the four attitudes questions (P = 0.02 for both); and four of the six behavior questions (P < 0.01 for all four) [Table 1]. Postsession, 94% felt comfortable treating the LGBTQI population compared to 86% prior [Table 2]. Among the participants (n = 44) who responded to the qualitative question, content analysis revealed that the majority (n = 38) saw an impact of the presentation on their future practice (group labeled “change”). The minority (n = 8) saw either no impact or no change (group labeled “unchanged”). The change group revealed three major themes that reflected the survey question categories - KAB [Table 3]. Attitudes had four subthemes and behavior had two subthemes, with none for knowledge (only 1 comment was made). The unchanged group was predominantly made up of surgical specialties, and only two themes emerged: attitudes and behavior, and one subtheme was revealed for each [Table 4].

Conclusion: Our study demonstrates that educating healthcare providers improves cultural competency by increasing knowledge, awareness, and inspiring change toward more inclusive practices for treating our LGBTQI patient population. It has implications for requiring and integrating LGBTQI education into medical education curricula, and 84% supported mandatory education at St. Luke's. Certain subspecialties might benefit from a tailored curriculum as they did not see the same relevance to their practice as other specialties. Future research should focus on establishing differences in LGBTQI competency among medical specialties, and education sessions should be modified accordingly. Since we have yet to conduct a 6-month follow-up survey, future research should assess whether endorsed behavioral changes from survey responses were put into practice and maintained and the impact on patient care.










  References Top


  1. Ayhan CH, Bilgin H, Uluman OT, Sukut O, Yilmaz S, Buzlu S. A systematic review of the discrimination against sexual and gender minority in health care settings. Int J Health Serv 2020;50:44-61.
  2. Blosnich JR, Farmer GW, Lee JG, Silenzio VM, Bowen DJ. Health inequalities among sexual minority adults: Evidence from ten U.S. States, 2010. Am J Prev Med 2014;46:337-49.
  3. Hu SS, Neff L, Agaku IT, Cox S, Day HR, Holder-Hayes E, et al. Tobacco product use among adults - United States, 2013-2014. MMWR Morb Mortal Wkly Rep 2016;65:685-91.
  4. COVID-19: Experts Highlight LGBTI Discrimination, Antisemitism; 17 April, 2020. Available from: https://news.un.org/en/story/2020/04/1062042. (Last accessed on 2020 May 27).
  5. Durso LE, Gates GJ. Serving Our Youth: Findings from a National Survey of Service Providers Working with Lesbian, Gay, Bisexual, and Transgender Youth Who Are Homeless or At Risk of Becoming Homeless. Los Angeles, CA: University of California; 2012.
  6. CDC. HIV and Transgender People. Available from: https://www.cdc.gov/hiv/group/gender/transgender/index.html. (Last updated on 2019 Nov 09; Last accessed on 2020 Feb 09).
  7. CDC. HIV and Gay and Bisexual Men. Available from: https://www.cdc.gov/hiv/group/msm/index.html#. (Last updated on 2019 Nov 12; Last accessed on 2020 Feb 09).
  8. Institute of Medicine. The Health of Lesbian, Gay, Bisexual, and Transgender (LGBT) People: Building a Foundation for Better Understanding. Washington, DC: National Academies Press; 2011.
  9. James S, Herman J, Rankin S, Keisling M, Mottet L, Anafi MA. Executive Summary of the Report of the 2015 U.S. Transgender Survey. Washington, DC: National Center for Transgender Equality; 2016.
  10. Obedin-Maliver J, Goldsmith ES, Stewart L, White W, Tran E, Brenman S, et al. Lesbian, gay, bisexual, and transgender-related content in undergraduate medical education. JAMA 2011;306:971-7.
  11. Safer JD, Coleman E, Feldman J, Garofalo R, Hembree W, Radix A, et al. Barriers to health care for transgender individuals. Curr Opin Endocrinol Diabetes Obes 2016;23:168-71.
  12. Tamargo CL, Quinn GP, Sanchez JA, Schabath MB. Cancer and the LGBTQ population: Quantitative and qualitative results from an oncology providers' survey on knowledge, attitude, and practice behaviors. J Clin Med 2017;6:93.
  13. >The Lives and Livelihoods of Many in the LGBTQ Community Are At Risk Amidst COVID-19 Crisis. Available from: https://assets2.hrc.org/files/assets/resources/COVID19-IssueBrief-032020-FINAL.pdf?_ga=2.35731746.341300536.1590630749-601141570.1588686223. (Last accessed on 2020 May 27).
  14. Valdiserri RO, Holtgrave DR, Poteat TC, Beyrer C. Unraveling health disparities among sexual and gender minorities: A commentary on the persistent impact of stigma. J Homosex 2019;66:571-89.



  Abstract Number 3 Top


Blunt Aortic Dissection and Bilateral Internal Carotid Dissection in the Setting of Polytrauma

A. Shanker 1, A. L. Gifford 1, C. Bendas 1, R. Castillo 1

1 General Surgery Residency, Temple/St. Luke's Medical School, Bethlehem, PA, USA

Introduction: Traumatic mechanisms account for a relatively small percentage of reported vascular dissections and are associated with high morbidity and mortality. In cases of carotid artery dissection, trauma accounts for only approximately 4% of identifiable causes. Blunt cerebrovascular artery injuries involving a single internal carotid artery affect roughly 0.68%–0.86% of all trauma patients. Acute aortic dissection occursat a frequency of about 3.5–6.0 per 100,000 patient-years, with only a small fraction attributable to traumatic causes. This report presents a rare case in which concomitant type B aortic dissection, bilateral internal carotid dissection, and multisystem injuries are sustained after blunt trauma. Treatment of these vascular injuries in the setting of severe polytrauma requires careful balancing of antiplatelet and anticoagulation therapy in the setting of solid organ injury and multi-disciplinary collaboration.

CARE Statement: The authors of this manuscript declare that this scientific work complies with reporting quality, formatting, and reproducibility guidelines set forth by the EQUATOR Network. The authors also attest that appropriate patient consent was obtained prior to the publication of this abstract.

Case Scenario: A 67-year-old male was brought to a level I trauma center as an unrestrained driver involved in a single-vehicle collision versus tree. He was intubated in the field by emergency medical services (EMS) after extrication due to decreased mental status, for airway protection. Advanced trauma life support protocol was followed. After the primary survey, a FAST examination was positive in the left upper quadrant, and a chest X-ray revealed widened mediastinum. The patient was hemodynamically stable [Table 1] and transported for computed tomography (CT) of his head, neck, chest, abdomen, and pelvis [Table 2]. Laboratory results and imaging revealed that the patient had sustained significant multisystem trauma with acute blood loss anemia [Table 2] and [Table 3]. His vascular injuries included a partially thrombosed dissection of the aorta at the level of the left subclavian artery and descending just proximal to the level of the iliac bifurcation. Hypoenhancement of the right kidney secondary to dissection was also noted. Formal CT angiography (CTA) was done on the hospital day 1. The descending aortic dissection had an enlarging intramural hematoma with thrombosis of the false lumen to the inferior mesenteric artery, which was not flow limiting. A stable known ascending aortic aneurysm was visualized. Furthermore, a contained brachiocephalic artery transection, local dissection of the distal cervical right internal carotid artery, and dissection of the left internal carotid artery into the cavernous portion were visualized. A multi-disciplinary approach including trauma, cardiothoracic, vascular, and neurosurgical teams was utilized due to the extent of injuries. The aortic dissection was managed conservatively with targeted blood pressure and heart rate control. Bilateral carotid artery dissection was managed with the initiation of antiplatelet therapy once the splenic injury was embolized. However, repeat imaging on the hospital day 2 demonstrated new cortical infarcts in the occipital lobe. At this point, full anticoagulation was started.

Conclusion: The patient went on to have a lengthy intensive care unit (ICU) course requiring a pacemaker, tracheostomy, and gastrostomy tube placement. Repeat CTA of the head, neck, chest, abdomen, and pelvis on the hospital day 7 was stable. He was eventually able to be discharged to long-term rehabilitation following his ICU stay. He was placed on warfarin for the treatment of his carotid dissections and subsequent stroke. In this case, a multi-disciplinary approach was essential in formulating a treatment plan that took into consideration the patient's stroke risk versus bleeding risk inherent to polytrauma patients with vascular injuries.






  Abstract Number 4 Top


Role of Intranasal Calcitonin in Charcot Neuroarthropathy

Y. Cha 1, J. C. McGovern 1, B. Bernstein 1

1 PodiatryResidency Program, Bethlehem, PA, USA

Introduction: Charcot neuroarthropathy (CNA) is a complex condition with heterogeneous treatment options; consequently, there is no definitive treatment algorithm. The standard of care for this process has been offloading and decreased or nonweight-bearing. In addition to the neurovascular and neurotraumatic theories, increased osteoclastic activity has been proposed as a contributing factor in the destruction of bone associated with Charcot. Bisphosphonates have been shown to have potential positive effect on bone turnover in patients with CNA; however, they are contraindicated in patients with kidney disease and may decrease bone remodeling. Intranasal salmon calcitonin has been considered for its use in osteopenia associated with Charcot neuropathy, due to its potential effect on the receptor activator of NF-kB ligand (RANK-L)/osteoprotegerin pathway. It has been proposed that there is an unregulated inflammatory process in patients with Charcot neuropathy that leads to an increase in RANK-L.

Methods: A retrospective clinical study was conducted to evaluate the effect of intranasal salmon calcitonin as an adjunctive treatment for CNA. All patients were immobilized and offloaded per standard of care. Patients' temperature difference between affected and unaffected lower extremities was then measured with an infrared thermometer at each office visit. Our results were then compared to those of Bem et al.'s study.[1] A total of 13 patients were included in our study who reached <2°C temperature difference; we compared time of maximum temperature decrease and overall temperature decrease rates to those in Bem et al. The sample size required for the Bem et al.'s study was 32 patients, which gave a power of 80% to detect a difference of 15% between the intervention and control groups with a two-sided α of 0.05.

Results: The average amount of time it took for our patients to reach <2°C temperature difference in our study was 103.7 days, as compared to Bem et al.'s study who reached <2°C temperature difference at 90 days.[1] Thus far, in our study, subjects had a temperature difference of 1.6°C ± 0.4°C, whereas subjects in Bem et al.'s study had a temperature difference of 1.5°C ± 0.5°C. Initial skin temperature difference for our study was 4.1°C ± 1.4°C and for Bem et al.'s study was 3.6°C ± 0.8°C. Utilizing the same statistical tools as Bem et al.'s (power of 80% to detect a difference of 15%), the sample size required for our study would be 22. Therefore, we are in the process of gathering more subjects, as we currently have 13 subjects.

Conclusion: Intranasal salmon calcitonin has been investigated in its use for osteopenia associated with CNA. Comparing our observations to previously published data, it was noted that subjects had similar temperature differences at 90 days (1.6°C ± 0.4°C vs. 1.5°C ± 0.5°C, our study and Bem et al.'s study, respectively) although subjects in our study had a greater starting temperature difference.[1] Intranasal salmon calcitonin may be an effective adjuvant modality in preventing progression of the disease, especially in those with renal disease, although larger clinical trials will be needed to assess its role with acute Charcot neuropathy.




  Reference Top


  1. Bem R, Jirkovská A, Fejfarová V, Skibová J, Jude EB. Intranasal calcitonin in the treatment of acute Charcot neuroosteoarthropathy: a randomized controlled trial. Diabetes Care 2006:1392-4.



  Abstract Number 5 Top


The Use of Liposomal Bupivacaine in Transversus Abdominis Plane Blocks for Postoperative Pain Control

K. Dammann 1, A. L. Gifford 1, R. Fontem 1, A. Ng Pellegrino 1

1 General Surgery Residency, Bethlehem, PA. USA

Introduction: Standard morphine derivatives used to treat postoperative pain are associated with delayed return of bowel function, increased length of stay, and unneeded economic burden on the healthcare system. Ultrasound-guided transversus abdominis plane (TAP) blocks provide a means for nonnarcotic postoperative pain control when an epidural is not feasible, or when complicated laparoscopic procedures are converted to open surgery in the operating room (OR). Exparel ® is a long-acting liposomal bupivacaine with a duration up to 72 h that has been FDA approved for TAP block use. Here, we examined the effect of Exparel ® TAP blocks on the postoperative pain and outcomes in patients undergoing abdominal surgery.

Methods: This nonrandomized study was institutional review board (IRB) approved, SLUHN IRB# 2016-02. Patients undergoing open abdominal surgery aged >18 years and with ASA 1–3 were included, and those with allergies to local anesthesia, advanced liver failure, pregnancy, and dementia were excluded. Consent was obtained before procedure, and Exparel ® was mixed with 0.25% bupivacaine and normal saline. TAP block was performed under ultrasound guidance in the OR or in postanesthesia care unit (PACU) using a Stimuplex ® needle (B. Braun Medical, Allentown, PA) via hydrodissection technique ensuring 30cc to each side of the TAP block. Pain scores were collected in the PACU and on the postoperative day (POD)0–5. Based on alpha = 0.05 and beta = 0.80, we required 20 Exparel ® and 20 non-Exparel ® patients to see 30% change in pain scores.

Results: Fifty-two patients underwent open abdominal surgery followed by Exparel ® TAP blocks (n = 26) or standard opioid therapy (n = 26) [Figure 1]a. Fifty-two percent were male, and the mean age was 58 ± 17 years. Exparel ® treatment resulted in <50% patients requiring opioid morphine equivalents (OME) (61–90) for the treatment of severe postoperative pain [Figure 1]b. Exparel ® was also associated with decreased length of stay (5 ± 2 vs. 9 ± 7 days) [Figure 1]c, reduced incidence of ileus (3% vs. 27%), reduced nausea and vomiting (8% vs. 42%), readmission (4% vs. 12%), and fewer postoperative complications (23% vs. 54%) [Figure 1]d. Overall, the effect of Exparel ® on OME was most profound POD2–5; ranging from 41% to 43% decrease in OME on POD2–3 and 68% to 73% decrease in OME on POD4–5 [Figure 2]. Exparel ® reduced the incidence of severe postoperative pain scores by 50% or greater from POD0 to POD4 [Figure 3]. Exparel ® reduced pain scores by 75% on POD2 and 50% on POD3–4 in acute care surgery patients [Figure 4]a and by 40% on POD2 in colorectal surgery patients [Figure 4]b and [Figure 4]c.

Conclusion: Exparel ® TAP blocks improve postoperative outcomes by reducing OME and acute pain scores, resulting in reduced length of stay and fewer complications at our institution. In the future, randomized, controlled trials should investigate whether TAP blocks are similarly effective on large scale.


  Abstract Number 6 Top


Effect of Oral Vancomycin Prophylaxis on Clostridioides difficile Infection Recurrence in Patients Receiving Antibiotic Therapy

S. Gluhov 1, S. C. Depcinski 1, A. M. P. Kratz 1

1 Pharmacy Residency, Bethlehem, PA, USA

Introduction: Clostridioides difficile infection (CDI) is one of the most common healthcare-associated infections and is plagued with a high rate of recurrence. National guidelines do not routinely recommend oral vancomycin prophylaxis (OVP); however, limited retrospective data demonstrate an inconsistent but potential benefit with the use of OVP to prevent recurrent CDI in patients, with a history of such, when they are started on systemic antibiotics. The objective of this research is to assess the efficacy of OVP use in this setting at St. Luke's University Health Network (SLUHN).

Methods: This study retrospectively evaluated patients admitted to a SLUHN Hospital between June 1, 2018, and September 30, 2019, with a medical history of CDI who were initiated on systemic antibiotics during the admission, with or without OVP. Patients started on OVP more than 24 h after initiating systemic antibiotic therapy, transferred to an outside facility before systemic antibiotic completion, expired within 48 h of admission, who were duplicates, or who had their OVP discontinued before completion of systemic antibiotic therapy were excluded. The primary objective was to evaluate the CDI recurrence from initiation of systemic antibiotics until 30 days after their discontinuation. Categorical data were analyzed using the Chi-square or Fisher's exact tests; parametric and nonparametric, continuous data were analyzed using the Student's t-test and the Mann–Whitney U-test, respectively. A P < 0.05 was used to determine statistical significance.

Results: A total of 489 patients were screened and 102 were included in the primary analysis. Of these patients, 54 patients were in the OVP group and 48 patients were in the control group. There was no significant difference between the two groups with regard to baseline demographics, level of care, PPI/H2RA = Proton Pump Inhibitor/ H2 Receptor Blocker or probiotic use in the hospital, total number of prior CDI episodes, and time since the last CDI. Type of systemic antibiotic exposure was similar except for statistically significantly more patients in the OVP group receiving third- and fourth-generation cephalosporins (83.3% vs. 47.9%, P < 0.01). Mean systemic antibiotic duration was statistically, but not clinically, significantly longer in the OVP group (6.24 vs. 6.12 days, P = 0.036). The primary outcome of CDI recurrence from the initiation of systemic antibiotics until 30 days after their completion occurred in only one patient in the OVP group and none in the control group (P = 0.34). CDI recurrence at 90 days from the initiation of systemic antibiotics was also similar between OVP and control groups (3.7% vs. 0%, P = 0.53). No differences in the length of stay or 30-day mortality, VRE Vancomycin-resistant enterococcus infection/colonization, or hospital readmission were seen.

Conclusion: Despite the high-risk nature of the patients included, the use of OVP at SLUHN was not associated with a lower incidence of CDI recurrence or improved outcomes. This was likely due to higher rates of third- and fourth-generation cephalosporin use in the OVP group and the very low baseline level of recurrent CDI seen at SLUHN compared to other studies (0% vs. 20%–30%). Given the inconsistency in results reported across various retrospective and limited prospective trials, larger prospective studies are needed to further determine the role of OVP in reducing the risk of CDI recurrence in patients receiving systemic antibiotic therapy.










  Abstract Number 7 Top


Efficacy of Intravenous Lidocaine Infusion for Renal Colic versus Nonrenal Colic Pain in the Emergency Department

T. S. Bartol 1, J. T. Binstead 1, J. D. Miller 1, L. Koons 1

1 Pharmacy Residency, Bethlehem, PA, USA

Introduction: Studies have shown that intravenous (IV) lidocaine infusions are an effective analgesic for renal colic pain. However, its efficacy for other pain conditions is less conclusive. The objective of this study is to assess if IV lidocaine can be used as an effective analgesic in the emergency department (ED) for nonrenal colic pain.

Methods: A retrospective chart review was completed that included patients treated at one of the ten hospital EDs within a community health network who received IV lidocaine for pain during a 26-month period. Pregnant patients and those who received IV lidocaine for a nonpain condition were excluded. Patients were classified into two groups: those diagnosed with renal colic pain and those with nonrenal colic pain. The primary outcome was need for rescue analgesia after the IV lidocaine infusion, and the secondary end points included 30-day ED revisit for the same diagnosis, oral morphine equivalents administered, opioid(s) prescribed at discharge, adverse effects, and change in pain score. Statistical analyses performed included the Chi-square or Fisher's exact tests for categorical data, and continuous data were interpreted using the Student's t-test or Mann–Whitney U-test, as appropriate.

Results: The study included 61 patients – 21 patients in the renal colic group and 40 patients in the nonrenal colic group. Baseline characteristics were similar between groups. The need for rescue analgesia after lidocaine infusion did not differ significantly between the renal colic and nonrenal colic groups (28.6% vs. 40.0%; P = 0.55). The use of opioids as a rescue analgesia also did not differ significantly between the groups (9.5% vs. 17.5%; P = 0.48). A change in pain score of ≥20% (57.1% vs. 45.0%; P = 0.53) and incidence of adverse events (9.5% vs. 20.0%; P = 0.47) were not significantly different between the groups. More patients in the renal colic group received opioid prescriptions at discharge than the nonrenal colic group (42.9% vs. 12.5%; P = 0.02).

Conclusion: Previous studies suggest that IV lidocaine is effective in relieving renal colic pain, with less conclusive results for nonrenal colic pain. Based on the results of this study, an IV lidocaine infusion may be a safe and effective analgesic option for nonrenal colic pain. Larger, prospective studies are warranted to find a definitive benefit in nonrenal colic patients.


  Abstract Number 8 Top


Impact of Vancomycin with or without Antistaphylococcal Beta Lactam for Empiric Treatment of Intensive Care Unit Patients with MSSA Bacteremia

M. Lipski 1, S. C. Depcinski 1, A. N. Kester 1

1 Pharmacy Residency, Bethlehem, PA

Introduction: Staphylococcus aureus is the leading cause of Gram-positive bacteremia in the United States and is associated with significant morbidity and mortality, with critically ill patients at the highest risk. Patients with a Charlson comorbidity index score ≥2 and a critical care unit admission are at even greater risk, with an estimated twofold increase in mortality. The high incidence of methicillin-resistant S. aureus warrants empiric treatment with vancomycin; however, vancomycin has been shown to be inferior to antistaphylococcal beta lactam (ASBL), such as cefazolin or nafcillin, for the definitive treatment of MSSA bacteremia. Despite this, it is unknown if ASBL use earlier in therapy will impact clinical outcomes. The purpose of this study was to determine if the addition of an ASBL to empiric vancomycin treatment can improve clinical outcomes in critically ill adults with MSSA bacteremia.

Methods: This retrospective cohort was submitted to the St. Luke's University Health Network (SLUHN) Institutional Review Board and approved as exempt. Critically ill adults with MSSA bacteremia who were admitted to a SLUHN Hospital between April 2018 and September 2019, empirically treated with vancomycin alone or vancomycin plus an ASBL, and definitively treated with an ASBL were included in this study. Patients who expired within 48 h of initial MSSA blood culture or presence of a polymicrobial bacteremia were excluded. The primary composite end point of clinical failure was assessed and defined as 90-day all-cause mortality, persistence of bacteremia at 3-7 days, or recurrence of MSSA bacteremia within 90 days. The following secondary outcomes were assessed: individual composite end points, time to blood culture clearance, hospital length of stay, critical care length of stay, acute kidney injury within 7 days, and Clostridioides difficile infection within 30 days. Chi-square test, Fisher's exact test, Student's t-test, and Mann–Whitney U-test were used to compare the treatment groups (significance set at a P < 0.05).

Results: A total of 29 patients were included for analysis (vancomycin plus ASBL combination, n = 19; vancomycin monotherapy, n = 10). Empiric treatment with a combination therapy was not associated with a difference in the rate of clinical failure compared to monotherapy (57.9% vs. 60.0%, P = 1.00). Ninety-day mortality was not significantly lower with combination therapy compared to monotherapy (36.8% vs. 60.0%, P = 0.24). The rate of persistent bacteremia was not significantly higher in the vancomycin plus ASBL group compared to vancomycin-alone group (26.7% vs. 14.3%, P = 0.64). There were no cases of recurrent MSSA bacteremia in either group.

Conclusion: In critically ill adults with MSSA bacteremia definitively treated with an ASBL, the early addition of an ASBL to empiric vancomycin did not result in significant improvement in the rate of clinical failure compared to vancomycin alone. However, due to the limited number of patients evaluated and multiple differences in baseline characteristics between the two groups, this topic warrants further research with large, multicenter cohort studies or randomized controlled trials.


  Abstract Number 9 Top


Prehospital End-Tidal Carbon Dioxide Measurement as an Early Marker for Transfusion Requirement in Trauma Patients

N. Akers 1, B. R. Wilson 1, R. K. Jeanmonod 1

1 Emergency Medicine Residency, Bethlehem, PA

Introduction: Below normal end-tidal carbon dioxide measurement (ETCO2) is associated with worse outcomes in sepsis and trauma patients as compared to patients with normal ETCO2. We sought to determine if ETCO2 can be used in the prehospital setting to predict transfusion requirement, operative hemorrhage control, or mortality in the first 24 h after admission for trauma.

Methods: This is a retrospective cohort study at a suburban, academic level 1 trauma center. Patients were sequentially identified as prehospital trauma alerts from a single (EMS) Emergency Medical Services system which requires, per policy, ETCO2 for all traumas. One-year prehospital data were collected and paired with the hospital trauma registry data. Comparisons were made between ETCO2 values for patients who required transfusion, with operative blood loss control, or who died and those who did not.

Results: Two hundred and thirty-five trauma patients were transported via the study EMS system, of which 105 (44.7%) had documented ETCO2 values. The patients' mean age was 60 (standard deviation 24) years with 59 (56.2%) males [Table 1]. Three patients were intubated prehospital, and seven were intubated in the trauma bay. The mean prehospital ETCO2 for those who needed transfusion, surgery, or died (n = 11) was 25.7 (9.1) compared to 30.6 (7.8) for those who did not (P = 0.049). The optimal cutoff for our population was ETCO2 ≤27. Characteristics of the transfused group are shown in [Table 2]. The receiver operating characteristic curve is shown in [Figure 1].

Conclusion: Below normal ETCO2 values were associated with increased need for transfusion, operative intervention, and death. Further study is warranted to determine if ETCO2 outperforms other predictors of severe trauma.


  Abstract Number 10 Top


Lidocaine for the Treatment of Acute Pain Syndromes

M. Corrigan 1, J. T. Binstead 1, L. Koons 1, K. W. Miller 1

1 Emergency Medicine Residency, Bethlehem, PA, USA

Introduction: Lidocaine is a medication more commonly used in cardiology and as a local anesthetic; however, recent journal articles suggest lidocaine's effectiveness in systemic pain relief. Lidocaine is classified as an amide, local anesthetic and exerts its effects by inhibiting impulse conduction on neurons, blocking voltage-gated sodium channels peripherally and centrally.[1] Lidocaine also has potent anti-inflammatory properties, and infusion of lidocaine is associated with decreased circulating pro-inflammatory cytokines.[2] Side effects of lidocaine toxicity range widely from metallic taste to central nervous system responses, including agitation, confusion, and seizure. However, infusion is generally considered safe at ranges of 0.5–2 mg/kg/h and patients should be monitored by staff during infusion.[3] First articles reporting analgesia from lidocaine were reported for burns, oncological pain, and postoperative patients.[1],[2] More recently, separate papers have discussed lidocaine as the treatment for certain complaints in the emergency department, such as renal colic, migraine, acute limb ischemia, and chronic pain.[1],[2],[4] There is support for using lidocaine in the setting of opiate-refractory pain including bowel obstruction and traumatic ankle injury, with a goal of reducing the use of opiate medications. This highlights an additional consideration to lidocaine for systemic pain relief: its use as an opioid-sparing pain medication.

Methods: This is a retrospective chart review of 44 patients who received intravenous lidocaine while in the emergency department for various painful complaints from July to December 2017. A specific electronic medical record order set set was designed and followed before administration of this medication. This order set included the medication, dosing options, frequency, time of infusion, route, and nursing instructions. When selected, recorded weight is the default for dosing calculations and lidocaine was dosed at 1.5 mg/kg. Time of infusion and route are not customizable and set at 30 min intravenously. Nursing instructions include obtaining vital signs, oxygen saturation, and pain scores before the administration of lidocaine, every 15 min during infusion, and before disposition. Further safety instructions are included for nursing staff to notify a physician and stop the infusion at signs of toxicity. Lidocaine toxicity was classified (from minimal to severe) using a pre-defined scale. In addition, if oxygen saturations decrease to 90%, per ordering protocol, nursing staff is to apply supplemental oxygen at 2 L/min, to hold the lidocaine infusion, and to notify the physician.

Results: Out of the 44 patients included in the study, 29 did not require treatment with a narcotic agent after infusion while 15 patients received narcotic analgesia after lidocaine infusion. There were two patients who experienced brief and transient adverse reactions: one patient had transient hypotension and one patient had sensation of pins and needles and fatigue which resolved without intervention. Most common complaints treated with lidocaine included flank pain with and without known stones and acute and chronic back pain. Patient pain syndromes treated with intravenous lidocaine also included sickle cell pain crisis, fibromyalgia, migraine, groin pain, abdominal pain, and constipation.

Conclusion: Most common complaints treated with lidocaine included flank pain with and without known stones and back pain. Other patient pain syndromes treated with intravenous lidocaine also included sickle cell pain crisis, fibromyalgia, migraine, groin pain, abdominal pain, and constipation. The administration of intravenous lidocaine (per protocol) was a safe and effective adjunct to standard medical therapy in the treatment of acute pain syndromes.








  References Top


  1. Golzari SE, Soleimanpour H, Mahmoodpoor A, Safari S, Ala A. Lidocaine and pain management in the emergency department: A review article. Anesth Pain Med 2014;4:e15444.
  2. Kandil E, Melikman E, Adinoff B. Lidocaine Infusion: A Promising Therapeutic Approach for Chronic Pain. J Anesth Clin Res 2017;8:697.
  3. Eipe N, Gupta S, Penning J. Intravenous lidocaine for acute pain: An evidence-based clinical update. BJA Education 2016;16:292-8.
  4. Farahmand S, Hamrah H, Arbab M, Sedaghat M, Basir Ghafouri H, Bagheri-Hariri S. Pain management of acute limb trauma patients with intravenous lidocaine in emergency department. Am J Emerg Med 2018;36:1231-5.



  Abstract Number 11 Top


A Unique Presentation of Severe Dysautonomia in Guillain–Barre Syndrome

A. Shaji 1, M. A. Ritenuti 1, D. Raheja 1

1 Neurology Residency Program, Richard A. Anderson Campus, Easton, PA, USA

Introduction: Guillain–Barre syndrome (GBS) is an acute monophasic illness usually provoked by a preceding infection. Approximately 70% of patients present with dysautonomia, which can be severe in up to 20% of cases. Common symptoms include tachycardia, urinary retention, hypertension alternating with hypotension, orthostatic hypotension, bradycardia, arrhythmias, ileus, and anhydrosis. Here, we report a unique presentation of GBS with severe dysautonomia.

Case Scenario: A 76-year-old woman with history of lumbar degenerative disease (previously treated with spinal stimulator), diabetes, and hypertension, presented with acute-onset, diffuse pain and paresthesias in all four extremities. She reported recent sinusitis and current urinary tract infection (UTI). She was treated for UTI and discharged with the diagnosis of diabetic neuropathy. She presented 3 days later with a generalized tonic–clonic seizure in the setting of accelerated hypertension (243/109 mmHg). Computed tomography (CT) of the head was unremarkable, and electroencephalogram revealed diffuse intermixed delta/theta slowing without epileptiform discharges. One week later, she presented again with acute-onset right gaze preference and right-sided weakness (NIHSS 22). Rapid improvement of focal deficits occurred within 20 min with residual encephalopathy, which progressed over the subsequent day into bizarre behaviors and delusions with visual ataxia, agnosia, and hallucinations. CT of the head (CTH) showed a subtle area of decreased attenuation within the right cerebellum. Repeat imaging 9 days later demonstrated bilateral occipital hypodensities. She subsequently developed urinary retention with obstructive hydroureter. Neurologic examination revealed weakness, areflexia, ataxia, and diminished vibration sensation in the distal aspect of all four extremities. She received 5 days of high-dose steroids, with improvement in cognition, but with persistent extremity symptoms.

Conclusion: Overall, the patient's clinical course is consistent with GBS. We suspect that she developed severe dysautonomia which led to urinary retention, hyponatremia with SIADH-like features, accelerated hypertension, leading to a posterior reversible encephalopathy syndrome (PRES), manifesting as encephalopathy/delirium, visual disturbance/hallucination, and seizures. The high-dose pulse intravenous (IV) steroids initially used to empirically treat presumptive encephalitis unknowingly may have treated PRES (based on evidence from published case reports). CTH abnormalities were isolated to the posterior circulation, consistent with changes seen in PRES (spinal stimulator precluded magnetic resonance imaging). Electromyogram confirmed the diagnosis; she was treated with IV immunoglobulin with gradual improvement.


  Abstract Number 12 Top


Electrosurgical Techniques for Salpingectomies: Is There Any Difference?

D. Tang 1, J. N. Anasti 1

1 Temple/St. Luke's Medical School, Bethlehem, PA, USA

Introduction: Female or male sterilization is the most common form of contraception utilized in the United States. Each year, about 700,000 women undergo tubal ligation with half being performed at the time of delivery and half by an interval laparoscopic procedure. In the past, tubal ligation was accomplished by simply coagulating or mechanically occluding a small portion of the tube. Recently, it has been suggested that the majority of ovarian cancer may begin in the distal portion of the  Fallopian tube More Details. This has caused gynecologic surgeons to rethink the current sterilization paradigms, moving from simple occlusion to total removal of the tubes (i.e., bilateral salpingectomy). The rationale is that the removal of the entire tube may decrease the incidence of ovarian cancer. There is limited literature looking at the best way to perform this now common gynecologic procedure. Our study design includes retrospective cross-sectional analysis to compare the outcomes of various electrosurgical modalities (Kleppinger™ [KS], LigaSure™ [LS], and Enseal™ [ES]) that are commonly utilized in salpingectomies. Opportunistic salpingectomy has been increasingly utilized as a means of female sterilization and prevention of ovarian carcinoma. However, comparisons of different techniques employed in the salpingectomy have not been well studied. Therefore, we performed a retrospective chart review of commonly employed laparoscopic electrosurgical methods to elucidate any clinical differences.

Methods: Charts of patients who had undergone laparoscopic bilateral salpingectomy during the last 4 years were identified with gathering of basic demographic details (age, body mass index [BMI], gravidity, parity, and medical and surgical history). Operative and postoperative notes were reviewed to discern the instrument used for salpingectomy as well as operative times and complications. This particular abstract focuses on differences in operating times, intra- and post-operative complications, and differences in patient pain scores postoperatively. All procedures were performed at a large teaching hospital. Demographic information, complications, and operative times were reviewed from salpingectomies performed employing various electrosurgical techniques. In addition, the recorded pain score (0–10 scale) at 30-min postoperatively and at discharge was reviewed. Individual techniques were compared employing one-way ANOVA with Student–Newman–Kuels pairwise comparisons as appropriate. All values were reported as mean ± standard deviation.

Results: We reviewed 50 salpingectomies employing each of the following techniques; LS, ES, and KS. The groups did not differ in age, BMI, parity, or prior procedures. KS had the longest Operating Room (OR) times (KS 42 ± 12 min, LS 36 ± 11 min, ES 37 ± 10 min, P = 0.04). LS had the lowest 30-min postoperative pain score (LS 2.5 ± 2.8, ES 4.0 ± 3.4, KS 4.6 ± 3.1, P = 0.008). Pain score at discharge did not differ between procedures.

Conclusion: In salpingectomy surgery, LS was associated with shorter OR times compared to KS and less postoperative pain than ES or KS. The clinical/cost significance of these findings remains to be determined.


  Abstract Number 13 Top


Proximal Splenic Artery Embolization for the Management of a Geriatric Trauma Patient with High-Grade Splenic Injury

D. M. Tang 1, J. R. Gacke 1, C. Bendas 1,2

1 Temple / St. Luke's Medical School, Bethlehem, PA, USA;2 Division of Surgical Critical Care, St. Luke's Regional Level I Trauma Center, Bethlehem, PA, USA

Introduction: Splenic artery embolization (SAE) remains controversial in the management of patients aged >55 years with high-grade splenic injury. Nonsuperiority of the proximal versus distal SAE techniques further complicates this decision. Careful selection of patients in this population may be indicated for SAE as opposed to operative management (OM) although data supporting this claim are limited to a retrospective review of few cases and are disputed. We discuss a case of blunt abdominal trauma secondary to a fall, treated with proximal SAE. An 84-year-old female hemodynamically stable (HDS) patient with abdominal pain and inability to ambulate was admitted with multiple fractures and high-grade splenic injury. A nonoperative approach was requested by the patient, prompting successful treatment via SAE of the main splenic artery. Current literature dictating the use of SAE in the elderly patients with high-grade splenic injury is insufficient. Careful selection of appropriate patients for nonoperative management (NOM) in this demographic may abrogate the risks associated with OM. The spleen remains the most common injured intra-abdominal organ in blunt abdominal trauma, mandating expeditious management to avoid the potential of fatal hemorrhage. The decision to pursue OM versus NOM in the splenic injury is often dictated by several parameters, including, but not limited to, hemodynamic status, medical comorbidities, injury severity score, and grade of splenic injury.[1] The latter often favors HDS patients who are younger with lower-grade (e.g., I–II) splenic injury [Table 1]. Justifications for nonoperative approaches to the management include salvage of functional splenic tissue and avoidance of the risks associated with surgical intervention, such as infection and postsplenectomy sepsis. This case report details the management of a high-grade splenic injury in a HDS geriatric patient who did not wish to pursue surgical intervention, successfully managed by proximal SAE.

CARE Statement: SAE provides an alternative to OM in patients who have failed observational approaches to care; however, the indications for embolization remain controversial in certain demographic groups. In patients aged >55 years with higher-grade splenic injuries (e.g., III–V), current literature demonstrates varying efficacy of embolization with respect to splenic salvage and unsuccessful NOM, with failure rates approaching 30% in patients aged >75 years.[1] Moreover, the decision as whether to employ proximal (i.e., main splenic artery) or distal (i.e., selective branches of the splenic artery) embolization poses additional challenges. Current literature does not demonstrate clear advantages of one method over the other, although some literature suggests that distal techniques may result in higher infection rates.

Case Scenario: An 84-year-old woman presented to the emergency department with complaints of generalized pain in her abdomen, pelvis, neck, and lower back, following a fall down eight steps the night prior. The patient returned to bed following the fall and awoke the next morning unable to ambulate. Of note, the patient had a history of recurrent mechanical falls and was recently widowed 1 week before arrival. Additional medical history was significant for rheumatoid arthritis, hypothyroidism, osteoporosis, stress/compression fractures of T8, T9, and L4, and fixation kyphoplasty of L2–L3. Primary trauma survey revealed an intact airway, symmetric breath sounds, intact pulses, and a GCS of 15. Initial hemodynamic parameters showed a blood pressure of 120/56 mmHg, pulse of 83 bpm, and respirations of 18 per minute. Secondary survey revealed tenderness in the chest wall, abdomen, and lower back and visible ecchymosis over the neck, left midaxillary region, and lower rib cage. Hemoglobin level was 12.3 g/dl, with a nadir of 10.3 g/dl before embolization. No blood transfusions were administered. Computed tomography (CT) of the chest, abdomen, and pelvis demonstrated multiple left-sided rib fractures, multiple splenic lacerations extending to the splenic hilum, perisplenic hemoperitoneum tracking to the lower abdomen, hematoma of the right obturator internus, and multiple pelvic fractures including the pubic bone, inferior pubic rami, right sacral wing, and left ilium [Figure 1]. Multiple branches of the splenic arteries beyond third-order branches demonstrated active extravasation (i.e., blush) on the celiac and splenic arteriography [Figure 2]. Nonselective embolization of the main splenic artery was performed in attempt to minimize the pressure head of the splenic artery and decrease the risk of subsequent splenic infarct. Following embolization of the main splenic artery with two 7-mm Amplatzer 4 plugs, contrast injection was administered displaying decreased flow through the plugs [Figure 3].

Conclusion: The HDS geriatric trauma patient with higher-grade splenic and associated injury continues to be a point of controversy among traumatologists. While plentiful literature exists supporting the efficacy of NOM in lower-grade splenic injuries in those aged <60 years, there is insufficient evidence to support the same level of success in those aged >60 years with few studies comparing the efficacy between these two populations.[1] Further investigation of NOM of the splenic injury in the geriatric population is warranted to further establish the criteria for patients who may benefit from such approaches to care. Despite avoiding potential morbidity associated with nontherapeutic laparotomy, failure of NOM may result in delaying surgical exploration, which itself carries risk of worsening shock and death. Psychosocial aspects of patient preference should also be considered when clinically appropriate. Further study is warranted in this population to optimize patient selection guidelines when considering OM versus NOM.










  References Top


  1. Stawicki SP. Trends in nonoperative management of traumatic injuries–A synopsis. International journal of critical illness and injury science. 2017 Jan;7(1):38.



  Abstract Number 14 Top


Investigation of the Accuracy and Frequency of Pharmacological Recommendation of Civilian Physical Therapists

K. Barry 1, S. M. Kareha 1

1 Orthopedic Physical Therapy Residency, Bethlehem, PA, USA

Introduction: Over the past 15 years, opioids have been overprescribed, which has led to an increase in the occurrence of opioid addiction. In 2014 alone, there were a total of 47,055 overdose associated deaths. Since 2000, the death rate from overdoses has increased significantly (i.e., >130%). Due to the rising number of deaths, there is a need to reduce the amount of prescription opioids that are dispensed in the United States. Studies have found that those patients who receive care from a physical therapist (PT) have a significantly decreased risk of needing opioid medications. Furthermore, PTs are trained in pharmacology, and PTs in the military have been prescribing medications effectively for years.

Methods: SLUHN has taken steps to reduce healthcare costs by innovating a comprehensive spine program, which allows patients with acute low back pain and neck pain to be triaged to the appropriate healthcare provider, with PT being one of the professions of choice. PTs with direct access licenses have undergone specialized training to participate in the program. When a patient goes through the spine program, the data are stored in the REDCap database. It was explored that during the direct access treatments, if the PT felt that a medication was required, a physician was contacted, and they prescribed a medication.

Results: The comprehensive spine program had 27 recorded patients to whom medication was prescribed. Of those 27 patients, one was excluded because there was no medication prescribed throughout the plan of care, and two others were excluded due to being chronic pain patients. PTs assisted in prescribing medication to the remaining 24 patients who met the criteria of acute spine pain. PTs matched the physicians' prescription correctly (58.33%). There was a 37.5% partial match to what the physician prescribed. Finally, there was one non-match (4.17%); however, it was recognized that the patient required medication and the PT was able to decrease the time it took to prescribe the medication.

Conclusion: In conclusion, PTs were able to identify with high accuracy (i.e., 100%) when a prescription medication was required. This, in turn, may be potentially helpful in reducing the consumption of opioids and therefore prescription-related deaths in the U.S.


  Abstract Number 15 Top


Disproportionately Affected and Underinsured: Trauma and Violence in Young African-American and Latino Men Extend Beyond Major Urban Centers

A. L. Gifford 1, R. Castillo 1, N. J. Stewart 1, J. Cipolla 1, P. G. Thomas 1, B. A. Hoey 1, S. P. Stawicki 1

1 Department of General Surgery

Introduction: Studies from large urban trauma centers show that African-American men (AAM) and Latino men (LM) are disproportionately affected by the current violence epidemic. The study goal was to evaluate whether similar disparities exist outside of densely populated urban areas. Our Level I Trauma Center's (L1TC) catchment area consists of small/medium-sized cities and surrounding nonurban areas. We hypothesized that AAM and LM are both underinsured and disproportionately affected by penetrating trauma.

Methods: This study was a retrospective analysis of our L1TC Registry between January 2010 and December 2016. Variables queried included patient demographics, all-cause mortality, length of stay, discharge destination, injury severity score, trauma mechanism, and insurance score (number of health insurance policies per patient on admission). Group comparisons were performed using Chi-square test for categorical data and ANOVA/Kruskal–Wallis testing for normally and nonnormally distributed continuous data, respectively. Statistical significance was set at α < 0.01.

Results: A total of 12,082 patients were analyzed. Of those, 10,364 were identified as Caucasian, 938 as multiracial, 672 as African-American, and 108 as Asian. A total of 1326 (10.98%) were identified as Hispanic/Latino. Compared to Caucasian men (CM), AAM and LM were disproportionately affected by penetrating trauma (P < 0.01). Both AAM and LM were underinsured when compared to CM (P < 0.01). Similar but less pronounced trends were seen for analogous comparisons for female patients [Table 1].

Conclusion: Our L1TC data show that disparities for African-American and Latino trauma patients exist beyond major urban areas. In addition to ongoing efforts on ensuring socioeconomic equity, aggressive community outreach is required to educate young AAM and LM regarding trauma/injury prevention.




  Abstract Number 16 Top


Diverse Presentation in Rare  Creutzfeldt-Jakob disease More Details – Case Report

A. Mendez 1, D. Raheja 1, A. C. W. Lasker 1

1 Neurology Residency Program, Richard A. Anderson Campus, Easton, PA, USA

Introduction: Creutzfeldt-Jakob disease (CJD) is a rare and fatal disorder, with rapid neurodegeneration that has an estimated yearly incidence of one in 1 million cases worldwide and 350 cases in the United States. CJD is suspected in cases with rapidly progressive dementia, especially if accompanied by myoclonus, ataxia, and/or visual disturbance. Diagnosis is supported by periodic sharp waves on electroencephalogram (EEG), positive 14-3-3 CSF assay, hyperintensity in the caudate nucleus/putamen and/or in at least two cortical regions on DWI or FLAIR magnetic resonance imaging and RT-QuIC. Here, we present two cases of CJD to exemplify varied presentations and different challenges encountered during the clinical workup.

CARE Statement: CJD is a rare neurodegenerative disorder that yearly affects one in 1 million people worldwide and 350 people in the United States. At SLUHN we had the opportunity to care for two patients with this devastating disease. This case report was written to reflect on the varied presentations of CJD in patients. Our hope is to bring awareness to the SLUHN community to rapidly identify these patients and improve the quality of life for them and their loved ones.

Case Scenario: First case involves a 62-year-old male who presented initially with 3 months of fatigue, sleep difficulties, progressive dysarthria, diplopia, and balance disturbance. Examination showed mild dysarthria, diplopia with lateral gaze, normal strength with ataxic finger-to-nose and heel-to-shin. Initial workup including MRI, lumbar puncture, electromyogram with repetitive nerve stimulation, paraneoplastic panel, and lumbar puncture were inconclusive. Given his oculobulbar symptoms and negative laboratory studies, seronegative myasthenia gravis was considered, and he was empirically treated with plasmapheresis and pyridostigmine, which mildly improved vision. Progressive symptoms with hoarseness, visual hallucinations, and truncal/limb ataxia led to repeat workup. MRI of the brain then revealed abnormal restricted diffusion and T2 prolongation in the right corpus striatum, minimal findings on the left caudate and bilateral thalami with sparing of the internal capsule. EEG showed diffuse generalized slowing. Positive CSF 14-3-3 assay with RT-QuIC achieves approximately 98% probability. Second patient is a 66-year-old female who presented to the emergency department with 2 weeks of right hand tremors, weakness, slowing of speech, and dizziness. Examination revealed high blood pressure (200 systolic), effortful speech with apraxia, and tremor of the right upper and lower extremities, with varying amplitude/frequency and normal strength/tone. Initial brain MRI was reported normal. EEG revealed bilateral paracentral periodic epileptiform discharges with intermittent bursts of rhythmic activity, as the patient became more encephalopathic. She was empirically treated with steroids, plasmapheresis, and AEDs without improvement. Repeat MRI revealed evidence for CJD, including the presence of bright cortical ribboning on DWI and CSF positive for 14-3-3.

Conclusion: Similar workups were initially inconclusive for both patients, who passed away within months of symptom onset, consistent with the usual progression of this devastating disease. These cases highlight the varied clinical presentations, demonstrating the importance of considering CJD workup in patients with rapidly progressive cognitive decline to maximize quality of life.


  Abstract Number 17 Top


Efficacy of an Enhanced Recovery after Surgery (ERAS) Protocol for Orthopedic Spinal fusion Procedures

A. Malige 1, G. Sokunbi 1

1 Orthopaedic Surgery Residency, Bethlehem, PA, USA

Introduction: Spine surgeons often turn to spinal fusions as a last resort for patients with back and lower extremity symptoms that are not well controlled by a combination of physical therapy, a comprehensive pain regimen, epidural steroid injections, and other alternative modalities. Regardless of whether patients have acute or chronic conditions causing these symptoms, they often rely on opioid medications to help with their pain control. Narcotic medications are still the mainstay for postoperative pain regimens, and patients may endure significant side effects associated with these medications, including nausea, vomiting, constipation, and ileus, leading to an increased length of hospital stay. More recently, enhanced recovery after surgery (ERAS) protocols have been utilized across the nation for various surgeries. In 2018, the orthopedics, anesthesia, perioperative nursing, and pharmacy departments worked together to construct and implement an ERAS protocol for orthopedic spine procedures, with the goal of helping to reduce opioid use in the postoperative period. This project highlights our ERAS protocol specifically for multilevel cervical and lumbar spinal fusions and the effect it had on changing opioid consumption in the postoperative period.

Methods: A retrospective chart review was performed on all patients undergoing spinal surgery in 2016. All patients older than 18 years of age undergoing primary spinal fusions were included. Patients were excluded if they underwent revision surgeries or spinal surgeries that did not include a fusion. We reviewed a group of 70 patients in 2016 before our ERAS protocol was created. Seventy patient charts were then reviewed in 2019–2020 after our ERAS protocol for multilevel spine fusion was in place. Components of the ERAS protocol include primary utilization of nonnarcotic medications during the preoperative period (acetaminophen and gabapentin), the intraoperative period (ketamine and dexmedetomidine), and in the postoperative period (acetaminophen, gabapentin, and muscle relaxants), with the goals of allowing for minimal use of narcotic medications in the postoperative period. For each patient, length of stay, re-admissions, day out of bed, and the amount of opioid (type of opioid, dose of opioid, and frequency of use) used in the immediate recovery period (postanesthesia care unit stay) until day of discharge were ascertained. Using an oral morphine equivalent (OME) conversion chart [Table 1], we were able to calculate a patient's daily average OME requirement (i.e., how much opioid was used in the postoperative period on a daily basis). The difference in outcomes between patients having surgery after the ERAS protocol was implemented and patients before this protocol was implemented was compared using descriptive statistics and independent t-tests. For all analyses, P < 0.05 denotes statistical significance (IBM SPSS Version 23 Statistics for Windows, Armonk, NY: IBM Corp).

Results: Overall, 129 patients were included in our cohort: 63 from the ERAS group and 66 in the non-ERAS group. Fifty patients underwent cervical procedures (23 ERAS and 27 non-ERAS), and 79 underwent lumbar procedures (40 ERAS and 39 non-ERAS). Forty-seven patients underwent anterior fusions (21 ERAS and 26 non-ERAS), and 78 underwent posterior fusions (38 ERAS and 40 non-ERAS). The orthopedic spinal fusion ERAS protocol had a high compliance of the majority, if not all, of its measures (>90% compliance). The ERAS group had a higher percentage of patients with low narcotic usage postoperatively (67% in the 0–40 OME range) compared to the non-ERAS group (23% in the 0–40 range). The average OME requirement was significantly lower in the ERAS group compared to the non-ERAS group (41.94 vs. 79.90 OME, P < 0.01). However, the hospital length of stay (4.10 vs. 3.61 days, P = 0.15), postoperative day out of bed as pain permitted (0.54 vs. 0.39 days, P = 0.18), and readmission rate (18.9% vs. 9.1%, P = 0.25) were statistically similar between the ERAS and non-ERAS groups. Key study outcomes are show in [Figure 1] and [Figure 2].

Conclusion: The ERAS protocol significantly decreases narcotic usage in patients undergoing spinal fusions. The effectiveness of this protocol, as well as the high rates of compliance to this protocol by patients and providers, shows that not only should this protocol continue to be used in spine fusion surgeries but it should also start being implemented in other orthopedic procedures associated with high levels of postoperative pain.




  Abstract Number 18 Top


YouTube™ Education Improves Patient Understanding of Management of Bleeding on Dual-Antiplatelet Therapy

M. Krinock 2, M. N. Katz 2, K. Shah 1, V. Yellapu 1, J. Shirani 2

1 Internal Medicine Residency and 2 Cardiology Fellowship, University Hospital Campus, Bethlehem, PA

Introduction: Since the advent of percutaneous coronary intervention (PCI), the mortality rate of acute myocardial infarction has significantly decreased. Despite advancements in angioplasty and coronary stent technology, compliance with dual-antiplatelet therapy (DAPT) remains critical. Although physicians routinely educate patients on the importance of DAPT after PCI, it is unclear if bedside patient education is retained in the long term. Standard print patient education pamphlets are routinely used upon patient discharge, but their effectiveness has not been systematically studied. For certain patients, supplementation of the handouts with YouTube™ educational videos may prove beneficial. We aimed to evaluate the impact of supplemental contemporary bedside patient education via a YouTube™ video on patient understanding of their condition and the importance of compliance with DAPT after PCI.

Methods: Patients who underwent PCI within the past 24 h were screened via an electronic health record system. The objectives of the study were explained to the patients, and informed consent was obtained. A four-question preintervention test was administered [Table 1], and an educational YouTube™ video on an iPad or laptop computer was provided. Patients then answered the same four questions as the postintervention test. Inclusion criteria were patients aged ≥18 years, having undergone PCI in the last 24 h, and having been prescribed DAPT. Exclusion criteria were non-English speaking and inability to provide informed consent.

Results: A total of 18 patients were enrolled. The average age of the participants was 63 and seven were women (38.9%). Two (11.1%) patients had prior DAPT use, 8 (44.4%) had a medical history of diabetes, and 10 (55.6%) were former or active smokers. The average pretest score was 71.75%, while the average posttest score was 97%. The most common question answered incorrectly was regarding nuisance (nonlife-threatening) bleeds, while on DAPT. Of those who initially answered this question incorrectly, all answered it correctly on the follow-up postintervention test. Compared to pretest scores, the average posttest score increased by 25.25%. There was a statistically significant improvement in the postintervention test scores measured via two-tailed Wilcoxon signed-rank test (P < 0.05).







Conclusion: Standardized patient education via YouTube videos is feasible and improves patient understanding of their overall disease process and understanding of medication compliance with DAPT after PCI.


  Abstract Number 19 Top


Risk Factors for Complications and Return to the Emergency Department after Interscalene Block using Exparel® for Shoulder Surgery

A. Malige 1, S. T. Yeazell 1, G. F. Carolan 1

1 Orthopaedic Surgery Residency, University Hospital Campus, Bethlehem, PA

Introduction: Exparel ® has recently gained favor for the use in interscalene regional block for shoulder surgery. While effective for pain relief, it does have adverse effects which can lead to postoperative emergency department (ED) visits. This study aims to identify any patient risk factors that are associated with complications, leading to ED return due to interscalene blocks using Exparel ® before shoulder surgery.

Methods: A retrospective chart review was performed for all patients undergoing shoulder surgery with an Exparel ® interscalene block in an 8-month period. For each patient, demographic information, comorbidities, type of block, postoperative complications, ED return visits, and readmissions were recorded. A five-factor modified frailty index and Charles comorbidity index (CCI) score were calculated. Univariate and multivariate logistic regressions were conducted to identify the risk factors associated with increased complications and return to the ED.

Results: Overall, 352 patients were included, mostly male, between 51 and 70 years of age, and a body mass index of 25.0–35.0. Fifty-eight patients (16.5%) had postoperative complications related to their Exparel ® interscalene block, including 37 (10.5%) minor complications and 21 (6.0%) major complications that led to return ED visits. Univariate analysis yielded ASA score (P = 0.03) as a significant predictor of minor complications. Multivariate logistic regression analysis yielded ASA score (P = 0.096, odds ratio = 1.64) as a trending toward being a significant risk factor for minor complications. Univariate analysis yielded age (P = 0.006), ASA score (P = 0.009), and CCI score (P = 0.002) as the significant predictors of major complications. Multivariate logistic regression analysis yielded ASA score (P = 0.049, odds ratio = 2.25) as the only significant risk factor for major complications. Key study outcomes are summarized in [Table 1], [Table 2], [Table 3] and [Figure 1], [Figure 2], [Figure 3].

Conclusion: Surgeons and anesthesiologists should strongly consider a patient's ASA score, in addition to their pulmonary and cardiac history, when deciding whether the patient is an appropriate candidate for interscalene regional block using Exparel ® for shoulder surgery.














  Abstract Number 20 Top


A Unique Case of Progressive Spasticity and Profound Brain Atrophy in a Young Female

N. Mufti 1, D. Raheja 1, C. R. Craven 1

1 Neurology Residency Program, Richard A. Anderson Campus, Easton, PA, USA

Introduction: More than 60 types of hereditary spastic paraplegia (HSP) have been identified. These types can be classified as pure (mostly spasticity) and complex (spasticity with neurological features) forms that can be inherited as autosomal dominant, recessive, or more rarely, X-linked disorders. We present a unique case of progressive spasticity and cortical atrophy in a young woman, who was eventually diagnosed with spastic paraplegia type 11 (SPG-11). A 32-year-old previously athletic female presented with more than 10-year history of cognitive decline and progressive weakness. She was in her usual state of health up until age 18 when she first presented with leg pain and difficulty running. Over the next few years, she continued to decline, was unable to ambulate without a walker, eventually became wheelchair-bound. Similar symptoms started in the upper extremities, though to a lesser degree, with dysarthria, dysphagia, and some visual hallucinations. She was noted to have a mild cognitive impairment, spastic dysarthria, and increased tone in both lower and upper extremities. Family history was largely negative for neurocognitive disorders. She was initially diagnosed with cerebral palsy.

Case Scenario: Laboratory investigations including CBC, CMP, TSH, B12, ANA, and SPEP were all within normal range. Arylsulfatase A and long-chain fatty acids were unremarkable. Serial brain imaging over the decade revealed progressive cerebral volume loss, predominantly in the frontal lobes with normal cerebellar volumes, thinning of the corpus callosum, and minimal white matter changes. Electromyogram (EMG) findings were suggestive of a chronic lower motor neuron involvement. Due to progressive frontal lobe atrophy and lower motor neuron findings noted on the EMG, genetic testing was sent for the ALS-FTD panel, which showed two pathogenic mutations in the SPG-11 gene, known to cause SPG-11. SPG-11 is the most common complex autosomal recessive disorder of the protein Spastacsin, on chromosome 15q21.1 which deals with endosomal trafficking and lysosomal biogenesis.

Conclusion: Defects in this protein resulting in HSP cause not only spasticity but also  Parkinsonism More Details, maculopathy, progressive slow cognitive decline, and peripheral neuropathy. Hallmark brain magnetic resonance imaging findings in patients with SPG-11 gene mutation are thinning of the corpus callosum, so-called “ears of the lynx deformity,” as was noted in our patient and was present on her initial imaging in 2009. This suggests that imaging markers may present early in the disease course and may prove useful in future efforts to develop treatments or early diagnostics for HSP.


  Abstract Number 21 Top


The Efficacy of Liposomal Bupivacaine in Interscalene Nerve Blocks for Shoulder Arthroplasty

A. Malige 1, G. F. Carolan 1

1 Orthopaedic Surgery Residency Program, University Hospital Campus, Bethlehem, PA, USA

Introduction: Postoperative pain can have far-reaching effects on patient recovery, functional status, and overall satisfaction. Because of this, surgeons and anesthesiologists have turned to preoperative nerve blocks in hopes of providing pain relief and decreasing opioid usage postoperatively. Interscalene (IS) nerve blocks have become the standard of care to provide pain relief following shoulder procedures due to their effectiveness. Before the introduction of Exparel ®, IS blocks would be administered as a single shot or a catheter with a continuous pain pump using the local anesthetic agent of choice. Unfortunately, these methods were associated with a short period of pain control associated with significant upkeep, time, and cost, after which an oral pain regimen centered around narcotic medication was often needed for adequate pain control. Recently, the FDA approved the use of liposomal extended-release bupivacaine (Exparel ®). This drug has gained popularity as the anesthetic agent of choice used in IS nerve blocks. The liposomal formulation allows for an extended delivery of medication (up to 4–5 days' time) to specific targets while decreasing toxicity. This study explores the effects of IS nerve blocks using Exparel after shoulder arthroplasty in decreasing postoperative narcotic consumption, hospital stay, and readmission.

Methods: A retrospective chart review was performed of patients undergoing shoulder arthroplasty within our network, either for proximal humerus fracture or arthritic changes. All patients 18 years of age or older undergoing primary or secondary arthroplasty with an ASA of 1–3 were included. Any patients with an ASA of 4 or did not have adequate documentation in their chart were excluded. On the day of surgery, patients were consented for an IS nerve block for postoperative pain control and general anesthesia was the main intraoperative anesthetic. The IS block was performed by an anesthesiologist experienced with regional anesthesia techniques and procedures. Patients who received a preoperative ultrasound-guided IS block with plain local anesthesia (administered as a single shot or a catheter with a continuous pump) were placed into a control group, while patients who received an ultrasound-guided IS block with Exparel ® administered as a single shot were placed into the experimental Exparel ® group. All patients subsequently underwent general anesthesia for the surgical procedure. For each patient, length of stay, readmission, and opioid requirements postoperatively were recorded. The difference in outcomes between groups was compared using descriptive statistics and independent t-tests. For all analyses, P < 0.05 denotes statistical significance (IBM SPSS Version 23 Statistics for Windows, Armonk, NY: IBM Corp.).

Results: Overall, 432 patients were included in our study. There were 181 patients in the Exparel ® group (167 primary surgeries and 14 revision surgeries) and 251 patients in the non-Exparel ® control group (231 primary surgeries and 20 revision surgeries). The Exparel ® group (19.8 opioid morphine equivalent [OME]) used a significantly lower amount of average daily OME compared to the non-Exparel ® (30.0 OME) control group (P < 0.001). There was no difference in opioid usage between primary and revision surgery patients in the Exparel ® group (19.5 vs. 23.6 OME, P = 0.65) and non-Exparel ® groups (29.2 vs. 39.8 OME, P = 0.15). Furthermore, more patients were considered to use a low level of opioids (0–40 OME) postoperatively in the Exparel group (86%) compared to the non-Exparel ® group (73%). Finally, the Exparel ® group had a shorter length of stay (1.67 vs. 2.25, P = 0.016) but similar readmission rate (3.9% vs. 4.0%, P = 0.951) compared to the non-Exparel ® group. Key study characteristic and outcomes are presented in [Table 1], [Table 2], [Table 3], [Table 4] and [Figure 1], [Figure 2].

Conclusion: The use of Exparel ® in the IS nerve blocks for pain control helps decrease postoperative opioid requirements and length of stay in patients undergoing shoulder arthroplasty. Patients and physicians should strongly consider this adjunct when deciding how best to effectively treat a patients' perioperative pain.

Keywords: Arthroplasty, bupivacaine, Exparel, liposomal, shoulder


  Abstract Number 22 Top


Bone Markers in Charcot Neuroarthropathy

K. Patel 1, B. Bernstein 1, J. C. Stoltzfus 1

1 Podiatric Medicine and Surgery Residency, University Hospital Campus, Bethlehem, PA

Introduction: There have been multiple studies evaluating the effectiveness of pharmaceuticals in the treatment of acute Charcot neuroarthropathy (CNA). The effectiveness of most pharmaceuticals historically has been based on the levels of certain biomarkers of bone turnover. Previous research has shown that bone turnover markers are increased in the acute phase of CNA. However, biomarkers' activity in the chronic phase of CNA and its utility for monitoring treatment response remains to be established. In this article, we evaluated the relationship of such biomarkers to disease severity. We hypothesized that biomarkers such as BSAP and DPD:CRT levels will be directly proportional to disease severity, i.e., increased in acute CNA and decreased in chronic CNA. Hence, pharmacologic treatment in acute CNA with reference to levels of such biomarkers potentially could be beneficial in clinical settings.

Methods: We retrospectively reviewed 41 patients diagnosed with acute and chronic CNA in our Charcot clinic. Disease severity was determined via pedal temperature difference between affected and unaffected limbs using an OMEGA Surface Temperature Scanner in conjugation with radiographic films and clinical presentation. Temperature of 2°C or greater is considered acute CNA, while temperature less than 2°C is considered chronic CNA. Urine and serum blood samples were collected at baseline to evaluate the levels of BSAP and DPD:CRT via immunoassays. Statistical analysis was performed via separate Mann–Whitney rank sum tests to determine between-group differences in BSAP and DPD:CRT in acute versus chronic CNA.

Results: Bone formation marker BSAP (P = 0.46) and bone resorption marker DPD:CRT (P = 0.92) were not significantly different in acute versus chronic CNA. In acute CNA patients (n = 31), the median BSAP was 11.9 (range 3.8–41.3) versus 15.2 (range 4.4–40.7) in chronic CNA patients. In acute CNA patients, the median DPD:CRT was 7.8 (range 3.1–38.4) versus 8.8 (range 3.3–18.6) in chronic CNA patients. Key study outcomes are shown in [Figure 1], [Figure 2], [Figure 3], [Figure 4].

Conclusion: The lack of significant between-group differences in BSAP and DPD:CRT calls into question their reliability in monitoring pharmaceuticals effectiveness in acute versus chronic CNA.














  Abstract Number 23 Top


The Borescope: An Adjunct in Sterile Processing Department Quality Assurance

K. C. Kelley 1, J. J. Lukaszczyk 1, T. Bennett 1, R. Castillo 1, B. A. Hoey 1, S. P. Stawicki 1

1 General Surgery Residency, University Hospital Campus, Bethlehem, PA, USA

Introduction: The borescope is a semi-rigid fiber-optic device originally developed as an “inspection camera” to look down small spaces and cavities. At our institution, the borescope was introduced as a tool to inspect surgical instruments after our sterile processing department (SPD) discovered that internal lumens of some surgical instruments contained poststerilization biological residue (e.g., skin, bone, blood, and rust) and internal damage not otherwise detected. We hypothesized that the implementation of a borescope-based SPD inspection protocol would result in improved overall process quality and potentially fewer surgical site infections (SSIs).

Methods: We performed an institutional review board-exempt review of our institution's SPD Scorecard reports and all-reported SSIs between January 2018 and June 2019. After the Healthmark flexible scope (Fraser, Michigan) was introduced in our SPD (August 2018), the instrument cleaning process was modified to include these steps: (a) initial decontamination of instruments; (b) initial borescope examination to determine cleaning adequacy; (c) additional cleaning if required (e.g., brushing/air and sterile water flushing); and (d) re-inspection to ensure complete cleaning before final sterilization. Results are presented utilizing descriptive statistics, focusing on comparisons of pre/post-borescope periods.

Results: Preliminary findings after implementation of the borescope (August 2018) show a decrease in the percentage of dirty instrument trays identified by operating room (OR) staff (0.0003% vs. 0.0002%) and a decrease in reported SSIs (5.9/month vs. 3.0/month).

Conclusion: In the 10-month period since its implementation, the borescope has shown promise in reducing the incidence of postprocessing-soiled instruments at our institution. Identification of soiled instruments in the OR leads to greater resource consumption and potential time delays. Consequently, we pose that the borescope should be a tool utilized more widely in the SPD and the incidence of refractory instrument residue should be tracked as an important quality measure.










  Abstract Number 24 Top


Implementation of an Enhanced Recovery after Surgery Protocol for Video-Assisted Thoracoscopic Surgery Lobectomy Decreases Perioperative Opioid Use

Z. A. Frenzel 1, R. Fontem 1, A. L. Gifford 1

1 General Surgery Residency, University Hospital Campus, Bethlehem, PA, USA

Introduction: With the ongoing nationwide opioid crisis, reduction in opioid remains a priority when patients are admitted in the hospital. The initiation of an enhanced recovery after surgery (ERAS) protocol for thoracic video-assisted thoracoscopic surgery (VATS) procedures is one example. Before initiation of the ERAS protocol, VATS procedures relied mostly on epidural use or opioid patient-controlled analgesia use for pain control, which may not reduce opioid requirements. ERAS protocols were introduced in the surgical community in the late 1990s and implemented to reduce a patient's perioperative stress response. Ideally, there would be improved pain control, reduced rate of complications, decreased hospital length of stay (LOS), and decreased readmission rates. Exparel ® (liposomal bupivacaine) was used in intercostal nerve blocks to help reduce opioid requirements during the perioperative period in 2016. In 2018, the thoracic surgery, anesthesia, perioperative nursing, and pharmacy departments worked together to construct and implement an ERAS protocol for thoracic procedures. This study highlights the use of intraoperative Exparel ® alone or in combination with an ERAS protocol in patients undergoing VATS lobectomies. Primary outcome measures involve ERAS protocol efficacy in reducing opioid requirements (calculating daily average oral morphine equivalent [OME] requirements) [Table 1]. Secondary outcome measures include effect of these interventions on hospital LOS and 30-day readmission rates.

Methods: All patients undergoing elective VATS lobectomy procedures from January 2016 to January 2020 were included. Patients were consented for Exparel ® and ERAS protocol during their final preoperative visit. Components of the thoracic ERAS protocol include preoperative patient education, preoperative nonnarcotic pain medications (Tylenol, Gabapentin, Celebrex), use of a preoperative carbohydrate drink to curb fasting, DVT prophylaxis, maintaining euvolemia and normothermia, regional anesthesia (via intercostal nerve blocks with Exparel ®, antiemetic use, opioid-sparing analgesia (ketamine and magnesium), postoperative nonnarcotic pain meds (Tylenol, gabapentin, and/or NSAIDs), early chest drain removal, avoidance of urinary catheters, and early mobilization after surgery. ERAS protocol was initiated for patients scheduled for elective VATS lobectomy procedures and requiring a hospital admission. ERAS protocol was instituted upon patient presentation to the hospital, and Exparel ® was administered intraoperatively by the operating surgeon. Our baseline data group consists of 20 patients from January 2016 to March 2016, who did not receive Exparel ® intraoperatively and were not part of an ERAS protocol (NoEx/NoEr). From this baseline group, we determined opioid utilization, LOS, and readmission rate. Our study groups included 142 patients from March 2016 to September 2018 who received Exparel ® but were not part of an ERAS protocol (Ex/noEr) and 83 patients from October 2018 to January 2020 who were part of an ERAS protocol and received Exparel ® intraoperatively (Ex/Er). OME requirements, LOS, and readmission rates were also obtained. Data were collected by chart review with surgery and anesthesia staff in EPIC to determine the OME requirements, LOS, and 30-day readmission. Daily average opioid requirements were measured throughout the patient's hospital stay. Compliance with individual ERAS protocol interventions was also investigated.

Results: Two hundred and forty-five patients underwent VATS lobectomy during the study period. Twenty patients underwent VATS lobectomy without intraoperative Exparel ® or use of ERAS protocol (noEx/noEr), 142 underwent VATS lobectomy with intraoperative use of Exparel ® but without ERAS protocol (Ex/noEr), and 83 patients underwent VATS lobectomy with intraoperative use of Exparel and under ERAS protocol (Ex/Er). Key study results are listed in [Figure 1], [Figure 2], [Figure 3] and [Table 1], [Table 2], [Table 3]. Eighty-four percent of the patients in the Exparel ® /ERAS group had very low OME requirements (0–40), hence low opioid use [Table 2] compared to 30% of the patients who did not receive Exparel ® /were not in ERAS protocol. Exparel use plus ERAS protocol resulted in decreased LOS 3.92 ± 3.1 days versus 4.12 ± 2.8 days for Exparel ® use alone versus 5.09 ± 2.8 days for no Exparel, no ERAS [Figure 2]. Exparel ® use plus implementation of the ERAS protocol also resulted in decreased readmission rates compared to Exparel ® use alone or use of neither intervention with readmission rates of 2.4%, 5.6%, and 5%, respectively [Figure 3]. However, the average cost per hospital admission was higher with the use of Exparel ® and ERAS protocol ($21,263.27 ± 12,465) versus Exparel ® use only ($17,420.27 ± 7561) versus no Exparel ®, no ERAS ($18,265 ± 5175) [Figure 4]. While these cost differences may be fiscally relevant, our analysis did not reveal a statistical significance. There was >87% compliance in nine out of the 15 measures of the ERAS protocol [Table 3].

Conclusion: Exparel ® intercostal nerve block in addition to ERAS protocol results in the improvement of certain postoperative outcome measures such as pain control (decreased opioid requirements), length of hospital stay, and readmission rates. On the other hand, Exparel ®/ERAS use was associated with increased cost. Randomized controlled studies should be performed to investigate these findings, especially given the small sample size in the nonintervention group. Since the start of the thoracic ERAS protocol for VATS procedures, other surgical services are interested in starting their own ERAS protocols: Colorectal Surgery with Colorectal ERAS Protocol, Urology Service with Cystectomy ERAS Protocol, Gynecology Oncology Surgery with Hysterectomy ERAS Protocol are to name a few.


  Abstract Number 25 Top


Abdominopelvic Computed Tomography Utilization and Age in Emergency Department Patients over Age 65 Presenting with Abdominal Pain

M. L. Grimaldi 1, J. M. Shugars 1, H. A. Stankewicz 1

1 Emergency Medicine Residency, University Hospital Campus, Bethlehem, PA, USA

Introduction: Abdominal pain is one of the most common chief complaints encountered in the emergency department (ED), accounting for approximately 8.8% of all visits in the United States in 2015. Approximately 3%–4% of all ED visits in patients over age 65 years are due to abdominal pain. Multiple studies have demonstrated that the rates of surgical intervention, morbidity, and mortality are substantially higher in older patients with abdominal pain relative to younger patients. Older patients also tend to present with an atypical history, physical examination, and laboratory findings, increasing the value of imaging in the evaluation of these patients. We aimed to determine whether computed tomographic (CT) utilization increases with increasing age in a cohort of ED patients over age 65 years presenting to an ED within the St. Luke's University Health Network (SLUHN) hospital system.

Methods: We conducted a multicenter, retrospective review of patients over age 65 years presenting to an ED within the SLUHN hospital system with the chief complaint of abdominal pain from August 1, 2016, to January 10, 2018. Our study was approved by the institutional review board. The St. Luke's system has an emergency medicine residency (EMR) at its Bethlehem, Pennsylvania campus, but otherwise consists of multiple community hospitals in Pennsylvania and one in New Jersey. Patient charts were abstracted from the EMR, Epic (Verona, WI, USA), and then analyzed for several key variables. These variables included patient age, sex, and whether abdominopelvic CT was performed during the patient's ED course. Patients who left the ED without being seen were excluded from the analysis. ED to ED transfers within the St. Luke's system generate two separate encounters in the EMR; the second encounter was excluded from the analysis. Patients were stratified by age into four categories: 65–69, 70–79, 80–89, and 90 and older. The rate of CT utilization in each age group was determined. We then assessed for statistically significant differences in the CT rate between the above age groups using a Chi-square test.

Results: We reviewed a total of 1451 patient charts. After initial review, 10 duplicate encounters resulting from ED to ED transfers and three patients who left without being seen were excluded from the analysis. This left 1438 patients who were included in the analysis. The average patient age was 75.6 years, and 61.8% of the patients were female. Overall, 82% of the patients underwent abdominopelvic CT. The rate was 82.1% in patients aged 65–69 years, 82.2% in patients aged 70–79, 83.2% in patients aged 80–89, and 75% in patients aged over 90. There was no statistically significant difference in CT utilization between age groups (P = 0.35).

Conclusion: Two prior studies have examined CT utilization in older patients presenting to an ED. The CT rate was 37% for patients over age 60 presenting with abdominal pain in a 2005 multicenter study conducted in the United States. Another study conducted in 2007 at a single center in the United States found a CT rate of 59% in patients over age 60 presenting with abdominal pain. CT utilization was higher in our cohort, at 82% overall. It is likely that the uniformly high CT rate in our cohort made it less likely that there would be significant variations by age group. We are presently conducting a review of our data to evaluate the diagnostic yield of CT and the impact of CT on patient management in our cohort. As part of this analysis, we will also determine the most common pathologies seen on CT, rate surgical intervention, patient disposition, and in-hospital mortality in these patients.
















  Abstract Number 26 Top


Effect of Disposition on Patient Satisfaction in the Emergency Department

H. J. T. Kleiman 1, H. A. Stankewicz 1, R. A. Patterson 1, P. Kaur 1, S. Koo 1

1 Emergency Medicine Residency, University Hospital Campus, Bethlehem, PA, USA

Introduction: Patient satisfaction, as measured by surveys following discharge from the emergency department (ED), is an important metric for ED physicians. However, 8%–20% or more of patients seen in the ED are admitted to the hospital and ED physicians rarely receive feedback from these patients. ED physicians often spend more time both at the bedside and charting for admitted patients, and it is proposed that the satisfaction ratings of these patients may be higher than those of discharged patients.

Methods: This single-center, institutional review board-approved study assessed patient satisfaction in an ED staffed by both resident and attending physicians. Satisfaction surveys were presented to the patients directly after disposition was selected and before leaving the ED. Surveys were presented by nonbiased research assistants that were uninvolved in patient care. Patients were asked to rate their level of satisfaction on a Likert scale from 0 (very dissatisfied) to 5 (very satisfied) in the following categories: physician courtesy toward patient, physician listening to patient, physician keeping patient informed, physician's concern for patient comfort, length of patient wait time, treatment of patient pain, and overall experience. Patients were asked to complete separate surveys for the resident and attending involved in their care.

Results: A total of 267 patients completed a total of 525 surveys. 97 patients (193 surveys) were admitted and 166 patients (320 surveys) were discharged. The valid percentage of physicians receiving a rating of 5 for patients who were admitted vs. discharged is as follows: courtesy (69.1 vs. 63.0), listening (70.5 vs. 60.1), informed (67.7 vs. 60.8), concern for comfort (61.5 vs. 59.9), length of wait time (48.4 vs. 45.0), treatment of pain (68.1 vs. 52.9), and overall satisfaction (68.5 vs. 54.9).

Conclusion: Of the patients who participated in this study, admitted patients more frequently gave physicians the highest satisfaction rating (5) in all categories. It can be surmised from these data that admitted patients are more likely to have a positive experience in the ED than discharged patients. The patient satisfaction ratings of both admitted and discharged patients should therefore be accounted for when assessing an ED physician's performance. This is the first study to show that disposition has a significant effect on patient satisfaction ratings.


  Abstract Number 27 Top


Retrospective Study of Same-Patient Admissions for Consecutive Injury Events: A High-Volume Level I Trauma Center Experience

R. W. Van De Graaf 1, B. A. Hoey 1, R. Castollo 1, S. N. DeTurk 1, P. G. Thomas 1, J. Cipolla 1, J. B. Wilson 1, S. Stawicki 1

1 Department of General Surgery Residency, University Hospital Campus, Bethlehem, PA, USA

Introduction: Trauma continues to be one of the leading causes of death across all age groups. Despite significant amount of research dedicated to understanding of anatomical and physiological aspects of injury, there continues to be a paucity of information regarding patterns of same-patient, recurring trauma (SPRT) events over a long period of time. The aim of our study was to determine the incidence and temporal characteristics of SPRT at a busy regional level I trauma center (L1TC).

Methods: A retrospective review of our institution's L1TC registry between January 1998 and February 2019. Records of all patients who had more than one trauma admission during the study period were abstracted and compared to patients who only had one trauma encounter. Readmissions related to immediately preceding injury were excluded. Outcome parameters included patient demographics, hospital length of stay (LOS), mechanism of injury (blunt vs. penetrating), injury severity (ISS), time between admissions, and number of total SPRT events per patient. Statistical comparisons were made using appropriate parametric and nonparametric tests, with statistical significance set at <0.05.

Results: Approximately 3.8% of trauma admissions (1281/33,569) were attributed to SPRT events. The SPRT group had similar proportion of women (40.1% vs. 38.9%); had lower ISS (8.470.27 vs. 8.8324); was older (58.70.86 vs. 48.10.76 years); and had slightly longer LOS (4.060.17 vs. 3.960.13 days). We noted that the average number of days between consecutive trauma events decreased for each additional encounter [Table 1]. Not unexpectedly, the patient age increased within this longitudinal cohort [Table 1]. Of note, the patients tended to present with injury mechanisms, similar to that of the index admission (blunt injury following blunt injury = 91%; penetrating injury following penetrating injury = 57%), and did not appear to have elevated mortality during consecutive SPRT events (overall mortality, 3%).

Conclusion: Our study shows a crescendo pattern of SPRT episodes within our L1TC long-term registry record. Better understanding of risk factors associated with SPRT will be critical in designing and implementing long-term injury prevention programs. Further research in this important area is warranted.


  Abstract Number 28 Top


Emergency Medicine Resident Procedure Documentation versus Procedure Logs

J. Longenbach 1, H. A. Stankewicz 1

1 Emergency Medicine Residency, University Hospital Campus, Bethlehem, PA, USA

Introduction: Emergency medicine residents are required to complete and log a certain amount of various procedures throughout their residency in order for each to graduate. Procedural notes are also required to be completed on all patients who undergo certain procedures. Oftentimes, the notes are completed to satisfy the checks that our electronic medical record (EMR) places on note completion, but the process of logging the procedure is through a different medium and is ignored due to time restraints. The goal of this study is to determine whether or not the procedure notes completed by the residents or the log records are a more accurate representation of the total procedures completed by each resident. There are 15 core procedures that each resident is required to be competent in by the time they graduate from residency. One way to keep track of this is to have the residents log each procedure they complete on a third party website called New Innovations: www.new-innov.com. Logging procedures on this website is usually of secondary importance to completing your notes for the day, and often, it gets forgotten. It is possible to pull the data from our EMR that will show us how many of each procedure note the residents have completed in the EMR and then these numbers can be compared to the residents' procedure logs on New Innovations. These data can then potentially be used to help credential residents in certain procedures should the EMR numbers be higher.

Methods: This was a retrospective study conducted at an emergency medicine residency program at a level 1 trauma center. The research assistant compiled data from the New Innovations website for all 36 residents in the program from July 1, 2017, to June 30, 2018. The procedures evaluated were intubations, central line insertions, chest thoracostomies, procedural sedations, and lumbar punctures. Data were extracted from the EMR for the same residents and the same procedures during the same timeline. These numbers were then statistically analyzed.

Results: Majority of the residents had more procedures logged in New Innovations when compared to the EMR. A negative value (green) indicates that more procedures were logged into the EMR rather than New Innovations. A positive value (red) was the opposite. Out of the 180 data points evaluated, only 15 were negative [Table 1]. Five were a net of zero, leaving 160 comparisons that showed residents logged more procedures into new innovations rather than the EMR. Furthermore, in analyzing the mean, median, and mode of all the data above, only the central venous catheter (CVC) procedure offered a positively skewed frequency distribution [Figure 1]. On the other hand, intubations showed the opposite trend.

Conclusion: The only data to show a positive skew came from the CVC results. This suggests that with more data, a relationship between the logged central lines and procedure notes could argue more toward the initial hypothesis. However, the positive skew is minimal at best. Ultimately, it appears that, at this time, the third party website is indeed required for residents to meet their procedure goals by the end of residency. However, if residents were aware that pulling data from the EMR would count toward their graduation requirement, could that be enough to change the balance? Moreover, if there were a section of the procedure note where a “supervisor” or “assistant” resident could be captured, that would solve the primary issue mentioned above. Finally, for the 15 data points that did agree with the hypothesis, maybe, these could be added on to the respective resident to provide a more accurate analysis of their completed procedures, which could help them in the future with credentialing.




  Abstract Number 29 Top


A Survey of Wellness in Emergency Medicine Residents: Effects of the COVID-19 Pandemic

J. Paster 1, J. C. Stoltzfus 1, H. A. Stankewicz 1

1 Emergency Medicine Residency, University Hospital Campus, Bethlehem, PA, USA

Introduction: Physician wellness has long been a hot topic among attending and resident physicians alike. Multiple assessments, psychometric testing, mindfulness exercises, and “wellness guides” have been developed and implemented by residency programs throughout the country. Our program was provided a unique opportunity to compare a wellness questionnaire given to emergency medicine residents a few months before and after the onset of the COVID-19 pandemic.

Methods: This was an institutional review board-approved survey study conducted at an emergency medicine residency program at a level 1 trauma center in Eastern PA. The initial questionnaire was provided to each resident during Grand Rounds in November 2019 and was filled out anonymously to reduce measurement bias. The survey was partially derived from a previously validated sleep index, the Pittsburgh Sleep Quality Index. The questionnaire was again provided to the same residents on April 15, 2020, via e-mail by a research assistant.

Results: A total of 34 residents participated. There were no significant changes in hours spent sleeping, perceived difficulty in falling asleep at night, sleep disturbances, or exercise frequency. The question, “How difficult is it for you to fall asleep during the day?” demonstrated a decrease in the selection of “very easy;” from 29.4% to 14.7%. There was also a noted difference in “level of enthusiasm to get things done.” Prepandemic surveying showed that 61.8% of the residents had either “no problem” or “only a very slight problem,” with 35.3% answering “somewhat or a problem” or “a very big problem.” Postpandemic surveying showed 50% and 50%, respectively.

Conclusion: There was a significant decrease in self-assessed measure of productivity and motivation among emergency medicine residents during the COVID-19 pandemic, when compared to the same questionnaire given before the pandemic. There was also an increase in perceived difficulty to fall asleep during the day. Our goal of this study was not to prove that a pandemic is stressful, but rather to show how stress can manifest in resident physicians who are already conditioned to handle an exceedingly stressful environment, albeit early in their training. This is an unprecedented time to be an emergency medicine resident, and we hope that this study contributes to our understanding and prevention of physician burnout.






  Abstract Number 30 Top


Bridging the Gap: The Utilization of Electronic Health Records to Increase Metformin Prescription

M. Krinock 1, T. Chamakkala 1, V. Yellapu 1, R. A. Sluder 1, M. E. Widawski 1, J. T. Hippen 1

1 Internal Medicine Residency, University Hospital Campus, Bethlehem, PA, USA

Introduction: Few diseases have had as significant of an impact on health in the United States as diabetes. Currently, one in 10 people in the United States are diagnosed with diabetes which will increase to one in five by 2025.[1] Diabetes has long been known as a significant cause for morbidity and mortality, affecting organs from the cardiovascular system to the eyes.[1] Beyond this, diabetes also has a tremendous price tag, with an estimated $245 billion per year in both healthcare and lost productivity secondary to it.[2] In recent years, there have been numerous advancements in the treatment options for diabetes, but unfortunately, the cost precludes many patients who are most affected by the disease from receiving them. High costs can limit patients to only sulfonylureas, pioglitazones, and metformin.[3] Of these, sulfonylureas and pioglitazones have significant side effects that limit their use, in contrast to metformin which has been a long-term staple of antidiabetic regimens.[3],[4] Despite its effectiveness, low cost, proven effect on A1C, and favorable side effect profile, metformin is under-prescribed, with one study reporting only 64% of diabetics taking metformin.[5] This paucity of metformin prescriptions exposes a gap of care in diabetics, hindering them from lowering their A1C in the most cost-effective manner possible. Due to this deficiency, we set out to perform a quality improvement project to increase metformin prescription among residents at an internal medicine clinic.

Methods: This prospective intervention was conducted at a single internal medicine residency clinic. After receiving institutional review board approval, residents were randomly assigned into two groups (A and B). Group A utilized a clinic visit note template including a “hard stop,” which required addressing whether the patient was a diabetic, and if so, whether they were prescribed metformin. Group B did not have a “hard stop” in their template. Both groups received weekly reminders via a secure text messaging service that included education on the safety and efficacy of metformin. Other education topics included contraindications to metformin prescribing and dose adjustment recommendations per the society guidelines. Pre- and post-intervention data were recorded for each group and tracked via Epic and Tableau™ software using the standard institutional protocol by nonstudy participating moderators. Tracked data recorded included the number of diabetics, the percentage of diabetics prescribed metformin, as well as the number and percentage of diabetics poorly controlled (PC) (defined as an A1C greater than 8%). The data were subsequently analyzed using SPSS (IBM, Armonk, NY) statistical software, utilizing two-tailed paired t-tests before intervention implementation and after 4 months to determine statistical significance.

Results: A total of 476 diabetics were included in the original cohort, with a total of 508 in the final analysis. The average percentage of diabetics prescribed metformin in Group A (”hard stop” + education group) was 57.85% at baseline while in Group B (education only) was 68.38%. After 4 months of intervention, Group A had 63.52% of diabetics prescribed metformin and Group B had 68.18%. At baseline, 34.1% of the patients in Group A were defined as PC and 34.39% in Group B. In follow-up, Group A had 31.97% PC and Group B had 39.39% [Table 1] and [Table 2]. We identified that there was a statistically significant increase in metformin prescription over the 4-month period in the “hard stop” template Group A (P = 0.02). We did not see the same increase in metformin utilization in Group B, which had no “hard stop” intervention. In Group B, we did see an increase in PC patients over the 4 months (P = 0.011). Additional study outcomes are shown in [Table 3] and [Figure 1], [Figure 2].

Conclusion: As the burden of diabetes increases, the search for low-cost ways to lower this burden is of continual interest.[3] Despite metformin's minimal side effect profile, reduction in A1C, and low cost, a paucity of diabetics receive it, representing a significant gap in health-care.[4],[5] In our study, we found that education alone is not enough to bridge this gap. We demonstrated a significant increase in the percentage of diabetics prescribed metformin utilizing a “hard stop” in resident visit note templates versus education alone. Notably, the education group's PC percentage increased but can be explained due to the study taking place over the winter when A1Cs are known to worsen. Through this study, we were able to demonstrate a promising avenue for the possibility for the utilization of electronic health records to bridge the gap and increase metformin prescription.


  References Top


  1. Albright AL, Gregg EW. Preventing type 2 diabetes in communities across the U.S.: The National Diabetes Prevention Program. Am J Prev Med 2013;44:S346-51.
  2. Tabano DC, Anderson ML, Ritzwoller DP, Beck A, Carroll N, Fishman PA, et al. Estimating the impact of diabetes mellitus on worker productivity using self-report, electronic health record and human resource data. J Occup Environ Med 2018;60:e569-74.
  3. Vaughan EM, Rueda JJ, Samson SL, Hyman DJ. Reducing the burden of diabetes treatment: A review of low-cost oral hypoglycemic medications. Curr Diabetes Rev 2020;2: DOI: 10.2174/1573399816666200206112318. [Epub ahead of print]
  4. Pawlyk AC, Giacomini KM, McKeon C, Shuldiner AR, Florez JC. Metformin pharmacogenomics: Current status and future directions. Diabetes 2014;63:2590-9.
  5. Kannan, Arshad, Kumar S. A study on drug utilization of oral hypoglycemic agents in type-2 diabetic patients. Asian J Pharm Clin Res 2011;4:60-4.



  Abstract Number 31 Top


A Unique Neurological Presentation of Granulomatosis with Polyangiitis (Wegener's Granulomatosis) with Polyneuropathy, Myopathy, and Small-Vessel Stroke

A. Shaji 1, S. Devarinti 1, D. Raheja 1, R. Leung 1

1 Neurology Residency Program, Richard A. Anderson Campus, Easton, PA, USA

Introduction: Granulomatosis with polyangiitis (Wegener's granulomatosis) is a necrotizing, granulomatous vasculitis of the medium and small arteries, mainly affecting the respiratory tract and kidneys; however, it can also affect the central and peripheral nervous systems. Here, we report a rare and complicated neurological presentation of granulomatosis with polyangiitis presenting with polyneuropathy, polymyalgia rheumatica (PMR), and small-vessel stroke.

Case Scenario: A 60-year-old woman with a history of peripheral vascular disease, hypertension, and osteoarthritis presented with a 4-month history of progressive neck and back pain and dysesthesias of all four extremities. The initial neurological examination revealed normal motor strength in all muscle groups, acral sensory loss to all modalities, and decreased reflexes, consistent with neuropathy. She had an episode of transient diplopia 6 weeks after presentation, with negative brain MRI. She was later hospitalized for diplopia, imbalance, bilateral ptosis, and ophthalmoplegia. The examination was significant for splinter hemorrhages on several nail beds and diffuse myofascial tenderness. Neurological examination showed mild ptosis on the right, right-sided internuclear ophthalmoplegia, weakness in the proximal muscles, acral sensory loss, diffuse hyporeflexia, and bilateral dysmetria. Blood workup showed elevated (ESR) Erythrocyte Sedimentation Rate (81), (CRP) C-Reactive Protein (61), positive (RF) Rheumatoid Factor, and (c-ANCA) Cytoplasmic Antineutrophil Cytoplasmic Antibodies (1:80), with normal Vitamin B1, B6, mildly low B12 and D, negative (ANA) Anti-Nuclear Antibody, Sjogren's, (APLA) Anti-Phospholipid Antibody, (AchR) Acetylcholine Receptor antibody, (MuSK) Muscle-Specific Kinase, (SPEP) Serum Protein Electrophoresis, and (UPEP) Urineprotein Electrophoresis. Cerebrospinal fluid was normal. Electromyogram showed polyneuropathy with mixed axonal and demyelinating features. MRI of the brain showed restricted diffusion in the dorsal median midbrain, in a V-shaped morphology, as has been reported in Wernekink commissure syndrome involving the inferior paramedian mesencephalic arteries. Computed tomography of the chest showed multiple pulmonary nodules.

Conclusion: Overall, our patient's clinical syndrome is consistent with a small-vessel vasculitis, affecting both central and peripheral nervous systems, resulting in peripheral neuropathy, PMR versus myopathy, and small-vessel stroke. She was treated with high-dose intravenous methylprednisolone followed by oral steroid taper and rituximab with clinical improvement.












  Abstract Number 32 Top


Case Report of Delusional Disorder in an Older Adult Female: Impact of First-Generation Antipsychotic Monotherapy

K. R. Munzenmaier, F. Sholevar, A. Thomas

Psychiatry Residency, Richard A. Anderson Campus, Easton, PA, USA

Introduction: Delusional disorder is an uncommon and treatment-resistant psychiatric disorder in which an individual holds one or more delusions, in the absence of prominent hallucinations or mood symptoms, while maintaining baseline functioning. Of the many forms these delusions take on, the persecutory type is most common. However, rigorous studies examining the treatment of delusional disorder are lacking, leading psychiatrists to rely on case reports. These were reviewed in 2006 and again in 2015, citing a number of possibly efficacious treatments including typical and atypical antipsychotics, antidepressants, cognitive behavioral therapy, and even electroconvulsive therapy. Further, multiple comprehensive literature reviews have been unable to identify a best first-line treatment. The most recent Cochrane review confirms this, stating that there is “insufficient evidence to make recommendations for treatments of any type.” Here, we present an interesting case of persistent delusional disorder responsive to monotherapy with a first-generation antipsychotic, fluphenazine.

CARE Statement: Delusional disorder is a rare psychiatric illness, the criteria for which are demonstrated in this memorable and unusual case. Despite displaying some criteria of other, more common disorders such as bipolar disorder or schizophrenia, the patient never met criteria for these illnesses and maintained her baseline functioning apart from the ramifications of her delusion. Further, this case provides insight into possible first-line treatments. Generally, patients with this complex illness are managed with a range of treatments concomitantly, making it very difficult to elucidate the exact cause of patients' improvement. However, in this case, the patient improved dramatically following only one change, the consistent outpatient use of fluphenazine, suggesting that this medication should be considered as a first-line treatment.

Case Scenario: The patient is a single, retired woman in her mid-70s with a long psychiatric history centering around one delusion. For at least 25 years, she has held the belief that she is being persecuted by a man she met in the mid-1960s who has widely slandered her as a promiscuous and subservient, broadcasting her behaviors and spreading these false ideas on the internet. She believes that his actions have been legitimized by a prestigious law firm to the point that many people living in the area hold the same false views of her. She has gone to the police and written to senators and national women's organizations to plead her case and insist on the man's arrest, to no avail. She stated that she even moved across the country and took on an alias in an unfruitful effort to escape his persecution. Moreover, the patient maintained this delusion despite evidence that this man is likely deceased, in addition to multiple attempts to demonstrate the unlikelihood of her delusion. Although history is limited by the patient's memory and medical record availability, she was involuntarily hospitalized at least 6 times in a 19-month period, with many more reported before that including two stays in a state hospital. The patient was usually brought to the emergency room by police who found her harassing strangers and insisting that what they have heard about her is untrue. The patient was largely noncompliant with her medications and psychiatric follow-up during that time due to her belief that she was not mentally ill. She was treated with multiple antipsychotics while hospitalized and eventually had some response to fluphenazine. Although she now continues to maintain that she is being stalked by the same man, her compliance with outpatient fluphenazine and psychiatry follow-up has decreased the intrusiveness of her delusion such that she does not fervently bring up the topic as before and has not been brought into the hospital by police since 2017.

Conclusion: This case of an older adult with delusional disorder exemplifies the chronic course of the illness, the pervasiveness of delusions, and the repercussions that can occur. Despite persistence of her persecutory delusion, the patient eventually achieved clinical remission, attributed to consistent use of oral and intramuscular fluphenazine. This directly opposes previous literature stating that the exact source of improvement is impossible to identify because of the range of treatments employed at once. Previous reviews have also suggested that antipsychotic monotherapy is often insufficient and that atypical antipsychotics may be favored over typicals. However, the presented case demonstrates the durable efficacy of fluphenazine monotherapy. This suggests that typical antipsychotics should be strongly considered in patients experiencing social and legal ramifications as a result of their delusion, to decrease the overall burden of their illness.


  Abstract Number 33 Top


Coronary Artery Fistula in Setting of Cardiomyopathy: To Ligate or Not to Ligate?

K. Shah, I. Taha, M. Krinock, P. Thacker, C. E. Ruggeri

Departments of 1 Internal Medicine Residency and 2 Cardiology Fellowship

Introduction: Coronary artery fistulas (CAFs) are an anomalous connection between a coronary artery and other cardiovascular structures.[1],[2],[3] CAFs that connect to right-sided cardiac structures cause cardiac shunting from left to right, while left-sided connecting fistulas resemble aortic insufficiency in physiological parameters as well as cardiac auscultation.[4] CAFs are often clinically benign but have been reported to cause serious complications, including heart failure, myocardial infarction (MI), thrombosis, embolism, rupture, and sudden cardiac death.[1],[2] The gold standard diagnostic test for CAF is coronary angiography. After diagnosis, the decision on intervention is controversial but can be aided by the utilization of the Konna criteria.[5],[6] The criteria include a shunt ratio of left to right >30%, right ventricular (RV) ischemia/volume overload, presence of pulmonary hypertension/heart failure, and finally, the presence or history of infective endocarditis or aneurysm formation.[5] The 2018 AHA/ACC guidelines did not specifically give indications for surgical intervention but did note that there was a high postoperative MI rate of 11%, adding to the difficulty of the decision.[6] We present a case of a patient who had cardiomyopathy of unclear origin and was found to have a CAF, presenting a clinical decision that must take into account several different factors, as well as the patients' personal decision on whether to proceed with surgical intervention.

Case Scenario: A 42-year-old female presented secondary to syncope. Her medical history was significant for hypothyroidism from prior thyroidectomy. Her presentation was originally consistent with vasovagal syncope. However, during the workup, the patient was noted to have a new left bundle branch block. She then underwent echocardiography which showed an ejection fraction of 30%, a dilated cardiomyopathy, normal RV size and function, as well as normal pulmonary artery size. Laboratories were notable for a TSH 219, T4 0.16, total CK 2355, and aldolase 14.2. The patient was also noted to have clinical features consistent with myotonic dystrophy (MD), as well as a positive family history of MD in her brother. The genetic test for type 2 MD was negative and type 1 MD is pending. Cardiac catheterization showed normal coronary arteries and a possible left anterior descending (LAD) fistula. A cardiac computed tomography scan showed a small 2-mm vascular fistula extending from LAD to the pulmonary artery [Figure 1].

Conclusion: There has been no association between MD and CAF noted in the literature. While our patient did potentially have one of the Konna indications for surgery in that she did exhibit heart failure, her fistula was thought to be an incidental finding. She was managed with guideline-directed medical therapy (with angiotensin-converting enzyme inhibitor, heart failure approved beta-blocker), instead of ligation of the fistula, given the possible concomitant undiagnosed MD as a most likely cause of her cardiomyopathy as well as patient not willing to undergo the risk of surgery. The patient has remained clinically stable with thyroid replacement therapy with mild improvement of ejection fraction (35%–40%), and she is currently awaiting genetic test results for her highly suspected MD.


  References Top




  1. Qureshi SA. Coronary arterial fistulas. Orphanet J Rare Dis 2006;1:51.
  2. Minhas AM, Ul Haq E, Awan AA, Khan AA, Qureshi G, Balakrishna P. Coronary-cameral fistula connecting the left anterior descending artery and the first obtuse marginal artery to the left ventricle: A rare finding. Case Rep Cardiol 2017;2017:8071281.
  3. Alammar AK. Coronary artery fistulae discovered during presentation of a patient having heart failure due to severe aortic stenosis. Case Rep Cardiol 2014;2014:213673.
  4. Loukas M, Germain AS, Gabriel A, John A, Tubbs RS, Spicer D. Coronary artery fistula: a review. Cardiovasc Pathol 2015;24:141-8.
  5. Takeuchi N, Takada M, Nishibori Y, Maruyama T. A case report of coronary arteriovenous fistulas with an unruptured coronary artery aneurysm successfully treated by surgery. Case Rep Cardiol 2012;2012:314685.
  6. Stout KK, Daniels CJ, Aboulhosn JA, Bozkurt B, Broberg CS, Colman JM, et al. 2018 AHA/ACC guideline for the management of adults with congenital heart disease: A Report of the American College of Cardiology/American Heart Association Task Force on Clinical Practice Guidelines. J Am Coll Cardiol 2019;73:e81-192.



  Abstract Number 34 Top


Effect of Perceptions of Wait Time on Physicians' Evaluations in the Emergency Department

A. Suri 1, H. A. Stankewicz 1, H. E. Barnes 1

1 Emergency Medicine Residency, University Hospital Campus, Bethlehem, PA, USA

Introduction: With the development of the Emergency Department (ED) Patient Experience of Care Survey by the Centers for Medicare and Medicaid Services (CMS), there has been increasing interest in the factors influencing patients' experience of care in the ED.[1] Differing from patient satisfaction, CMS defines that patients' experience has a “focus on how patients experienced or perceived key aspects of their care.”[2] Nevertheless, the outcomes of patient experience surveys rely heavily on patients' evaluations of the physicians delivering care. Previous studies have found that the results of overall satisfaction surveys are influenced by factors beyond provider–patient interactions, such as wait time.[3] The relationship between wait time and patient experience is especially important in emergency care settings, where wait times can be lengthy due to ED overcrowding. Moreover, perceived wait times have been demonstrated to have a strong impact on overall ratings of ED visits.[4],[5],[6] Studies found that longer wait times have been associated with lower overall satisfaction scores.[7] While studies have shown the effect of perceived wait times on overall ED experience evaluations, no studies have investigated the relationship between perceived wait time and patients' evaluation of providers delivering care.[8] The purpose of this study is to determine if perceived wait times in a suburban ED has a significant impact on the postcare evaluation of emergency care physicians.

Methods: Patients who visited the ED at St. Luke's University Health Network in Fountain Hill, PA, completed patient experience surveys. Once a disposition was assigned to the patient, a research assistant showed the patient a picture of each physician who was caring for them in the ED and asked them a series of questions. Surveys were modeled after validated measures developed by Press Ganey Associates (South Bend, Indiana). Evaluations of qualities of the physicians included interpersonal qualities as well as how long the patient waited for the physician during his or her stay. Physician evaluation scores followed a Likert-style rating from 1 through 5, with 1 indicating “very poor” and 5 indicating “very good.” The perceived length of time waiting for the physician followed the same 1 through 5 rating system. Data were analyzed using the Statistical Package for Social Sciences (SPSS) software version 24.0.0.1 (IBM Corporation, Armonk, NY, USA). To develop a binary outcome variable, the outcome for physician evaluations was recoded into two categories, with “very poor,” “poor,” and “fair” responses as “poor” and those with “good” and “very good” responses as “good.” Similarly, the evaluation of the length of time waited was categorized into a binary independent variable of “poor” and “good.” Univariate analyses provided descriptive statistics for the demographics of the study population. Chi-square tests were conducted to identify significant associations at an alpha of 0.10 between physician evaluations and categorical predictors. Where applicable, Fisher's exact test was used for variables with expected counts of less than 5. An independent sample t-test was conducted for continuous variables. Finally, binary logistic regression was conducted with physician evaluations as the dichotomous outcome with the length of time until disposition and perceived length of time as predictors. This process was repeated separately for attending and resident physician evaluations.

Results: [Table 1] summarizes the characteristics of survey respondents. In total, 201 and 170 survey responses were included in the analysis of the residents and attendings, respectively. Of the individuals who evaluated resident physicians, the mean age of the participants was 50 years, and the average length of time until disposition was 203.9 min. Of the individuals who evaluated attending physicians, the mean age of the participants was 50.4 years, and the average length of time until disposition was 200.9 min. On bivariate analysis, the perceived length of waiting time was found to be significant for attending physicians, but not resident physicians (attendings: P = 0.028; residents: P = 0.303). Length of time until disposition was not found to be significantly different between “good” and “poor” attending physician evaluations (t(4.5) = −0.462, P = 0.665), despite those reporting higher evaluations having higher mean wait times (mean [M] = 201.4, standard deviation [SD] = 128.8), compared to those reporting lower physician scores (M = 182.4, SD = 89.2). Similarly, this variable was not significantly different among “good” (M = 203.9, SD = 124.8) and “poor” (M = 200.5, SD = 150.6) resident evaluations (t(1.014) = −0.032, P = 0.979). [Table 2]a and [Table 2]b summarizes the results from multivariate analyses of attending and resident evaluations, respectively. When accounting for missing variables, 166 surveys were analyzed for attending physicians whereas 194 surveys were analyzed for resident physicians.

Conclusion: Results show that attending physicians' evaluations are strongly influenced by a patients' perception of waiting time. Special care should be taken when setting expectations with regard to waiting times in emergency care settings. This relationship was not observed with residents' evaluations. It is not clear why attendings and residents differed in factors influencing the evaluations. It is possible that the length of wait time differed more among those who rated attendings compared to residents, but this difference was not significant. It is also possible that attending physicians were rated differently due to their advanced training from a patient's perspective. This study analyzed data from a single ED in a larger health network, which limits its generalizability to other settings. In addition, the majority of evaluations in this analysis fell into the “good” category. Further studies would benefit from collecting more responses to increase the likelihood of capturing a wide array of scores.




  References Top


  1. Weinick RM, Becker K, Parast L, Stucky BD, Elliott MN, Mathews M, et al. Emergency department patient experience of care survey: Development and field test. Rand Health Q 2014;4:5.
  2. Centers for Medicare and Medicaid Services. Consumer Assessment of Healthcare Providers and Systems. 13 November, 2019. Available from: www.cms.gov/Research-Statistics-Data-and-Systems/Research/CAHPS. (Last accessed September 2, 2020)
  3. Aaronson EL, Mort E, Sonis JD, Chang Y, White BA. Overall emergency department rating: Identifying the factors that matter most to patient experience. J Healthc Qual 2018;40:367-76.
  4. Davenport PJ, O'Connor SJ, Szychowski JM, Landry AY, Hernandez SR. The relationship between emergency department wait times and inpatient satisfaction. Health Mark Q 2017;34:97-112.
  5. Taylor C, Benger JR. Patient satisfaction in emergency medicine. Emerg Med J 2004;21:528-32.
  6. Thompson DA, Yarnold PR, Williams DR, Adams SL. Effects of actual waiting time, perceived waiting time, information delivery, and expressive quality on patient satisfaction in the emergency department. Ann Emerg Med 1996;28:657-65.
  7. Boudreaux ED, Mandry CV, Wood K. Patient satisfaction data as a quality indicator: A tale of two emergency departments. Acad Emerg Med 2003;10:261-8.
  8. Sharp B, Johnson J, Hamedani AG, Hakes EB, Patterson BW. What are we measuring? Evaluating physician-specific satisfaction scores between emergency departments. West J Emerg Med 2019;20:454-9.



  Abstract Number 35 Top


Nonuremic Calciphylaxis

A. Parameswaran, R. Snyder, M. Chalunkal, R. Garwood1

1Internal Medicine,Richard A. Anderson Campus, Easton, PA, USA

Introduction: Calciphylaxis, known as calcific uremic arteriolopathy, is classically found in patients with end-stage renal disease and advanced chronic kidney disease. This condition has a high morbidity and mortality rate; it causes abnormal deposition of calcium in the vessels, resulting in vascular thrombosis and tissue infarction. “Nonuremic” calciphylaxis is far less common but also has a high mortality rate. It is often overlooked as a diagnostic consideration in patients with normal functioning kidneys. We present a complex patient with nonuremic calciphylaxis with significant contributing comorbid illnesses including antiphospholipid antibody syndrome on anticoagulation with warfarin.

Case Scenario: A 37-year-old Caucasian female with a medical history of antiphospholipid antibody syndrome, diabetes mellitus type 2, depression, and chronic opioid use admitted to pain management for a recurrent abdominal wound that recently underwent debridement. The patient underwent a gastrojejunal-jejunostomy procedure for a jejunostomy feeding tube (J-tube) insertion 6 months prior and had been combining her tube feeds with oral intake. Shortly after the procedure, she had been treated for surgical wound debridement on the abdomen and lower back and pelvis without penetration into the retroperitoneum. She had regular follow-up with wound care, and despite daily use of opioids for pain management, it was difficult to control her pain. In addition, the patient noted considerable weight loss over the past year that was attributed to malabsorption and gastroparesis which required J-tube placement. She had additionally been placed on warfarin for antiphospholipid syndrome which was later held for wound debridement and switched to Lovenox. Before the wound debridement preceding the admission, a biopsy of the abdominal wound showed subcutaneous medium-sized blood vessels with medial circumferential and near occlusive calcification and soft tissue calcification consistent with calciphylaxis. Her serum creatinine was noted to be slightly elevated at 1.50 from her baseline of 0.9–1.10. In addition, she was noted to have hyperkalemia and hyperphosphatemia, while her INR was 1.34. Once the acute kidney injury and electrolyte disturbances resolved with tube feeds and fluids, at the suggestion of nephrology and dermatology, the patient was started on treatment sodium thiosulfate infusion after port placement. She was discharged after changing the wound VAC on her abdomen and was instructed to follow up with dermatology, nephrology, and wound care, regularly in addition to palliative care for pain management.

Conclusion: Among the few observed cases of nonuremic calciphylaxis, cases cited were primarily among Caucasian women reporting malignancy, hyperparathyroidism, protein C and S deficiencies, mineral abnormalities, and a variety of other conditions, underscoring a contribution of a multitude of factors toward the pathogenesis. We have reason to suspect that antiphospholipid antibody syndrome could also be implicated as a major contributing factor to our patient's presenting complaints. One concern is that the indicated coumadin treatment increased the likelihood of warfarin-induced skin necrosis, leading to calciphylaxis. While the incidence of nonuremic calciphylaxis is low, the high rate of mortality in patients highlights the necessity for better understanding the pathophysiology of this rare disorder so that we may improve our treatment regimen and reduce adverse effects of this lethal ailment.






  Abstract Number 36 Top


Anchoring Bias: In This Patient with Diabetes, Is Diabetic Gastroparesis Truly the Cause of Her Symptoms?

M. Tawadros1, R. L. N. Hindosh1, A. Brahmbhatt1, H. B. Liaquat, R. Snyder, J. Sotherland, A. Davis1

1Internal Medicine Residency, Richard A. Anderson Campus, Easton, PA, USA and Gastroenterology Fellowship, University Campus, Bethlehem, PA, USA

Introduction: Gastroparesis is characterized by delayed gastric emptying in the absence of any mechanical obstruction. Symptoms include nausea, vomiting, early satiety, and abdominal bloating. While diabetes is the most common systemic disease causing gastroparesis, not all gastroparesis in patients with diabetes is caused by diabetes. It is important to be aware of potential biases that can affect decision-making

Case Scenario: A 45-year-old female with a history of type 1 diabetes, hypothyroidism, and Stage 3 chronic kidney disease due to diabetic nephropathy presented with worsening fatigue, cold intolerance, and palpitations for 4 weeks. In February, her TSH level was markedly elevated at 447 mU/L. Prior TSH was 1.1 mU/L in January 2019. She has been on Synthroid 150 mcg daily for years. The patient repeatedly denied any changes to her medication dosing, formulation, or method of consumption. She has consistently taken Synthroid 1 h before her other medications with a sip of water. On further questioning, she noted that during those 4 weeks, she had increased abdominal distention, bloating, and nausea. She denied ever having these symptoms before. Gastric emptying study performed at the end of March demonstrated severely depressed gastric emptying, and she was prescribed Reglan. The initial assumption was that she had developed diabetic-induced gastroparesis. While Reglan improved her gastrointestinal (GI) symptoms and she was regularly taking her Synthroid, and her fatigue persisted. Other laboratory testing done in March showed AST 98 U/L, ALT 211 U/L, ALP 148 IU/L, total bilirubin 0.5 mg/dL, and Hgb A1C 6.5. Viral hepatitis and celiac panels were negative. Liver ultrasound showed normal size, echotexture, and no splenomegaly. The patient's abdominal symptoms improved over the following 3 weeks, and she no longer required Reglan. Repeat liver function tests at the beginning of April showed normalization of liver enzymes. She has not experienced any further GI symptoms.

Conclusion: The initial assumption was that her GI symptoms were due to diabetes-induced gastroparesis. Given our patient's prior medical history including diabetic nephropathy, this would normally be a reasonable consideration. It was this anchoring bias, however, that initially attributed her GI symptoms due to diabetic-induced gastroparesis. It was the persistent fatigue that prompted further testing. It was later concluded that this patient most likely had a viral illness causing transient gastroparesis and affecting the absorption of Synthroid. Her symptoms abated temporally with normalization of her liver function and likely eradication of her viral illness. She has not had any further GI complaints. The teaching points are twofold: we need to be aware of our own decision-making biases as it can affect patient management. Second, it is important to consider viral-induced gastroparesis as a potential etiology, especially if the symptoms are more acute and of a shorter duration.


  Abstract Number 37 Top


Natural Options for the Treatment of Chronic Pain When Chronic Kidney Disease Is Present

A. Parameswaran, R. Snyder1

1Internal Medicine Residency, Richard A. Anderson Campus, Easton, PA, USA

Introduction: For individuals with chronic pain, treatment options are often limited, especially if the patient has chronic kidney disease. Commonly used medications including nonsteroidal anti-inflammatory drugs (NSAIDs) and acetaminophen are not without risk. The effects of NSAIDs on renal function are well documented including acute kidney injury, hypertension, edema, and hyperkalemia. Chronic use of acetaminophen increases the risk of developing chronic tubular-interstitial disease secondary to papillary necrosis. There are other possible, effective alternatives with minimal side effects and no reported adverse effect on kidney function. They should be considered in patients with both coexisting chronic pain and chronic kidney disease. We report two patients with chronic kidney disease and chronic pain in which the use of natural, safe alternatives improved pain and level of functioning with no damaging effects on kidney function.

Case Scenario: We present a 72-year-old male with a history of stage 3 chronic kidney disease and significant osteoarthritis of his left knee. He is able to ambulate without assistance but has chronic pain in that area. Other medical history includes nephrolithiasis and hypertension. His baseline creatinine is approximately 1.6 mg/dL. He was started on turmeric 500 mg daily as well as a topical homeopathic product Arnica Montana 30X which was applied to his left knee twice daily. He reported an improvement in his pain levels and noted being able to walk better within 3 weeks of initiating this regimen. A 90-year-old male with repeat of hypertension and gout had been maintained on losartan and allopurinol dose based on renal function for several years. His baseline creatinine was 1.5 mg/dL. Despite being on these two medications, he still would experience intermittent gout flares. The patient was started on tart cherry extract twice daily the capsule form and he has not had a gout flare over the last couple of years.

Conclusion: There are alternative options for people with chronic pain and chronic kidney disease which deserves further study. Turmeric has been found to be noninferior to NSAIDs with regard to pain control however does not cause the renal dysfunction associated with NSAIDs. It also may have other beneficial aspects including cardiac- and neuro-protective properties. Arnica Montana is a homeopathic supplement that has been demonstrated to be an effective pain option for patients with both knee and hand osteoarthritis in peer-reviewed study. Tart cherry extract has been demonstrated to reduce the frequency of gout flare-ups as well as chronic pain induced by the gout. Their pain control and level of functioning have improved without detrimental effects on renal function. We need to consider incorporation of these alternative agents in our armamentarium for pain control in patients both with and without chronic kidney disease. Further research is needed regarding optimal dosing and duration.


  Abstract Number 38 Top


Varicella Zoster Vasculopathy Causing Multifocal Cerebral Infarcts

A. S. Elshaikh1, I. Taha1, C. E. Ruggeri1

1Internal Medicine Residency, University Hospital Campus, Bethlehem, PA, USA

Introduction: Multifocal cerebral infarcts are most commonly attributed to cardioembolic phenomena and are seldom seen as a result of varicella zoster vasculopathy (VZV). In this report, we describe a rare cause of multiple cerebral infarcts secondary to VZV.

Case Scenario: A 63-year-old male was brought to the hospital after recurrent falls felt to be due to alcohol abuse. He was recently discharged from physical rehabilitation and the past workup was unrevealing for cardiac or metabolic etiologies. Physical examination revealed a confused malnourished patient, with generalized skin abrasions at multiple stages of healing and healed vesicular scabbed lesions on the buttocks. Cardiovascular examination revealed normal heart sounds with no murmurs or gallop, lungs clear to auscultation bilaterally. Neurological examination revealed new left facial droop otherwise intact cranial nerves, positive left-sided upper and lower limb spastic hemiparesis, and brisk reflexes. Computed tomography (CT) of the head demonstrated profound cerebral and cerebellar atrophy. Magnetic resonance imaging of the brain showed three distinct foci of diffusion restriction in three separate vascular territories including both middle cerebral artery territories and the left posterior cerebral artery territory, indicative of multifocal infarctions [Figure 1]. TPA was not given as the patient was outside the window. EKG and echocardiogram excluded atrial fibrillation and vegetations/thrombus, respectively, as etiology. Lumbar puncture was negative for pleocytosis or protein. However, it revealed positive VZV DNA PCR. This prompted CT angiography of the head and neck which showed no overt inflammation or vascular beading consistent with central nervous system vasculitis. The patient was treated for VZV with a 14-day course of acyclovir and prednisone and was later discharged to a rehabilitation facility. Outpatient follow-up demonstrated improvement in his neurological examination.

Conclusion: Although multifocal infarcts are usually attributed to cardioembolic sources, we learned from our case not to overlook other uncommon causes such as VZV vasculitis. VZV vasculitis responds well to antiviral and steroid therapy.


  Abstract Number 39 Top


Delayed-Onset Seizures Following Self-Inflicted Nail Gun Injury to the Head: Case Report and Review of the Literature

T. Xia1, B. A. Hoey1

1 General Surgery Residency, University Hospital Campus, Bethlehem, PA, USA

Introduction: Nail gun use has increased since its introduction in 1959. Not surprisingly, the incidence of nail gun injuries has increased with approximately 40,000 patients injured per year. A vast majority of these injuries involve the extremities; however, there is a subset of patients who suffer intracranial trauma. There are multiple reports that suggest such an injury can lead to permanent neurologic impairment or death. A 2012 review of 41 nail gun head trauma cases suggested that further study is needed to develop a proper diagnostic and treatment algorithm for these individuals with nail gun-related head trauma. This case details late-onset posttraumatic seizures in a patient who suffered 28 self-inflicted penetrating head wounds from a nail gun while distraught after accidently amputating his hand. In addition, we present an updated comprehensive review of relevant literature to discuss proper diagnosis and management of intracranial nail gun injuries.

CARE Statement: This case details late-onset posttraumatic seizures in a patient who suffered multiple self-inflicted penetrating head wounds from a nail gun. He had no neurological deficits and ultimately was treated conservatively. The patient suffered delayed posttraumatic seizures requiring life-long seizure prophylaxis. A literature search was conducted on PubMed using the phrases nail gun and penetrating head trauma; this is the 65th reported case. All articles were reviewed with attention to patient condition at presentation, treatment strategies, and outcomes. Based on this review, we present the first diagnostic and treatment algorithms for patient with nail gun injuries to the head.

Case Scenario: A 25-year-old male construction worker suffered an accidental amputation of his left hand while using a circular saw and subsequently fired a nail gun 28 times into his head in an attempt to “dull the pain.” He presented to the trauma center with left arm pain and a headache with numerous nails protruding from his scalp. His GCS was 15. Plastic surgery and neurosurgery were consulted for evaluation. Computed tomographic (CT) scan of his head revealed 24 nails perforating the skull, with subarachnoid blood bilaterally [[Figure 1], CT scout image] One nail emerged in the interhemispheric fissure and several entered the cerebral cortex. There was no obvious pneumocephalus or intracerebral hematomas. An arteriogram revealed no major vessel injury. He was then taken to the operating room for scalp debridement and hand reimplantation. All extracranial nails were removed; however, no attempt was made to remove the embedded nails due to the risk of intracerebral bleeding and the patient's intact examination. On the postoperative day 1, the decision was made to re-amputate the patient's hand due to ischemic changes. The rest of his hospital course was uneventful as he completed a 7-day course of broad-spectrum antibiotics and daily seizure prophylaxis. Fifteen months later, the patient presented to an emergency department following a seizure. Physical and neurological examination was unremarkable. The patient denied previous seizures but admitted to recently stopping his anticonvulsant. A CT scan showed no new lesions and an electroencephalogram revealed bihemispheric cortical abnormalities without any focal epileptiform activity. The hospital course was otherwise unremarkable, and the patient was discharged home back on prophylaxis. He has since returned to work and is living independently with no apparent long-term complications.

Conclusion: We report a case of nail gun injury that involved 28 self-inflicted penetrating head wounds. The patient had no neurological deficits and was treated conservatively suffering a delayed outpatient seizure, after stopping his recommended prophylactic antiseizure medication. Diagnostic and treatment algorithms for this unique injury are presented based on this case and a review of the literature.





Risk Factors Associated with Poor Outcomes in Younger COVID-19-Hospitalized Patients

J. G. Fleischer1, R. A. Pallay1, C. Oshea1, R. Unterborn1, D. Corwin1

1Temple/St. Luke's Medical School, Bethlehem, PA, USA

Introduction: Over 1,000 patients admitted to the St. Luke's University Health Network (SLUHN) have been afflicted with the COVID-19 virus during the past several months. With about 2 million cases of the novel coronavirus in the United States, the current pandemic has become the focus of immense research efforts and innumerable scientific publications. However, much of the published content is focused on constantly-evolving treatment algorithms and descriptive characteristics of patient cohorts and clinical experiences. The goal of this project is to better understand the types of patients impacted and to identify the influence of common comorbidities (body mass index [BMI], smoking status, hypertension, diabetes, coronary artery disease [CAD], congestive heart failure [CHF], chronic obstructive pulmonary disease [COPD], asthma, immunosuppression, and chronic kidney disease [CKD]) particularly on the younger patient population (<50 years old) based on the collective SLUHN experience. The goal of this project is to identify risk factors for poor outcomes in our patient cohort and thereby facilitate improved clinical management of more severe COVID-19 cases.

Methods / Study Population: The retrospective study cohort included 692 COVID-19-positive patients admitted to SLUHN hospital campuses in Pennsylvania and New Jersey between February 1, 2020 and May 31, 2020. Mean age of the study population was 64.0 years (±16.7). A majority of the patients were male (54.1%) while 45.9% were female. 57.9% of the patients were Caucasian, 17.3% African-American, 1.7% Asian, less than< 1% Native-American, and 19.2% reported their race as “Other.” Within the study cohort, 30.2% of the patients were as Hispanic or Latino, 66.0% Not Hispanic or Latino, with the remaining 3.8% were reported as Other [Table 1]. The average BMI of the participants was 31.6 (±7.8), where 53.9% of patients had a BMI of 30.0 or higher, 26.2% had a BMI between 25 and 29.9, 18.5% had a BMI between 18.6 and 24.9, and 1.4% had a BMI <18.5. 62.7% of the patients never smoked tobacco, whereas 32.7% report quitting tobacco use, 4.2% are active tobacco users, and 4.5% declined to answer. Data were extracted from patients' electronic medical records both during their admission and after discharge. Analyses were carried out using SPSS (IBM, Armonk, NY) Software 18, with statistical significance set at P < 0.05. Chi-square test of independence was performed to examine the relationship between known COVID-19 risk factors among COVID-19-infected patients younger than 50 years (n = 148) versus those 50 years and older (n = 544).

Results: Risk factors showing statistically significant difference between the two study groups (e.g., <50 years versus >50 years) BMI, smoking exposure, diabetes, hypertension, heart failure, COPD, and malignancy. There was no difference in asthma and immunosuppression between the two groups. Of note, 93.2% of patients in the younger group were overweight/obese (BMI >25) compared to 76.5% of patients >50 years (P < 0.0001). In addition, 27.7% of younger COVID-19 patients had smoking exposure compared to 40.3% of older patients [Table 2]. 29.7% of younger patients had diabetes compared to 44.7% of older patients (P = 0.00107). Hypertension was identified in 31.4% of the <50 years group compared to 66.2% of older patients (P < 0.00001). 95.9% of younger patients did not have heart failure compared to 83.8% of older patients (P = 0.00014). Moreover, 97.3% of younger patients did not have CAD compared to 80.3% among those >50 years of age (P < 0.00001). 2.3% of younger patients had COPD compared to 10.9% of older patients (P = 0.00087). 2.7% of younger patients had malignancy compared to 11.6% of older patients (P = 0.00120). Finally, 4.7% of younger patients had CKD compared to 23.5% of older patients (P < 0.00001).

Conclusion: The results of this study provide insight into key risk factors that contribute to severe COVID-19 infection in young (<50 years) patients compared to older (50+ years) patients. Hospitalized COVID-19 patients of all ages were more likely to have BMI >25 and hypertension. However, patients under 50 years of age were more likely to be overweight/obese compared to patients 50 years and older. On the other hand, older patients were more likely to have a history of smoking exposure, diabetes, hypertension, heart failure, CAD, COPD, malignancy, and CKD. Overall, this study suggests that BMI is a major risk factor for more severe disease in patients under 50 years of age. In comparison, smoking exposure, diabetes, hypertension, heart failure, CAD, COPD, malignancy, and CKD are risk factors for more severe disease in older patients. These results are important in establishing age-specific clinical prognostic criteria.







Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.

Ethical conduct of research

All research projects presented during the St. Luke's University Health Network Annual Research Symposium were verified to have either appropriate Institutional Review Board approvals or exemptions. For case reports, proof of patient consent documentation was required. In all instances, applicable EQUATOR guidelines (see https://www.equatornetwork.org/reporting guidelines/) for scientific reporting were followed.






 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
Background Infor...
Abstract Number 1
Abstract Number 2
References
Abstract Number 3
Abstract Number 4
Reference
Abstract Number 5
Abstract Number 6
Abstract Number 7
Abstract Number 8
Abstract Number 9
Abstract Number 10
References
Abstract Number 11
Abstract Number 12
Abstract Number 13
References
Abstract Number 14
Abstract Number 15
Abstract Number 16
Abstract Number 17
Abstract Number 18
Abstract Number 19
Abstract Number 20
Abstract Number 21
Abstract Number 22
Abstract Number 23
Abstract Number 24
Abstract Number 25
Abstract Number 26
Abstract Number 27
Abstract Number 28
Abstract Number 29
Abstract Number 30
References
Abstract Number 31
Abstract Number 32
Abstract Number 33
References
Abstract Number 34
References
Abstract Number 35
Abstract Number 36
Abstract Number 37
Abstract Number 38
Abstract Number 39
Abstract Number 40

 Article Access Statistics
    Viewed129    
    Printed2    
    Emailed0    
    PDF Downloaded0    
    Comments [Add]    

Recommend this journal


[TAG2]
[TAG3]
[TAG4]