Home About us Editorial board Ahead of print Current issue Search Archives Submit article Instructions Subscribe Contacts Login 
  • Users Online: 171
  • Home
  • Print this page
  • Email this page


 
 Table of Contents  
CONFERENCE ABSTRACTS AND REPORTS
Year : 2017  |  Volume : 3  |  Issue : 1  |  Page : 177-193

The 2017 St. Luke's University Health Network Annual Research and Innovation Symposium: Event Highlights and Scientific Abstracts


Department of Research and Innovation, The Research Institute, St. Luke's University Health Network, Bethlehem, Pennsylvania, USA

Date of Web Publication7-Jul-2017

Correspondence Address:
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/IJAM.IJAM_57_17

Rights and Permissions

How to cite this article:
Stoltzfus JC. The 2017 St. Luke's University Health Network Annual Research and Innovation Symposium: Event Highlights and Scientific Abstracts. Int J Acad Med 2017;3:177-93

How to cite this URL:
Stoltzfus JC. The 2017 St. Luke's University Health Network Annual Research and Innovation Symposium: Event Highlights and Scientific Abstracts. Int J Acad Med [serial online] 2017 [cited 2021 Jan 16];3:177-93. Available from: https://www.ijam-web.org/text.asp?2017/3/1/177/209857

Background Information and Event Highlights: The Annual St. Luke's University Health Network (SLUHN) Research Symposium was created in 1992 to showcase research and quality improvement projects by residents and fellows. The Research Institute Director is responsible for planning and organizing the event, with collaboration and consultation provided by the Chief Academic Officer, the graduate medical education leadership, residency, and fellowship faculty, and the Director of Media Production Services. Residents and fellows submit an application for oral and/or poster presentation along with an accompanying abstract describing their project. Three physician judges are selected to evaluate the presentations for the first and the second place cash prizes awarded in both oral and poster presentation categories.

For the second year in a row, the 2017 research symposium for residents and fellows was combined with the Nursing Research and Scholarship Celebration into an institutional-wide SLUHN “Research Day” that also included a keynote speaker. This year's distinguished keynote speaker was Dorry Lidor Segev, MD, PhD, Associate Vice-Chair for Research and Professor of Surgery at Johns Hopkins University in Baltimore, Maryland. As an abdominal transplant surgeon, Dr. Segev focuses on minimally invasive live donor surgery and incompatible organ transplantation. His research incorporates complex statistical methodology for mathematical modeling, clinical data simulation, analysis of large healthcare datasets, and outcomes research. Dr. Segev's keynote presentation was entitled “Of Math and Medicine: Changing Practice and Policy with Big Data” and masterfully outlined the use of advanced data analytics and modern statistical approaches to help shape, introduce, and enact important legislative healthcare initiatives. The session set an all-time attendance record for the SLUHN Research & Innovation Day, with nearly 300 participants either viewing the lecture remotely across the Network's 7 campuses or attending in person at the Laros Auditorium [Figure 1].



The Nursing Research and Scholarship Celebration featured original research and quality improvement projects by nurses from six of the seven hospital campuses and the Visiting Nurses Associations of SLUHN. The event was coordinated by Peter Deringer, RN, MA, NE-BC, and Joan Snyder, MSN, RN. Presenting nursing scholars discussed projects aimed at enhancing patient safety, optimizing care delivery, and improving quality of care. In addition, the fourth Annual Robin E. Haff Award, given to a nurse showing exceptional interest in nursing research, was presented to Heather Alban, MS, RN, for the project titled “Safely Managing Pain in Older Adults: Implementation of a Geriatric Acute Pain Management Order Set”. The award is sponsored by Dr. Vincent Lucente, an internationally recognized academic urogynecologist and medical researcher. Detailed list of research projects, listed by SLUHN campus/entity of origin, is provided in [Table 1].



Finally, the overall event was extended to include a session on healthcare innovation during the 2nd day. The Innovation Keynote was presented by Matthew Fenty, Director of Innovation, Strategic Partnerships, and Digital Strategy at St. Luke's University Health Network and focused on various aspects of technological innovation in healthcare, with emphasis on optimizing care quality, improving provider productivity, and providing the best value for patients. In addition, Daniel Foltz, Program Manager of the Enterprise Data Warehouse at St. Luke's discussed contemporary trends in healthcare data analytics and approaches to “data mining” in the context of clinical research and quality improvement projects [Figure 1]. A brief question-and-answer session followed these presentations.

The following core competencies are addressed in this article: Practice-based learning and improvement, Medical knowledge, Patient care, Systems-based practice.

Keywords: Innovation day, nursing research and scholarship celebration, resident and fellow research competition, St. Luke's University Health Network


  Oral Presentation Abstract Number 1 Top


Patients' Thoughts and Perspectives on Pain in an Outpatient Physical Therapy Practice: A Descriptive Study

Nicholas Adriance, Stephen Kareha, Andrue Bergmooser, Jill Stoltzfus, Jeffrey Bays

Physical Therapy Residency Program, St. Luke's University Health Network, Easton, PA, USA

Introduction/Background: Musculoskeletal pain is the most frequent complaint for which people seek medical treatment. Pain is a powerful motivating force that is subjective, as well as a universal human experience that can guide treatment-seeking behaviors in patients. There is a complexity to pain that combines multiple pain generators and amplifiers, including fear, avoidance, anxiety, stress, beliefs, and depression. Therefore, the purpose of this descriptive survey study was to analyze patients' beliefs, expectations, and misconceptions regarding pain beliefs before initial examination.

Methodology and Statistical Approach: We used a mixed methods approach with a convenience sample of 54 patients presenting for physical therapy consultation with a chief complaint of pain. The patients completed a 16-item survey before examination. The survey included demographic information such as gender, age, chronicity of symptoms, surgical or nonsurgical source, and body part or region involved. The survey also incorporated an 11-point Numeric Pain Rating Scale, as well as several open- and closed-ended questions from established and previously validated outcome measures such as the Fear Avoidance Belief Questionnaire, the Pain Catastrophizing Scale, the Pain Beliefs Questionnaire, and the Pain Neurophysiology Questionnaire. Data were collected from March 2015 to September 2015. The qualitative data were then synthesized using a thematic analysis.

Results: Four main themes emerged from the data: (1) chronicity (63% of patients reported pain lasting 6 months or more); (2) emotional lability (25% of patients responded emotionally when they experienced pain); (3) fear avoidance (55% of patients reported that they thought about pain both often and all of the time); and (4) pain catastrophization (64% of patients expressed their belief that having higher pain intensity is proportional to the amount of tissue damage).

Discussion and Conclusion: The results of this study are consistent with those discussed in previous research utilizing a biopsychosocial model. This model integrates the interaction between the environment, psychological, biological, and social components of pain. The model also focuses on patients' limited knowledge regarding pain and maladaptive beliefs that lead to functional disability. The results suggest a need for management through a biopsychosocial approach, given the flaws and gaps within the current biomedical model, which has not grasped the importance of all-encompassing patient care. In addition, further research is needed regarding the validity of 16-item survey created for this study.


  Oral Presentation Abstract Number 2 Top


Marissa Cohen, Thomas Wojda, Ashley Jordan, Joshua Luster, Holly Stankewicz, Alexander Wallner, Aliaskar Hasani, Samuel Schadt, Stanislaw Stawicki, Philip Salen

Emergency Medicine and General Surgery Residency Programs, St. Luke's University Health Network, Bethlehem, PA, USA

Exploring the Relationship between Initial Blood Alcohol Levels and Serum Lab Values in Trauma Patients

Introduction/Background: Despite acute alcohol intoxication being relatively common among trauma patients, little is known regarding the effect of serum alcohol levels and hematologic parameters at the time of initial patient evaluation. The aim of this study was to determine the behavior of comprehensive blood count (CBC) components in the context of increasing serum alcohol levels. We hypothesized that increasing alcohol levels would be associated with greater concentration of blood components, with resulting increase in all CBC components.

Methodology and Statistical Approach: The institutional registry at our Level I trauma center was queried between August 1998 and June 2015 for all patients in whom blood alcohol content (BAC) was collected. Other data points included patient demographics, injury mechanism/severity information, and basic hematologic parameters. Hematologic data were contrasted across predefined BAC strata (<0.10%, 10%–15%, 15%–20%, >20%). Statistical comparisons were performed using analysis of covariance with adjustment for patient demographics and injury characteristics. Statistical significance was set at α = 0.05.

Results: A total of 1218 patients who presented to our institution had formal serum alcohol testing during the study period. Serum alcohol levels were <0.10 in 89% (1086/1218); 0.10–0.15 in 6.2% (75/1,218); 0.15–0.20 in 2.2% (27/1218); and >0.20 in 2.5% (30/1218). After adjusting for patient age, gender, and injury severity score, there was a significant increase in both hemoglobin and hematocrit with increasing alcohol levels [Figure 1]a and [Figure 1]b, both P< 0.01]. Unexpectedly, we noted a 16% drop in white blood cell (WBC) count with increasing alcohol levels [Figure 1]c, P< 0.02]. Platelet count behavior was more difficult to quantify, with an irregular V-pattern noted as serum alcohol levels increased [Figure 1]d, P< 0.01].



Discussion and Conclusion: Although our hypothesis regarding increasing hemoglobin and hematocrit levels with increasing serum alcohol levels was confirmed, we were surprised to note a decline in WBC counts with increasing degrees of alcohol intoxication. Platelet count behavior did not follow a predictable pattern in this sample of patients. As the knowledge of alcohol-induced hematologic changes may influence provider perception of trauma patient hematologic homeostasis, our findings are clinically significant. With 5%–16% alcohol-mediated variability in key CBC components, a nontrivial group of patients may have their initial hematologic results misinterpreted in a potentially harmful way.


  Oral Presentation Abstract Number 3 Top


Is There an Association between Blood Alcohol Levels and Polysubstance Use in Trauma Patients?

Ashley Jordan, Thomas Wojda, Marissa Cohen, Aliaskar Hasani, Holly Stankewicz, Philip Salen, Stanislaw Stawicki

Ashley Jordan, Thomas Wojda, Marissa Cohen, Aliaskar Hasani, Holly Stankewicz, Philip Salen, Stanislaw Stawicki

Introduction/Background: Polysubstance abuse is a major public health problem in the United States. In addition to the negative impact on the health and well-being of substance users, alcohol and/or drug abuse may be associated with a significant trauma burden. The aim of this study was to determine if serum alcohol (EtOH) levels at initial trauma evaluation correlate with the simultaneous presence of other substances of abuse. We hypothesized that polysubstance use would be significantly more common among patients who presented to our trauma center with blood alcohol content (BAC) >0.10%.

Methodology and Statistical Approach: After an Institutional Review Board exemption was granted, we retrospectively reviewed records from our Level I trauma center registry between January 2009 and January 2012. Abstracted data included patient demographics, BAC determinations, all available formal determinations of urine/serum drug screening, injury mechanism and severity (injury severity score 1) information, Glasgow coma scale (GCS) assessments, and 30-day mortality. Stratification of BAC was based on the 0.10% cutoff. Statistical comparisons were performed using Chi-square and Fisher's exact tests, with significance set at α = 0.05.

Results: A total of 488 patient records (76.3% male, mean age 38.7 years) were analyzed. Median GCS was 15 (interquartile range [IQR] 14–15). Median ISS was 9 (IQR 5–17). Median BAC was 0.10% (IQR 0–0.13). There were 284 (58.2%) patients with BAC <0.10% and 204 (41.8%) patients with BAC >0.10%. Of the 245 patients who underwent formal “tox-screen” evaluations, 31 (12.7%) were positive for marijuana, 18 (7.35%) for cocaine, 28 (11.4%) for opioids, and 32 (13.1%) for benzodiazepines. As presented in [Table 1], patients with BAC >0.10% on initial evaluation also had significantly greater polysubstance use (e.g., EtOH + additional substance) than patients with BAC <0.10% (53/220 [24.1%] vs. 16/25 [64.0%], P< 0.002). Among polysubstance users, BAC >0.10% was significantly associated with opioid and cocaine use.



Discussion and Conclusion: This study shows that a significant proportion of trauma patients with admission BAC >0.10% present with evidence of polysubstance use. Patients with BAC >0.10% were more likely to test positive for drugs of abuse than patients with BAC <0.10%. Our findings support the need for substance abuse screening in the presence of suspected EtOH intoxication, focusing on identification of at-risk patients, appropriate clinical management, and implementation of early polysubstance abuse intervention strategies.


  Oral Presentation Abstract Number 4 Top


Novel Uses for the Trauma and Injury Severity Score: Can it Predict Morbidity and Length of Stay?

W. T. Hillman Terzian, Brian Hoey, William Hoff, Peter Thomas, Thomas Wojda, James Cipolla, Stanislaw Stawicki

General Surgery Residency Program, St. Luke's University Health Network, Bethlehem, PA, USA

Introduction/Background: The Trauma and Injury Severity Score (TRISS) was designed as a survival prediction paradigm. Our aim was to determine if TRISS correlated with morbidity and hospital length of stay (LOS) using data from our Level 1 Trauma Center Institutional Registry (TCIR). We hypothesized that lower TRISS probability of survival is associated with increased morbidity and longer LOS.

Methodology and Statistical Approach: We performed a review of the TCIR between 1999 and 2015. Of 32,000 charts, 23,205 contained data required to calculate TRISS probabilities of survival (POS). Results were controlled for demographic factors. We performed univariate analyses to determine relationships between TRISS and mortality, morbidity, and LOS (for hospital, step-down, Intensive Care Unit [ICU]). Analysis of co-variance was utilized to determine between-group differences. Corresponding receiver operator characteristic curves were constructed. Comparisons of the corresponding areas under the curve (AUC) were performed using the DeLong method. Statistical significance was set at 0.05.

Results: There were 23,205 patients (60.3% males; 94.9% blunt mechanism; 22.1% required ICU). Median age was 45 years (interquartile range, 24–70). Median injury severity score (ISS) was 5, with 3.1% overall mortality. TRISS was highly predictive of mortality (area under the curve [AUC] 0.95), outperforming GCS (AUC 0.83), ISS (AUC 0.80), and age (AUC 0.65). Likewise, TRISS predicted complications (AUC 0.81) better than its subcomponents. Finally, TRISS outperformed GCS and age at predicting ICU admissions, being comparable only to ISS (AUC 0.80 and 0.81, respectively). Lower TRISS POS correlated strongly with LOS [Table 1].



Discussion and Conclusion: Although the utility of TRISS has been questioned in contemporary literature, we found that our TRISS data were useful in prognosticating mortality, morbidity, and hospital LOS. The latter two findings are novel and unique to this study. Given the above, TRISS may be predictive of outcomes other than mortality and should not be abandoned at this time.


  Oral Presentation Abstract Number 5 Top


Glycemic Control after Transitioning from Standard Concentration Insulin to Concentrated Insulin in Patients with Uncontrolled Diabetes: A Longitudinal Case Series

Shari Williams, Daniel Longyhore, Shawn Depcinski

Pharmacy Residency, St. Luke's University Health Network, Bethlehem, PA, USA

Introduction/Background: The number of patients unable to achieve glycemic control with the use of standard concentration insulin (U-100) is increasing. Due to this reality, the use of concentrated insulin formulations has expanded and become a more readily available treatment option. Despite widespread availability, there are limited data regarding real-life patient transitions from standard to concentrated insulin and the impact on glycemic control. The purpose of this study was to determine the efficacy of concentrated insulin in improving glycemic control in patients with uncontrolled glucose utilizing standard concentration insulin.

Methodology and Statistical Approach: This was a retrospective, multicenter, longitudinal case series analyzing medical records from an Internal Medicine and Endocrinology practice from January 2011 to January 2017. Insulin regimens pre- and post-concentrated insulin transitions were evaluated to determine glycemic control by reduction in hemoglobin A1C, change in total daily dose, and change in number of injections per day. Statistical testing was performed using the Mann-Whitney U-test, with statistical significance set at 0.05.

Results: Among the treatment groups represented, the average reduction in A1C was statistically significant for those patients using Reg U-500 (−1.2%, P< 0.001), Glar U-300 (−0.6%, P< 0.001), or combination Lis U-200/Glar U-300 (−1.1%, P = 0.002). Regarding total number of units required per day, there was a statistically significant difference in patients using Glar U-300 (+9.4 units, P = 0.009) or combination Lis U-200/Glar U-300 (+30.9 units, P = 0.018). Finally, the number of injections per day remained consistent per treatment group, except patients using Reg U-500 (−1 injection/day, P< 0.001).

Discussion and Conclusion: The transition to concentrated insulin presents advantages for improved glucose control. Regimens of U-500, Glar U-300, and the combination of Lis U-200 with Glar U-300 yielded lower A1C values. However, only Reg U-500 was able to demonstrate this reduction without a significant increase in insulin dose and a significant reduction in the number of injections administered per day.


  Poster Presentation Abstract Number 1 Top


Samih Barcham, Helaine Levine, Maria Reichel

Family Medicine Residency.Warren Hospital Program, St. Luke's University Health Network, Phillipsburg, NJ, USA

Screening for Nephropathy in Diabetes Mellitus: Just What the Doctor Should Order

Introduction/Background: Diabetic nephropathy occurs in 20%–40% of diabetics, and annual screening by urine albumin/creatinine ratio is recommended. Low screening rates are commonly attributed to patient factors, including lack of follow through with ordered testing or missed appointments. We hypothesized the following: (1) low screening rates resulting from physician factors would lead to failure to order the testing and (2) focused education can raise diabetic nephropathy screening rates 20% in the short term, with continuing educational support potentially leading to further increases.

Methodology and Statistical Approach: The study design followed quality improvement methodology with successive plan-do-check-act (PDCA) cycles over 20 months. Billed ICD diabetic codes identified diabetic visits at Coventry Family Practice (CFP) for the SLW Family Medicine Residency. Every diabetic patient visiting CFP was included in the baseline 14-month period. Intervention cohorts were assessed over 1–2 month period. Faculty participation remained constant, but different residents participated in different cycles. Urine albumin orders and results were collected retrospectively from charts of serial cohorts, with percentages of total available urine results and orders compared between the different patient groups. The presence or absence of orders was used to differentiate between physician factors (no order) and patient factors (noncompliance); physician educational interventions were modified before each new cycle based on previous cycle results.

Results: An increase of 20% of available results over baseline was maintained over all intervention cycles. The absence of physician order decreased 22% initially, but further gains between interventions 1–2 were achieved through increased patient compliance. Measurement of orders on previously missed patients was added after education session 3 and revealed a 127% increase in cycle 2–3. Summary of key study results is provided in [Table 1] and [Figure 1].



Discussion and Conclusion: Diabetic nephropathy is the leading cause of end-stage renal disease. Modification of treatment in proteinuric patients can slow progression. Therefore, annual urine albumin screening is both a payer outcomes quality measure and an American Diabetes Association recommendation. Our baseline results confirmed our hypothesis that low screening rates at CFP result from physician failure to order the testing rather than patient noncompliance. Effectiveness of the focused educational intervention was demonstrated by maintenance of a 20% increase of available results over baseline in different patient cohorts, with a corresponding increase in physician orders after the first and third interventions. The PDCA cycles approach is especially helpful in an educational setting with a constantly changing physician pool.


  Poster Presentation Abstract Number 2 Top


Focused Improvement in Abdominal Aortic Aneurysm Screening Practices in a Resident-driven Clinic Setting

Steven Cardio, Jason Bacon, Cara Ruggeri

St. Luke's University Health Network, Bethlehem, PA, USA

Introduction/Background: Population-based studies in adults older than 50 years have found that the prevalence of abdominal aortic aneurysm (AAA) ranges from 3.9% to 7.2% in men. The incidence of AAA increases steadily with advancing age, and there is a 3- to 5-fold increase in prevalence in individuals who have ever smoked. It is important to consider potential screening strategies for AAA because most are asymptomatic for many years until they rupture. The overall mortality rate from a ruptured AAA can be as high as 75%–90%. Numerous studies show adequate evidence that one-time ultrasonography is a safe and accurate screening test for AAA, with high sensitivity and specificity of 94%–100% and 98%–100%, respectively. Therefore, as a class B recommendation, the US Preventative Services Task Force (USPSTF) advises one-time screening for AAA in men aged 65–75 years who have ever smoked (“ever smoked” is defined as having smoked at least 100 cigarettes during a lifetime). Although this recommendation has been in place since 2005, it is rarely followed resulting in large gaps between guideline recommendations and translation into actual practice standards.

Our quality improvement project aimed to assess the current adherence of our general internal medicine clinic (Southside Medical Center) to the USPSTF's AAA screening guidelines, as well as to determine the impact of simple, low-cost interventions on practice behavior by comparing screening rates before and after these interventions.

Methodology and Statistical Approach: To establish our baseline data, patient records were initially obtained from the clinic's Electronic Medical Record (EMR) system, Allscripts, with the assistance of the network's information technology support staff. Filters were applied to capture inclusion criteria, which included men aged 65–75 years with a current or former smoking history, with and without diagnosis codes associated with “screening for AAA” as of December 2016. Filtered data were further by manually reviewing each individual chart in Allscripts to ensure that baseline data did indeed reflect the inclusion criteria and to determine whether these patients were actively being seen at our clinic. Data from Allscripts were cross-referenced with data from our inpatient EMR system, Epic, as well as our former system Portal to determine if these patients may have been screened by ultrasound or alternative means incidentally (e.g., computed tomography scan) during the recommended screening age interval. The overall study process and its chronology are outlined in [Figure 1].



Results: As depicted in the [Figure 1]a and [Figure 1]b below, our percentage of purposefully screened patients increased from 13% to 26% while the percentage of those screened inadvertently through alternative means remained relatively stable with an increase from 25% to 28% [Table 1]. As a result, our total percentage of screened patients for AAA increased from 38% to 54% at the conclusion of this study.



Discussion and Conclusion: This study showed that implementation of low-cost multifaceted interventions aimed at provider education/reminders increased the rate of AAA screening at our clinic. We believe that this was a successful focused intervention, since our ultrasound screening rate doubled, while our inadvertent screen rate stayed relatively stable. Further research regarding sustainability of this intervention is warranted.


  Poster Presentation Abstract Number 3 Top


Assessing Knowledge Gaps Regarding End-of-life Issues in Patients Admitted to the Hospital

Marissa Cohen, Mayank Mehrotra, Luis Vera, Vamsi Balakrishnan, Rebecca Jeanmonod

Emergency Medicine Residency, St. Luke's University Health Network, Bethlehem, PA, USA

Introduction/Background: Healthcare expenditures continue to rise in the United States. Many patients incur their highest health-care utilization at the end of life, and 85% of patients die in hospitals, although most studies demonstrate a preference for death at home. We sought to assess patients' knowledge gaps regarding end-of-life issues, including do not resuscitate (DNR) and do not intubate (DNI) orders, physician orders for life-sustaining treatment (POLST), knowledge of and outcomes after cardiopulmonary resuscitation (CPR), and preferences regarding end-of-life care. In addition, we surveyed patients regarding prior hospitalizations and discussions with healthcare providers about end-of-life care.

Methodology and Statistical Approach: This was a cross-sectional, survey-based study of adult patients ages 55 and older who were admitted to the hospital. A convenience sample of patients admitted to the medical floors was identified by research assistants and approached for survey completion. Only English-speaking patients were enrolled. Written consent was waived as the survey was anonymous. Patients were queried regarding demographic information, prior discussions about end-of-life care, prior hospitalization history, knowledge of DNR/DNI, POLST, CPR (including complications and outcomes), intubation, and priorities in end-of-life scenarios. In addition, patients were asked to complete an open-ended question regarding anything they would like to know about CPR, DNR, DNI, comfort care, and living wills.

Results: A total of 105 patients elected to complete the survey (59% females; 43% ages 55–70 [with the remaining 57% above age 70]; 87% Caucasian; >50% reporting annual income under $30,000; and 54% reporting either attendance or completion of high school as the highest level of education). Most patients (66%) had public insurance, while 29% had private insurance, and 3% were self-pay. Almost all patients (98%) reported having had a primary care physician (PCP) who they see once a year, but only 26% recalled having had conversations with their PCP regarding decisions about end-of-life care (cardiopulmonary resuscitation [CPR] or mechanical ventilation). Furthermore, 65% of patients had been admitted to the hospital within the past year, and despite the requirement of documented code status associated with every admission, only 41% of patients recalled having spoken with a hospital provider about end-of-life decisions. Just over half of patients (52%) answered that they had discussed end-of-life decisions with family, and the same percentage answered that they had signed paperwork indicating end-of-life goals of care. When queried about survival rates following CPR in and out of the hospital, 60% and 36% expected better outcomes than current statistics regarding in-hospital and out-of-hospital CPR, respectively [Figure 1] and [Figure 2]. When asked to rank factors that signify a “good death,” patients thought that it was most important to have a painless death and/or to have family present, followed by being at home, then having a long life. Patients thought that it was the least important to be in the hospital or have a physician present [Figure 3]. Anecdotally, compared to similarly conducted studies devised by our emergency department, patients were much less willing to answer questions about end-of-life issues than they were about other topics.



Discussion and Conclusion: Our study addressed multiple components relating to end-of-life care. Overall, we observed a notable lack of communication and/or understanding between patients and their physicians when discussing CPR/ventilators. Patients overestimated the positive outcomes in cases of CPR for cardiac arrest. Although most medical care is devoted to patients at the end of life, and the vast majority of patients end their lives in hospitals, patients tend to envision ideal end-of-life conditions in the comfort of their homes and surrounded by family.


  Poster Presentation Abstract Number 4 Top


The Prevalence of Movement-related Conditions in the Primary Care Setting

Jenna Cornell, Stephen Kareha

Physical Therapy Residency, St. Luke's University Health Network, Easton, PA, USA

Introduction/Background: Movement of the human body is essential to personal and societal function. It has been reported that 58% of Americans suffer from musculoskeletal conditions, with an associated cost of $213 billion/year for care. The human movement system encompasses multiple inter-related systems, and therefore, it is hypothesized that the prevalence of movement-related conditions is even higher. The purpose of this study was to determine the prevalence of movement-related conditions in patient seeking treatment in a primary care setting.

Methodology and Statistical Approach: Data were obtained from an ongoing quality improvement project. A questionnaire was administered to patients arriving in the primary care setting to identify movement-related conditions. If a patient reported pain or difficulty with movement, they were encouraged to select the body region of complaint from a prespecified list along with the movement with which they had pain or difficulty.

Results: The questionnaire was administered to 379 consecutive patients seeking care in a primary care setting. The prevalence of self-reported movement-related conditions was 39%. The most common body regions associated with movement-related conditions were the spine (60%) followed by the lower extremity (56%). Pain or difficulty with lifting/carrying (52%), bending (51%), and sleeping (38%) were the most frequent functional complaints among patients with movement-related conditions.

Discussion and Conclusion: While the human movement system extends beyond the musculoskeletal system, the percentage of patients reporting pain or difficulty with movement in this study was 39.1%, compared to the previously reported 58% incidence of musculoskeletal conditions. There was a subgroup of patients (5.8%) who answered “no” to the initial question but indicated areas or movements with which they had difficulty. This disassociation of patient perception of a disorder and the existence of movement-related conditions is problematic. Since delay in addressing movement system dysfunction results in increased long-term disability and added cost to the health care system, it is essential to improve patient awareness and implement systems to discover these problems in the primary care setting.


  Poster Presentation Abstract Number 5 Top


Fecal Microbial Transplant for  Clostridium difficile Scientific Name Search ection Refractory to Conventional Treatment

Rodrigo Duarte-Chavez, Thomas Wojda, Stanislaw Stawicki, Gloria Fioravanti, Berhanu Geme

St. Luke's University Health Network, Bethlehem, PA, USA

Introduction/Background: Clostridium difficile (CD) is the most common health care-associated infection replacing methicillin-resistant Staphylococcus aureus. The yearly cost attributed to CD infection (CDI) is $6.3 billion. Metronidazole is the treatment for mild CDI, while in recurrent or severe CDI, oral vancomycin is superior. From 10% to 30% of patients will have recurrence after initial resolution, 40%–65% will have recurrent after a second episode, and 65%–80% will have recurrent after a third event.

Fecal microbiota transplant (FMT) prevents CDI recurrence by competing with CD for the available nutrients, regulating the immune response, and producing antimicrobial peptides. Overall, the cure rate is 87%–94%, but this decreases with severe CDI. We sought to describe the characteristics of patients receiving FMT at our institution.

Methodology and Statistical Approach: This was a retrospective, single-center study of FMT using colonoscopy for the treatment of CDI refractory to conventional therapy from July 2015 to February 2017. We used descriptive statistics to report side effects and features associated with both successful and failed FMT.

Results: Thirty-five patients with a mean of 2.7 recurrences underwent FMT. The mean age ± standard deviation was 58.6 ± 18.28 years, with 71% females and 29% males, and 29% having severe disease. During the initial infection, 40% of patients were using opioids, and 40% were taking a proton-pump inhibitor (PPI), while 26% had risk factors for immunosuppression and 23% had a previous cholecystectomy. During and after FMT, patients were using opioids and PPIs at rates of 31% and 43%, respectively.

Overall, FMT was successful in 91% of patients, with primary cure achieved in 86% [Figure 1]. Of the 29% with severe disease before FMT, the cure rate was 70% with primary cure achieved in 60%. FMT failed initially to cure CDI in 14% of patients; compared to patients with primary cure, these patients were older, had more incidence of severe disease, and were using opioids during initial infection and FMT. Use of PPI during initial infection and FMT was similar, as well as rates of immunosuppression and previous cholecystectomy [Table 1]. The most common adverse effects were loose stools (34%) and abdominal pain (11%).



Discussion and Conclusion: FMT is safe and effective for the treatment of refractory CDI. Opioid use was highly prevalent in patients who had initial failed response to FMT. The potential role of opioids in CDI requires further study.


  Poster Presentation Abstract Number 6 Top


Prealbumin Levels in Critically Ill Patients Correlate with Computed Tomography-Derived Psoas Muscle Characteristics

Nicholas Ferguson, Stanislaw Stawicki, Jamie Thomas

General Surgery Residency, St. Luke's University Health Network, Bethlehem, PA, USA

Introduction: Physiologic changes associated with acute stress may render traditional markers of nutritional status unreliable in the Intensive Care Unit (ICU), creating the need for more objective alternatives. One such alternative to traditional serum laboratory testing is the use of data from computed tomography (CT), including the psoas muscle (PM) area and density. This study examined the associations between prealbumin and CT characteristics (e.g., density-corrected psoas area [DCPA]) in a cohort of ICU patients. We hypothesized that PM area, density, and DCPA would correlate significantly with prealbumin in this population.

Methodology and Statistical Approach: In this pilot study, a convenience sample of ICU patients from January 2010 to July 2015 was reviewed retrospectively. Data collected included demographics (age, gender, body mass index [BMI]); labs (prealbumin, albumin, total protein, lymphocyte counts); and abdominal CT measurements of PM density (Hounsfield units [HU]); and area (measured in mm 2). Psoas data were acquired using axial CT images at the superior aspect of the L4 vertebral body. Using advanced image processing software (GE Healthcare, Chicago, Illinois, USA), the trace tool was used to outline PM borders. Software-generated cross-sectional area/HU was recorded. Bilateral PM data were averaged for cross-sectional area and density. The primary study variable was DCPA (average PM area/average PM density), and was further categorized into “low” (≤28) and “high” (>28) based on the mean dataset value. Permitted time between the CT and nutritional labs was 72 h (based on the 3-day half-life of prealbumin). Clinical data were contrasted against “high” and “low” DCPA. Univariate comparisons included the Mann–Whitney U-test, Student's t-test, and Fisher's exact test, as appropriate.

Results: A total of 86 measurement pairs were analyzed. The DCPA was associated with patient weight and prealbumin levels, but not with BMI, lymphocyte count, albumin, or total protein determinations. Neither the average PM area nor density alone correlated well with prealbumin. Although neither of its constituent variables (psoas density or area) correlated meaningfully with prealbumin, DCPA ≤28 was associated with lower prealbumin levels. This identifies DCPA as a potential marker of suboptimal nutritional status in ICU patients. Study results are summarized in [Table 1].



Discussion and Conclusion: Although neither psoas density nor area correlated with prealbumin in this pilot study, we found that DCPA ≤28 was associated with lower prealbumin levels. Although this finding identifies DCPA as a potential marker of suboptimal nutritional status, the clinical implications require independent confirmation, further investigation, larger sample sizes, and greater data granularity.


  Poster Presentation Abstract Number 7 Top


Are Computed Tomography Scans Over-utilized in the Workup of Vertebral Compression Fractures?

Shane McGowan, David Ramski, Brittany Homcha, Gbolabo Sokunbi

Orthopedic Surgery Residency, St. Luke's University Health Network, Bethlehem, PA, USA

Introduction/Background: Compression fractures are an increasingly common diagnosis in the United States, coinciding with the rapid increase of osteoporosis and osteopenia in the general population. Computed tomography (CT) does not reliably aid in determination of fracture chronicity and contributes to higher cost of care as well as unnecessary radiation exposure. An examination of extraneous testing and development of a guided treatment algorithm would help inform providers in choosing appropriate studies during the workup of patients with compression fractures.

Methodology and Statistical Approach: A retrospective chart review was performed that evaluated all patients who received kyphoplasty or vertebroplasty from 2009 to 2016. Inclusion criteria were a diagnosis of spinal compression or burst fracture, age 18–90, kyphoplasty or vertebroplasty performed between 2009 and 2016; exclusion criteria included no imaging studies available in the hospital PACS system and age <18 or >90. The primary end-point of the study was to elucidate extraneous imaging ordered to determine definitive treatment. The secondary outcome of the study was to evaluate increased radiation exposure and cost resulting from unnecessary studies.

Results: Between 2009 and 2016, 254 patients underwent kyphoplasty or vertebroplasty, 228 of which had images in PACS available. A total of 258 unique imaging workups were included, which consisted of the following studies: 203 plain radiographs, 87 CT scans, 156 magnetic resonance imaging (MRI) studies, and 44 bone/single-photon emission computed tomography scans. There were 104 instances (40.3%) in which patients underwent only MRI or bone scan after radiographs. There were 27 instances (10.5%) in which patients underwent only radiographs with a comparison study. There were 76 instances (29.5%) in which patients underwent extraneous CT scans and 13 instances (5%) in which patients underwent both MRI and bone scan before procedure [Figure 1]. This resulted in increased charges of at least $18,500 and $5350, respectively. There were 62 radiation dose reports available for patients who underwent CT scans, revealing an average of 979.4 mGy/cm additional radiation exposure [Figure 2].



Discussion and Conclusion: Efficiency in diagnosis of compression fractures while reducing costs and unnecessary radiation exposure become primary goals for providers. We recommend a unifying algorithm for workup that favors either radiographs in the presence of a comparison study or acquiring an MRI or bone scan to aid in determining injury acuity [Figure 3]. If these studies are available, a CT scan is unnecessary for treatment. It is therefore imperative to establish acuity early in the treatment regimen to streamline and deliver care in a safe and cost-effective manner.




  Poster Presentation Abstract Number 8 Top


The Impact of a Standardized Checklist on Length of Time to Complete Sign Out During Emergency Department Physician Change of Shift

Alyssa Milano, Holly Stankewicz, Philip Salen, Jill Stoltzfus

Emergency Medicine Residency, St. Luke's University Health Network, Bethlehem, PA, USA

Introduction/Background: Transitions of patient care during physicians' change of shift introduce the potential for critical information to be missed or distorted, resulting in possible morbidity. Since 2009, the Joint Commission has encouraged improving transitions of care as a national safety goal. Our study sought to determine if utilization of a sign-out checklist during emergency medicine (EM) resident transition of care changed the length of time to complete sign out.

Methodology and Statistical Approach: This prospective study assessed EM residents' transition of care during departmental group sign out. Residents of varying postgraduate years transferred their patients' care to the incoming physician team. For 2 months, residents gave their typical sign-out. For the next 2 months, residents utilized a standardized sign-out checklist. Times were recorded from first to last patient sign out. Continuous data were reported as medians, with separate Wilcoxon signed-rank tests conducted as appropriate.

Results: Assessment of transition of care was performed for 77 days (38 days of status quo, 39 days utilizing a checklist). There were 548 assessments in the prechecklist cohort (PCL) and 697 in the postchecklist cohort (CL). Length of time for sign out in PCL cohort was 13 min compared to 9 min in the CL cohort. This finding was statistically significant (P = 0.03).

Discussion and Conclusions: A standardized checklist appears to decrease length of time to complete sign out with the checklist.


  Poster Presentation Abstract Number 9 Top


A Systemic Review to Assess Optimal Management of Laparoscopic Cholecystectomy in Patients with Left Ventricular Assist Devices

Ronnie Mubang, Samuel Schadt, Halward Blegen, Mark Schadt

General Surgery Residency, St. Luke's University Health Network, Bethlehem, PA, USA

Introduction/Background: Since its introduction 20 years ago, left ventricular assist devices (LVADs) are increasingly being used in heart failure patients and may be used as a bridge to heart transplantation, as temporary treatment, or as destination therapy. It is estimated that 25% of these patients will require noncardiac surgical interventions. These devices add significant technical limitations to abdominal surgeries due to the location of the power supply and driveline, which cross the abdomen and may be damaged during the operation. We performed a systematic literature review to provide optimal management guidelines when performing laparoscopic cholecystectomy for acute cholecystitis in LVAD patients.

Methodology and Statistical Approach: An exhaustive review of the literature was performed using EBSCO, PubMed, Google Scholar, and Bioline with keywords of “laparoscopy,” “cholecystectomy,” “left ventricular assist device,” and “complications” to assess frequency of laparoscopic cholecystectomies with similar assist devices. We found a total of eight cases describing the above procedure, including an additional case completed at our hospital. We did not exclude cases based on lack of information regarding preoperative/perioperative planning. Our primary end-point was successful completion of laparoscopic cholecystectomy. Our second end-points were mortality and morbidity within the immediate 30-day perioperative period.

Results: Laparoscopic cholecystectomy was performed without significant hemodynamic intraoperative and perioperative compromise in all eight cases, with no mortality. One of the eight patients had postoperative bleeding at a trocar site requiring laparoscopy. A team approach and detailed briefing among various team members, including anesthesiologists, cardiac surgeons, perfusionists, cardiologists, and surgeons, were essential to the success of the operation. The use of imaging such as intraoperative fluoroscopy was necessary to help mark positions for port placement to avoid device damage and minimize preload disturbances, with subsequent postoperative anticoagulation team discussions.

Discussion and Conclusion: Our review reinforces the small body of evidence indicating that laparoscopic cholecystectomy can be performed safely in patients with LVADs. Complications may be avoided by holding detailed briefings among various departments involved in patient care. The use of intraoperative monitoring and fluoroscopy for port placements is ideal since each laparoscopic procedure performed in these patients provides unique challenges that must be addressed, and preoperative planning and fluoroscopy are vital to the success of the operation.


  Poster Presentation Abstract Number 10 Top


Reducing Hospital Readmissions in Short-term Rehab Patients through Implementation of Clinical Pathways

Stephanie Rabenold, Omolara Bamgbelu, Alaa-Eldin Mira, Amaravani Mandalapu

Geriatrics Fellowship, St. Luke's University Health Network, Bethlehem, PA, USA

Introduction/Background: Hospital readmissions within 30 days of discharge are associated with increased patient morbidity and mortality, increased overall health care costs, decreased patient satisfaction, and reduced payment from Medicare. Medicare defines readmission as unplanned return to an acute care hospital within 30 days of hospitalization for one of the following conditions: myocardial infarction (MI), congestive heart failure (CHF), pneumonia, chronic obstructive pulmonary disease (COPD), and elective total hip arthroplasty and total knee arthroplasty. Nationwide, all-cause readmissions presenting from a skilled nursing facility (SNF) range from 15.1% to 28.1% of the Medicare population during this period. Patients at high risk for readmission are often referred for acute rehabilitation before return to previous living environment. Through the implementation of clinical pathways in a short-term rehabilitation SNF, we sought to reduce hospital readmissions for those conditions at high risk for recurrent exacerbation.

Methodology and Statistical Approach: In this quality improvement project, clinical pathways were developed to standardize the care of patients in our short-term rehab facility. They were evidence based and clinically driven using a multidisciplinary team approach with focus on diabetes mellitus, COPD, CHF, lower extremity major joint replacement, MI, pneumonia, and sepsis. Nursing staff and physicians received education during provider meetings. Patients with qualifying diagnoses were placed on appropriate pathways on admission. Data were collected from January 2015 to July 2016 and focused on 30-day readmission rate as well as length of acute rehabilitation stay.

Results: Hospital readmission rates from 2013 to 2014 were up to 24%. Following implementation of the clinical pathways, we saw a steady decline in readmissions from 14.7% during the second quarter 2015 to a rate of zero in July 2016. In addition, short-term rehabilitation average length of stay decreased from 32 days in February 2015 to 19 days in July 2016.

Discussion and Conclusion: Implementing clinical pathways for high-risk patients with multiple comorbidities in addition to Medicare readmission targeted diseases improves patient care through reduction in hospital readmissions and acute rehabilitation length of stay. These improvements likely influence patient satisfaction, improve patient outcomes and health care systems flow, and reduce overall costs.


  Poster Presentation Abstract Number 11 Top


Correlation of Diabetic Education with hemoglobin A1c Levels

Urja Shah, Nargiza Mahmudova, Rebekah Cherian

Podiatric Medicine and Surgical Residency, St. Luke's University Health Network, Allentown, PA, USA

Introduction/Background: In the podiatric medicine profession, an overwhelming majority of the patient population is diabetic. Diabetes mellitus is an extremely debilitating disease that affects blood flow and sensation to lower extremities in the human body. Due to the location and weight-bearing function of feet, they are extremely prone to complications such as ulcerations and infections, which could ultimately lead to limb loss.

In diabetics, hemoglobin A1c (HbA1c) refers to glycated hemoglobin, which identifies average plasma glucose concentration over 3 months. Podiatrists tend to see high-risk patients every 3 months to more effectively monitor their diabetic health through education of blood sugar/HbA1c management and examination of feet. This study was conducted to educate diabetic patients about the importance of proper diabetic management to prevent such circumstances.

Methodology and Statistical Approach: Inclusion criteria for study patients were diagnosis of diabetes by their primary physician, receipt of an educational packet from the American Diabetic Association website about which foods are healthy for a person with diabetes, and willingness to participate in the study. On initial examination, patients were presented with a questionnaire asking about foods they thought were healthy for a diabetic diet. This questionnaire was used to establish a baseline assessment of the patients' view on healthy foods. HbA1c was assessed and used to educate patients about proper food and lifestyle habits, with presentation of an educational packet. Ideal patient follow-up was 3 months, with measurement of HbA1c after this time interval. A Wilcoxon signed-rank test was used to compare pre- and post-intervention HbA1c values.

Results: For the majority of patients, HbA1c either stayed the same or increased and 3/30 patients had no follow-up HbA1c. Ideally, although HbA1c values were to be monitored in 3 months, in reality, the time span ranged from 3 months to 1 year. The difference in pre- and post-intervention HbA1c levels was not statistically significant (P = 0.48). Median preintervention HbA1C was 7.7 (range = 6.0–13.3), while median postintervention HbA1C was 7.6 (range = 5.6–11.1). However, the value range did decrease slightly postintervention.

Discussion and Conclusion: This study revealed no significant difference in pre- and post-intervention HbA1c values. However, these results are limited by poor patient compliance with follow-up, patients' economic status, and limited patient resources. In the future, it would be beneficial to conduct a similar study in a private office setting to better assess the association of factors such as socioeconomic status and personal resources with patient compliance.


  Poster Presentation Abstract Number 12 Top


Making Comfort Count: A Hospice Prescribing Process Quality Improvement Project

Anna Thomas, Diane Hummel-Spruill, Ric Baxter

Hospice and Palliative Care Fellowship, St. Luke's University Health Network, Bethlehem, PA, USA

Introduction/Background: In the hours of a dying patient's greatest need, hospice nurses are trained to administer medications to relieve suffering, and having a comfort pack (CP) of essential medications in the home is a crucial part of hospice care. Three plan-do-check-act (PDCA) cycles were undertaken at St. Luke's Hospice with the aim of improving time to receipt of the CP in the patient's home by March 1, 2017, with a primary outcome goal of CPs arriving on average less than 48 h from start of care (SOC), and secondary goals of (1) 80% of CPs profiled by the nurse less than one calendar day from SOC and (2) 80% of prescriptions returned by the physician to pharmacy less than or equal to one calendar day.

Methodology and Statistical Approach: All home hospice patients who had a CP shipped through the hospice pharmacy from January 1, 2016, to March 1, 2017, were evaluated for consideration in the study via a retrospective chart review. January 1, 2016, to August 15, 2016, was considered the preperiod; the three PDCA cycles occurred during August 16, 2016, to March 1, 2017, as the postperiod. The Student's two-tailed t-test with unequal variance was used to analyze the primary outcome.

Results: Of the 950 charts that were reviewed for eligibility, 449 met study inclusion criteria. Average time to arrival of the CP improved from 66.7 h in the preperiod to 46.1 h in the postperiod, reaching statistical significance (P < 0.0001) and meeting the study objective. Secondary objectives were achieved during the poststudy period, with 84% (vs. 80%) of CPs profiled less than one calendar day of SOC and 93% (vs. 77%) of prescriptions returned by the physician to the pharmacy less than or equal to one calendar day from profiling.

Discussion and Conclusion: At the end of three simple PDCA cycles, significant improvements were seen in time to delivery of CPs in this hospice-oriented quality improvement project. Continued efforts and improvements should be considered to further maintain the success of this intervention.


  Poster Presentation Abstract Number 13 Top


End-tidal Carbon Dioxide as an Early Marker for Transfusion Requirement in Trauma Patients

John Tran, Jason Black, Rebecca Jeanmonod, Dhanalakshmi Thiyagarajan

Emergency Medicine Residency, St. Luke's University Health Network, Bethlehem, PA, USA

Introduction/Background: Within the human body, carbon dioxide (CO2) and sodium bicarbonate comprise the primary buffer system to protect against acidosis and acidemia. Blood measurement of base deficit, lactate, and pH is often utilized to detect and monitor acidosis. End-tidal CO2 (ETCO2) measurement requires no blood draw and should be reflective of overall acidosis, as occurs during hemorrhage. It has the benefit of real-time measurement and can be obtained in any spontaneously breathing patient. The purpose of this study was to determine the use of ETCO2 in nonintubated trauma patients as a general marker for a hypoperfused state, as well as its correlation to transfusion requirement in the first 24 h after trauma.

Methodology and Statistical Approach: This is a single-center prospective cohort study was conducted at a Level I trauma center. Consenting patients ages 18 and older who were not intubated, but for whom trauma activation occurred, were enrolled in the study. Nasal cannula ETCO2 detectors were placed upon arrival and levels were recorded every 3 min for at least 6 min by a research team member. Patients were managed as per trauma team discretion. Patients' records were subsequently reviewed to determine any transfusion requirements, length of hospital stay, operative interventions to control bleeding, and hemoglobin level. A Mann–Whitney rank sum test was conducted to analyze the data, with additional calculation of sensitivity, specificity, positive predictive value (PPV), and negative predicted value (NPV).

Results: A total of 41 patients were enrolled, with a median age of 52 (interquartile range 27.0–66.5); 6 (14.6%) required transfusion. There was a statistical trend toward lower median ETCO2 levels in patients requiring transfusion (25.9 vs. 35, P = 0.077). Using a cutoff value of 30, the sensitivity/specificity of ETCO2 as a predictor of the need for transfusion was 66.7% and 68.6%, respectively, with a PPV of 26.7% and a NPV of 92.3%.

Discussion and Conclusion: ETCO2 may be useful in identifying sicker trauma patients, but more data are required to best determine how this technology may be applied in such a population.


  Poster Presentation Abstract Number 14 Top


Jessah Villamor, Brian Waldron, Tricia Papademetrious

St. Luke's University Health Network, Bethlehem, PA, USA

Cost Implications with Utilizing a 5% versus 10% Dose Rounding Policy for Biologic Antineoplastic Agents

Introduction/>Background: As cancer-related treatment costs continue to rise, cost-minimization strategies are becoming more essential. Despite clinical data, it is common practice to round within 5% of the prescribed dose with minimal risk of producing a substantial difference in clinical effect. Although studies have found significant savings with 10% rounding in cases of noncurative intent, there are limited data comparing 5% and 10% rounding strategies in such situations. The purpose of this study was to evaluate the cost implications when using a 5% versus 10% chemotherapy and biologic therapy pharmacy dose rounding policy and to assess adherence to St. Luke's University Health Network's current 5% policy.

Methodology and Statistical Approach: We conducted a retrospective chart review of all patients with an order for bevacizumab, rituximab, bortezomib, pembrolizumab, or ipilimumab from July 2014 to July 2016. Demographic information and order details were collected. Cost outcomes were reported based on average wholesale pricing. The primary outcome was cost difference when utilizing a 5% versus 10% dose rounding policy. Secondary outcomes included adherence to the current 5% policy, costs with versus without the policy, and costs due to nonadherence. Descriptive statistics were used to summarize findings.

Results: A total of 2600 orders were included; 1262 orders were eligible for 5% dose rounding, and an additional 435 orders would have been eligible for 10% rounding. In the 2-year period, the projected cost savings with 10% versus 5% rounding was $315,092. The current 5% policy allowed for a projected savings of $760,817. With the current rate of adherence at 22.1%, our projected cost savings is approximately $288,918. Bethlehem and Allentown were identified as the highest contributors to the low adherence rate, at 18.8% and 4.8%, respectively.

Discussion and Conclusion: The projected cost savings that were identified with 10% rounding were similar to shorter duration existing studies. Despite our findings showing potentially substantial savings, we still have no clinical outcomes to support the threshold, suggesting the need for a risk versus benefit determination. In addition, the adherence to the policy was likely affected by the study period beginning immediately after study implementation. However, the findings from this study indicate a need for a closer evaluation and reeducation for pharmacists.




 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
Oral Presentatio...
Oral Presentatio...
Oral Presentatio...
Oral Presentatio...
Oral Presentatio...
Poster Presentat...
Poster Presentat...
Poster Presentat...
Poster Presentat...
Poster Presentat...
Poster Presentat...
Poster Presentat...
Poster Presentat...
Poster Presentat...
Poster Presentat...
Poster Presentat...
Poster Presentat...
Poster Presentat...
Poster Presentat...

 Article Access Statistics
    Viewed1470    
    Printed62    
    Emailed0    
    PDF Downloaded15    
    Comments [Add]    

Recommend this journal


[TAG2]
[TAG3]
[TAG4]