Home About us Editorial board Ahead of print Current issue Search Archives Submit article Instructions Subscribe Contacts Login 
  • Users Online: 288
  • Home
  • Print this page
  • Email this page


 
 Table of Contents  
ORIGINAL ARTICLE
Year : 2017  |  Volume : 3  |  Issue : 1  |  Page : 10-15

The quality reporting reality at a large Academic Medical Center: Reporting 1600 unique measures to 49 different sources


1 College of Medicine, The Ohio State University, Columbus, OH, USA
2 Division of Health Services Management and Policy, College of Public Health, The Ohio State University, Columbus, OH, USA
3 The Department of Family Medicine, The Ohio State University, Columbus, OH, USA
4 Division of Health Services Management and Policy, College of Public Health, The Ohio State University; The Department of Family Medicine, The Ohio State University, Columbus, OH, USA
5 Department of Quality and Operations Improvement, The Ohio State University Wexner Medical Center, Columbus, OH, USA
6 Division of Health Services Management and Policy, College of Public Health, The Ohio State University; The Department of Family Medicine, The Ohio State University; Deparment of Bioinformatics, The College of Medicine, The Ohio State University, Columbus, OH, USA
7 Department of Quality and Operations Improvement, The Ohio State University Wexner Medical Center; Department of Surgery, The College of Medicine, The Ohio State University, Columbus, OH, USA

Date of Web Publication7-Jul-2017

Correspondence Address:
Jennifer L Hefner
2231 North High Street, Columbus, Ohio 43210
USA
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/2455-5568.209844

Rights and Permissions
  Abstract 


Objective: This paper explores the complexity and costs associated with quality reporting at a large, tertiary Academic Medical Center (AMC).
Methods: For each quality measure reported during fiscal year (FY) 2014, we noted the associated agency or registry and any repeated use of identical measures (measure overlap). In addition, the cost of quality reporting was examined – including personnel, registry expenses, and rewards and penalties.
Results: The AMC reported over 1600 unique measures to 49 different sources; measure overlap was 9%. This effort cost $2,367,168, including 24.8 full-time equivalent employees hired specifically to conduct or support quality reporting. Pay-for-performance rewards and penalties totaled $29,197. As a result, $2,337,971 was the FY 2014 cost of quality reporting.
Conclusions: There are financial and personnel burdens associated with quality reporting and considerable inefficiencies. As the number of metrics increase, measures need to be carefully assessed, standardized across agencies, and incorporated into electronic health records.
The following core competencies are addressed in this article: Systems-based practice, Practice-based learning and improvement.

Keywords: Academic Medical Center, administrative data, health services research, health-care administration, quality measurement


How to cite this article:
Murray KR, Hilligoss B, Hefner JL, McAlearney AS, VanBuren A, Huerta TR, Moffatt-Bruce S. The quality reporting reality at a large Academic Medical Center: Reporting 1600 unique measures to 49 different sources. Int J Acad Med 2017;3:10-5

How to cite this URL:
Murray KR, Hilligoss B, Hefner JL, McAlearney AS, VanBuren A, Huerta TR, Moffatt-Bruce S. The quality reporting reality at a large Academic Medical Center: Reporting 1600 unique measures to 49 different sources. Int J Acad Med [serial online] 2017 [cited 2017 Nov 21];3:10-5. Available from: http://www.ijam-web.org/text.asp?2017/3/1/10/209844




  Introduction Top


There is increasing pressure on hospitals and health systems to report on a growing number of clinical quality data measures. The goal of this public reporting is to facilitate transparent analysis of health-care quality across the nation,[1] and subsequently inform quality improvement (QI) programs.[2],[3],[4] Health-care funders and policy agencies also use publicly reported clinical quality data to inform guideline development and evaluate adherence to best practices.[5] Public reporting on quality is touted as producing overall quality gains,[6] and the transparency that comes from these public data can spur competition for increased quality.[7]

Yet for all the benefits quality reporting may bring, it is not without costs. The sheer volume of reporting measures, the significant discrepancies between like measures for different reporting agencies, and the need to verify measures with administrative data create an enormous burden on clinicians, hospitals, and health systems,[8],[9],[10],[11],[12],[13] including increasing health-care providers' workloads.[3],[14] Just as many efforts to improve the value of care have focused on eliminating inefficiencies in clinical care processes,[15],[16] so efforts to improve the overall value of health care must reduce redundancies and other inefficiencies in quality reporting. Every dollar spent on reporting is one less dollar available for direct patient care. This is an argument not against quality reporting but for increasing the value produced by that reporting. The value of any endeavor is increased by either improving the outcomes achieved by it, reducing the costs incurred by it, or both.[17] Thus, an important first step in increasing the value of quality reporting is assessing associated costs.

Despite the increasing emphasis on quality measures and public reporting, few studies have explored the financial resources needed to adhere to these reporting standards.[3] A handful of studies have looked at the cost of reporting to a few agencies from the perspective of a small practice,[18],[19],[20] or at the cost of implementing a specific quality program.[21],[22],[23],[24],[25] Our study aimed to take a new perspective by exploring the complexity and costs associated with reporting on quality at a large Academic Medical Center (AMC). The results section starts with a visual presentation of the exponential growth in the Centers for Medicare and Medicaid Services (CMS) reporting metrics between 2004 and 2017. We then present details on the breadth and complexity of quality reporting within our AMC, including a calculation of the number of metrics and agencies reported to, and the human and financial resources necessary to meet these quality-reporting requirements.


  Methods Top


Study site

The study site was a tertiary care AMC in a large metropolitan city. The AMC serves all populations and patients, with Medicaid and underserved populations accounting for approximately 25% of the patient mix. The medical campus includes six hospitals – a heart and vascular hospital, a cancer hospital, a rehabilitation hospital, an inpatient psychiatric facility, a community hospital, and a university hospital. The AMC has a total of nearly 1300 beds and an annual average of 57,000 discharges. The Quality and Operations Improvement Department coordinates initiatives such as event reporting, clinical resource utilization review, evidence-based medicine promotion, technology assessment, and quality data reporting.

Data collection

We collated and totaled all the measures reported by the AMC (all six hospitals) to various agencies and registries (further referred to as “agencies”) during fiscal year (FY) 2014, grouping the list of agencies into six categories as shown in [Table 1]. These categories were: (1) Accreditation – agencies that certify hospitals and health-care organizations (e.g., The Joint Commission), (2) CMS – as the largest payer, CMS has a number of mandated reporting requirements, (3) registry – registries are observational in nature and use clinical data to track disease or procedure outcomes, (4) managed care payer – other managed care payers (e.g., Blue Cross Blue Shield) who often offer distinctions in quality areas or service lines and use payer data to evaluate quality, (5) ranking – consumer-focused ranking organizations (e.g., US News and World Report and Leapfrog), and (6) miscellaneous – state reporting, laboratory reporting, and internal reporting.
Table 1: Reporting categories and associated agencies

Click here to view


Six variables were recorded for each agency. First, agency refers to the type of agency as described above. Then, the number of reported metrics was summed to provide a total metrics reported variable. Financially, subscription fees and licensing fees were included. Personnel costs included the full-time equivalent (FTE) of staff time and associated staff time involved with reporting to each agency, and their salaries and responsibilities were apportioned based on effort. Those responsible for reporting had, in general, >75% of their responsibility in the reporting realm so that we could assure conformity and standardization with the process of data submission.

The value of the quality metric incentives was also calculated. CMS reports the pay-for-performance rewards or penalties from hospital value-based purchasing (HVBP) and hospital-acquired conditions (HAC).

Analysis

We analyzed the variables recorded from each agency to determine the total costs of reporting across all agencies and within reporting categories. The degree to which metrics overlapped was defined as the percent of metrics that were used by more than one agency (overlapping metrics were identically defined). The total cost was the sum of subscription fees, licensing fees, and personnel costs per agency. The overall cost of reporting for the AMC was calculated as the sum of all total costs across agencies. The average cost per agency was the overall cost divided by the number of total agencies (n = 49). A breakdown analysis was performed for each agency and a subtotal of the total cost was calculated for each agency category. The percentage of overall cost for each agency and for each agency category was also calculated using overall cost.


  Results Top


Number of reporting metrics

[Figure 1] presents a visual depiction of metric growth for CMS in the area of quality measures. Required metrics went from 10 metrics in 2006 to a projected 154 metric categories in 2016. [Table 2] presents the breath of CMS measures across the quality, efficiency, and meaningful use categories for 2014, a total of 541 metrics. Including CMS and all other agencies, the AMC reported 1662 metrics to 49 agencies in FY 2014. There was an overlap of 9% for the metrics used by more than one agency.
Figure 1: Centers for Medicare and Medicaid Services quality measures: Number of measures over time

Click here to view
Table 2: Centers for Medicare and Medicaid Services measures related to quality, efficiency, and meaningful use for 2014

Click here to view


Cost of reporting

The overall cost to the medical center for quality reporting was $2,367,168. Salaries made up a majority of these costs, with 24.8 FTE staff who managed the quality reporting across all hospitals costing $1,818,232.

Detailed costs by reporting category are presented in [Table 3]. As a category, registry costs represented the majority of reporting expenses. Miscellaneous costs, including administration costs, were the next largest cost category. Managed care payers and ranking agencies did not materially impact total costs.
Table 3: Costs of quality measure reporting by reporting requirement category

Click here to view


Pay-for-performance

CMS is currently the only agency that participates in pay-for-performance programs.

The hospital received rewards of $285,365 for HVBP and paid penalties of $256,168 for readmission reduction, for a net gain of $29,197. $100,000 was spent on quality reporting to CMS, resulting in $70,803 in reporting costs to CMS that were not reimbursed through pay-for-performance programs. In sum, $2,337,971 was spent on quality reporting, including pay-for-performance gains and losses in FY 2014. In 2015, HAC penalties will also apply.


  Discussion Top


This case study calls attention to the considerable reporting burden and financial costs associated with meeting growing quality reporting needs for a health system. For the AMC, reporting 1662 metrics to 49 agencies cost nearly $2.4 million in FY 2014. Registries made up the largest portion of quality reporting costs, likely due to a combination of high subscription fees, the collection of a large number of measures, and the need to input data manually into a paper- or web-based registry system that increases staff workload as has also been reported in literature.[26] These registries are voluntary and no one registry is absolutely required for patient care payment. Registries often have a narrow focus resulting in few overlapping measures and adding to the overall complexity and inefficiencies of quality reporting. The benefits from these detailed data, however, can be realized when the AMC wants to track and analyze diseases or procedures to identify opportunities for QI and reduced care costs.[27] In practice, this opportunity is leveraged in other countries that have implemented governmental disease registries [28] and in comparative effectiveness research studies that attempt to determine optimal treatments and patterns of care.[29] Yet the outstanding question remains as to which stakeholders should carry the burden of paying the subscription and licensing fees for these very expensive registries.[30]

Registries are voluntary and no one registry is absolutely required for patient care payment. Having said, AMCs are increasingly feeling the push to submit these registries to facilitate the process of data collection and adherence to Joint Commission and CMS regulations and expectations. A striking finding was that measure overlap (identically defined measures) across agencies was only 9%. For example, mortality is calculated in different ways – CMS, the Agency for Health care Research and Quality, and the American Heart Association all note different definitions for their reporting purposes.[12] This low rate of measure overlap provides support for calls in literature to standardize metrics across reporting agencies,[8],[12],[31],[32] as well as calls for a universal reporting system.[33] Electronic health record (EHR) vendors could facilitate standardization by partnering with reporting agencies to design EHR enhancements with quality reporting in mind. For instance, given the high costs associated with reporting to registries, EHR vendors could consider how measures for registries could be more easily gathered through the EHR system, reducing inefficiencies of current manual systems.

The pressure to improve efficiency and reduce costs raises questions about the return on this investment in quality reporting. Because it is necessary for regulatory purposes and increasingly tied to payment, quality reporting is the cost of doing business. While investing in quality reporting may seem expensive, its value may be more appropriately assessed in the context of impact rather than return. In as much as quality reporting allows provider organizations to benchmark against peers and identifies areas for improvement, this reporting enhances value. In addition, quality metrics are used by a multitude of ranking systems, which could ultimately lead to higher payer reimbursement levels, better-negotiated contract rates, and gains in both market share and reputation.

There is currently very little literature to compare our costs to other AMCs. It is our hope that in sharing our experience, other AMCs will come forward and advocate for more collaborative or standardized quality reporting processes. Finally, the current process of submitting quality data is very labor-intensive and has not, to date, been facilitated in the era of the electronic medical record. The push for electronic CMS measures and electronic registry submission is more important than ever, and the processes to support reliable submission must be encouraged. The measures we present herein were pulled from a number of free standing databases overseen by a variety of clinical entities, few of which could have been abstracted electronically either due to technical or reliability factors.

Limitations

Limitations of this study include the fact that these cost data did not account for organizational QI efforts or staff time at QI meetings. Neither did our analyses include any data on costs associated with publishing quality data for public relations nor using rankings in marketing. Additional staff costs that were not included were time associated with coding patient visits and time to develop reporting queries from the EHR. Other nonstaff costs omitted included the cost of office space and supplies, and computer hardware and software. Given these omissions, we believe that the cost figures presented here are a conservative estimate of the true costs of quality reporting.

Our study was also limited by the decisions we made about cost-related variables. Other organizations such as for-profit hospitals might need to consider additional costs that are accounted for differently in a not-for-profit entity. In addition, we recognize that the costs we reported reflect the costs of maintaining reporting systems. Costs associated with implementing a new registry or working with a new reporting agency may be higher given the need to identify proper data sources, develop new reports, and train staff about new metrics. Finally, we recognize the costs we report are specific to our market. Labor costs will vary based on location, and participation in some statewide registries is idiosyncratic to the state or region in which a facility is embedded.


  Conclusions Top


Quality reporting is increasingly important, and it is crucial for health-care organizations to be involved in registries and report to accreditation agencies and payers. However, there are financial and personnel burdens associated with quality reporting and considerable inefficiencies. The lack of standardization across metrics and variability in data abstraction methods only further complicate the process. Health-care organizations can take advantage of opportunities to promote standardization while critically evaluating which agencies are appropriate for quality reporting goals. For example, based on this analysis, our AMC has decided to leverage the ability of our electronic medical record to automatically complete required reporting; while initially resource-intensive, this process will ultimately reduce the human resources needed to complete reporting requirements. In addition, we are only committing to clinical data registries that are aligned with our strategic missions. This sometimes requires simply saying no to nonmandatory reporting entities. Clearly, quality reporting is neither going away nor should it. However, as the cost of quality reporting becomes accepted as the cost of doing business, reporting must be assessed on the value that it brings to health care. Developing a better understanding of the costs of quality reporting is the first step in that direction.

Acknowledgment

The authors would like to thank Anne VanBuren for her assistance with this project. This project was supported by the Institute for the Design of Environments Aligned for Patient Safety (IDEA4PS) at The Ohio State University which is sponsored by the Agency for Healthcare Research and Quality (P30HS024379). The authors' views do not necessarily represent the views of Agency for Health care Research and Quality.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.



 
  References Top

1.
Pronovost PJ, Goeschel CA. Viewing health care delivery as science: Challenges, benefits, and policy implications. Health Serv Res 2010;45(5 Pt 2):1508-22.  Back to cited text no. 1
    
2.
Galvin RS, McGlynn EA. Using performance measurement to drive improvement: A road map for change. Med Care 2003;41 1 Suppl: I48-60.  Back to cited text no. 2
    
3.
Pham HH, Coughlan J, O'Malley AS. The impact of quality-reporting programs on hospital operations. Health Aff (Millwood) 2006;25:1412-22.  Back to cited text no. 3
    
4.
Wharam JF, Frank MB, Rosland AM, Paasche-Orlow MK, Farber NJ, Sinsky C, et al. “Pay-for-performance” as a quality improvement tool: Perceptions and policy recommendations of physicians and program leaders. Qual Manag Health Care 2011;20:234-45.  Back to cited text no. 4
[PUBMED]    
5.
Chaudhry B, Wang J, Wu S, Maglione M, Mojica W, Roth E, et al. Systematic review: Impact of health information technology on quality, efficiency, and costs of medical care. Ann Intern Med 2006;144:742-52.  Back to cited text no. 5
    
6.
Marshall MN, Shekelle PG, Leatherman S, Brook RH. The public release of performance data: What do we expect to gain? A review of the evidence. JAMA 2000;283:1866-74.  Back to cited text no. 6
    
7.
Longo DR, Land G, Schramm W, Fraas J, Hoskins B, Howell V. Consumer reports in health care. Do they make a difference in patient care? JAMA 1997;278:1579-84.  Back to cited text no. 7
    
8.
Ix M. Reducing the administrative burden of health care quality reporting. Find Brief 2008;11:1-4.  Back to cited text no. 8
    
9.
Hearld LR, Alexander JA, Shi Y, Casalino LP. Pay-for-performance and public reporting program participation and administrative challenges among small- and medium-sized physician practices. Med Care Res Rev 2014;71:299-312.  Back to cited text no. 9
    
10.
American Hospital Association. Quality Reporting and Pay-for-Performance; 2014. http://www.aha.org/content/14/ip-qualreport.pdf. [Last accessed on 2016 Feb 05].  Back to cited text no. 10
    
11.
Cassel CK, Conway PH, Delbanco SF, Jha AK, Saunders RS, Lee TH. Getting more performance from performance measurement. N Engl J Med 2014;371:2145-7.  Back to cited text no. 11
    
12.
Bazinsky D, Bailit M. The Significant Lack of Alignment Across State and Regional Health Measure Sets — Health Care Performance Measurement Activity: An analysis of 48 state and regional measure sets. Needham, Massachusetts: Bailit Health Purchasing, LLC; 2013.  Back to cited text no. 12
    
13.
Meyer GS, Nelson EC, Pryor DB, James B, Swensen SJ, Kaplan GS, et al. More quality measures versus measuring what matters: A call for balance and parsimony. BMJ Qual Saf 2012;21:964-8.  Back to cited text no. 13
    
14.
Wachter RM. Expected and unanticipated consequences of the quality and information technology revolutions. JAMA 2006;295:2780-3.  Back to cited text no. 14
    
15.
DelliFraine JL, Langabeer JR 2nd, Nembhard IM. Assessing the evidence of Six Sigma and Lean in the health care industry. Qual Manag Health Care 2010;19:211-25.  Back to cited text no. 15
    
16.
Glasgow JM, Scott-Caziewell JR, Kaboli PJ. Guiding inpatient quality improvement: A systematic review of Lean and Six Sigma. Jt Comm J Qual Patient Saf 2010;36:533-40.  Back to cited text no. 16
    
17.
Porter ME. What is value in health care? N Engl J Med 2010;363:2477-81.  Back to cited text no. 17
    
18.
Halladay JR, Stearns SC, Wroth T, Spragens L, Hofstetter S, Zimmerman S, et al. Cost to primary care practices of responding to payer requests for quality and performance data. Ann Fam Med 2009;7:495-503.  Back to cited text no. 18
    
19.
Reiter KL, Halladay JR, Mitchell CM, Ward K, Lee SY, Steiner B, et al. Costs and benefits of transforming primary care practices: A qualitative study of North Carolina's improving performance in practice. J Healthc Manag 2014;59:95-108.  Back to cited text no. 19
    
20.
West DR, Radcliff TA, Brown T, Cote MJ, Smith PC, Dickinson WP. Costs associated with data collection and reporting for diabetes quality improvement in primary care practices: A report from SNOCAP-USA. J Am Board Fam Med 2012;25:275-82.  Back to cited text no. 20
    
21.
Kilpatrick KE, Lohr KN, Leatherman S, Pink G, Buckel JM, Legarde C, et al. The insufficiency of evidence to establish the business case for quality. Int J Qual Health Care 2005;17:347-55.  Back to cited text no. 21
    
22.
Watts B, Augustine S, Lawrence RH. Teaching quality improvement in the midst of performance measurement pressures: Mixed messages? Qual Manag Health Care 2009;18:209-16.  Back to cited text no. 22
    
23.
Brown SE, Chin MH, Huang ES. Estimating costs of quality improvement for outpatient healthcare organisations: A practical methodology. Qual Saf Health Care 2007;16:248-51.  Back to cited text no. 23
    
24.
O'Beirne M, Reid R, Zwicker K, Sterling P, Sokol E, Flemons W, et al. The costs of developing, implementing, and operating a safety learning system in community practice. J Patient Saf 2013;9:211-8.  Back to cited text no. 24
    
25.
Severens JL. Value for money of changing healthcare services? Economic evaluation of quality improvement. Qual Saf Health Care 2003;12:366-71.  Back to cited text no. 25
    
26.
Ryan TJ. Large cardiac registries: The path to higher quality and lower cost in our healthcare system. Circulation 2010;121:2612-4.  Back to cited text no. 26
    
27.
Larsson S, Lawyer P, Garellick G, Lindahl B, Lundström M. Use of 13 disease registries in 5 countries demonstrates the potential to use outcome data to improve health care's value. Health Aff (Millwood) 2012;31:220-7.  Back to cited text no. 27
    
28.
National disease registries for advancing health care. Lancet 2011;378:2050.  Back to cited text no. 28
    
29.
Shah BR, Drozda J, Peterson ED. Leveraging observational registries to inform comparative effectiveness research. Am Heart J 2010;160:8-15.  Back to cited text no. 29
    
30.
Hawn MT. Surgical registries: Effective, but how to pay for them?: Comment on “More recurrences after hernia mesh fixation with short-term absorbable sutures.” Arch Surg 2011;146:17.  Back to cited text no. 30
    
31.
Hussey PS, Luft HS, McNamara P. Public reporting of provider performance at a crossroads in the United States: Summary of current barriers and recommendations on how to move forward. Med Care Res Rev 2014;71 5 Suppl: 5S-16S.  Back to cited text no. 31
    
32.
Hospitals AHAFoA, Colleges AoAM. Envisioning the Roadmap for Nationwide Hospital Quality Reporting; 2006.  Back to cited text no. 32
    
33.
California Health Care Foundation. Creating a Statewide Hospital Quality Reporting System; 2002. Retrieved from http://www.chcf.org/~/media/MEDIA LIBRARY Files/PDF/C/PDF CreatingHospitalQualityReporting.pdf. [Last accessed on 2016 Feb 05].  Back to cited text no. 33
    


    Figures

  [Figure 1]
 
 
    Tables

  [Table 1], [Table 2], [Table 3]



 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
Abstract
Introduction
Methods
Results
Discussion
Conclusions
References
Article Figures
Article Tables

 Article Access Statistics
    Viewed313    
    Printed21    
    Emailed0    
    PDF Downloaded2    
    Comments [Add]    

Recommend this journal


[TAG2]
[TAG3]
[TAG4]