Download this article in PDF format

Position Statement

Comparative Effectiveness Research

This position statement was developed as an educational tool based on the opinion of the authors. It is not a product of a systematic review. Readers are encouraged to consider the information presented and reach their own conclusions.

Patients, physicians and other healthcare providers deserve access to high quality information that optimizes their capacity to identify which diagnostic, treatment, and prevention services do or do not work and under what conditions. There is strong evidence that the lack of this type of information degrades the ability of patients, physicians and other healthcare providers to choose the best treatment options.

The American Academy of Orthopedic Surgeons (AAOS) believes that developing high quality objective information will improve informed patient choice, shared decision-making, and the clinical effectiveness of physicians’ and other healthcare providers’ treatment recommendations.

Overview and History

Comparative Effectiveness (CE), in its broadest sense, refers to the evaluation of the relative (clinical) effectiveness, safety, and cost of two or more medical services, drugs, devices, therapies, or procedures used to treat the same condition. Many organizations and institutions, including the AAOS, have a long history of both individual and combined efforts in developing and encouraging the use of this type of information. The Medicare Prescription Drug, Improvement, and Modernization Act of 2003 mandated research on “outcomes, comparative clinical effectiveness, and appropriateness of health care.” Recent announcements of both the development of CE research and its subsequent deployment have generated considerable concern. This concern, in large part, may be due to the perception that comparative effectiveness equates to cost-effectiveness. Utilization of cost-effectiveness information raises issues related to limiting access, deployment for cost-containment purposes, and rationing of care. Unfortunately, a coordinated effort to educate the many stakeholders and clearly define the term “comparative effectiveness” has been lacking.

Components of Comparative Effectiveness Research (CER)

Clinical Effectiveness

Comparative Effectiveness Research and analysis evaluates the relative value of drugs, devices, diagnostic and surgical procedures, diagnostic tests, and medical services. In this context, value is defined as the clinical effectiveness of a treatment or service compared with the clinical effectiveness of alternative treatments or services, with appropriate consideration of costs as one of many variables. CER has the potential to promote care of higher value (better quality at the same or lower cost) in the public and private sectors.

Cost-Effectiveness

Analysts use many terms and techniques (cost-benefit analysis, cost-utility analysis, cost-identification analysis) when cost factors are a significant component in comparing the effectiveness of different medical interventions. Cost-effectiveness analysis (CEA) is the most common technique used to quantify this comparison. CEA compares the incremental economic cost per unit of health outcome gained among different interventions. In other words, it attempts to standardize the unit of measuring the health benefit of a dollar spent across different fields of medicine. CEA produces a single ratio that compares the difference in the cost of two given interventions divided by the difference in their health effectiveness or clinical outcomes. This yields an Incremental Cost-Effectiveness Ratio (ICER) that is the basis for cost-effectiveness comparisons.

ICER=

Cost Intervention A – Cost Intervention B

Health Outcome Intervention A – Health Outcome Intervention B

Quantifying health outcomes remains a complex and difficult task. Since this metric (ICER) is a ratio, it can increase when the incremental improvement in health outcomes (clinical effectiveness) is small or the difference in cost is large. The larger ICER represents the expensive interventions with little health benefit.

Safety

Safety is often not considered explicitly as a component of Comparative Effectiveness Research by many in the general medical and scientific community. However, high quality CER must include and highlight this particular component. Safety considerations should form a prominent leading edge in the effort to deploy the findings in scientifically rigorous CER.

The Framework of Comparative Effectiveness Research

The Entity

The AAOS believes that an independent public-private entity should be established to conduct, prioritize, and coordinate Comparative Effectiveness Research. This entity should focus and be built on the following principles to provide high quality information to inform patients, physicians and other healthcare providers in their choice of alternative treatments and services.

  • Single entity coordinating CER initiatives to avoid redundant efforts
  • Stand alone governance with federal and political independence
  • Stable dedicated financing (public and private)
  • Transparent trustworthy methods
  • Produce objective timely research
  • Legitimate governance and organizational structure with broad stakeholder representation
  • Broad public input including public meetings and comment periods
  • Centralized database of all CER activities accessible to all stakeholders
  • Wide dissemination of information on a regular and recurring basis
  • Provide a forum for addressing conflicts
  • The entity should not have a role in payment or coverage decisions

The AAOS believes it is important and useful for information on all CER activities to be shared. We believe it is necessary for a CER entity to act in a transparent, consistent manner. Transparency facilitates coordination and participation among stakeholders. Sharing the activities and priorities for CER allows the public and private entities to coordinate their initiatives and reduce overlapping efforts. Also, a lack in transparency and dissemination of information creates a distrustful environment and, in turn, inaccurate assumptions. The AAOS believes that CER stakeholder groups will benefit from the sharing of information and collaboration of a shared initiative.

The Scope

Ideally, any CER entity should have broad authority to set research priorities for the research conducted. Primary and secondary research priorities should be established with input from all relevant stakeholders. Research methodology may include systematic reviews, analysis of claims records, medical registries, randomized controlled trials, and recently described pragmatic or practical clinical trials.

Establishing the scope of potential CER is complicated by issues such as the concern that outcomes with medical devices, unlike those with pharmaceuticals, depend not just on the device but also on the skill of the operator and procedural technique. Initial outcomes and total costs associated with new therapies might not be favorable, but over time, with greater operator expertise and a fuller understanding of how best to use the device, a therapy may become more valuable and clinically effective.

The AAOS believes that, in establishing the scope of Comparative Effectiveness Research, careful consideration should be given to aligning research priorities with real-world questions faced by patients, physicians and other healthcare providers in their medical decision-making.

The AAOS has renewed its efforts to provide high quality, evidence-based information to its members through both evidence-based Clinical Practice Guidelines and Technology Overviews. Indeed, our evidence-based efforts meet the highest standards of evidence-based methodological rigor, and the objectivity of our guidelines has been noted in the Journal of the American Medical Association.1 Further, we provide extensive documentation for each of these of these two types of publications on the AAOS website so that end-users can verify that they are objective. The links for Overviews and Guidelines are at http://www.aaos.org/research/research.asp

Our efforts to implement evidence-based medicine gives us concern that if CER efforts follow existing models of clinical research and evidence-based medicine, their impact on clinical practice will be blunted or delayed. Additional pertinent issues include:

  • There is little, if any, published, peer-reviewed data showing that previously published, evidence-based systematic reviews (including CER) have been successfully translated into practice, and the evidence suggesting that clinical practice guidelines can influence practice also suggests that substantial direct-to-physician efforts are required to make this happen;2,3
  • A lack of integration between public/private efforts and specialty society clinical practice guidelines could create competitive (if not conflicting) efforts; and
  • There is no method to coordinate dissemination of CER and the evidence-based efforts of specialty societies. In short, we are concerned that efforts made to produce and disseminate CER in the absence of significant specialty society involvement will lead to research that is not effectively translated into practice.

We strongly suggest that CER efforts be closely coordinated with specialty societies. This coordination should take place at both the development and dissemination phases of a CER. It should take place at the development stage because clinical trials do not always include studies of patients that are of interest to clinicians.4 Systematic reviews that rely on such trials could, therefore, have an uncertain impact on patients. Further, existing systematic reviews use different methodologies and different rules for determining which studies to examine.5 These differences affect and impact their conclusions.6 Clinician input on which studies are most relevant to their practice seems vital, as is consensus on the methodology that will be used in any given project.

Coordination should take place prior to the dissemination phase because there is limited (if any) evidence that publically and privately prepared evidence-based reviews have had a significant impact on clinical practice. There is some evidence, however, showing that evidence-based guidelines impact practice, at least when they are appropriately disseminated (see above ref). We suggest that the CER be performed in coordination with, or integrated into specialty societies’ evidence-based, clinical practice guideline efforts (especially those specialty societies that employ highly rigorous evidence-based processes and methodological standards of guideline development) given that:

  1. Specialty societies make their practice recommendations through clinical practice guidelines;
  2. These societies have existing means of information dissemination; and
  3. These means are likely more effective than those available to either public or private entities.

The AAOS believes, consistent with published peer-reviewed evidence, that CER developed and disseminated according to current models will have limited impact on clinical practice, unless it is closely coordinated with specialty societies.

Issues in Deploying Comparative Effectiveness Research Results

Potential Adverse Effects on Quality

Recent federal cost savings projected from deploying CER results show that net savings in the cost of health care will only be possible if there is a resultant change in payment rules or cost-sharing requirements. Without these changes, the cost of conducting CER would equal or exceed the projected cost savings. This fact will potentially drive an effort to use CER to set payment and inform coverage decisions. The AAOS is concerned that if a CER entity focuses on the cost of care, we risk the counterintuitive result of decreasing quality and access to needed care.

Potential Adverse Effects on Innovation

New technologies and innovations, when first introduced, frequently do not demonstrate favorable comparative effectiveness ratios. Learning curve issues and the lack of refined clinical indications frequently impact early measurements of comparative effectiveness. We believe comparative effectiveness should not stifle innovation and the development of new technologies. We also believe it is important to consider special populations when developing CER. Patients are quite diverse and a one size fits all approach will not improve the quality of care for our patients.

The AAOS is committed to working with a broad range of public and private entities, including governmental agencies, medical professionals, patients, and others to improve the quality and accessibility of information available to patients, physicians, other healthcare providers, payers, and policymakers regarding the quality and value of different health care interventions. This will require a concerted multi-stakeholder effort to develop and report patient-specific, risk-adjusted measures of health outcomes, as opposed to relying exclusively on cost of care as the basis for comparing treatment interventions.

References:

References:

    1. Voelker R: Guideline provides evidence-based advice for treating osteoarthritis of the knee. JAMA. 2009;301(5):475-476.

    2. Weingarten S: Translating practice guidelines into patient care. Chest 2000; 118:4S-7S

    3. Clouter MM, Hall CB, Wakefield DB, and Balit H: Use of asthma guidelines by primary care providers to reduce hospitalizations and emergency department vsits in poor, minority, urban children. J Pediatr. 2005; 146: 501-597.

    4. Thorpe KE, Zwarenstein M, Oxman AD, et al: A pragmatic-explanatory continuum indicator summary (pprecis): A tool to help trial designers. J. Clin. Epidemiol. 2009;62: 464-475.

    5. Peinemann F, McGauran N, Sauerland S, Lange S: Disagreement in primary study selection between systematic reviews on negative pressure wound therapy. BMC Med Res Methodol. 2008;8:41.

    6. Jadad AR, Cook, DJ, Browman, GP: A guide to interpreting discordant systematic reviews. Can Med Assoc. J, 1997: 156;1411-1416.

Resources:

Kirschner N: The American College of Physicians, A Policy Paper; Improved Availability of Comparative Effectiveness Information: An Essential Feature for a High-Quality and Efficient United States Health Care System, January 2008.

American College of Physicians. Information on cost-effectiveness: An essential product of a national comparative effectiveness program. Ann Intern Med, June 17, 2008; 148(12): 956 - 961.

Wilensky GR: Cost-effectiveness information: Yes, it's important, but keep it separate, please! Ann Intern Med, June 17, 2008; 148(12): 967 - 968.

Buto K, Juhn P: Can a center for comparative effectiveness information succeed? Perspectives from a health care company. Health Aff, November 1, 2006; 25(6): w586 - w588.

Rowe JW, Cortese DA, McGinnis JM: The emerging context for advances in comparative effectiveness assessment. Health Aff, November 1, 2006; 25(6): w593 - w595.

The Congress of the United States Congressional Budget Office. Research on the Comparative Effectiveness of Medical Treatments: Issues and Options for an Expanded Federal Role, December 2007.

New England Healthcare Institute. A White Paper; Balancing Act: Comparative Effectiveness Research and Innovation in U.S. Health Care, April 2009.

Deloitte Center for Health Solutions. Comparative Effectiveness: Perspectives for Consideration, 2009, Deloitte Development LLC.

MedPac. Report to Congress: Reforming the Delivery System, Chapter 5 Producing comparative-effectiveness information, June 2008.

Wilson NA, Schneller ES, Montgomery K, Bozic K: Hip and knee implants: Current trends and policy considerations. Health Aff, November 1, 2008; 27(6): 1587 - 1598.

Firth BG, Cooper LM, Fearn S: The appropriate role of cost-effectiveness in determining device coverage: A case study of drug-eluting Stents, Health Aff, November/December 2008; 27(6): 1577 -1586.

Emanuel EJ, Fuchs VR, Garber AM: Essential elements of a technology and outcomes assessment initiative. JAMA, September 19, 2007; 298(11): 1323 - 1325.

Institute of Medicine of the National Academies. Report Brief, Knowing What Works in Health Care: A Roadmap for the Nation, January 2008

The Dartmouth Institute for Health Policy & Clinical Practice. A Dartmouth Atlas White Paper: An Agenda for Change, Improving Quality and Curbing Health Care Spending: Opportunities for the Congress and the Obama Administration, December 2008.

June 2009 American Academy of Orthopaedic Surgeons.

This material may not be modified without the express written permission of the American Academy of Orthopaedic Surgeons.

Position Statement 1178

For additional information, contact the Public Relations Department at 847-384-4036.

AAOS Headquarters

AAOS Headquarters

6300 North River Road
Rosemont, IL 60018-4262
Phone: 847.823.7186
Fax: 847.823.8125


Washington Office

AAOS Washington, DC Office

317 Massachusetts Ave NE
1st Floor
Washington DC 20002
Phone: 202.546.4430
Fax: 202.546.5051