The AAOS Board of Directors approved the Academy’s first appropriate use criteria (AUC)—on the treatment of distal radius fractures—at their meeting on March 18, 2013. AUC, explained Kevin J. Bozic, MD, MBA, chair of the Council on Research and Quality, “are another step in ensuring that orthopaedic patients receive high quality, cost-effective, and appropriate musculoskeletal care.”
According to William C. Watters III, MD, chair of the AUC section of the Committee on Evidence-Based Quality and Value, AUC are guideline derivatives, “a product based on the evidence generated by a high quality clinical practice guideline (CPG) to which has been added an extreme dose of clinical expertise by people who treat the condition and perform the procedures the guidelines address.” AUC cover a range of treatment options and are designed to assist physicians and other medical personnel in clinical decision making.
The Treatment of Distal Radius Fractures (DRF) AUC, for example, cover 10 treatments and include 216 patient scenarios. Out of more than 2,000 different patient/treatment combinations, 36 percent were rated “appropriate,” 44 percent were rated “may be appropriate,” and 20 percent were rated “rarely appropriate” (Fig. 1).
AUC provide clinicians and patients with a measure of when, how, and for whom medical and surgical procedures are used. They differ from CPGs, however, in how they are developed and used.
“CPGs,” said Dr. Watters, “tell clinicians what they should or shouldn’t do, based on the evidence. AUC tell them what’s appropriate in most people’s eyes and what’s inappropriate in most people’s eyes. Clinicians usually have options in treating a injury. Often, several of those ways may be acceptable. But sometimes only one or two options would be appropriate for a particular patient at a particular time.”
Because randomized clinical trials—the “gold standard” for evidence-based medicine—are often not available or cannot apply to the wide range of patients seen in everyday practice, AUC can fill the gaps by using the expertise and experience of clinicians who are specialists in the topic area.
The AAOS uses the RAND/UCLA Appropriateness Method to develop AUC. The process uses three separate panels, as follows:
The writing panel—clinicians (6 to 10 in number) who are experts in the topic under study. They create a list of patient indications, assumptions, and treatments based on an evidence-based systematic review of the literature by AAOS staff research analysts. The DRF AUC writing panel included representatives from the AAOS, the Orthopaedic Trauma Association (OTA), the American Society for Surgery of the Hand (ASSH) and the American Association for Hand Surgery (AAHS).
The review panel—a larger group of clinicians (10 to 30 in number) who review the writing panel’s materials and provide any suggestions for improvement. The DRF AUC review panel included representatives from the AAOS, AAHS, ASSH, OTA, American Association of Hip and Knee Surgeons (AAHKS), American Society of Plastic Surgeons, and the American College of Occupational and Environmental Medicine.
The voting panel—a multidisciplinary group that uses a review of the most current and relevant literature along with their expert clinical judgment to rate the appropriateness of each treatment for each of the various patient scenarios. The DRF AUC voting panel included representatives from the AAOS, AAHKS, AAHS, ASSH, American Academy of Family Physicians, and Pediatric Orthopaedic Society of North America.
Before developing the patient scenarios, the DRF AUC writing panel first adopted several standardized criteria (assumptions), such as “The patient is healthy enough to undergo surgery if indicated” and “A low-energy open fracture is assumed to be a Grade I or II open fracture.” The full range of assumptions is included in the AUC document.
They also carefully defined the terms used, including fracture type, mechanism of injury, associated injuries, patient’s functional demands, and American Society of Anesthesiologists (ASA) comorbidities status. The 10 treatments considered in the DRF AUC are shown in Table 1. All patient scenarios and treatments were determined after an evidence-based systematic review of the literature.
The review panel independently reviewed the materials to ensure that they were representative of patients and scenarios clinicians would be likely to encounter. Finally, the multidisciplinary voting panel rated the appropriateness of each treatment for each scenario, using a 9-point scale (Table 2). Two rounds of voting were held, with conferences calls between the rounds to discuss disagreement among the ratings. The median rating is provided in the AUC, and areas of disagreement are noted.
“Virtually everyone involved—including Jayson Murray, the AAOS staff manager of the AUC unit—was just superb. The three groups were all very different and extremely interested in the process,” said Dr. Watters. “The voting panel started by vigorously discussing each item, but as they went through the scenarios, it got easier. For example, regardless of fracture type, an elderly patient with high seizure risk would not be an appropriate surgical candidate.”
Mobile app available
“The full AUC can be found on the AAOS website (www.aaos.org/auc),” said Dr. Watters, “but we think most clinicians, especially the younger fellows, will find the web-based app (http://aaos.webauthor.com/go/auc) very useful. It’s optimized to work on a wide range of devices, from smart phones to desk top computers.”
The app (Fig. 2) presents an indication profile so that the clinician can select the patient and fracture characteristics. When the clinician selects the “submit” button, the list of treatment recommendations is shown, with green circled checkmarks (appropriate), yellow caution symbols (may be appropriate), and red circled Xs (rarely appropriate). The app also includes a demonstration “tour,” definitions, fracture illustrations, background information, the literature review, and the list of contributors.
“This is also the kind of quality and value information that government, payers, and administrators need,” said Dr. Watters. “Although AUC are not strictly comparative research, they are based on a very structured, well-supported process. Physicians need this information to bring best practices to bear on their patient care decisions.”