AAOS Now

Published 10/1/2010
|
Craig S. Roberts, MD, MBA; J. Lawrence Marsh, MD; Laura Hruska, MEd

Surgeons test knowledge and come up on top

Practicing surgeons outperform residents on the 2009 OITE

The AAOS Orthopaedic In-Training Examination (OITE) for orthopaedic surgery residents has been around since 1963. During that time, AAOS volunteers have developed thousands of questions for the exam, and hundreds of thousands of residents have taken the test.

The process of creating the OITE extends over a period of 18 months, meaning that committee members are already working on the next examination before the current year’s exam is administered. Each year’s OITE is a unique exam with 275 new questions. Residents may keep their examinations to help direct their course of study.

In November 2009, approximately 4,400 residents from 215 residency training programs in 13 different countries took the examination. The year 2009 was notable for two firsts: the OITE was produced exclusively on a DVD-ROM, and test scoring was brought into the Academy. (See “An Electronic Orthopaedic In-Training Examination” in the October 2010 Journal of the AAOS.)

Less recognized is the fact that the OITE is also available to practicing surgeons as a continuing medical education (CME) program worth 15 AMA Category 1 CME credits. This program runs from mid-November through April, and each practitioner receives a score report with his or her results. In 2009, 809 practitioners ordered the examination and 396 returned their answer files for scoring.

How well did these orthopaedic surgeons do? An analysis of the demographics and overall performance of the practitioners who took the OITE shows some interesting results in comparison to the performance of the residents in training.

Demographics
Most practitioners (58 percent) were in private practice (
Fig. 1). Just over half (52 percent) were out of residency training for more than 16 years (Fig. 2).

About half of the surgeon test-takers reported some portion of their practice in general orthopaedic surgery. Respondents were also able to select multiple practice areas that described their practice (Table 1).

Overall results
The OITE has 12 content domains, giving practitioners the opportunity to assess their knowledge in areas outside of their normal practice. Each question is supported by recommended readings in current literature, which facilitates the surgeons’ continued education in all areas.

The distribution of practitioners’ scores is fairly tight around the median (169.5) and slightly skewed to the right. The entire third quartile of scores was within 15 points of the median, whereas the total score spread in the fourth quartile ranged over 79 points. Scores below the median in the second quartile ranged over 15 points and scores in the first quartile ranged over 98 points. The 50 percent of participating practitioners whose scores were in the second and third quartiles had a range of just 30 points.

Sorting the practitioners’ results by quartiles allows interesting comparisons to the orthopaedic residents’ results by year in training. Practitioner scores in the first and second quartile were similar to the scores of first- and second-year residents. Practitioners who scored above the median, however, tended to outscore residents. Practitioners in the fourth quartile on average did better on the OITE than fourth- and fifth-year residents (Table 2).

These results are interesting and somewhat surprising. We would have expected that senior residents would score better than most practicing orthopaedic surgeons. PGY-4 and 5 residents should be nearing the pinnacle of their general orthopaedic knowledge across basic and clinical specialties. On the other hand, practitioners could be expected to have a narrower focus, based more on their practice specialty, and not be as up-to-date on basic science concepts as residents.

Instead, the top half of practitioners who took the OITE performed equally as well as or better than senior residents. Although this is impressive, at least two likely explanations are possible. First, practitioners who took the test and returned the score sheets might be more interested in education and general orthopaedic knowledge than the overall group of orthopaedic practitioners. Second, residents and practitioners took the exam under different circumstances: residents in a controlled setting and practitioners in their homes or offices.

Questions on the OITE are divided into three taxonomies as shown in Table 3, which also indicates the results for both residents and practitioners. Overall, practitioners (N = 396) scored better than residents (N = 4,400) in all three taxonomy levels. Scores were closest for questions that relied more heavily on memory recall (taxonomy I). The spread between practitioner and resident scores increased as the questions focused on clinical thinking and problem-solving skills.

These results are consistent with developing clinical thinking skills. The practitioner treats more patients and is exposed to a greater variety of clinical issues; residents, particularly junior residents, are just developing these skills.

Why practitioners took the OITE
Practitioners reported the following two main reasons for taking the OITE after being in practice for many years: to prepare for recertification examinations and to assess the currency of their orthopaedic knowledge.

They found the examination especially helpful in specialty areas outside of their practice focus. Practitioners replied that the questions were an impetus to look up subjects that might not arise in their day-to-day contacts. Some practitioners thought taking the test was a good way to know what was being taught in orthopaedic residency programs, and those who interacted with residents thought it would improve their ability to instruct residents.

The change to an OITE on DVD instead of paper allowed the inclusion of videos and multi-slice 3D imaging with exam questions. This gave the OITE the potential to better represent what happens in actual practice. The comments from practicing orthopaedic surgeons are welcome because the goal of the new format is to develop questions that require the clinical problem-solving skills necessary to treat patients; therefore, the questions need to simulate the clinical situations that trigger this kind of thinking.

In summary, the 396 ortho-paedic practitioners who took the OITE and returned answer sheets did very well; most matched or surpassed the performance of senior orthopaedic residents. They had diverse practice backgrounds and provided valuable feedback on the exam and on the new DVD-based format. These results demonstrate that general orthopaedic knowledge can be retained at a high level well past residency training. Orthopaedic surgeons are life-long learners, indeed.

Craig S. Roberts, MD, MBA, is the current chair of the AAOS Evaluation Committee; J. Lawrence Marsh, MD, chaired the committee at the time of the 2009 OITE; Laura Hruska, MEd, is the staff liaison to the Evaluations Committee.

More information and tables (PDF)