New policies will help measure whether courses meet their educational goals
In 2006, the AAOS Council on Education undertook a year-long study on the educational effectiveness of the Academy’s continuing medical education (CME) programs. We wanted to know how effectively our courses and products were meeting their educational goals. Council project teams and staff investigated three areas of educational theory: needs assessment, educational methodology, and program evaluation.
As a result of this study and the subsequent discussions, the CME Courses Committee will be implementing changes in the courses it plans and conducts; print and electronic products will be changing as well.
New approaches to program evaluation
The council project team made several recommendations on new approaches to participant evaluation for CME courses, including a change in style and substance for questions on the evaluation form. In the past, questions focused on the faculty presentations—the quality of the speaker and the overall value of the presentation—not on what participants learned or whether they would be implementing the new knowledge into their orthopaedic practice.
In a parallel development, the Accreditation Council for Continuing Medical Education (ACCME), which accredits CME providers like the AAOS, was promulgating new requirements, eight of which targeted program evaluation issues. The ACCME’s most significant new evaluation requirement states, “The provider analyzes changes in learners (competence, performance, or patient outcomes) achieved as a result of the overall program’s activities/educational interventions.”
The CME Courses Committee, under the leadership of Edward Akelman, MD, has developed new methods for evaluating Academy CME programs, including pre- and post-testing and self-rating using performance checklists for skills evaluation. To continue to improve programming and maintain ACCME accreditation while reducing the length of written evaluations, the committee recommended a mixture of audience response system questions and shortened surveys. They also developed new evaluation questions related to the use of knowledge acquired in the program.
Pre- and post-testing
Voluntary pre- and post-testing debuted at the AAOS Maintenance of Certification Preparation and Review Courses in 2006. This pilot program had substantial participation; about 80 percent of participants completed both pre- and post-tests at each course. Score data indicated that participants had a statistically significant increase in knowledge. In 2007, this evaluation technique will be used in 14 courses.
Self-rating performance checklists
Performance checklists, which enable participants to evaluate their surgical skills both before and after completing a skills lab procedure, were tested in three courses in 2006. Although studies report that physicians are not effective self-raters, the committee believes the use of checklists will prove valuable to participants in AAOS surgical skills courses.
A checklist breaks down a procedure into its smallest measurable unit of activity and is enhanced with knowledge issues and failure points to help the participant complete the self-rating. Course faculty find the checklists are valuable teaching aids during lab sessions.
New evaluation questions
The committee also developed a new approach to evaluating a CME course. Course directors will work with staff to develop a course-specific evaluation plan that could include use of audience response systems as well as paper forms. Daily questionnaires—with a focus on evaluating what the participant learned and how the new knowledge will be implemented in practice—will be used, rather than an aggregated questionnaire at the end of a course.
Previous evaluation forms asked participants to rate each faculty member’s talk and their overall satisfaction with the course. Now the form (sidebar p. 51) will ask about a participant’s goals for attending a course—to gain new knowledge/skills or to confirm existing knowledge/skills. Specific questions related to course content will focus on the extent that participants added to their knowledge. Open-ended questions provide the opportunity to reflect on implementation by asking “how will this knowledge gained influence your daily practice?”
In CME programs sponsored by other medical specialty societies, effectiveness is intended to be measured by a change in practice by the participating physician. Although surgeons may change their practices as a result of CME, they more often attend a course to ensure that their standard practices are still within the acceptable window of current practice. Validation—rather than change—is the goal. The new approach to evaluation takes this into account. Striving for consistency
In examining the current approaches to evaluating the Academy’s CME programs, the council realized an inconsistency related to CME credits and program evaluation. Although members who participated in a CME course were not required to complete an evaluation form to gain CME credits, those who took home study programs (examinations and electronic media programs) had to complete an evaluation form to receive CME credits.
To eliminate this inconsistency, the council approved a proposal requiring CME course participants to complete an evaluation form to receive their CME credits. This new requirement affects all CME courses presented after July 1, 2007. Course evaluations are a significant part of the planning process for future years, and the larger the number of returned evaluations, the better that planning can be.
Other AAOS CME programs
The shift to focusing on improving the learning experience applies to all forms of CME education. The Annual Meeting, Publications, and Evaluation Committees will also be developing new approaches to CME program evaluation. Our goal is to create consistency in evaluation questions across all CME program formats.
As these changes in the methods and questions for program evaluation are implemented in Academy CME courses, home study programs, and the Annual Meeting, the Council on Education and its committees look forward to your feedback and evaluations of these programs.
Alan M. Levine, MD, is chair of the AAOS Council on Education. He can be reached at email@example.com
New Approaches to CME Course Evaluation
Here is an example of how CME program evaluation is changing. Instead of general course evaluation questions, such as those shown, current and upcoming courses will include questions with greater specificity, linking the content presented to implementation in practice. The new questions are designed to provide course participants the opportunity to reflect on what was learned and how it may be put into practice.
Old Evaluation Items: AAOS Occupational Orthopaedics Course, October 2006 (Entire course)
- Please rate the value of the following course sessions.
- How would you rate the quality of each faculty member’s presentation?
- Did you perceive any bias in any faculty member’s presentation?
- Did you perceive any commercialism in any faculty member’s presentations?
- How would you rate this course?
- As a result of completing this CME program, do you feel you gained knowledge that can be applied to your practice?
New Evaluation Items: AAOS Boomeritis Course, April 2007 (1 session)
- In relation to musculoskeletal care of the mature athlete, to what extent did you add to your knowledge about care and treatment of the knee including tissue engineering, meniscal tears/transplantation, and synthetic meniscus?
- To what extent will you implement knowledge gained into your practice within the next year?
- How well did this session meet your current learning needs in this content area?
- To what extent were faculty presentations evidence-based?
- To what extent did faculty presentations include a tone of commercialism?
- How will this knowledge influence your daily practice?