当前位置: 首页 > 期刊 > 《美国整骨期刊》 > 2006年第5期 > 正文
编号:11325290
Survey on the Clinical Skills of Osteopathic Medical Students
http://www.100md.com 《美国整骨期刊》
     the National Board of Osteopathic Medical Examiners, Clinical Skills Testing Center, Conshohocken, Pa (Gimpel, Weidner); and the Foundation for Advancement of International Medical Education and Research, Philadelphia (Boulet), Pa.

    As part of the standard-setting methods used by the National Board of Osteopathic Medical Examiners for its Comprehensive Osteopathic Medical Licensing Examination clinical skills performance evaluation (COMLEX-USA Level 2-PE), a self-administered survey was distributed electronically and by mail to deans of colleges of osteopathic medicine, directors of graduate medical education programs, osteopathic medical students, and experts chosen demographically to represent osteopathic physicians in the United States. Groups were asked to rate fourth-year osteopathic medical students and interns on their clinical skills and acceptable pass rates and expected pass rates on the COMLEX-USA Level 2-PE. The surveys were not used systematically to compute the passing standards but to provide additional support for their validity. The viewpoints of the deans differed from those of the students, osteopathic graduate medical education program directors, and experts regarding clinical skills proficiencies and acceptable pass rates. However, all of the groups agreed that, on average, some students and interns do not have adequate clinical skills. These results provide additional support for requiring acceptable performance on a comprehensive clinical skills examination before admission to osteopathic graduate medical education programs.

    In 2004, the National Board of Osteopathic Medical Examiners (NBOME) implemented a clinical skills examination, the Comprehensive Osteopathic Medical Licensing Examination Level 2 Performance Evaluation (COMLEX-USA Level 2-PE).1,2 As part of the standard-setting process for this examination, a national survey of the clinical skills of fourth-year osteopathic medical students and interns (Figure) was distributed to stakeholder groups to obtain their opinions. In addition to providing the National Board of Osteopathic Medical Examiners (NBOME) with an idea as to how each of the groups perceived the clinical skills of fourth-year medical students and interns,3 the survey results supplied information about the minimum and maximum passing standards that would be acceptable to stakeholders. These data were considered by the NBOME executive committee in the final approval process of the standards for the Level 2-PE.

    Methods

    The groups surveyed included deans from colleges of osteopathic medicine, directors of osteopathic graduate medical education (OGME) programs, fourth-year osteopathic medical students, and an expert panel of osteopathic physicians demographically representative of the field of osteopathic medicine in the United States. Experts were selected based on nominations from deans as well as NBOME board members. This group comprised osteopathic physicians from state medical licensing boards, private clinical practice, academic clinical practice, teaching, and administration, and was further representative of the profession with respect to age, sex, geography, and ethnicity or race.4–6 Primary care physicians (ie, family medicine, osteopathic manipulative medicine, internal medicine, and pediatrics) predominated, which is both representative of the osteopathic medical profession and preferred for clinical skills examinations for general practice (including osteopathic manipulative treatment [OMT]).2 Also of note is that academic physicians who had substantial experience in full-time private practice prior to taking teaching positions at academic institutions were favored in selection.7

    Survey Instrument

    The survey included five questions (Figure). The paper survey was identical to the electronic survey except that the electronic survey sent to OGME directors requested information on the specialty of the program and replaced "fourth-year students" with "interns."

    Data Collection

    In mid-August 2004, the NBOME survey was mailed to deans at each of the 20 osteopathic medical schools and a random sample of approximately 700 fourth-year students from the class of 2005. Approximately 345 surveys were sent electronically to OGME directors in October 2004 and were re-sent in December 2004 to those who did not respond. Data were collected from these groups through January 31, 2005. An identical survey was sent to the 35 experts in January 2005.

    The expert panel meetings provided the data required to determine the relationship between the adequacy of students' performance and their resultant scores. The surveys were part of the triangulation process used to determine the final cut score decisions for the COMLEX-USA Level 2-PE. The surveys were not used systematically to compute the passing standard, but rather, to provide additional support for the passing standard currently in use. The use of triangulation, in which multiple sources of data along with judgments from expert panels are used to determine testing standards, is common in medical testing organizations and is highly recommended in high-stakes testing in general.7,8 The use of data from external sources can enhance the credibility of the process of standard setting.7–14

    Results

    One hundred eighty-nine student surveys (27%) were undeliverable, owing to temporary addresses provided. Of the undeliverable surveys, 113 were re-sent because a current address was provided by the post office. Data are based on responses from 17 deans, 35 experts, 70 OGME directors, and 220 students. Thirty-one percent (22) of the OGME directors who completed the survey were from family medicine programs, 12% (8) were from internal medicine, 10% (7) were from obstetrics and gynecology, and 5% (4) were from pediatrics. Ten percent (7) of the OGME respondents were directors of medical education (DMEs) at their institutions, 14% (10) directed other programs (specified on the survey as cardiology, sports medicine, dermatology, orthopedics, surgery, neuromusculoskeletal medicine, or simply as "other"), and 13% (9) directed more than one program. The selected experts had an average of 10.4 years of private clinical practice experience. Most experts had direct experience with the population of students for whom clinical skills standards will be set, with an average of 9.2 years of working directly with fourth-year osteopathic medical students.

    The students' judgments of their peers were closer to those of the OGME directors than those of the deans. Unlike the deans, there were no skill areas where students, on average, felt that more than 90% of fellow students had adequate clinical skills. The students were most confident in the areas of history taking and use of the SOAP (subjective, objective, assessment, and plan) note form, where they felt that 89% and 88% had adequate skills, respectively. Students suggested that approximately 85% of fellow students had adequate skills in physical examination, and 84% had sufficient physician-patient communication skills. Students felt that only two thirds of fourth-year students would be prepared for delivering OMT in osteopathic graduate medical training.

    In comparison with the deans and the students, the OGME directors were the most stringent group. The participating OGME directors felt that approximately 80% of interns had adequate skills in physical examination, the SOAP note form, and physician-patient communication. Among the skills listed, OGME directors were most confident in the ability of interns to take patient histories. Similar to the students, OGME directors were also uncertain of the ability of interns to provide OMT to their patients. According to this group, only 58% of interns have adequate levels of mastery in this skill area.

    The experts had judgments that were most in agreement with the OGME directors. Experts felt that approximately 80% of the students had adequate skills in history taking, physical examination, and the SOAP note form and that only 74% of students had sufficient skills in physician-patient communication. Similar to the other groups, experts were least confident in students' OMT skills; 64% of students had adequate OMT skills, according to the expert group.

    There was a range of rates among the four groups, especially between the deans and the other three groups. Many experts, OGME directors, and students listed relatively small percentages in some skill categories (eg, minimum scores in some areas were as low as 5%). In contrast, the minimum score given by the deans was 60% in physician-patient communication and exceeded 70% in all other categories. The maximum percentages given in Table 1 demonstrate that in every group, there were individuals who felt that 95% to 100% of the students had adequate skills. Some respondents felt that 100% of students had adequate clinical skills in all five skill areas. Thirty-five percent of the deans recorded 100% for all five skill categories, compared with 6% of the OGME directors, 4% of the students, and none of the 35 experts.

    The level of agreement between groups for the expected and acceptable pass rates (Table 2) was higher than for the clinical skills. All respondents felt that a minimum of 79% to 88% of students taking the test for the first time would pass, and a maximum of 94% to 98% would pass. Similar to the clinical skills ratings, the mean minimum expected pass rates by OGME directors and experts were lower than those reported by deans and students.

    Experts, OGME directors, and students felt that a lower pass rate was acceptable (82%, 79%, and 80%, respectively). Deans, however, felt that a minimum pass rate of 90% was acceptable. The groups were closer in agreement on the mean maximum pass rates (95%–98%).

    Comment

    These data demonstrate that, of the groups surveyed, deans of the colleges of osteopathic medicine (COMs) are the most confident in students' clinical skills. Across all five areas (history taking, physical examination, OMT, SOAP note form, and physician-patient communication), the mean ratings given by COM deans was substantially higher than those given by OGME directors, students, or experts. Although these surveys were sent directly to the dean at each of the 20 COMs, responses were returned from the deans themselves in only some instances, and from associate or assistant deans in other instances.

    The results of our study clearly indicate that students do not feel that all of their peers have the necessary set of clinical skills to practice medicine safely and effectively in OGME programs. The OGME directors rated the items more similarly to the students than to the COM deans, even though they were describing the performance level of interns. The experts had responses that were comparable with the OGME directors. The experts may have been the most representative of the four groups in our present study. The fact that seven (20%) of the 35 experts were also OGME directors may have increased the correlation between judgments of the OGME director group and the experts. However, 14% of the experts were assistant or associate deans.

    The survey results also demonstrate that the perceived adequacy of students' skill levels varies according to clinical skill areas. All groups appeared most concerned about students having inadequate OMT skills and were most confident in their history-taking skills. This is consistent with the preliminary report of Osteopathic Medical Education in the United States: Improving the Future of Medicine5 commissioned by the American Osteopathic Association and the American Association of Colleges of Osteopathic Medicine in July 2005, which reported that only 69% of graduating fourth-year osteopathic medical students perceived themselves as competent in integrating osteopathic principles and practice in the diagnosis and treatment of patients.

    A significant limitation in this study is that the number of OGME directors who responded to survey requests was relatively small (20%). The low response rate could be the result of incorrect e-mail addresses, names of OGME directors being outdated, or OGME directors not opening e-mail from unfamiliar individuals. Their response rate may have been influenced by the fact that surveys were sent to them in October, a time in the academic year when OGME directors typically are busy with applicant interviews.

    This low response rate may also have been because OGME directors were less familiar with the new Level 2-PE than were students and COM deans. Selection bias may have been a factor and should be taken into consideration when interpreting the results of this study. For example, it is plausible that many OGME directors who received the survey may not have responded because they felt that their interns possessed the necessary skills, while those who responded were not happy with the clinical skills levels of their interns and welcomed the opportunity to relay this information to researchers. In addition, only OGME directors from residency programs approved by the American Osteopathic Association were surveyed. And, because more than 50% of osteopathic medical school graduates are now completing residency training in programs accredited by the Accreditation Council for Graduate Medical Education, interpretation of these results is further limited.5 Further study in this area is required.

    Nonacademic factors may have influenced the dean's responses. For example, it would be difficult for deans and associate or assistant deans to indicate that a large number of students who are graduating lack the appropriate skills. Deans are facing pressure from other groups (eg, students, alumni who are parents of students) that had initially opposed the addition of the clinical skills examination for various reasons, including cost. Some of the deans may not have had recent exposure to fourth-year students, especially not in the clinical setting, because most students spend much of their third- and fourth-year clinical clerkships in hospital and clinical settings that are geographically removed from the medical school campus. Of course, student respondents had significant exposure to the cohort in question, but their responses may have been influenced by the ramifications and costs associated with failing and having to retake the examination.

    Despite these limitations, the results of the survey have proved to be a valuable piece of the process of standard setting. Additional study of perceived as well as actual clinical skills of COM graduates is warranted. Because the Level 2-PE has been in use since 2004, it would be informative to administer a follow-up survey of OGME directors to see if they perceive any increase in the clinical skills of their interns.15 The fact that no groups, on average, thought that the clinical skills of students or interns were adequate provides additional support for the inclusion of a high-stakes clinical skills examination into COMLEX-USA so that students with inadequate clinical skills can be identified prior to entering OGME training programs.

    Acknowledgment

    We thank Lisa Brown, PhD, for her assistance in the initial literature review as well as coordination of the initial surveys. At the time, Dr Brown was a member of the research staff for the NBOME at the National Center for Clinical Skills Testing in Conshohocken, Pa.

    References

    3. Hambleton RK, Jaeger RM, Plake BS, Mills C. Setting performance standards on complex educational assessments. Appl Psychol Meas. 2000;24:355 –366.

    4. Plake BS, Impara JC, Potenza MT. Content specificity in expert judgments in a standard-setting study. J Educ Meas.1994; 31:339 –347.

    7. Norcini JJ, Shea JA. The credibility and comparability of standards. Paper presented at: Annual Meeting of the American Educational Research Association; April 18–22, 1995; San Francisco, Calif.

    8. Norcini JJ, Blackmore DE, Spike N, Swanson DB. Standard setting. Paper presented at: 11th Ottawa Conference on Medical Education; July 6–8, 2004; Barcelona, Spain.

    9. American Educational Research Association, American Psychological Association, and the National Council on Measurement in Education. The Standards for Educational and Psychological Testing. Washington, DC: AERA; 1999:49 –60.

    10. Boulet JR, De Champlain AF, McKinley DW. Setting defensible performance standards on OSCEs and standardized patient examinations. Med Teach.2003; 25:245 –249.

    11. Norcini JJ, Boulet JR. Methodological issues in the use of standardized patients for assessment. Teach Learn Med.2003; 15:293 –297.

    13. Dauphinee WD, Blackmore DE, Smee S, Rothman AI, Reznick R. Using judgments of physician examiners in setting the standards for a national multicenter high stakes OCSE. Adv Health Sci Educ Theory Pract. 1997;2:201 –211.

    14. McKinley DW, Boulet JR, Hambleton RK. A work-centered approach for setting passing scores on performance-based assessments. Eval Health Prof. 2005;28:349 –69.

    15. Boulet JR, McKinley DW, Whelan GP, Van Zanten M, Hambleton RK. Clinical skills deficiencies among first-year residents: utility of the ECFMG clinical skills assessment. Acad Med.2002; 77(suppl 10):S33 –S35.(John R. Gimpel, DO, MEd; )