Assessment of Performance of Auscultation
John P. Finley MD CM
Halifax, Nova Scotia
Assessment is an important stimulus for learning. It must be valid, reliable, content appropriate, transparent to student and assessor, feasible and have suitable scope. Both formative and summative testing is useful for learning a skill such as auscultation, and increasingly the mastery approach to skill learning is emphasized in medical education. Auscultation teaching and assessment should be repeated at least once, preferably more, during medical education as competence decreases with lack of practice. Effective auscultation teaching and assessment have been described in a classroom setting with small and large groups, on clinical rotation, in OSCE format and more recently on websites. Recordings of synthesized heart sounds are used by some programs but live recorded sounds have the advantage of realism and inclusion of breath sounds. Whether to integrate auscultation assessment with diagnostic formulation and other cognitive skill assessment will need to be decided for each program. The significant challenge faced by students in learning to recognize heart sounds argues for a focussed assessment of this skill. Objectives of the teaching program must be clear and matched in the assessment objectives.
General aspects of skill assessment
It is widely accepted that assessment drives teaching1. Certainly, without assessment of the outcomes of teaching there is no assurance that it has been successful. Much has been written on the subject and the reader is referred to expert reviews for in depth discussion2-5 Some general principles are outlined below, as an introduction to specific details of heart auscultation assessment.
The functions of assessment, summarized by Jackson et al2, include 1. supporting teaching and learning- are the teaching objectives met?, 2. accountability to the public, 3. certification. The design of assessment of a particular skill or knowledge set may vary depending on the main function of the assessment.
Good assessment has several attributes2,3 : 1. Validity- are learning objectives represented, are the test instruments appropriate for the skill?; 2. Reliability- does the testing give consistent results regardless of examiner, the example tested, and over time?; 3. Content appropriate- does this match the desired competencies?; 4. Transparency- are the objectives and methods clear to both student and assessor?; 5. Feasibility- will it work in the local setting regarding time, staffing, cost?; 6. Scope- does the test represent the full range of the skill, knowledge and attitudes which are the teaching objectives?
Finally, the timing of assessment in relation to the learning may be formative (“Learning tests” for feedback), or summative (competence at the end of teaching). Test design may differ for these two assessment types. The minimum competence (“mastery”) level will also need to be determined with provision for repeated testing if appropriate.
Specific aspects of heart auscultation assessment
Designing appropriate assessment of student performance of auscultation requires consideration of a number of features of the assessment process.
Aspects of heart auscultation assessment
- Timing in curriculum
- Type of sounds
- Objectives and test contents
- Relation to level of training
For some of these features there is published experience to guide the design. For others individual preference based on experience or institutional constraints is the only guide, which in any case will inform whatever design is chosen. For each of the features published information is reviewed along with some personal observations. The comments relate to teaching of undergraduate medical students except where noted.
Timing in curriculum
Heart auscultation teaching may occur early in the curriculum, with later sessions after a year or two and even in the final year. Earlier assessment will be more formative in nature and later, more indicative of the competence at graduation. Whatever scheduling is chosen should be informed by the main objective of the assessment which may be to assist the student in learning, or to assess competence/performance. Ideally testing would occur both early/middle and late in the curriculum to achieve both objectives.
Several reports6-11 have assessed performance of medical students, usually mid-curricular. Reasons for choice of year of testing were either not stated or related to access to students. Vukanovic-Criley et al11 assessed students in all 4 years of curriculum in a cross sectional study and reported improved scores with increasing year. Stern et al8 in a report of 151 third year students retested a cohort 9-12 months later and found good skill retention while Kuzma et al12 reported significant loss of skills in this time frame. In Barrett’s institution, mastery testing is repeated in each of the first 3 years of their curriculum9,10.
Finally, the interval between the teaching and assessment must be determined based on the objectives of assessment. Again, early testing will be more formative while later testing will be summative8. Web- based testing can allow individual flexibility in the timing of testing based on each student’s grasp of the skill10. Our current protocol is to assess formatively in Year 2, 3-4 weeks after classroom teaching and self study with a CD-ROM although we would prefer to add a second assessment before graduation.
Various settings have been described for heart auscultation teaching as well as student assessment: in classrooms, on a clinical rotation, via dedicated websites, or in a scheduled OSCE session.
In a classroom setting, both small or large group assessments can be successful given appropriate technology7,8,9.
The value of individual teaching on a cardiology rotation has been questioned by Mahnke13 studying pediatric residents, but such rotations do occur for medical students and may allow for individual structured assessments14. Monitoring of auscultation findings of clerks on rotation has been described using personal digital devices15. The authors cite the importance to students of learning to record their findings properly in a standard manner, which are then reviewed by teachers. The outcome on learning has not been reported.
Web- based or digital recordings (CD-ROM, MP3 etc.) of heart sounds offer much flexibility for both teaching and assessment as reviewed in the chapters by Barrett and Pieretti and Mackie. Both formative and later summative (skill competence) assessment can be performed online from locations and at a time convenient for students. This is most valuable in the clerkship when many students are located remote from teaching hospitals. We have had successful experience with live audioconference-based teaching linking several sites. The heart sounds are accessed from a website but the students and teacher communicate via telephone conference for discussion.
Another consideration of assessment is who should perform it. Formative peer assessment has the advantage of providing a learning experience for the assessor as well as the student being tested. Properly executed, peer assessment can be less intimidating for the student. Feedback may be better provided and received, coming from another student who appreciates the challenge from the student’s level of experience. Self assessment, whether online or in a classroom setting, would likely be appropriate for practitioners, but also has a role in formative assessments for students.
OSCE-style assessment of auscultation is apparently included in very few Canadian medical curricula according to a recent personal survey. It is preferred in our pediatric curriculum at Dalhousie University, using live-recorded sounds, as it allows for integration of history and reasoning skills with the assessment of murmur recognition, in an easily scheduled, standardized format. Two murmurs are incorporated into one station of a total of 11 in the OSCE at the end of a clerkship pediatric rotation.
Type of sounds
Sounds used in assessment may be from patients in person, live recorded, or synthetic. Historically, patients have been employed for teaching heart sound recognition and later assessment of the student’s performance. Despite the current availability of recorded or simulated sounds, patients do offer the advantages of the realism of a clinical setting and sound fidelity. Other aspects of clinical examination besides auscultation can be part of the assessment, and the effect of postural changes and stethoscope position can be assessed, as in a real clinical setting 11,16,17. Patient availability and cost are obvious limitations with the use of patients18. A variation on the use of patients might be a simulated patient for the history and some aspects of physical exam, along with a recording of abnormal human heart sounds, attributed to the patient for test purposes.
With the availability of high quality recordings of heart sounds and murmurs, teaching programs have been able to incorporate many more examples of abnormalities than could otherwise have been experienced in medical training, and to demonstrate them to many students simultaneously. Assessment of student performance of auscultation using human sound recordings has been reported by several groups7,11,13,19,20. None of these studies has compared performance on live patients with that of recorded sounds from patients. Collections of recorded heart sounds are available commercially, and electronic recording stethoscopes make possible the acquisition of archived sounds for teaching and assessment purposes. Inevitably the fidelity of the sounds on playback will depend on the quality of the recording and listening systems used. Stethophones will mimic the quality of sounds from an acoustic stethoscope, and transmit sound more effectively to the ear than stereo earphones, but there will always be differences from truly live auscultation (See Chapter 8). Interestingly, the type of stethoscope made no difference in auscultatory performance with patients in one study of 72 resident doctors21.
Synthesized heart sounds, available for over 40 years, have been used in studies of student performance6,8,9,10,22. The authors cite the clarity of these sounds and absence of background noise. Barrett et al.10 compared the performance of 42 second year medical students in recognizing simulated heart sounds versus recorded human heart sounds and found they were comparable (approx. 89% vs approx. 81%).
Some medical school programs combine recorded heart sounds (either synthetic or human sounds) with the use of a mannequin to lend realism to the auscultation22. Other physical signs can be programmed into some mannequins to allow assessment of skills in addition to, but related to, auscultation. The high cost of these mannequins has limited their use.
Our preference for assessment of students, residents and practitioners is for live recorded heart sounds heard through stethophones. The audio quality mimics a stethoscope, and breath sounds are clearly heard. Our archive of sounds can readily be augmented with new recordings from patients, providing an expanded base of examples for assessment.
Accurate assessment of auscultation performance depends on high quality recording and reproduction of heart sounds if patients are not involved. Whether the sounds are synthesized or actual human sounds, the recordings should be as free as possible of noise and artefact which could confuse the student. Background noises such as breath sounds and movement may or may not be admissible depending on the age of the patients and the realism of the setting desired.
As noted in Chapter 8 high quality recordings are routinely feasible at low cost with current equipment. Delivery of these high quality sounds to the student’s ears is more problematic. Listening with individual earphones is essential as speakers and most stereo headphones reproduce heart sounds poorly. The choice between miniature earpieces or stethophones will depend on whether a stethoscope-like sound is desired. Cost and durability may be factors in this choice.
Auscultation performance assessment may be limited to listening skills only (psychomotor skills) or integrated with assessment of the history and other physical findings (cognitive skills) . Integration of cognitive or reasoning skills into testing may give a more operational, applied assessment than a restricted test of auscultation-ie “auscultation in practice”. Vukanovic-Criley et al11, in a study of 860 medical students, residents and practicing physicians, assessed cardiac knowledge, auscultation and pulse visualization and integration of physical findings, using a videotaped mannequin. They reasoned that such an assessment of competency was more indicative of performance than auscultation assessment alone, and is similar to that used in the American Board of Internal Medicine recertification examination. Published auscultation performance studies are at variance with this view. From a personal twenty year review, only two studies were identified which specifically integrated aspects beyond heart sounds and murmurs11,18. The review included assessments of medical students, residents and practicing physicians. Six studies employed patients, and may have included other physical signs in the process16,18,20,22,23. We contend that auscultation is the skill presenting the greatest challenge for learners, and thus merits focussed attention for both teaching and assessment of students. The history, other physical findings and clinical reasoning can be separately assessed in relation to auscultation findings.
Objectives and test contents
As with the assessment of any other skill, what the student is required to do in the testing should reflect what we expect them to be able to do after learning the skill. Further, the assessment will greatly influence what the students learn1. In designing a learning and assessment program for auscultation, educators must be very clear about their objectives which should be written explicitly for both students and teachers. Is the student expected to describe what is heard, to distinguish normal from abnormal sounds, or to make a diagnosis? The answer will depend on the stage of learning and differ for students, residents and practicing physicians. For example, in children with murmurs a major decision for general physicians and pediatricians is whether the murmur is innocent or pathological. This distinction is very difficult for many students, pediatric residents and practitioners and has led to large numbers of normal children being referred unnecessarily to pediatric cardiologists25,26. Our group of pediatric cardiologists feels the major objective of teaching auscultation to medical students must be to help them achieve competence in making the distinction between innocent and pathological murmurs and the identification of abnormal S2 splitting, and we are evaluating new teaching and assessment protocols to this end (See Chapters 5 and 6).
Testing could conceivably include aspects of S1, S2, other heart sounds and murmurs (timing, quality, loudness) and should only include what fits with the objectives for the learners. The assessment should include a check list of what is heard and if appropriate what action is indicated such as referral, review or reassurance.
As noted above,the minimum competence level must be decided in advance, and repeated testing may be allowed after additional teaching measures until mastery of the skill is demonstrated.
Various formats of assessment questions have been reported, with most of the auscultation studies referred to above using open ended questions (“describe what you hear/ describe the abnormality/what is your diagnosis?”). Several reports describe the use of multiple choice questions6,7,13,27 which allow for ease of marking or computer scoring. Finley et al.7 used a combination of both multiple choice and open ended questions and interestingly found different scores for the two formats. This study was a comparison between outcomes of teaching with sounds presented in a classroom for discussion versus self learning with sounds from a CD-ROM of cases involving murmurs. Both groups had similar scores with multiple choice questions but the group having classroom discussions fared better with open ended questions. We continue to use a combination of both types of questions and combined classroom and self learning. We feel that open ended choices mimic real life but the multiple choices allow for more breadth of questioning. Whatever the choice of format, the more questions asked, the more robust the assessment of the student, as is well described for OSCEs in general3.
Relation to level of training
Assessment should be appropriate for the skill and experience of the learner. Medical and nursing students generally would have more limited objectives than specialty residents and their testing might be focussed on distinguishing normal from abnormal with limited emphasis on pathologic diagnosis. Physicians in practice might need limited or more advanced objectives depending on their experience.
Examples of assessment protocols:
- Formative – combines MCQs and open ended questions
- Summative – distinguish normal from abnormal murmurs
1. Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine 1990;65:S63-S67
2. Jackson N, Jamieson A, Khan A. Eds. Assessment in medical education and training. A practical guide. Radcliffe Publishing, Oxford 2007. pp. 1-26
3. Amin Z, Chong YS, Khoo HE. Practical guide to medical student assessment. World Scientific Publishing; 2006, pp.3-14
4. James R, McInnis C, Devlin M. Assessing learning in Australian universities. Ideas, strategies and resources for quality in student assessment. 2003. www.cshe.unimelb.edu.au/assessinglearning
5. Shumway JM, Harden RM. AMEE education guide No 25: the assessment of learning outcomes for the competent and reflective physician. Medical teacher 2003;25(6):569-584
6. Mangione S, Neiman LZ, Greenspon LW, et al. A comprison of computer-assisted instruction and small group teaching of cardiac auscultation to medical students. Med Educ 1991;25:385-395
7. Finley JP, Sharratt GP, Nanton MA, Chen RP, Roy DL, Paterson G. Auscultation of the heart: a trial of classroom teaching versus computer-based independent learning. Med Education 1998;32:357-361
8. Stern DT, Mangrulkar RS, Gruppen LD, Lang AL, Grum CM, Judge RD. Using a multimedia tool to improve cardiac auscultation knowledge and skills. J Gen Intern Med 2001;16:763-769
9. Barrett MJ, Carolyn S, Lacey CS, Sekara AE, Linden EA, Lacey EJ. Mastering Cardiac Murmurs: The power of repetition Chest 2004;126; 470-475
10. Barrett MJ, Kuzma MA, Seto TC, Richards P, Mason D, Garrett DM, Gracely EJ. The power of repetition in mastering cardiac auscultation. Amer J Med 2006;119:73-75L
11. Vukanovic-Criley JM, Criley S, Warde CM, Boker JR, Guevara-Mathews L Churchill WH, Nelson WP, Criley JM. Competency in cardiac examination skills in medical students, trainees, physicians, and faculty. Arch Intern Med 2006;166:610-616
12. Kuzma MA, Barrett M, Cohen D, Seto T. Traditional versus acoustic training in cardiac auscultation. Teaching and Learning in Medicine. 2006;18(2):172- 173.
13. Mahnke CB, Norwalk A, Hofkosh D, Zuberbuhler JR, Law YM. Pediatrics 2004;113(5):1331-1335.Comparison of Two Educational Interventions on Pediatric Resident Auscultation Skills.
14. Mattioli LF, Belmont JM, McGrath Davis A. Effectiveness of teaching cardiac auscultation to residents during an elective pediatric cardiology rotation. Pediatr Cardiol 2008; 29:1095-1100.
15. Torre DM, Sebastian DL, Simpson DE. A PDA-based instructional tool to monitor students’ cardiac auscultation during a medicine clerkship. Medical Teacher 2005;27:559-566
16. Favrat B, Pecoud A, Jaussi A. Teaching cardiac auscultation to residents in internal medicine and family practice: does it work? BMC Medical Education 2004;4:5
17. Dhuper S, Vashist S, Shah N, Sokal M. Improvement of cardiac auscultation skills in pediatric residents with training. Clinical Pediatrics 2007;46:236-240
18. Haney H, Ipp M, Feldman B, McCrindle BW. Accuracy of assessment of heart murmur by office based (general practice) pediatricians. Arch Dis Child 1999;81:409-414.
19. Roy DL, Sargent J, Gray J, Hoyt B, Allen M, Fleming M. Helping family physicians improve their cardiac auscultatory skills with an inter-active CD Rom. J Contin Educ Health Prof 2002; 22:152-159
20. March SK, Bedynek JL, Chizner MA.Teaching cardiac auscultation: effectiveness of a patient-centered teaching conference on improving cardiac auscultatory skills. Mayo Clinic Proc 2005;80:1443-1448
21. Iversen K, Sogaard Teisner A, Dalsgaard M, Grelbe R, Timm HB, Skovgaard LT, Hrobjartsson A. Effect of teaching and type of stethoscope on cardiac auscultatory performance. Am Heart J 2006;152:85.e1-7
22. Gaskin PR, Owens SE, Talner NS, Sanders SPP, Li JS. Clinical auscultation skills in pediatric residents. Pediatrics 2000 Jun; 105(6):1184-7.
23. Farrer KMF, Rennie JM. Neonatal murmurs: are senior house officers good enough? Arch Dis Child Fetal Neonatal Ed 2003;88:F147-F151
24. Mackie AS, Jutras LC, Dancea AB, Rohlicek CV, Platt R, Beland MJ. Can cardiologists distinguish innocent from pathologic murmurs in neonates? J Pediatr 2009; 154:50-54
25. Wong KK, Barker AP, Warren AE. Pediatricians’ validation of learning objectives in pediatric cardiology. Paediatr Child Health 2005;10:95-99
26. Murugan SJ, Thomson J, Parsons JM, Dickinson DF, Blackburn MEC, Gibbs JL. New outpatient referral to a tertiary paediatric cardiac centre: evidence of increasing workload and evolving patterns of referral. Cardiol Young 2005; 15:43-46
27. Lam MZC, Lee TJ, Boey PY, Ng WF, Hey HW, Ho KY, Cheong PY. Factors influencing cardiac auscultation proficiency in physician trainees. Singapore Med J 2005;46:11-14