Review Article
Assessment of Professionalism in Undergraduate Medical Education: What Oncologists need to know
Authors: Ikram Burney, Nausheen Yaqoob, Nisar Ahmed, Firdous Jahan, Nadia Alwardy
DOI: https://doi.org/10.37184/lnjcc.2789-0112.6.7
Year: 2024
Volume: 6
Received: Nov 04, 2024
Revised: Mar 13, 2025
Accepted: Jun 14, 2025
Corresponding Auhtor: Nausheen Yaqoob (nausheen_yaqoob@hotmail.com)
All articles are published under the Creative Commons Attribution License
Abstract
Background: Cancer is the 3rd most common non-communicable disease. Oncology is generally considered a super-subspecialty, and in several centers, postgraduate students rotate through the specialty. However, since undergraduates rotate in oncology wards and clinics and deal with cancer patients, they frequently come across the treating oncologists. Since oncologists are part of a teaching unit, they need to be aware of assessment tools in undergraduate medical education.
Objective: The objective of this study was to provide oncologists with an overview of the current methods of assessment.
Methods: A systematic review of peer-reviewed literature using SCOPUS database was carried out to identify methods of assessment of professionalism in undergraduate medical education. The title and abstract of the selected documents were skimmed to include only relevant articles, defined as articles describing methods or tools of assessment of professionalism in undergraduate medical curricula.
Results: Over the period 1973 to 2020, 125 relevant articles were identified. 54 articles described a method or tool of assessment, such as continuous assessment, or formal end-of-year/semester assessment. Most articles described assessing professionalism as part of continuous assessment, such as peer-assessment, self-assessment, a combination of the two, conscientiousness index, and portfolio.
Conclusion: Assessment of professionalism in undergraduate medical education is complex, and several types of methods have been employed. While formal assessments play a role, integration of continuous assessment methods, such as peer and self- assessments and portfolios, suggests a shift towards formative evaluation strategies. Curriculum developers and examination committees should select method(s) appropriate to their program, contextually and culturally.
Keywords: Professionalism, assessment, medical education, curriculum, systematic review, Oman.
INTRODUCTION diverse understanding, different strategies to assess
Medical education is in a state of constant evolution, and the importance of producing a 'competent' doctor has emerged as a central theme. According to the Accreditation Council for Graduate Medical Education (ACGME), core competencies of a medical graduate include patient care, medical knowledge, interpersonal and communication skills, practice-based learning and improvement, system-based practice, and professionalism. Professionalism is central to the practice of medicine because it has an impact on physician-patient relationship, patient satisfaction, and health care outcomes [1, 2 ]. However, professionalism is a complex and multi-dimensional construct, and the core components include organization of health care delivery, maintaining patient and public trust in the profession, and an individual's continuous professional development [3-5]. Like any other competency, professionalism can be taught and hence needs to be assessed [6]. However, as the concept of teaching medical professionalism has a
professionalism have been described, and the optimum strategy or the mix of methods continues to be explored [7].
Cancer is the 3rd most common non-communicable disease. According to the GLOBOCAN, 20 million new cases were diagnosed in 2023, and almost 10 million patients died of the disease [8]. In Pakistan, an estimated 200,000 patients are diagnosed to have cancer every year [9]. Oncology is generally considered a super sub- specialty; however, undergraduate medical students need to rotate through medical and surgical oncology wards and outpatient departments as part of the curriculum. Oncologists need to be aware of assessment tools in undergraduate medical education, especially professionalism. Professionalism is an important competence for medical practice, yet the concept of its assessment is complex. Assessment of professionalism also gained momentum as a result of several factors, including generational change, accreditation demands, consequences of not assessing professionalism on future practice, issues related to patient safety, and to develop explicit professional attributes [10]. With the current perception about healthcare as a profit-making
profession, and the resultant erosion of public interest in the noble profession, it is all the more important that professionalism is assessed at various levels of education and practice, from undergraduate medical school, through postgraduate programs, and during medical practice [11]. Diverse methods and instruments have been employed to assess professionalism in undergraduate medical students [12-17] and include informal observation, formal observation by peers, patients, and tutors, longitudinal assessments by faculty, formative multisource assessments, part of the summative assessments, and sometimes, even absence of professional behavior. Some of these methods may or may not apply to different curricula and cultures [18].
We examined the published literature systematically to identify the various and commonly used instruments and methods to assess professionalism in undergraduate medical education. The results of this study might help to identify actionable and applicable method(s) for curriculum planners and developers, and examination committees to select what is most suitable for their curriculum.
completed. In addition to the SCOPUS database search,a secondary search was conducted by screening the references and citations of included full texts and of previous published reviews.
Articles focusing on professionalism but not related to assessment methods were excluded, as were those addressing professionalism in medical practice, continuing medical education, and postgraduate residency programs. Also, articles related to teaching professionalism but without assessment were also excluded. Additionally, articles related to the assessment of professionalism in the education of Pharmacy, Podiatry, Physiotherapy, Dentistry, and Veterinary medicine were excluded. The selection process for relevant articles is illustrated in Fig. (1).
METHODOLOGY
A systematic review of peer-reviewed literature was conducted using the PRISMA framework to study the methods used for assessing professionalism in undergraduate medical education. SCOPUS database was used to identify relevant articles. The initial search, conducted on April 7, 2021, used the terms “professionalism” AND “medical education” OR “medical college” OR “medical school” in the title, abstract, or keywords. To focus specifically on the undergraduate level, the keywords “undergrad” or “medical student” were included. In the next phase, additional terms such as “teaching,” “learning,” “assessment,” and “curriculum” were added. This search resulted in a total of 1,457 articles published between 1973 and December 31, 2020.
To narrow the search, articles written in languages other than English and conference papers and proceedings etc were excluded, resulting in 1,387 articles. Two independent researchers manually reviewed the titles and abstracts of the remaining documents to identify the relevant articles. Relevant articles were defined as those that described methods for assessing professionalism within undergraduate medical curricula. These included original articles, reviews, meta-analysis, and commentary, describing either qualitative, quantitative, or mixed-method study design. For the final analysis, only original articles were chosen. It is possible that some articles may not have been identified, as only the SCOPUS database was searched. Hence, PubMed was also searched briefly, and since no further articles were identified, the search was aborted without being
RESULTS
Over the study period of almost 48 years, a total of 125 published articles were identified describing assessment of professionalism. The annual number of publications over the last two decades is shown in Fig. (2). All 125 articles were analyzed for the type of publication, and after reviewing the abstracts and the full manuscript, a total of 54 articles were selected for more detailed
|
analysis of the methods/tools of assessment (Table 1). One meta-analysis, 34 review articles, and 36 articles (commentaries, studies exploring beliefs, perceptions, and attitudes of medical students and tutors about professionalism, ethics, and moral values, etc.) were not included in the final analysis. The vast majority of the 'relevant' 54 articles were published in 6 journals (Medical Teacher, Academic Medicine, Teaching and Learning in Medicine, BMC Medical Education, Advances in Health Sciences Education, and Medical Education).
Table 1: Tools of assessment.
Reference | Country | Description of Method | Type of Study | Title |
Curran, Fairbridge [32] | Canada | Peer assessment | Longitudinal prospective study | Peer assessment of professionalism in undergraduate medical education |
Sahota and Taggar [23] | UK | Situational judgement tests (SJTs) | Cross- sectional | The association between Situational Judgement Test (SJT) scoresand professionalism concerns in undergraduate medical education |
Noguera, Arantzam- endi [33] | Spain | Self- assessment | Correlation | Student’s inventory of professionalism (Sip): A tool to assess attitudes towards professional development based on palliative care undergraduate education |
Cheng and Chen [34] | China | A scale consisting of eight factors with 51 items | Correlation | An exploration of medical education in central and southern China: Measuring the professional competence of clinical undergraduates |
Tavakol and Pinner [35] | UK | Many-Facet Rasch Model (MFRM) | Cross- sectional | Using the Many-Facet Rasch Model to analyze and evaluate the quality of objective structured clinical examination: A non- experimental cross-sectional design |
Kassab, Du [36] | Qatar | Professional competency assessment scores in PBL | Observational | Measuring medical students' professional competencies in a problem-based curriculum: A reliability study |
Harendza, Soll [37] | Germany | Group Assessment Performance (GAP)-Test | Case-control | Assessing core competences of medical students with a test for flight school applicants 13 Education 1303 Specialist Studies in Education |
Epstein, Dannefer [77] | USA | 2-week comprehens- ive assessment (CA) | Longitudinal prospective | Comprehensive assessment of professional competence: The Rochester experiment |
Davis [78] | UKA | OSCE (Dundee experience) | Cross- sectional | OSCE: The Dundee experience |
UK: United Kingdom, USA: United States, PBL: Problem-based Learning, EPA: Entrustable Professional Activities, MSF: Multi-source Feedback, OSCE: Objective Structured Clinical Examination
The 54 articles either described a tool for assessment or the effectiveness, feasibility, or correlation of one type of tool with one or more other types of tools. The year of publication and the type of tool used to assess professionalism are shown in Table 1. Assessment methods were classified as either part of the continuous assessment or part of the final formal assessment. In some instances, they were presumed to have professional competence, unless they committed an act requiring disciplinary measures, such as warning letters, etc. Table 2 shows the themes and the frequency of use of different assessment tools. Peer-assessment, with or without self-assessment, or as part of the multi-source feedback, was the most frequently used method as part
Table 2: Themes and the frequency of use of different assessment tools.
Part of Continuous assessment | Count (n) |
Peer-assessment | 7 |
Peer and Self-assessment | 5 |
Multisource feedback (eg PRIMES) | 4 |
Conscientiousness index / Clinical Conscientiousness index | 2 |
Portfolio | 2 |
Rating scale | 1 |
Faculty’s direct observation during rounds | 1 |
PROMOBES (Web-based instrument for professional behavior assessment) | 1 |
Meaningful use of EMR | 1 |
Patient narratives | 1 |
2-week comprehensive assessment | 1 |
Professional competency assessment in problem-based learning | 1 |
Use of workplace assessment form | 1 |
Student’s progress review | 1 |
Part of final / formal Assessment | |
MCAT, CPX, OSCE | 5 |
Simulated patient | 3 |
Situation judgment tests | 2 |
Reflective writing | 1 |
ACGME and USU | 1 |
Videotape | 1 |
Dichotomous assessment | |
Critical Incident form | 2 |
Number of warning letters | 1 |
Observable indicators (Excessive absences and negative peer assessments) | 1 |
of the continuous assessment method. A lot of other methods were reported either once or twice, including longitudinal assessments using direct observation, review of portfolios, documentation in electronic medical records and written records, or even patient narratives. The use of simulated patients, either as part of clinical performance examination (CPX) or objective structured clinical examination (OSCE), was the most frequently employed method as part of the formal assessment. Alternatively, situational judgement tests were employed.
DISCUSSION
This systematic review collates the methods of assessment of professionalism in undergraduate medical education. Over almost 50 years, a total of 125 articles describing assessment of professionalism in undergraduate medical education were identified, of which 54 original articles described a method or tool of assessment. Most articles described assessing professionalism as part of the continuous assessment, such as peer-assessment, self-assessment, a combination of the two, conscientiousness index, or portfolio. Other articles described formal end-of- year/semester assessments including OSCE-like examinations, assessment by simulated patients, and situational judgement tests. Occasionally, the number of warning letters issued to the student was used as an assessment tool.
Professionalism is one of the critical attributes of medical graduates. To ensure that professionalism has been taught and learnt, it must be assessed. Over the past few years, the interest in assessing professionalism has gained momentum; however, its measurement remains challenging. Part of the challenge is because of the different perceptions of the definition of professionalism, the difficulty in defining the concept and boundaries of professionalism, and partly because of the complexity involved in the assessment of an 'attribute', rather than that of knowledge or skill. The peer-reviewed literature on assessment of professionalism of undergraduate medical students spanning over the past half-century was reviewed to study the temporal trends of publication and identification of the methods and tools of assessment.
Peer-assessment and self-assessment were identified as the most employed methods of assessment. However, although medical students have unique information about peers' professionalism, they are generally reluctant to share it with faculty. Although most students would agree that there should be peer-assessment of professionalism and ethical initiatives, they have preferences for how the assessment should take place. For example, some prefer reporting unprofessional behavior to an impartial counselor through a completely
anonymous process [19, 20], while others prefer soliciting their views on specific incidents [19]. Different tools were used for peer-assessment, including a multi-rater process, where medical students rated their peers on professionalism behaviours and attributes using a structured scale. This scale included ethical decision- making, inter-personal skills, reliability, communication, and respect for others. The methodology used is observed behaviour. In other studies, peer-assessment was carried out through survey responses, where medical students were asked to reflect on the assessment of their peers based on their perceptions about professionalism, ethical behaviour, etc.
PRIMES (Professionalism, Reporting, Interpreting, Managing, Educating, and Procedural Skills) self- assessment method was used to compare students' self-assessment ratings with those of faculty during mid-clerkship. Based on this comparison, individualized learning objectives for the second half of the clerkship were then established. The degree of agreement between students and preceptors in various domains of professionalism was found to be as high as 70% [21].
Situational judgement tests (SJTs) are commonly used in selecting medical students and doctors to test an individual's ability to respond to role-relevant scenarios. STJs were also employed to assess attributes of medical students, such as empathy, integrity, and resilience, along with judgment and decision making in work-related contexts. Research has shown that SJTs positively influence students' learning, and students generally accept them as a valid assessment tool [22]. Sahota et al. conducted a cross-sectional study utilizing SJT scores from second-year medical students and occurrences of student professionalism concerns. It revealed that for each point increase in SJT score, students were 10% less likely to have multiple professionalism issues. On the other hand, lower SJT scores were strongly linked to a higher risk of professionalism concerns [23].
The Conscientiousness Index (CI) of medical students' performance is determined by various objective measures of conscientiousness, which include punctuality, assignments submitted on time, completion of rotation evaluations, and meeting deadlines for submitting other required information, such as immunization status. The instrument demonstrated a significant correlation between CCI scores and OSCE scores, as well as with the construct of professionalism, based on staff perceptions of individual students' professional behavior. This makes it a relatively simple and straightforward method for assessing professionalism [24, 25].
A mobile web-based platform for professionalism assessment in a situated clinical setting (Professional
Mobile Monitoring of Behaviors [PROMOBES] is a recently described tool. Six domains of professional behavior (reliability, adaptability, peer relationships, upholding principles, team relationships, and scholarship) were documented on mobile devices. In contrast with the primary objective of capturing lapses, most reports were of commendation. PROMOBES attained high acceptance ratings, especially because of near real-time reporting capability, multiple observer inputs, and positive feedback [26]. Whether a web-based approach to assessment of professional behavior may have a potential advantage over a classic, paper-based method was also studied. The quantity and quality of comments provided by students and their peers, along with the feasibility, acceptability, and usefulness of the two approaches, were compared. While the web-based group provided significantly more feedback than the paper-based group, there was no significant difference in the quality of feedback between the two groups [27].
Another common method of continuous assessment was 'longitudinal assessment'. This could take multiple forms. For example, a rating scale or a workplace assessment form can be used for the faculty's direct observation during rounds. Alternatively, a comprehensive assessment is done over a short span. In other instances, portfolio or student's written records are reviewed and an assessment is made. However, these methods are subject to variability and a lower degree of reliability.
Assessment of professionalism is a complex construct. Several other constructs, such as teamwork, life-long learning, communication skills, interpersonal skills, and empathy, also have similar limitations. Since there is no universal definition of professionalism, there is no 'gold standard' for its assessment. Several tools, such as OSCE, mini-CEX, and CPX have been demonstrated to be crucial instruments for assessing professionalism because they offer multiple perspectives from different assessors; however, they do not assess professionalism comprehensively [16]. On the other hand, tools such as peer assessment, self-assessment, and faculty's observations during ward rounds may be valid, yet not reliable. MSF may be more comprehensive, but does not demonstrate "show how" on Miller's pyramid model [28].
The study has several limitations. Firstly, SCOPUS was the only database that was searched to identify relevant articles, as SCOPUS is widely used for citation-based systematic literature reviews [29, 30]. There are several benefits to using SCOPUS over other databases like Web of Science, ProQuest, etc., as SCOPUS includes the widest range of articles with complete reference sets with consistency and reliability [31]. PubMed was also
searched using the same keywords, but since no additional publications were identified, we restricted our search to SCOPUS. Another limitation of the study is that, since professionalism in medical practice is viewed from various perspectives, certain aspects professionalism may not have been emphasized. We identified more than 1000 articles related to teaching, learning, and assessment of professionalism in undergraduate medical education, and reviewed the titles and the abstracts. This number of publications provides an idea about the scope of the subject.
The huge number of tools for assessment of professionalism in undergraduate medical education identified in this review underscores the importance and interest in the subject. At the same time, the diversity and variability of the structure of tools suggest that the quest for the standard tool is ongoing. The question arises whether more tools need to be developed, or is it time to select a few tools, which would be contextually relevant to the type of clinical rotation, and culturally relevant where they are going to be implemented. In our opinion, future studies should explore improving the performance of existing tools and assessing professionalism longitudinally.
CONCLUSION
In conclusion, while the significance of medical professionalism and its assessment is widely acknowledged and recognized, there is currently no contextually related and culturally adapted tool for assessing medical professionalism. While most assessment tools are used for continuous assessment, a considerable number of familiar tools are also applied in formal assessments at the end of the year or semester. Formative and summative assessments are not opposing approaches; rather, they are complementary. Both formative and summative aspects of professional behavior can be integrated within a unified assessment framework. Curriculum developers and examination committees choose methods that are contextually and culturally appropriate for their program.
REFERENCES
1. Abadel FT, Hattab N. Patients' assessment of professionalism and communication skills of medical graduates. BMC Med Educ 2014; 14(1): 28.
2. Lesser CS, Lucey CR, Egener B, Braddock III CH, Linas SL, Levinson W. A behavioral and systems view of professionalism. J Am Med Assoc 2010; 304(24): 2732-7.
3. Swick HM. Toward a normative definition of medical professionalism. Acad Med 2000; 75(6): 612-6.
4. Wilkinson TJ, Wade WB, Knock LD. A blueprint to assess professionalism: Results of a systematic review. Acad Med 2009; 84(5): 551-8.
5. Hafferty F, Papadakis M, Sullivan W, Wynia M. The American Board of Medical Specialties Ethics and Professionalism Committee Definition of Professionalism. Chicago, Ill: American Board of Medical Specialties. 2012.
6. Wynia MK, Papadakis MA, Sullivan WM, Hafferty FW. More than a list of values and desired behaviors: A foundational understanding of medical professionalism. Acad Med 2014; 89(5): 712-4.
7. Hsieh JG, Kuo LC, Wang YW. Learning medical professionalism – the application of appreciative inquiry and social media. Med Educ Online 2019; 24(1).
8. Bray F, Laversanne M, Sung H, Ferlay J, Siegel RL, Soerjomataram I, et al. Global cancer statistics 2022: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries. CA Cancer J Clin 2024; 74(3): 229-63.
9. Pervez S, Jabbar AA, Haider G, Qureshi MA, Ashraf S, Lateef F, et al. Karachi Cancer Registry (KCR): Consolidated data of 5-years 2017–2021. Esophagus 2023; 1: 01.
10. Van Mook WNKA, van Luijk SJ, O'Sullivan H, Wass V, Schuwirth LW, van der Vleuten CPM. General considerations regarding assessment of professional behaviour. Eur J Intern Med 2009; 20(4): e90-5.
None.
FUNDING
Chin JJ. Ethical sensitivity and the goals of medicine: Resisting the tides of medical deprofessionalisation. Singapore Med J 2001; 42(12): 582-5.
CONFLICT OF INTEREST
The authors declare no conflict of interest.
ACKNOWLEDGEMENTS
Declared none.
AUTHORS' CONTRIBUTION
IB conceived the design and drafted and reviewed the manuscript.
NY, FJ and NAW drafted and reviewed the manuscript. NA did literature search and analysis.
12. Chin JJ. Ethical sensitivity and the goals of medicine: Resisting the tides of medical deprofessionalisation. Singapore Med J 2001; 42(12): 582-5.
13. Arnold L, Shue CK, Kritt B, Ginsburg S, Stern. Medical students' views on peer assessment of professionalism. J Gen Intern Med 2005; 20(9): 819-24.
14. Lynch DC, Surdyk PM, Eiser. Assessing professionalism: A review of the literature. Med Teach 2004; 26(4): 366-73.
15. Veloski JJ, Fields SK, Boex JR, Blank. Measuring professionalism: A review of studies with instruments reported in the literature between 1982 and 2002. Acad Med 2005; 80(4): 366-70.
16. Hodges BD, Ginsburg S, Cruess R, Cruess S, Delport R, Hafferty F, et al. Assessment of professionalism: Recommendations from the Ottawa 2010 Conference. Med Teach 2011; 33(5): 354-63.
17. Li H, Ding N, Zhang Y, Liu Y, Wen D. Assessing medical professionalism: A systematic review of instruments and their measurement properties. PLoS ONE 2017; 12(5).
18. Fong W, Kwan YH, Yoon S, Phang JK, Thumboo J, Leung YY, et al. Assessment of medical professionalism using the Professionalism Mini Evaluation Exercise (P-MEX) in a multi-ethnic society: A Delphi study. BMC Med Educ 2020; 20(1).
19. Tay KT, Ng S, Hee JM, Chia EWY, Vythilingam D, Ong YT, et al. Assessing professionalism in medicine – a scoping review of assessment tools from 1990 to 2018. J Med Educ Curric Dev 2020; 7: 2382120520955159.
20. Arnold EL, Blank LL, Race KEH, Cipparrone N. Can professionalism be measured? The development of a scale for use in medical education. Acad Med 1998; 73(10): 1119-21.
21. Roberts LW, Hammond KAG, Geppert CMA, Warner TD. The positive role of professionalism and ethics training in medical education: A comparison of medical student and resident perspectives. Acad Psychiatry 2004; 28(3): 170-82.
22. Hochberg M, Berman R, Ogilvie J, Yingling S, Lee S, Pusic M, et al. Midclerkship feedback in the surgical clerkship: the “Professionalism, Reporting, Interpreting, Managing, Educating, and Procedural Skills” application utilizing learner self-assessment. Am J Surg 2017; 213(2): 212-6.
23. Goss BD, Ryan AT, Waring J, Judd T, Chiavaroli NG, O'Brien RC, et al. Beyond Selection: The Use of situational judgement tests in the teaching and assessment of professionalism. Acad Med 2017; 92(6): 780-4.
24. Sahota GS, Taggar JS. The association between Situational Judgement Test (SJT) scores and professionalism concerns in undergraduate medical education. Med Teach 2020; 42(8): 937-43.
25. McLachlan JC, Finn G, Macnaughton J. The conscientiousness index: A novel tool to explore students' professionalism. Acad Med 2009; 84(5): 559-65.
26. Kelly M, O'Flynn S, McLachlan J, Sawdon MA. The clinical conscientiousness index: A valid tool for exploring professionalism in the clinical undergraduate setting. Acad Med 2012; 87(9): 1218-24.
27. Cendán JC, Castiglioni A, Johnson TR, Eakins M, Verduin ML, Asmar A, et al. Quantitative and qualitative analysis of the impact of adoption of a mobile application for the assessment of professionalism in medical trainees. Acad Med 2017; 92(Suppl): S33-42.
28. Van Mook WNKA, Muijtjens AMM, Gorter SL, Zwaveling JH, Schuwirth LW, van der Vleuten CPM. Web-assisted assessment of professional behaviour in problem-based learning: More feedback, yet no qualitative improvement? Adv Health Sci Educ 2012; 17(1): 81-93.
29. Miller GE. The assessment of clinical skills/competence/performance. Acad Med 1990; 65(9): S63-7.
30. Ahmad N, Naveed A, Ahmad S, Butt Banking sector performance, profitability, and efficiency: A citation-based systematic literature review. J Econ Surv 2020; 34(1): 185-218.
31. Ahmad N, Aghdam RF, Butt I, Naveed. Citation-based systematic literature review of energy-growth nexus: An overview of the field and content analysis of the top 50 influential papers. Energy Econ 2020; 86.
32. Corbet S, Dowling M, Gao X, Huang S, Lucey B, Vigne SA. An analysis of the intellectual structure of research on the financial economics of precious metals. Resour Policy 2019; 63: 101416.
33. Curran VR, Fairbridge NA, Deacon D. Peer assessment of professionalism in undergraduate medical education. BMC Med Educ 2020; 20(1).
34. Noguera A, Arantzamendi M, López-Fidalgo J, Gea A, Acitores A, Arbea L, et al. Student's inventory of professionalism (SIP): A tool to assess attitudes towards professional development based on palliative care undergraduate education. Int J Environ Res Public Health 2019; 16(24).
35. Cheng X, Chen. An exploration of medical education in central and southern China: Measuring the professional competence of clinical undergraduates. Int J Environ Res Public Health 2019; 16(21): 4119.
36. Tavakol M, Pinner G. Using the Many-Facet Rasch Model to analyse and evaluate the quality of objective structured clinical examination: A non-experimental cross-sectional design. BMJ Open 2019; 9(9).
37. Kassab SE, Du X, Toft E, Cyprian F, Al-Moslih A, Schmidt H, et al. Measuring medical students' professional competencies in a problem-based curriculum: A reliability study. BMC Med Educ 2019; 19(1): 155.
38. Harendza S, Soll H, Prediger S, Kadmon M, Berberat PO, Oubaid. Assessing core competences of medical students with a test for flight school applicants. BMC Med Educ 2019; 19(1): 9.
39. Mullikin TC, Shahi V, Grbic D, Pawlina W, Hafferty. First year medical student peer nominations of professionalism: A methodological detective story about making sense of non-sense. Anat Sci Educ 2019; 12(1): 20-31.
40. Dolan BM, O'Brien CL, Cameron KA, Green MM. A qualitative analysis of narrative preclerkship assessment data to evaluate teamwork skills. Teach Learn Med 2018; 30(4): 395-403.
41. Curran VR, Deacon D, Schulz H, Stringer K, Stone CN, Duggan N, et al. Evaluation of a workplace assessment form to assess entrustable professional activities (EPAs) in an undergraduate surgery core clerkship. J Surg Educ 2018; 75(5): 1211-22.
42. Emke AR, Cheng S, Chen L, Tian D, Dufault. A novel approach to assessing professionalism in preclinical medical students using multisource feedback through paired self- and peer evaluations. Teach Learn Med 2017; 29(4): 402-10.
43. Sattler AL, Merrell SB, Lin SY, Schillinger E. Actual and standardized patient evaluations of medical students' skills. Fam Med 2017; 49(7): 548-52.
44. Hoffman LA, Shew RL, Vu TR, Brokaw JJ, Frankel RM. The association between peer and self-assessments and professionalism lapses among medical students. Eval Health Prof 2017; 40(2): 219-43.
45. Roberts C, Jorm C, Gentilcore S, Crossley. Peer assessment of professional behaviours in problem-based learning groups. Med Educ 2017; 51(4): 390-400.
46. Butler KL, Hirsh DA, Petrusa ER, Yeh DD, Stearns D, Sloane DE, et al. Surgery clerkship evaluations are insufficient for clinical skills appraisal: The value of a medical student surgical objective structured clinical examination. J Surg Educ 2017; 74(2): 286-94.
47. Burns CA, Lambros MA, Atkinson HH, Russell G, Fitch MT. Preclinical medical student observations associated with later professionalism concerns. Med Teach 2017; 39(1): 38-43.
48. Boysen-Osborn M, Yanuck J, Mattson J, Toohey S, Wray A, Wiechmann W, et al. Who to interview? Low adherence by U.S. medical schools to medical student performance evaluation format makes resident selection difficult. West J Emerg Med 2017; 18(1): 50-5.
49. Al Rumayyan AR, Al Zahrani AA, Hameed. High school versus graduate entry in a Saudi medical school – Is there any difference in academic performance and professionalism lapses? BMC Med Educ 2016; 16(1).
50. Lee M, Wimmers PF. Validation of a performance assessment instrument in problem-based learning tutorials using two cohorts of medical students. Adv Health Sci Educ 2016; 21(2): 341-57.
51. Lee KL, Tsai SL, Chiu YT, Ho. Can student self-ratings be compared with peer ratings? A study of measurement invariance of multisource feedback. Adv Health Sci Educ 2016; 21(2): 401-13.
52. Dong T, Durning SJ, Gilliland WR, Swygert KA, Artino AR Jr. Development and initial validation of a program director’s evaluation form for medical school graduates. Mil Med 2015; 180(4): 97-103.
53. Spandorfer J, Puklus T, Rose V, Vahedi M, Collins L, Giordano C, et al. Peer assessment among first year medical students in anatomy. Anat Sci Educ 2014; 7(2): 144-52.
54. Ramakrishna J, Valani R, Sriharan A, Scolnik D. Design and pilot implementation of an evaluation tool assessing professionalism, communication and collaboration during a unique global health elective. Med Confl Surviv 2014; 30(1): 56-65.
55. Braun UK, Gill AC, Teal CR, Morrison LJ. The utility of reflective writing after a palliative care experience: Can we assess medical students’ professionalism? J Palliat Med 2013; 16(11): 1342-9.
56. Pierce Jr JR, Noronha L, Collins NP, Fancovic E. Brief structured observation of medical student hospital visits. Educ Health 2013; 26(3): 188-91.
57. Ferenchick GS, Solomon D, Mohmand A, Towfiq B, Kavanaugh K, Warbasse L, et al. Are students ready for meaningful use? Med Educ Online 2013; 18.
58. Kalish R, Dawiskiba M, Sung YC, Blanco M. Raising medical student awareness of compassionate care through reflection of annotated videotapes of clinical encounters. Educ Health 2011; 24(3): 1-14.
59. Zanetti M, Keller L, Mazor K, Carlin M, Alper E, Hatem D, et al. Using standardized patients to assess professionalism: A generalizability study. Teach Learn Med 2010; 22(4): 274-9.
60. Berk RA. Using the 360° multisource feedback model to evaluate teaching and professionalism. Med Teach 2009; 31(12): 1073-80.
61. Elliott DD, May W, Schaff PB, Nyquist JG, Trial J, Reilly JM, et al. Shaping professionalism in pre-clinical medical students: Professionalism and the practice of medicine. Med Teach 2009; 31(7): e295-e302.
62. Teherani A, O'Sullivan PS, Lovett M, Hauer. Categorization of unprofessional behaviours identified during administration of and remediation after a comprehensive clinical performance examination using a validated professionalism framework. Med Teach 2009; 31(11): 1007-12.
63. Hodges D, McLachlan JC, Finn GM. Exploring reflective critical incident documentation of professionalism lapses in a medical undergraduate setting. BMC Med Educ 2009; 9(1).
64. Kovach RA, Resch DS, Verhulst. Peer assessment of professionalism: a five-year experience in medical clerkship. J Gen Intern Med 2009; 24(6): 742-6.
65. Chang A, Boscardin C, Chou CL, Loeser H, Hauer. Predicting failing performance on a standardized patient clinical performance examination: The importance of communication and professionalism skills deficits. Acad Med 2009; 84(Suppl 10): S101-S4.
66. Du Preez RR, Pickworth GE, Van Rooyen M. Teaching professionalism: A South African perspective. Med Teach 2007; 29(9-10): e284-e91.
67. Schönrock-Adema J, Heijne-Penninga M, Van Duijn MAJ, Geertsma J, Cohen-Schotanus J. Assessment of professional behaviour in undergraduate medical education: Peer assessment enhances performance. Med Educ 2007; 41(9): 836-42.
68. Liu GC, Harris MA, Keyton SA, Frankel RM. Use of unstructured parent narratives to evaluate medical student competencies in communication and professionalism. Ambul Pediatr 2007; 7(3): 207-13.
69. Dannefer EF, Henson. The portfolio approach to competency-based assessment at the Cleveland Clinic Lerner College of Medicine. Acad Med 2007; 82(5): 493-502.
70. Lurie SJ, Lambert DR, Nofziger AC, Epstein RM, Grady-Weliky T. Relationship between peer assessment during medical school, dean's letter rankings, and ratings by internship directors. J Gen Intern Med 2007; 22(1): 13-6.
71. Arnold L, Shue CK, Kalishman S, Prislin M, Pohl C, Pohl H, et al. Can there be a single system for peer assessment of professionalism among medical students? A multi-institutional study. Acad Med 2007; 82(6): 578-86.
72. Mazor K, Clauser BE, Holtman M, Margolis. Evaluation of missing data in an assessment of professional behaviors. Acad Med 2007; 82(10 Suppl): S44-S7.
73. Stark P, Roberts C, Newble D, Bax. Discovering professionalism through guided reflection. Med Teach 2006; 28(1): e25-e31.
74. Korszun A, Winterburn PJ, Sweetland H, Tapper-Jones L, Houston P. Assessment of professional attitude and conduct in medical undergraduates. Med Teach 2005; 27(8): 704-8.
75. Bryan RE, Krych AJ, Carmichael SW, Viggiano TR, Pawlina W. Assessing professionalism in early medical education: Experience with peer evaluation and self-evaluation in the gross anatomy course. Ann Acad Med Singap 2005; 34(8): 486-91.
76. Van Zanten M, Boulet JR, Norcini JJ, McKinley D. Using a standardised patient assessment to measure professional attributes. Med Educ 2005; 39(1): 20-9.
77. Epstein RM, Dannefer EF, Nofziger AC, Hansen JT, Schultz SH, Jospe N, et al. Comprehensive assessment of professional competence: The Rochester experience. Teach Learn Med 2004; 16(2): 186-96.
78. Davis. OSCE: The Dundee experience. Med Teach 2003; 25(3): 255-61.
