A Delphi Study to Validate the Patient-Centered Doctor’s Competency Framework in Korea

Article information

Korean Med Educ Rev. 2024;26(Suppl 1):S64-S83
Publication date (electronic) : 2024 January 31
doi : https://doi.org/10.17496/kmer.23.045
1Department of Medical Education, Pusan National University School of Medicine, Busan, Korea
2Department of Medical Education, Wonkwang University School of Medicine, Iksan, Korea
3Department of Emergency Medicine, Chung-Ang University College of Medicine, Seoul, Korea
4Department of Medical Education, Chung-Ang University College of Medicine, Seoul, Korea
5Department of Family Medicine, Daegu Catholic University School of Medicine, Daegu, Korea
6Department of Psychiatry, Chungnam National University College of Medicine, Daejeon, Korea
7Department of Medical Education, Yonsei University College of Medicine, Seoul, Korea
Corresponding author: So Jung Yune Department of Medical Education, Pusan National University School of Medicine, 49 Busandaehak-ro, Mulgeum-eup, Yangsan 50612, Korea Tel: +82-51-510-8025 Fax: +82-51-510-8125 E-mail: cc139@pusan.ac.kr
Received 2023 December 28; Accepted 2023 December 29.


Defining a competent doctor is important for educating and training doctors. However, competency frameworks have rarely been validated during the process of their development in Korea. The purpose of this study was to validate the patient-centered doctor’s competency framework, which had been developed by our expert working group (EWG). Two rounds of Delphi questionnaire surveys were conducted among a panel of experts on medicine and medical education. The panel members were provided with six core competencies, 17 sub-competencies, and 53 enabling competencies, and were asked to rate the importance of these competencies on a 5-point Likert scale. Between April and July 2021, a total of 28 experts completed both rounds. The data of the Delphi study were analyzed for the mean, standard deviation, median, inter-rater agreement (IRA), and content validity ratio (CVR). A CVR >0.36 and IRA ≥0.75 were deemed to indicate validity and agreement. This study found that five enabling competencies were not valid, and agreement was not reached for three sub-competencies and two enabling competencies. In consideration of CVR and the individual opinions of panel members at each session, the final competencies were extracted through consensus meetings of the EWG. The competencies were modified into six core competencies, 16 sub-competencies, and 47 enabling competencies. This study is meaningful in that it proposes patient-centered doctor’s competencies enabling the development of residents’ milestone competencies, an assessment system, and educational programs.


Medical doctors’ competencies are important because they represent key milestones in medical education, and the implementation of these competencies in educational systems varies internationally. In the United States, the Accreditation Council for Graduate Medical Education has established six core competencies with corresponding milestones, sparking a lively debate, particularly in the context of residency training [1]. Canada developed competencies in the form of the “CanMEDS” framework, which presents a role model of a doctor, led by the Royal College of Doctors and Surgeons, with the participation of various specialists and educators. Canadian residency training programs have been incorporating these competencies since the early 1990s, predating similar initiatives in the United States [2]. The United Kingdom has introduced competencies encapsulated by the “Good Medical Practice” guidelines and initiated outcome-based residency training in the mid-1990s [3]. A common feature of these programs is their integration of undergraduate medical education with continuing professional development, a concept that took root in specialty training during the 1990s.

In South Korea, however, competency-based medical education (CBME) has taken a different form. Initially, CBME was introduced at the undergraduate level, with each university creating and implementing its own set of competencies, rather than adopting a unified approach through a representative federation. When it comes to major-specific competencies, the Korean Institute of Medical Education and Evaluation released a report in 2010 on the development of a standardized curriculum for Korean residency programs, known as “RESPECT 100.” Subsequently, in 2013, the Korean Academy of Medical Sciences recommended a set of desirable educational competencies for residency training as part of a study aimed at reorganizing the curriculum for each specialty to enhance the efficiency of residency training [4,5]. In 2019, the Ministry of Health and Welfare announced an outcome-based approach to residency training [6]. However, research on the actual training programs and the identification of competencies within residency training has been sporadic [7,8]. Recognizing the need to investigate the competencies of Korean doctors beyond undergraduate education and specialty training, the “Korean Doctor’s Role” was established in 2014 through policy research [9]. Despite this development, it has not been actively integrated with the competencies established at the undergraduate level or those for specialties. Consequently, CBME in Korea faces challenges due to the insufficient connection between the competencies of doctors, specialty competencies, and undergraduate competencies.

Although the practice of CBME varies between South Korea and other countries, a shared characteristic is that each nation has developed its own framework outlining the desired competencies for doctors. This framework aims to create a continuum between undergraduate and graduate medical education. Within this framework, the competencies expected at the undergraduate level, during residency, and at the specialist level are delineated, with the term “milestone” used to describe the competency levels at each stage [10,11]. Consequently, defining the competencies required of residents necessitates the establishment of a foundational framework for doctors’ competencies.

The “Korean Doctor’s Role,” which has been used as a framework for Korean doctors’ competencies, is important because it was the first formal presentation Korean doctors’ competencies. However, it has several limitations, particularly in the context of residency training and undergraduate education. These include the challenge of evaluating intangible aspects such as doctors’ attitudes and qualities; variability in the specificity of skill descriptions, with some milestones articulated as broad concepts rather than concrete competencies; ambiguous categorization of competencies due to overlapping content; and unclear descriptions that encompass multiple competencies or use undefined terms. Amidst the coronavirus disease 2019 (COVID-19) pandemic, there has been a shift towards incorporating social considerations and a patient-centered approach in healthcare. Historically, competencies for doctors, both domestically and internationally, have been predominantly devised by experts, without adequately reflecting the perspectives of patients and the broader medical community. In response to these challenges, the National Evidence-based Healthcare Collaborating Agency initiated a study entitled “How to Improve the Educational System for Residency Training to Improve Patient-Centered Outcomes.” This study necessitated a reevaluation and restructuring of the Korean Doctor’s Competency Framework, a high-level conceptual model, to define the competencies required of residents more accurately.

Therefore, this study was conducted as part of an initiative to establish a competency framework for physicians, serving as a foundation for both undergraduate and graduate medical education [12-17]. The development of the competency framework proceeded in stages: In Stage 1, the research team formed an expert working group and devised a plan for constructing the framework. In Stage 2, the researchers compared and analyzed both domestic and international competencies for physicians, reviewed relevant literature, social networking services (SNS), and newspaper articles, and finalized the competency framework through discussion and consensus. Stage 2 of the study primarily focused on identifying a competency framework that consists of six core competencies: (1) expertise in diseases and health, (2) communication with patients, (3) collaboration with colleagues, (4) guardianship of societal health, (5) professionalism in self-conduct, and (6) scholarship in academia. The framework is further detailed with 17 sub-competencies that are organically connected to the core competencies, and 53 enabling competencies, which are the functional units that must be demonstrable in practice. The third stage of the study involved conducting a Delphi study to validate the competency frameworks identified by the researchers. The Delphi method is the most widely used approach for decision-making by synthesizing expert opinions to reach consensus in uncertain situations. It has been employed in medical education for various purposes, including the development of curricula or assessment tools, the creation of educational resources, and the definition of competencies [18,19].

The purpose of this study was to validate the competencies created by the researchers under the theme of “patient-centered doctor’s competencies in Korea” from the perspective of an expert panel. To reflect the views of patients and the medical community, a large-scale survey was conducted after the Delphi survey, as well as big data analysis of newspaper articles and SNS.


1. Participants

To validate the framework of patient-centered doctor’s competencies in Korea, a Delphi survey was conducted with an expert panel comprising clinical specialists and medical education experts, each with at least 10 years of experience in their respective fields. The panel consisted of 28 members who were carefully selected to ensure a balanced representation in terms of age, gender, geographic location, institutional affiliation, current department (specialty), and work experience (Table 1). There were more men (60.7%) than women (39.3%), and half of the participants (50.0%) had between 16 and 25 years of work experience. The majority (85.7%) were affiliated with university hospitals, which included doctors from primary and secondary healthcare organizations as well as government employees. In terms of educational background, most panelists (71.4%) specialized in clinical medicine, with the remainder coming from fields such as basic medical science, education, and health administration. The clinical specialties represented on the panel were diverse, including internal medicine, surgery, obstetrics and gynecology, pediatrics, family medicine, emergency medicine, and psychiatry. Geographically, while the majority (32.1%) of the panelists were based in Seoul, the rest were distributed evenly across the country (Table 1).

Participants’ characteristics (N=28)

2. Instruments

In the first round of the Delphi survey, participants were prompted to freely list three patient-centered competencies that healthcare providers should currently possess, as well as three additional patient-centered competencies that healthcare providers should aim to acquire in the future. This was done to determine whether there were any further competencies to be considered beyond those identified by the research team. Additionally, respondents were asked to rate the significance of each patient-centered competency for doctors in Korea, as established by a prior study, on a scale from 1 to 5. Participants were also given the opportunity to suggest any competencies that they felt should be removed, altered, or included. The second Delphi questionnaire was updated and expanded based on the feedback from the first round. Respondents were once again asked to assess the importance of each competency on a 1 to 5 scale and were invited to provide their own commentary. The Delphi survey form is available in Appendix 1.

After selecting experts for the Delphi survey and developing the survey materials, we conducted two rounds of the Delphi survey in April and July 2021, collecting all questionnaires at each instance. This study received approval from the Institutional Review Board of Chungnam National University Hospital (CNUH 2021-02-025) and was carried out after obtaining informed consent from all participants.

3. Analysis

The first and second rounds of Delphi data were analyzed using mean, standard deviation, median, agreement, and content validity ratio (CVR), with modifications made based on free comments regarding the deletion, modification, and addition of competencies. The CVR, a measure of content validity, quantifies the extent of agreement among expert panelists on the validity of the statements, specifically the frequency with which each statement is rated as valid (receiving a score of 4 or 5 on a 5-point scale). In a Delphi survey with 28 participants, content validity was deemed satisfactory if the CVR exceeded 0.36 [20]. If the CVR was 0.36 or lower, or if the content was deemed valid by more than one expert in free-form comments, the item was either deleted, merged into a single competency, or revised to better reflect the content. Expert panel agreement is a measure of the degree to which panelists' opinions concur, and the threshold for agreement varies among researchers, ranging from 51% to 80%. In this study, consensus was considered achieved at a level of 0.75 or higher, a commonly accepted standard [18,19]. In this study, we prioritized the validity criterion and modified it with reference to the consensus.


1. First round of the Delphi survey

In our initial Delphi analysis, we discovered that when participants were asked an open-ended question to identify the top three patient-centered competencies for healthcare workers, as well as the top three patient-centered competencies essential for the future of healthcare workers, their responses encompassed all of the competencies we had identified.

In the validity analysis, the CVR exceeded 0.36 for both core and detailed competencies. However, 52 of the action competencies had CVRs below 0.36, indicating low content validity. Examples of these competencies with low CVRs include “coordinating various health care services needed for patient care” (Practice Competency 24) within the core competency of “collaborator”; “protecting the health of the population through public health activities in the community” (Practice Competency 32) under the core competency of “health guardian”; “improving the efficiency of health care organizations through cost-effective management systems” (Competency 33); “contributing to reducing health disparities through fair distribution and equitable utilization of health care resources” (Competency 35); and “conducting research to generate knowledge and contribute to the dissemination of results (Competency 52)” associated with the core competency of “scholar in academia” (Tables 24). The free comments from the panel of two or more experts are presented in Appendix 2. The main comments included: (1) the view that doctors’ participation in society, as described under “health guardian,” is more a societal role than a competency; (2) the question of whether the research and teaching competencies under “scholar in academia” are essential for all doctors; and (3) the recommendation to eliminate ambiguous expressions and revise them for greater clarity. The competencies were revised in consultation with researchers, focusing on those with the lowest validity and incorporating feedback from the expert panel. The core competency “health guardian” was renamed “healthcare leader,” and “scholar in academia” was updated to “contributor to the advancement of medicine.” Competency 24, “collaborator,” which had a CVR of 0.36 or lower, was eliminated and its elements were integrated into Competency 29. Competency 32, “health advocate,” was merged with Competency 30, “participating in the establishment of laws, institutions, and policies to protect and promote patients’ health,” resulting in a revised competency: “applying medical expertise to the establishment of policies, laws, and institutions that promote and protect patient health.” The phrasing for Competency 33 was refined to “ensuring effective and efficient management and operation of community healthcare organizations,” and Competency 35 was updated to “Contributing to the elimination of disparities by equitably utilizing healthcare resources.” Competencies 47 and 48, as well as 49 and 50, were each consolidated under the category of “academic personnel,” and competency 52 was removed to ensure relevance for doctors in primary and secondary institutions, in addition to those in university hospitals. Following the first Delphi round, five items were either combined or deleted, resulting in a refined framework of six core competencies, 17 sub-competencies, and 48 enabling competencies (Tables 24, Appendix 3).

Delphi findings on core competencies

Delphi findings on sub-competencies

Delphi findings on competency

Items with a consensus score below 0.75 included “improving community health” and “fulfilling social responsibilities,” which were sub-competencies of the “health guardian” competency. Also falling below the threshold were “working with communities to identify and respond to their health determinants and needs” (Competency 31) and “contributing to the elimination of health disparities through fair distribution and equitable utilization of health resources” (Competency 35) (Tables 24). After a review of the competencies and consideration of free comments from the expert panel, “improving community health” was merged with “social engagement for health promotion” to form “social activities for health promotion.” Competency 31 was rephrased as “participating in community-appropriate public health care activities in response to the health and health-related needs of residents.” Competency 35 was revised due to not meeting the feasibility criteria. However, “fulfilling social responsibility” was maintained as a separate category. Despite its consensus score being below the threshold, it was retained based on the researchers' discussions, as its validity score was above the standard.

2. Second round of the Delphi survey

The second round of the Delphi survey showed that competency 33, “Ensuring that community health care organizations are effectively and efficiently managed and operated,” had a CVR of 0.29, but all other competencies met the content validity requirements. In the free-form comments, as in the first survey, there were still disagreements about the social participation of doctors as “healthcare leaders,” and many comments were made regarding the need for further clarification (Tables 24, Appendix 2).

We removed Competency 33, as it lacked validation from consultations with researchers. Additionally, we refined the detailed competency of “healthcare leader” to “social activities for health promotion.” This revision integrates “social engagement to improve patients’ health” with “providing healthcare services for the community,” enhancing the clarity of the competency’s description. Following the second Delphi round, the competencies were categorized into six core competencies, 16 sub-competencies, and 47 enabling competencies (Appendix 3).


The purpose of this study was to evaluate the validity of researcher-derived, patient-centered physician competencies using the Delphi method. The findings indicated that the majority of competencies were deemed valid. However, the competencies related to societal and academic roles contained items that were considered substandard in terms of validity and consensus. Additionally, there was a significant number of comments from the expert panel regarding these competencies.

First, the competencies associated with doctors’ role in society as “health advocates” or “healthcare leaders” had low validity, with numerous dissenting opinions. Specifically, the competencies deemed to have low validity included “protecting the health of the population through public health activities in the community (Competency 32),” “improving the efficiency of the healthcare organization through a cost-effective management system (Competency 33),” and “contributing to the elimination of healthcare disparities through fair distribution and equitable utilization of healthcare resources (Competency 35).” Furthermore, the competency “being able to coordinate various healthcare services needed for patient care (Competency 24),” initially categorized under the role of “collaborators” for fellow healthcare workers, could also be considered relevant to societal roles, indicating an overall trend of low-validity items in this category. The primary reason for this is that the competencies associated with societal roles are not solely the responsibility of individual doctors; they require collaboration with other professionals such as public servants and social workers. Moreover, not all doctors are expected to engage in these activities.

Social competencies are not explicitly described in the General Medical Council’s competencies in the United Kingdom, which operates a public healthcare system [3]. However, in the United States, the Accreditation Council for Graduate Medical Education (ACGME) competencies include “systems-based practice,” which focuses on understanding healthcare delivery systems and providing cost-effective care within the healthcare system [1]. Similarly, the Canadian competencies include roles such as “health advocate” and “leader,” which compel physicians to address the health needs of patients or communities, manage health resources effectively, and strive for improvements in the healthcare system [2]. However, similar to the results of this study, doctors viewed social competencies such as ACGME’s “systems-based practice” and CanMEDS “health advocate” as less important than other competencies [21-23]. A proposed explanation for this is that doctors who enter medical school are often not socially disadvantaged and have fewer opportunities to experience socially disadvantaged populations [24]. To counteract this, efforts are being made to raise awareness of social competencies by offering experiences that highlight healthcare disparities and by maintaining a website dedicated to connecting clinical medicine with social determinants of health [24].

In contrast, the growing interest in the social competence of doctors during the COVID-19 pandemic seems to have contributed positively to the validation of the “healthcare leader” competency [25-27]. Throughout the pandemic, doctors and medical education professionals have recognized the importance of understanding illness not merely as an individual patient issue, but within the broader context of a society’s healthcare system. They have also acknowledged the significance of offering medical support to vulnerable populations during treatment and vaccination efforts [28,29].

The researchers concluded that doctors bear a collective responsibility to society as “leaders” who are pivotal in healthcare organizations, both in preventing illness and treating the health of patients and communities. However, the term “social accountability” as described in the “Korean Doctor’s Role” (2014) is typically associated with providing limited support for vulnerable populations and does not fully capture the broader concept of improving patient and community health or delivering healthcare services. Therefore, this study adopted the more encompassing and general term “healthcare leader.” Additionally, given the low validity and consensus indices from the expert panel, the competencies were articulated in terms of “exercising expertise” and “contributing,” reflecting the level of responsibility that an individual doctor can and should assume in society.

Second, there was disagreement about doctors’ competencies in academia as “scholars” or “contributors to the advancement of medicine.” While there was a consensus on the importance of “continuing professional development,” many argued that engaging in “education” and “research” should be expected of doctors in tertiary hospitals, but not necessarily of all doctors. This sentiment was echoed in a survey examining perceptions of CanMEDS competencies, which showed that the scholarly competency associated with research was not only deemed the fifth least important out of seven competencies but also exhibited the largest discrepancy in perceived importance between generalists and specialists [22]. The study concluded that doctors in primary or secondary care settings should also foster learning among peers and possess the ability to pose academic questions and seek scientific solutions within the medical field. These findings were generalized in the study’s description. However, terms such as “academic,” “scholar,” or “research” were replaced with “a person who contributes to the advancement of medicine” to avoid overwhelming general practitioners. Additionally, the phrase “contributes to the conduct of research to generate knowledge and the dissemination of its results” was removed because it suggested a direct involvement in research, which was not the intent of Competency 52.

This study has several strengths. First, we are confident that the Delphi method was effectively employed to validate the competencies of doctors. The expert panel participating in the Delphi study was diverse, including doctors from both primary and secondary hospitals, as well as university hospitals and government officials. The panel was not limited to clinicians; it also comprised experts in basic medicine, education, and administration. Furthermore, the panel members were geographically representative of the hospitals where they practiced. Second, in addition to the Delphi study, our research on competencies is comprehensive, involving the analysis of domestic and international literature, social media, and newspaper articles. We are also conducting large-scale surveys targeting citizens, nurses, medical students, residents, and specialists, and organizing public hearings to gather a wide range of perspectives.

However, there are limitations to this study. As mentioned earlier, disagreements regarding the competencies of “healthcare leader” and “contributor to the advancement of medicine” are likely to persist even if statistical validity and consensus levels are satisfactory. Although we have removed ambiguity and used language that is easier to understand, there may be difficulties with using this competency in the education of students or specialists, and the meaning may change during implementation.

The “patient-centered doctor’s competencies” developed in this study are significant for several reasons. First, the concept of competence pertains to the abilities a doctor possesses at the culmination of training. Therefore, it is anticipated that milestones—incremental competencies symbolizing the educational objectives for medical students and residents—will be established and serve as a foundation for their training. Second, the development of a competence evaluation system would enable the monitoring and individual assessment of medical students’ and residents’ competencies. Third, such an assessment system could pinpoint areas of deficiency and guide the creation of educational programs designed to address these shortcomings.

In a follow-up study, it will be necessary to actively review the validity not only through a Delphi study, which is a content validation process, but also through the statistical validation of construct validity or quasi-validity. In addition, efforts should be made to develop patient-centered competencies through comparative studies of experts’ views on doctors’ competencies and patients’ and society’s views. Above all, it is necessary to develop competencies with specificity, clarity, transparency, and applicability in mind, and to re-measure their validity while applying them to the education of students and specialists.

This study focused on developing and validating patient-centered doctors’ competencies, and it is hoped that these competencies will be used in the future for stepwise competency development, competency assessment systems, and educational program development.


Conflict of interest

Woo Taek Jeon, Hanna Jung, and Youngjon Kim are Editorial Board members of KMER, but was not involved in the peer reviewer selection, evaluation, or decision process of this article. Except for that, no other potential conflict of interest relevant to this article was reported.


This research was funded by Ministry of Health and Welfare for the project titled “patient-centered clinical research coordinating center” (project number: HC20C0138).

Authors’ contribution

Study design: SI, SJY, YK, CK, GHL, SWL, WTJ, HJ; data analysis: SI, SJY, YK, CK, GHL, SWL, WTJ, HJ; manuscript writing: SI, SJY; and final approval of the version to be published: all authors.


We would like to thank the expert panelists who participated in the Delphi study and the more than 70 co-researchers who participated in the study, “How to Improve Medical Education to Improve Patient-Centered Outcomes.”


1. Swing SR. The ACGME outcome project: retrospective and prospective. Med Teach 2007;29(7):648–54. https://doi.org/10.1080/01421590701392903.
2. Frank JR, Snell L, Sherbino J. CanMEDS 2015 Physician Competency Framework Ottawa (ON): The Royal College of Physicians and Surgeons of Canada; 2015.
3. General Medical Council. Good medical practice London: General Medical Council; 2020.
4. Lee MS, Ahn DS, Kim MK, Kim YR, Bae JY. Development of generic curriculum for graduate medical education Seoul: Research Institute for Healthcare Policy, Korea Medical Association; 2010.
5. Kim JJ, Whang KC, Kang WK, Kwon SH, Kim JT, Lee SK, et al. A study on the reorganization of the training curriculum for efficient training of the residents Sejong: Ministry of Health and Welfare; 2013.
6. Ministry of Health and Welfare. Announcement of partial revision of annual training curriculum for residents [Internet]. Sejong: Ministry of Health and Welfare; 2019. [cited 2022 Aug 29]. Available from: http://www.mohw.go.kr/react/modules/download.jsp?BOARD_ID=5900&CONT_SEQ=347949&FILE_SEQ=259258.
7. Kim HJ, Kyeon YG, Choi JH, Oh HS, Lee SM, Jung SW, et al. A recognition survey by psychiatry residents and psychiatrists regarding the quality of residency training and clinical competence in Korea. J Korean Neuropsychiatr Assoc 2020;59(2):148–58. https://doi.org/10.4306/jknpa.2020.59.2.148.
8. Kim SG. New start of surgical residents training: the first survey of program directors in Korea. BMC Med Educ 2019;19(1):208. https://doi.org/10.1186/s12909-019-1646-3.
9. Ahn DS, Han JJ, Lee MJ, Huh YJ, Kwon BK, Kim MK, et al. A study on role of Korea’s doctors Sejong: Ministry of Health and Welfare; 2013.
10. Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system: rationale and benefits. N Engl J Med 2012;366(11):1051–6. https://doi.org/10.1056/NEJMsr1200117.
11. Frank JR, Snell LS, Sherbino J. Draft CanMEDS 2015 milestones guide Ottawa (ON): The Royal College of Physicians and Surgeons of Canada; 2014.
12. Jeon WT. Development of a new framework for doctor’s competencies in Korea. Korean Med Educ Rev 2022;24(2):77–8. https://doi.org/10.17496/kmer.2022.24.2.77.
13. Jeon WT, Jung H, Kim WJ, Kim C, Yune S, Lee GH, et al. Patient-centered doctor’s competency framework in Korea. Korean Med Educ Rev 2022;24(2):79–92. https://doi.org/10.17496/kmer.2022.24.2.79.
14. Kim YJ, Lee JW, Yune SJ. Research trends on doctor’s job competencies in Korea using text network analysis. Korean Med Educ Rev 2022;24(2):93–102. https://doi.org/10.17496/kmer.2022.24.2.93.
15. Jung H, Lee JW, Lee GH. Analysis of social needs for doctors and medicine through a keyword analysis of newspaper articles (2016-2020). Korean Med Educ Rev 2022;24(2):103–12. https://doi.org/10.17496/kmer.2022.24.2.103.
16. Kim C, Lee JW, Im S. Social accountability in medical education. Proceedings of the Korean Medical Education Conference 2022 of The Korean Society of Medical Education; 2022 May 19-20; Suwon, Korea. Seoul: Korean Society of Medical Education; 2022.
17. Yune S. A study on the patient-centered doctor’s competency in Korea: Korean doctor’s competency as perceived by citizens and medical professionals. Proceedings of the Korean Medical Education Conference 2022 of The Korean Society of Medical Education; 2022 May 19-20; Suwon, Korea. Seoul: Korean Society of Medical Education; 2022.
18. Humphrey-Murto S, Varpio L, Gonsalves C, Wood TJ. Using consensus group methods such as Delphi and Nominal Group in medical education research. Med Teach 2017;39(1):14–9. https://doi.org/10.1080/0142159X.2017.1245856.
19. Hasson F, Keeney S, McKenna H. Research guidelines for the Delphi survey technique. J Adv Nurs 2000;32(4):1008–15. https://doi.org/10.1046/j.1365-2648.2000.t01-1-01567.x.
20. Lawshe CH. A quantitative approach to content validity. Pers Psychol 1975;28(4):563–75. https://doi.org/10.1111/j.1744-6570.1975.tb01393.x.
21. Yaszay B, Kubiak E, Agel J, Hanel DP. ACGME core competencies: where are we? Orthopedics 2009;32(3):171.
22. Stutsky BJ, Singer M, Renaud R. Determining the weighting and relative importance of CanMEDS roles and competencies. BMC Res Notes 2012;5:354. https://doi.org/10.1186/1756-0500-5-354.
23. Rademakers JJ, de Rooy N, Ten Cate OT. Senior medical students’ appraisal of CanMEDS competencies. Med Educ 2007;41(10):990–4. https://doi.org/10.1111/j.1365-2923.2007.02842.x.
24. Boroumand S, Stein MJ, Jay M, Shen JW, Hirsh M, Dharamsi S. Addressing the health advocate role in medical education. BMC Med Educ 2020;20(1):28. https://doi.org/10.1186/s12909-020-1938-7.
25. Daniel M, Gordon M, Patricio M, Hider A, Pawlik C, Bhagdev R, et al. An update on developments in medical education in response to the COVID-19 pandemic: a BEME scoping review: BEME guide no. 64. Med Teach 2021;43(3):253–71. https://doi.org/10.1080/0142159X.2020.1864310.
26. Papapanou M, Routsi E, Tsamakis K, Fotis L, Marinos G, Lidoriki I, et al. Medical education challenges and innovations during COVID-19 pandemic. Postgrad Med J 2022;98(1159):321–7. https://doi.org/10.1136/postgradmedj-2021-140032.
27. Kaul V, Gallo de Moraes A, Khateeb D, Greenstein Y, Winter G, Chae J, et al. Medical education during the COVID-19 pandemic. Chest 2021;159(5):1949–60. https://doi.org/10.1016/j.chest.2020.12.026.
28. Cutter CM, Nelson C, Abir M. Accountability to population health in the COVID-19 pandemic: designing health care delivery within a social responsibility framework. Popul Health Manag 2021;24(1):3–5. https://doi.org/10.1089/pop.2020.0096.
29. Nelson E, Waiswa P, Coelho VS, Sarriot E. Social accountability and health systems’ change, beyond the shock of COVID-19: drawing on histories of technical and activist approaches to rethink a shared code of practice. Int J Equity Health 2022;21(Suppl 1):41. https://doi.org/10.1186/s12939-022-01645-0.


Appendix 1. Delphi survey form


Appendix 2. Delphi survey key findings


Appendix 3. Comparing patient-centered doctor's competencies in Korea before and after a delphi survey


Article information Continued

Table 1.

Participants’ characteristics (N=28)

Characteristic Category No. of participants
Gender Male 17
Female 11
Years of related experience (yr) 11–15 4
16–20 7
21–25 7
26–30 6
>30 4
Affiliation University hospital 24
Primary/secondary hospital 3
Government organization 1
Major Basic medical science 4
Clinical medicine 20
Education 3
Administration 1
Residence Seoul/Gyeonggi 9
Pusan/Gyeongsangnam 6
Daegu/Gyeongsangbuk 4
Jeonnam/Jeonbuk 3
Choongnam/Choongbuk 3
Gangwon/Jeju 2
International (USA) 1

Table 2.

Delphi findings on core competencies

Core competency 1st Round 2nd Round
Mean±SD Median Agreement CVR Mean±SD Median Agreement CVR
An expert on disease/health 4.82±0.39 5.00 1.00 1.00 4.86±0.36 5.00 1.00 1.00
A communicator with patients 4.75±0.44 5.00 0.95 1.00 4.89±0.31 5.00 1.00 1.00
A collaborator with healthcare colleagues 4.32±0.72 4.00 0.75 0.71 4.50±0.51 4.50 0.78 1.00
A healthcare leader for society 4.21±0.74 4.00 0.75 0.50 4.46±0.58 4.50 0.78 0.93
A professional for oneself 4.75±0.52 5.00 0.10 1.00 4.82±0.39 5.00 1.00 1.00
A scholar for the advancement of medicine 3.89±0.69 4.00 0.75 0.43 3.93±0.60 4.00 1.00 0.57

SD, standard deviation; CVR, content validity ratio.

Table 3.

Delphi findings on sub-competencies

Competencies 1st Round 2nd Round
Mean±SD Median Agreement CVR Mean±SD Median Agreement CVR
An expert on disease/health
 Competent practice 4.64±0.49 5.00 0.80 0.43 4.86±0.36 5.00 1.00 1.00
 Patient-centered reasoning and decision-making 4.68±0.48 5.00 0.80 1.00 4.79±0.42 5.00 1.00 1.00
 Promoting patient safety and quality of life 4.64±0.49 5.00 0.80 1.00 4.79±0.42 5.00 1.00 1.00
A communicator with patients
 Patient-physician partnership 4.64±0.56 5.00 0.80 1.00 4.89±0.31 5.00 1.00 1.00
 Empathic communication 4.50±0.58 5.00 0.80 0.93 4.61±0.57 5.00 0.80 0.93
 Informed consent 4.75±0.44 5.00 0.95 1.00 4.82±0.39 5.00 1.00 1.00
A collaborator with healthcare colleagues
 Effective consultation and transfer 4.46±0.58 4.50 0.78 0.93 4.50±0.51 4.50 0.78 1.00
 Teamwork and continuous quality improvement 4.29±0.66 4.00 0.75 0.79 4.39±0.63 4.00 0.75 0.86
A health advocate for society
 Participating in social activities for health promotion 4.36±0.78 5.00 0.80 0.64 4.36±0.62 4.00 0.75 0.86
 Improving public health 4.04±0.74 4.00 0.69 0.50 4.11±0.74 4.00 0.75 0.57
 Fulfilling social accountability 4.21±0.83 4.00 0.69 0.50 4.18±0.67 4.00 0.75 0.71
A professional for oneself
 Adhering to ethical standards in patient care 4.79±0.42 5.00 1.00 1.00 4.96±0.19 5.00 1.00 1.00
 Participating in doctor-led self-regulation 4.36±0.68 4.00 0.75 0.79 4.61±0.50 5.00 0.80 1.00
 Managing physicians’ health and well-being 4.21±0.74 4.00 0.75 0.64 4.43±0.57 4.00 0.75 0.93
A scholar for the advancement of medicine
 Continuing professional development 4.61±0.57 5.00 0.80 0.93 4.57±0.57 5.00 0.80 0.93
 Facilitating professional learning 3.96±0.64 4.00 1.00 0.57 4.25±0.75 4.00 0.75 0.64
 Contributing to research 3.82±0.61 4.00 0.75 0.43 3.86±0.59 4.00 0.94 0.50

SD, standard deviation; CVR, content validity ratio.

Table 4.

Delphi findings on competency

Competencies Competencies no.a) 1st Round 2nd Round
Mean±SD Median Agreement CVR Mean±SD Median Agreement CVR
An expert on disease/health
 Competent practice (1) 4.75±0.44 5.00 0.95 1.00 4.86±0.36 5.00 1.00 1.00
(2) 4.64±0.56 5.00 0.80 0.93 4.79±0.42 4.00 1.00 1.00
(3) 4.14±0.76 4.00 0.75 0.71 4.18±0.77 4.00 0.75 0.71
(4) 4.29±0.66 4.00 0.75 0.79 4.50±0.58 5.00 0.80 0.93
 Patient-centered reasoning and decision-making (5) 4.82±0.39 5.00 1.00 1.00 4.93±0.26 5.00 1.00 1.00
(6) 4.64±0.68 5.00 0.95 0.79 4.57±0.63 5.00 0.80 0.86
(7) 4.75±0.44 5.00 0.95 1.00 4.71±0.46 5.00 0.80 1.00
 Promoting patient safety and quality of life (8) 4.64±0.49 5.00 0.80 1.00 4.61±0.50 5.00 0.80 1.00
(9) 4.64±0.49 5.00 0.80 1.00 4.68±0.48 5.00 0.80 1.00
(10) 4.50±0.64 5.00 0.80 0.86 4.57±0.63 5.00 0.80 0.86
(11) 4.46±0.64 5.00 0.80 0.86 4.61±0.57 5.00 0.80 0.93
(12) 4.54±0.64 5.00 0.80 0.86 4.68±0.48 5.00 0.80 1.00
A communicator with patients
 Patient-physician partnership (13) 4.71±0.46 5.00 0.80 1.00 4.82±0.39 5.00 1.00 1.00
(14) 4.32±0.72 4.00 0.75 0.86 4.39±0.63 4.00 0.75 0.86
(15) 4.21±0.69 4.00 0.75 0.71 4.43±0.50 4.00 0.75 1.00
(16) 4.46±0.74 5.00 0.80 0.71 4.50±0.58 5.00 0.80 0.93
 Empathic communication (17) 4.39±0.63 4.00 0.75 0.86 4.68±0.48 5.00 0.80 1.00
(18) 4.54±0.58 5.00 0.80 0.93 4.64±0.49 5.00 0.80 1.00
 Informed consent (19) 4.86±0.36 5.00 1.00 1.00 4.86±0.36 5.00 1.00 1.00
(20) 4.50±0.58 5.00 0.80 0.93 4.61±0.50 5.00 0.80 1.00
(21) 4.46±0.58 4.50 0.78 0.93 4.79±0.42 5.00 1.00 1.00
A collaborator with healthcare colleagues
 Effective consultation and transfer (22) 4.71±0.46 5.00 0.80 1.00 4.82±0.39 5.00 1.00 1.00
(23) 4.61±0.57 5.00 0.80 0.93 4.71±0.53 5.00 1.00 0.93
(24) 4.04±0.84 4.00 0.50 0.36 Combined with no. 29
 Teamwork and continuous quality improvement (25) 4.25±0.70 4.00 0.75 0.86 4.39±0.63 4.00 0.75 0.86
(26) 4.54±0.51 5.00 0.80 1.00 4.54±0.58 5.00 0.80 0.93
(27) 4.14±0.76 4.00 0.75 0.71 4.36±0.68 4.00 0.75 0.93
(28) 4.32±0.67 4.00 0.75 0.79 4.57±0.50 5.00 0.80 1.00
A health advocate for society
 Participating in social activities for health promotion (29) 4.11±0.79 4.00 0.75 0.64 4.29±0.66 4.00 0.75 0.79
(30) 4.00±0.72 4.00 0.88 0.43 4.25±0.65 4.00 0.75 0.79
 Improving public health (31) 3.96±0.74 4.00 0.69 0.43 4.04±0.74 4.00 0.69 0.50
(32) 3.75±0.80 4.00 0.75 0.36 Combined with no. 30
(33) 3.61±0.88 4.00 0.75 0.14 3.71±0.71 4.00 0.75 0.29 (Deleted)
 Fulfilling social accountability (34) 3.75±0.75 4.00 0.75 0.43 3.89±0.74 4.00 0.94 0.50
(35) 3.96±0.84 4.00 0.50 0.36 4.07±0.66 4.00 0.94 0.64
(36) 4.18±0.77 4.00 0.75 0.71 4.39±0.57 4.00 0.75 0.93
A professional for oneself
 Adhering to ethical standards in patient care (37) 4.79±0.42 5.00 1.00 1.00 4.93±0.26 5.00 1.00 1.00
(38) 4.68±0.48 5.00 0.80 1.00 4.71±0.46 5.00 0.80 1.00
(39) 4.43±0.57 4.00 0.75 0.93 4.54±0.51 5.00 0.80 1.00
(40) 4.54±0.64 5.00 0.80 0.86 4.71±0.46 5.00 0.80 1.00
(41) 4.79±0.50 5.00 1.00 0.93 4.86±0.36 5.00 1.00 1.00
 Participating in doctor-led self-regulation (42) 3.86±0.93 4.00 0.50 0.43 4.00±0.72 4.00 0.88 0.50
(43) 4.37±0.69 4.00 0.75 0.79 4.57±0.57 5.00 0.80 0.93
(44) 4.29±0.60 4.00 0.75 0.86 4.50±0.64 5.00 0.80 0.86
 Managing physicians’ health and well-being (45) 4.45±0.58 5.00 0.80 0.93 4.57±0.50 5.00 0.80 1.00
(46) 4.36±0.62 4.00 0.75 0.86 4.43±0.50 4.00 0.75 1.00
A scholar for the advancement of medicine
 Continuing professional development (47) 4.61±0.57 5.00 0.80 0.93 4.64±0.56 5.00 0.80 0.93
(48) 4.57±0.50 5.00 0.80 0.93 Combined with no. 47
 Facilitating professional learning (49) 4.07±0.60 4.00 1.00 0.71 4.25±0.65 4.00 0.75 0.79
(50) 4.00±0.61 4.00 1.00 0.64 Combined with no. 49
 Contributing to research (51) 3.96±0.64 4.00 1.00 0.57 4.14±0.59 4.00 0.94 0.79
(52) 3.68±0.61 4.00 0.75 0.21 Deleted
(53) 4.32±0.72 4.00 0.75 0.71 4.71±0.46 5.00 0.80 1.00

SD, standard deviation; CVR, content validity ratio.

a)For competencies (1) to (53) see Appendix 1.