Consensus-based statements for assessing clinical competency in podiatry-related work integrated learning
Journal of Foot and Ankle Research volume 16, Article number: 43 (2023)
The training of undergraduate and graduate-entry podiatry students in Australia and New Zealand includes practical sessions in a simulated and real-life clinical setting and Work Integrated Learning (WIL) comprising professional clinical placements. Student performance during WIL is evaluated by their Clinical Educators using clinical competency tools. Having a standardised and validated clinical assessment tool for WIL in podiatry would facilitate consistency in assessment, promote standardisation between programs, and ensure that all podiatry students are assessed against a set of criteria over the course of their clinical programs to the point of threshold clinical competency. Therefore, the aim of this study was to develop a series of consensus-based statements via Delphi technique as the first step towards developing guidelines to direct the assessment of podiatry students during WIL.
This study used a three-round modified Delphi consensus method. A panel of 25 stakeholders was sought. Specifically, representation from each of the universities in Australia and New Zealand who provide entry level programs, Clinical Educators, podiatry student representatives, new podiatry graduates and consumers (podiatrists hiring new graduates). The survey for Round 1 aimed for consensus and consisted of five open-ended questions. Questions one to three asked respondents to nominate what they considered were the important elements that needed to be assessed for podiatry students undertaking WIL for: Clinical performance/skills, Communication and Professional behaviour, Question 4 asked respondents to identify further/other elements of importance, whilst Question 5 asked a) how these elements should be evaluated and b) how should overall competency and ability to progress within the program be determined. Round 2 and 3 aimed to gather agreement and the questions were based on the responses from previous rounds.
Twenty-five participants agreed to participate, 17 females (68%) and eight males (32%). The panel consisted of 10 podiatry educators (40%), nine Clinical Educators (36%), two student representatives (8%), two new podiatry graduates (8%) and two consumers (8%). From the 25 recruited participants, 21 responded to Round one, 18 to Round two and 17 in Round three. At the conclusion of the Delphi survey, 55 statements had reached consensus or agreement.
This Delphi study is the first of its kind for the podiatry profession to develop consensus-based statements regarding the assessment of WIL. Fifty-five statements pertinent to the assessment of WIL were identified. This is an important first step toward the development of a consistent WIL assessment tool which may be applied across entry-level podiatry programs across Australia and New Zealand.
In Australia, the practice of podiatry is governed by regulatory documents and legislation, namely the National Registration Act , Professional Capabilities for Podiatrists , and the Accreditation Standards: Entry-level podiatry programs . Entry-level podiatrists need to demonstrate that they possess the professional capabilities to practice podiatry safely and competently within these legislative bounds. The training of undergraduate and graduate-entry podiatry students includes theoretical lessons in a classroom and online setting, practical sessions in a simulated and real-life clinical setting, and Work Integrated Learning (WIL) comprising professional clinical placements. WIL is an important part of learning in the health sciences, as it provides the truest form of contextual learning, whereby learners make meanings by contextualising the content within the learning environment in the workplace , as well as incorporating authentic assessment which ensures graduates meet professional competencies and are ‘work-ready’.
Student knowledge and skills are assessed via written assessments, written and practical exams, and Objective Structured Clinical Examinations (OSCEs). Student performance during WIL is evaluated by their Clinical Educators using clinical competency tools.
Clinical competency tools should be able to ensure that students demonstrate professional and ethical behaviour, are good communicators and collaborators, and are competent in practising safely, in accordance with their level of progression in a program. The clinical competency tools can be used in a formative and summative manner. In addition, these tools can be used to allow appropriate reporting of poor performance, concerning behaviour, and track student progress across the course.
At the time of study design and implementation there were nine universities providing podiatry education in Australia and one in New Zealand. Podiatry students’ performance in WIL is assessed based on bespoke clinical competency tools developed in-house by the respective universities. This raises the potential of non-standardised approaches that may not offer consistency in overarching conceptual basis, scaling, reliability, and validation processes. By comparison, standardised assessment of WIL has been developed and widely adopted for other allied health professions, including physiotherapy (Assessment of Physiotherapy Practice) , speech pathology (Competency Assessment in Speech Pathology Assessment)  and occupational therapy (Student Practice Evaluation Form – Revised Edition) .
Having a standardised and validated clinical assessment tool for WIL in podiatry would facilitate consistency in assessment, promote standardisation between programs, and ensure that all podiatry students are assessed against a set of criteria over the course of their clinical programs to the point of threshold clinical competency. Therefore, the aim of this study was to develop a series of consensus-based statements as the first step towards developing guidelines to direct the assessment of podiatry students during WIL. To ensure all voices could be heard equally, with anonymity, a Delphi consensus survey method was employed, seeking broad consultation with stakeholders, including providers, facilitators and end-users (students and consumers) of WIL .
This study used a three-round modified Delphi consensus method, where key stakeholders and experts in the field were invited to participate in a series of surveys seeking their views of key conceptual elements that underpin competency in clinical practice. As a common method of determining consensus in the absence of guidelines, the Delphi technique allows for a flexible approach to gain large amounts of data , with the ability to be conducted online in its entirety. The development and reporting of this study follows the Recommendations for the Conducting and Reporting of Delphi Studies (CREDES) . This study was approved by the University of South Australia’s Human Research Ethics Committee (Protocol number 203578).
A purpose-built survey was developed by the authorship group for Round one in keeping with the novel aims of the study. Round one questions were purposefully open-ended to identify respondents’ individual thoughts and suggestions related to WIL assessment. The questions were initially developed by three of the authors (RC, HB, MH) following review of WIL assessment tools provided by several Australian and New Zealand providers of entry-level podiatry programs who responded to our request (i.e., Auckland University of Technology, Central Queensland University, Charles Sturt University, La Trobe University, Southern Cross University, University of Newcastle, University of South Australia, Western Sydney University). The full authorship group reviewed the questions before implementation of the survey, with wording modified based on their feedback.
The final survey for Round one consisted of five open-ended questions (Appendix 1). Questions one to three asked respondents to nominate what they considered were the important elements that needed to be assessed for podiatry students undertaking WIL for:
Question 4 asked respondents to identify further/other elements of importance, whilst Question 5 asked a) how these elements should be evaluated (e.g., pass/fail, Likert scale, graded), and b) based on this evaluation approach, how should overall competency and ability to progress within the program be determined.
Rounds two and three of the survey were developed based on comments and responses received in the previous rounds.
A panel of 25 stakeholders were sought. The aim of recruitment was to seek a panel that had expertise in delivering WIL (e.g., providers and facilitators), and those with varied experiences of WIL (facilitators and end-users). Specifically, we sought expertise in WIL via academic providers, seeking representation from each of the universities in Australia and New Zealand who provide entry level podiatry programs (n = 10). For facilitators with expertise and experience of WIL we sought Clinical Educators (n = 9) who have been engaged in supervising and assessing students in WIL. End-users with experience of WIL included podiatry student representatives (n = 2), new podiatry graduates (n = 2) and consumers (which for the purpose of this study were podiatrists who had employed two or more new graduates within the previous five years) (n = 2). Except for student representatives, new graduates and consumers, respondents were required to have a minimum of three years’ experience supervising and assessing students clinically.
As podiatry is a relatively small health profession, and podiatry academia a very small subset, the authorship team took steps to reduce the potential for introduced bias. Recruitment for this study was conducted via email. Emails were disseminated to the Program Leads in the 10 universities in Australia and New Zealand with a podiatry program and Program Leads were asked to nominate potential candidates who they believed met the criteria outlined above. A research assistant (SD) then contacted each nominee directly via email with an information sheet attached, and instructions to respond with a confirmation if they were willing to participate. To minimise location bias, the a priori decision was to recruit from a mixture of geographical locations if respondent interest exceeded requirements. This was managed by identifying state of practice of potential respondents and ensuring a representation of states were included (e.g., if our consumers came from Victoria and Queensland, then we first approached the nominated new graduates from Western Australia and New South Wales). To improve the robustness of outcomes, all potential respondents were asked to commit and respond to all three rounds at enrolment, maintain anonymity throughout the study period, keep individual responses confidential and agree to be contacted by email as a method of alerting and reminding the respondents of survey rounds requiring attention. No enticements or compensation were provided, and participants could withdraw their consent of participation at any time during the study period.
Participants who met the inclusion criteria and were included in the study received individual link invitations to each survey round via participant-provided email. Implementation of the Delphi process was undertaken by a research assistant (SD) to minimise the risk of potential conflicts of interest from the authors with participants. All data were collected using the online survey platform SurveyMonkey© (Momentive Inc., California, USA). Respondents confirmed consent at the start of the online survey for Round one, with skip logic engaged to exclude respondents who did not consent. All rounds were open for four calendar weeks and participants were reminded by email one week before the closing date of the survey. Those failing to respond were contacted by email after the closing date and offered a further extension if required. Participants that did not respond to the survey or follow up emails within two-weeks after the closing date were considered non-responders. Participants were supplied a copy of their individual responses each round and supplied a summary of results at the completion of the study where requested.
Participants were able to make comments in Round one and two only. Statements for Round two and three were developed from individual comments made by respondents in the respective preceding rounds. Comments were themed via inductive qualitative content analysis , which allows individual comments to be considered, and statements developed on the overarching theme. Further comments were then considered and either deemed consistent with an existing statement or a new statement was developed accordingly. All comments were initially themed by three authors independently (RC, SD and HB), with inconsistencies discussed until agreement. Acknowledging the bias that may occur due to the collegial nature of the authors involved in the analysis (i.e., all three are employed at the same institution), a fourth author (MH) re-coded comments independently with disagreements resolved by discussion.
Statements were accepted if they reached ≥ 70% consensus or agreement. This required 70% or more of the respondents to identify the same themed statement in Round one (consensus) or indicate that they agreed or strongly agreed (on a five-point Likert scale) with a themed statement in Round two or three (agreement). All themed statements from Round one were returned to participants in Round two. Round three included themed statements where 50 to 69% of participants had agreed or where there were additional comments from Round two, to ensure adequate consideration. If less than 50% of participants agreed to a statement it was excluded from future rounds. This percentage of consensus and agreement is consistent with existing and recent literature on the modified Delphi technique [14, 15].
A total of 45 nominations were received for potential participants. Twenty-five participants agreed to participate, 17 females (68%) and eight males (32%). The panel consisted of 10 academic providers (40%), nine Clinical Educators (36%), two student representatives (8%), two new podiatry graduates (8%) and two consumers (8%). While recruitment met our expertise and experience aims, there was a shortfall in geographical representation. Overall, our panel included three participants from Queensland (12%), nine from New South Wales (36%), three from Victoria (12%), four from South Australia (16%), one from Western Australia (4%) and five from New Zealand (20%). There was no representation from the Northern Territory, Australian Capital Territory or Tasmania.
From the 25 recruited participants, 21 responded to Round one, 18 to Round two and 17 in Round three. One participant withdrew from the study shortly after Round two had been sent out, the other seven were non-responders. Figure 1 outlines the flow of participants and survey characteristics through the three rounds.
From Round one, 341 comments were received from 21 respondents. Following analysis, one statement met consensus, “Demonstrates safe and effective scalpel skills” (Table 1). Sixty-four further statements were developed based on the comments provided, these statements were returned to respondents to seek agreement for Round two.
During Round two, 18 respondents indicated their level of agreement to the returned statements and made 18 further comments. Following analysis, 44 statements met the pre-determined level of ≥ 70% agreement (Table 1) and 11 statements required review in Round three (i.e. had obtained 50 to 69% agreement). Nine new statements were developed based on the comments provided. A total of 20 statements were returned to respondents in Round three.
Round three had 17 respondents, with 10 statements meeting the pre-determined level of ≥ 70% agreement (Fig. 1).
At the conclusion of the Delphi survey, 50 statements relating to assessable elements of importance (Table 1) and five statements relating to grading or evaluation (Table 2) had reached consensus or agreement. Excluded statements that did not meet the minimum 50% agreement required, or were out of the scope of this study, are available in Appendix 2 (Table 3).
This study obtained consensus-based statements on essential elements when assessing podiatry students’ competency during WIL, as informed by podiatry academics, Clinical Educators, students and end-users. It ensures the necessary first step in the development of a valid WIL assessment tool specific for podiatry students, which will ultimately assist in consistency in clinical assessment across providers of entry-level podiatry programs in Australia and New Zealand.
Based on our findings, the essential elements identified by the Delphi technique share consistency with existing documentation. The primary elements from this study focus on competent clinical skills, communication and professional behaviour. These are consistent with several elements of the Professional Capabilities for Podiatrists document (2022), developed by the Podiatry Accreditation Committee of the Podiatry Board of Australia . The Professional Capabilities document covers five domains of expected competence for registered podiatrists: knowledge and skills; communication and collaboration; professional and ethical practice; lifelong learning; and quality and risk management. Arguably, the only domain not essential to the WIL experience of students is that of ‘lifelong learner’ due to its focus on continued learning and mentorship of peers/other health professionals, which is outside the need or ‘capabilities’ as they relate to students. Encouragingly, even though our findings do not specify a ‘quality and risk’ component, elements relevant to student expectations are covered in ‘Professional behaviour’ (such as, demonstrates and acts in accordance with relevant legislation, professional standards and guidelines, including consent, infection control, confidentiality, workplace health, safety and welfare). Similarly, many of our essential elements reflect those used within the COMPASS tool for speech pathology students , the Assessment for Physiotherapy Practice [APP] , and the Student Practice Evaluation Form – Revised (Second Edition) [SPEF-R2] (for Occupational therapists) . As one example, our findings indicate students should have the “Ability to communicate appropriately with people involved in client care”, whereas the COMPASS requires students to ‘Communicate effectively with work teams’, the APP requires students to “Communicate effectively and appropriately – verbal/non-verbal”, and the SPEF-R2 has “communicates effectively with service users and significant others” as a core objective.
Another notable finding was that there is evidence to support the main essential elements accepted by our panel. Reynolds and McLean , when investigating Clinical Educator perceptions of podiatry students’ placement practice, identified that deficiencies in practical clinical skills and communication abilities contributed to a lack of preparedness. It is potentially this perceived importance of professional and communication skills in clinical performance, where neither are mutually exclusive, that led to several essential elements being identified across categories. For example, ‘Demonstrates clear and appropriate history taking’ in the Performance/Clinical Skills section is similar to ‘Demonstrates note taking abilities’ identified in the Communication section. This speaks to the integrated nature of clinical practice, where it is acknowledged that no singular skill or task in isolation makes a good practitioner.
Of interest, many of the outcomes accepted by respondents relating to clinical skills were often specific. For example, the single consensus statement relates to ‘safe and effective scalpel skills’, whilst statements focused on biomechanical assessment, orthotic manufacture, wound and nail care were also accepted. While these skills are irrefutably important, there were notably some podiatry-related tasks that were not identified or accepted as essential elements for the assessment of WIL activities (e.g., assessment and management of paediatric clients, serial casting for musculoskeletal concerns). Ideally a universal assessment tool needs to be adaptable to a broad range of WIL experiences, able to be applied across cohorts with different levels of experience, adaptable to different levels of competence, and responsive to changing technology and practice scope (such as evolving methods of orthoses manufacture) to maintain relevance. This was identified and incorporated into the key elements by the respondents, with statements requiring students to maintain knowledge and identify client focused, evidence-based, appropriately informed strategies for management of conditions that demonstrate clear clinical reasoning reaching 100% agreement. These elements are essential to ensure the tool remains relevant, ‘future-proof’ and able to be nuanced to individual institutions.
With regards to grading scales, this study found that the preference was for a clear pass grade to determine baseline competency. This can be determined by a giving a pass/fail grade, or a mark over the midpoint of a Likert scale. This is similar to the APP  which uses a 5-point Likert scale to grade students’ competencies with the mid-line being the base requirement for success. It must be noted that WIL activities occur at different points depending on the university program structure. Clear guidelines need to be developed to assist Clinical Educators to rate students’ competency according to their progress within the program.
The consensus statements developed in this study represent the initial step to inform the development of a standardised WIL assessment tool. However, the statements may require amalgamation or refinement with the aim of improving brevity and clarity. Further work is required to develop clear assessment criteria, with explanatory notes and examples. However, once developed, this tool may offer entry-level program providers and students greater validity and consistency in assessment of WIL, provide Clinical Educators with more guidance on what is expected of students, and allow accrediting and registration bodies greater confidence that graduating students from different programs have been assessed against the same criteria. Ultimately, this has the potential to help ensure consistency in the clinical capabilities of graduates entering the workforce resulting in improved patient experiences. Any subsequently developed tool may also prove to have international implications where podiatrists train in similar structures to Australia and New Zealand.
There are limitations of this study that need to be considered. All statements required consensus or agreement from the respondents but, in the context of evidence-based practice, represents low-level evidence and expert opinion only. When selecting a manageable number of participants for the study, there was a particular focus on expertise and experience within the Podiatry profession (particularly within the Australian and New Zealand context), however this in itself was a limitation, and the panel may have benefited from experience external to the profession. Further to this, when choosing ‘consumer representation’, we chose to interpret the consumer as the ‘employers’ who then take on the graduates when they complete and enter the real world. It could be argued that the panel may have still benefited from the input of people who receive podiatry care for a particular complaint. Despite transparently supplying respondents with a copy of their comments prior to each round to ensure they were satisfied with our management of them, there is potential that the authorship team could have introduced bias during the theming of statements. The act of theming statements may also, inadvertently, remove detail or nuance from respondents' initial comments. While it is intended that further investigations of the usability of a WIL tool may assist to define or develop statements as needed, it is important to acknowledge that ambiguity may exist in the data as provided within this study. Furthermore, we acknowledge that Round one questions, as created by the authorship group, may have introduced bias given our clear understanding of the current Australian ‘Professional capabilities for podiatrists’  and experience in the assessment of students undertaking WIL. Finally, the strengths of a Delphi technique are enhanced by the anonymity of participants and maintaining confidentiality of responses/respondents. Whilst respondents were asked to maintain anonymity throughout the process, podiatry is a small profession and the chance of intentional or non-intentional collusion of responders cannot be guaranteed.
This Delphi study is the first of its kind for the podiatry profession to develop consensus-based statements regarding the assessment of WIL. Through broad representation from aspects of providers and facilitators (academics and Clinical Educators), learners (students and new graduates) and stakeholders (employers) 55 statements pertinent to the assessment of WIL were identified. This is an important first step toward the development of a consistent WIL assessment tool which may be applied across entry-level podiatry programs.
Availability of data and materials
Health Practitioner Regulation National Law Act 2009. https://www.legislation.qld.gov.au/view/pdf/inforce/current/act-2009-045.
Podiatry Board of Australia. Professional Capabilities for Podiatrists. Melbourne, Victoria: Australian Health Practitioner Regulation Agency (AHPRA); 2022. https://www.podiatryboard.gov.au/Registration-Endorsement/Podiatry-professional-capabilities.aspx.
Podiatry Board of Australia. Accreditation Standards: Entry-Level Programs. Melbourne, Victoria: Australian Health Practitioner Agency (AHPRA); 2021. https://www.podiatryboard.gov.au/Accreditation/Accreditationpublications-and-resources.aspx.
Delahaye B, Choy S. Using Work Integrated Learning for Management Development: Some Key Elements for Success. In Chapman, R (Ed.) Managing Our Intellectual and Social Capital: Proceedings of the 21st ANZAM 2007 Conference. Promaco Conventions Pty Ltd, CD Rom, 2007. pp. 1–16.
Spencer R. Your place or mine? Evaluating the perspectives of the practical legal training work experience placement through the eyes of the supervisors and the students. Int Electron J. 2007;8(2):365–76.
McNamara J. The challenge of assessing professional competence in work integrated learning. Assess Eval High Educ. 2013;38(2):183–97.
Dalton M, Keating J, Davidson M. Development of the Assessment of Physiotherapy Practice (APP): a standardised and valid approach to assessment of clinical competence in physiotherapy. In: Australian Learning and Teaching Council (ALTC) Final report. 2009. p. 6–28.
McAllister S, Lincoln M, Ferguson A, McAllister L. COMPASS®: Competency assessment in speech pathology assessment resource manual: excerpt professional competencies. 2nd ed. Melbourne: Speech Pathology Australia; 2013.
Caine AM, Copley J, Turpin M, Fleming J, Herd C. Development of the student practice evaluation form—revised (SPEF-R2): the first action research cycle. Aust Occup Ther J. 2021;68(1):21–31.
Jones J, Hunter D. Consensus methods for medical and health services research. BMJ. 1995;311(7001):376.
Vernon W. The Delphi technique: a review. Int J Ther Rehabil. 2009;16(2):69–76.
Jünger S, Payne SA, Brine J, Radbruch L, Brearley SG. Guidance on Conducting and REporting DElphi Studies (CREDES) in palliative care: Recommendations based on a methodological systematic review. Palliat Med. 2017;31(8):684–706. https://doi.org/10.1177/0269216317690685.
Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77–101.
Williams CM, James A, Dars S, Banwell H. Development and use of the PodEssential and Paeds-PodEssential triage tools to define “essential” podiatry services. A Delphi survey, scoping review, and face validity testing study. J Foot Ankle Res. 2022;15(1):1–15.
Dars S, Uden H, Kumar S, Banwell HA. When, why and how foot orthoses (FOs) should be prescribed for children with flexible pes planus: a Delphi survey of podiatrists. PeerJ. 2018;6: e4667.
Reynolds K, McLean M. Clinical supervisors’ perceptions of podiatry students’ preparedness for clinical placement and graduates’ preparedness for podiatry practice in Australia: an exploratory study. Focus Health Prof Educ: Multi-Discip J. 2021;22(2):1–22.
The authors would like to formally thank all participants for their contribution to this work.
Funding for this project was provided by the Australian Podiatry Education and Research Foundation (APERF). APERF and its trustees had no input into the study design, data collection, analysis, or in writing of the manuscript.
Ethics approval and consent to participate
This study was approved by the University of South Australia’s Human Research Ethics Committee (Protocol number 203578).
Consent for publication
The authors declare that they have no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Causby, R.S., Dars, S., Ho, M. et al. Consensus-based statements for assessing clinical competency in podiatry-related work integrated learning. J Foot Ankle Res 16, 43 (2023). https://doi.org/10.1186/s13047-023-00639-7
- Clinical competency
- Work-integrated learning
- Clinical placements
- Delphi survey