SQE Quality Assurance Report 2023-24
11 March 2025
Introduction
Here we explain what we have done to quality assure the Solicitors Qualifying Examination (the SQE) between October 2023 and July 2024. This is the third report since the introduction of the SQE in autumn 2021.
The previous SQE Quality Assurance report was published in April 2024.
Kaplan is the sole assessment provider for the SQE. It publishes a statistical report for each assessment delivery which provides information on the assessment, including the pass marks and pass rates. These reports are:
Kaplan has also published an annual report which provides a cumulative picture of the outcomes from SQE assessments delivered between July 2023 and July 2024.
The SQE is also the end point assessment (EPA) for solicitor apprentices. We are appointed by the Institute for Apprentices and Technical Education as the external quality assurance organisation for the EPA. Our quality assurance applies IfATE's principles for assuring the EPA.
Our priority, when quality assuring the SQE, is that standards for entry into the profession are at the right level and consistently applied, and that the SQE is up-to-date and fit for purpose. This provides confidence in the SQE as a professional assessment and in the competence of those who have passed.
We have a quality assurance framework to make sure that effective controls are in place to manage the risks to the quality and standard of the SQE. This provides the evidence for the assurances which are necessary to maintain confidence in the SQE as a fair and robust assessment of day one solicitor competence. We do this through:
- regular meetings with Kaplan
- Kaplan and us fulfilling our contractual obligations
- systematic monitoring, evaluation and analysis of assessment data
- obtaining evidence of compliance with agreed policies and procedures
- observations of live assessment deliveries and assessor and marker standardisation meetings
- ensuring compliance with the SQE Assessment Regulations.
We have three 'Subject Matter Experts' (SMEs), one of whom was appointed in September 2024 for a three-year term. The other two SMEs have been in post since 2021. Their term will end in August 2025.
The SMEs provide expert, objective and independent assurance about the quality of the assessments and contribute more widely to the quality assurance of the SQE. The SMEs all have experience in practice as a solicitor and they bring a breadth and depth of knowledge and experience in the areas of functioning legal knowledge covered by the SQE and in professional assessment.
In the past year, the SMEs have reviewed a sample of questions for both SQE1 and SQE2 assessments, observed live assessment deliveries and attended and observed markers' meetings and the standardisation and calibration of assessors.
The SQE is subject to the oversight of an Independent Reviewer. A new Independent Reviewer was appointed in January 2024. Since then, the Independent Reviewer has conducted the following activities:
- observation of live assessment deliveries
- observation of assessment standardisation and markers' calibration meetings
- observation of Assessment Board meetings
- observation of Mitigating Circumstances Panels
- interviews with members of the Kaplan management team
- meetings with us
- observation of meetings between the SRA and SMEs
- observation of an Angoff Panel
- consideration of reports and information produced by Kaplan and the SRA.
The Independent Reviewer's latest report covers the performance of the SQE processes and outcomes between January 2024 – October 2024.
They state that: 'The SRA and Kaplan teams work together to ensure openness and accountability, collaborating when issues arise to ensure optimal outcomes. Candidates, stakeholders and the public should have confidence that the SQE outcomes delivered in 2023/24 were fair and defensible.'
An Independent Psychometrician continues to provide expert guidance on the psychometric analysis conducted on each SQE sitting. This includes regular meetings with Kaplan's psychometricians, conducting checks for bias, question analysis and the identification of trends over time and checking that the interpretation and reporting of these analyses are appropriate.
The Independent Psychometrician has attended Assessment Board meetings as a member and observed markers' calibration and standardisation meetings. They have also held regular meetings with us and provided guidance and assurance regarding how standards are set for both SQE1 and SQE2. We liaise with Kaplan's academic heads to ensure that any recommendations made in their reports to us are followed up and actioned.
Candidates are asked to complete a feedback questionnaire after each delivery. This is administered by Kaplan and the findings are shared with us. Candidates' overall satisfaction levels for SQE1 ranged between 51% and 60% This is a significant improvement on the overall satisfaction levels from 2022-23 (44% in January 2023 and 43% in July 2023).
For SQE2, candidates' overall satisfaction levels ranged between 43% and 59%. This is comparable with the overall satisfaction from 2022-23 (39% – 62%).
Candidate feedback for those with reasonable adjustments continues to be below the overall cohort satisfaction levels, indicating that the candidate journey for these candidates requires improvement.
The main areas for improvement arising from feedback related to:
- the booking process. This was transformed at the beginning of 2024
- waiting time for oral assessments
- guidance on the SQE website
- the booking process for reasonable adjustments candidates.
Actions taken to address these issues are summarised below.
Also, for SQE2 in July 2024, candidate feedback had significantly improved from the previous three assessment windows.
Candidates are also invited to attend focus groups after each SQE1 delivery and after the April and October SQE2 assessments. The focus groups are run by Kaplan and are usually observed by the SRA and/or the Independent Reviewer. They are divided into two sessions: one for candidates who have reasonable adjustments and another for those who do not. Apprentices are also encouraged to engage.
The focus groups allow for more qualitative data to be collected. Similar questions are asked in each focus group. Where applicable, candidates are also given the opportunity to provide feedback on reasonable adjustments. During the reporting period, some of the main themes raised at the groups were the booking experience and the wish for more candidate facing information on the exam content.
The SQE Assessment Regulations set out the obligations of the SQE Assessment Board. These include meeting after each delivery to approve the pass mark and make other checks on the reliability and fairness of the assessment. It is chaired by an SRA executive director (or their nominee) and its members include senior personnel from the SRA and Kaplan and the Independent Psychometrician. The Independent Reviewer is an observer.
In reaching its decisions, the Assessment Board receives:
- a report on delivery and any adverse events
- a report on any allegations of malpractice and improper conduct
- minutes of the meeting and recommendations of the Mitigating Circumstances Panel
- a statistical and qualitative report containing information on test quality, the profile of the cohort, assessment performance (validity and standard), potential pass rates and demographic group performance.
We are confident the evidence we obtained through all the above quality assurance mechanisms confirms the following:
- the assessments are valid: they test the competences expected of a newly qualified, day one solicitor to the correct standard and they are set in realistic contexts
- each assessment has been constructed according to the assessment blueprint and reflects the SQE assessment specification
- the assessments are reliable; they measure consistently the performance of the candidates
- appropriate methods for setting the pass mark for a high stakes professional exam have been applied
- the assessments are fair and free from bias
- the assessments are secure
- risk is appropriately identified and managed
- there is a commitment to continuous improvement and mechanisms are in place to learn from any assessment related issues, including delivery failures, and reduce or eliminate the risk that they are repeated.
We have listed the evidence which confirms this in Annex 1.
In our previous two reports, we identified some areas of delivery which required improvement. The actions taken in those priority areas were in relation to:
- SQE assessment disruptions
- the booking process
- information for candidates and training providers
- increased number of SQE sittings
- provision of a spell check function for SQE2 written assessments.
Action in these priority areas has continued in 2023/24:
- IT failure - There have been instances where a candidate or a group of candidates have been unable to complete the assessment because of an IT failure. Most notably, this occurred on the first day of the July 2024 SQE2 written assessment at Chiswick ITTS, where a corrupt download of the exam affected 49 candidates. A provision for an Assessment Delivery Failure has been added to the mitigating circumstances policy so that it was possible to act swiftly to limit the impact on candidates. The majority of them then sat the assessments later that day, or at a rescheduled date later in the assessment window.
- Reasonable Adjustments - The Policy was updated in January 2024 (and first applied to candidates who sat SQE2 in April 2024). It was updated to provide clarity on accommodations for individuals whose conditions affect their ability to undertake SQE, even if they are not considered disabled under the Equality Act 2010. Further guidance and information was also provided to candidates and communicated via a news item on the SQE website.
- Venues - An additional oral assessment venue in London was introduced to increase overall capacity for those wishing to sit SQE2 and to better cater for candidates with reasonable adjustments.
- Booking process - Changes were made following feedback from candidates. The process now allows candidates to provide their preferred booking options and a seat to be booked that best aligns with candidates' preferences. Candidates no longer need to join a virtual queue to book.
- SQE1 sample questions - Kaplan has published 40 new sample SQE1 questions, taking the total number of sample questions up to 170. The new questions are split evenly between FLK1 and FLK2. The questions were published in response to candidates' feedback that more assessment materials should be made available to enable sufficient preparation for the SQE1 and to reflect the quality and content of the actual assessment. The newly published questions, like those released in 2023, had previously been used in SQE1 assessments.
- SQE2 sample questions - Further materials have been published: video recordings of a sample advocacy assessment for Dispute Resolution and two sample interviewing assessments for Wills and Intestacy, Probate Administration and Practice, and Property Practice.
- Multiple sitting dates - To accommodate increasing numbers of SQE candidates and be able to offer a wider choice of dates per assessment window, while maintaining the integrity of the assessment, multiple papers have been used for SQE1 from the January 2024 window onwards. The results were presented as scaled scores instead of percentages, as this allows easy comparison of candidates' results even when candidates have taken different papers.
- Spell check – introducing this function for SQE2 written assessments continues to be explored between Kaplan and Pearson Vue.
An error occurred in the results issued to candidates who took the January 2024 sitting of SQE1. The error arose because scores were not rounded at the point in the results process which was set out in the SQE Marking and Standard Setting Policy. The results, initially issued in March 2024, were reissued in April 2024. The main impact of the error was on 175 candidates, who were originally told that they had failed either FLK1 and/or FLK2 (the two parts of SQE1) when they had, in fact, passed.
Kaplan commissioned an independent expert to check the recalculation of the re-issued results, to determine how the error had occurred and to advise on how mistakes of this type could be prevented in the future.
The review confirmed that all reissued results were accurate.
Kaplan created an action plan to systematically address the issues identified. The actions centre around quality assurance, data analysis, the Assessment Board and results-related processes, and ownership of policies and processes. The implications for SQE2 and other Kaplan operations are also considered in the action plan. We have and will continue to monitor progress against the plan.
Our analysis of candidate performance in the SQE continues to indicate a correlation between success and prior educational achievement and socio-economic factors.
Prior to the launch of the SQE, we commissioned the University of Exeter to look at the potential causes of differential outcomes in professional assessments. The legal assessments considered were namely the Legal Practice Course and Graduate Diploma in Law.
The final research report was published in June 2024. We will be we acting collaboratively with others, including Kaplan, to address the identified causes.
Assurance on valid assessments
The assessments are valid.
Evidence:
- sample of SQE1 questions reviewed by SMEs and SRA
- sample of SQE2 assessments reviewed by SMEs and SRA
- observation by SMEs and SRA at SQE2 oral assessments
- composition of assessment checked by SRA
- report of Independent Reviewer.
Assurance on weightings for blueprint and assessment specifications
Each assessment has been constructed to reflect the assessment specifications, and SQE1 has been constructed according to the weightings within the assessment blueprint for SQE1.
Evidence:
- sample of SQE1 questions reviewed by SMEs and SRA
- sample of SQE2 assessments reviewed by SMEs and SRA
- report from Kaplan's Head of Quality Assurance on each assessment, confirming that all processes relating to the question writing and assessment build have been followed
- composition of assessments checked by SRA.
Assurance on reliability
The assessments are reliable.
Evidence:
- Cronbach's alpha has been greater than 0.8 in all SQE2 assessments and greater than 0.9 in all SQE1 assessments in this period. Cronbach's alpha is a measure of test reliability. The gold-standard for high stakes assessments in 0.8
- Independent Psychometrician checks.
Assurance on fairness
The assessment is fair and free from bias, decisions about candidate performance are fair and methods agreed for setting the pass mark have been applied.
Evidence:
- question writing methodology
- assessor recruitment and training
- reasonable adjustments policy – reported against at monthly contract meetings
- SME review of a sample of the questions for each assessment
- recognised appropriate standard setting methods for high stakes professional assessments applied
- SME, SRA, Independent Psychometrician and Independent Reviewer observations of live delivery of SQE2 oral assessments
- SME and SRA attendance at assessor standardisation and markers' meetings
- SME, SRA, Independent Psychometrician and Independent Reviewer attendance at Angoff Panel training for SQE1 standard setting
- analyses and evaluation of psychometric data reviewed by Independent Psychometrician and presented to the Assessment Board
- SRA and Independent Psychometrician attendance at mitigating circumstances panel meetings
- Independent Reviewer report
- Independent Psychometrician checks and requests for further analyses where appropriate.
Assurance on assessment security
The assessments are secure.
Evidence:
- confirmation from Kaplan's Head of Quality Assurance prior to signing off each assessment that all processes related to training, writing the individual assessments and the assessment build have been followed
- processes are in place to ensure the security of assessment materials during delivery of the live assessments
- confidentiality obligations imposed on all assessors
- conflict of interest policy and process (reported on in monthly contract meeting).
Assurance on risk
Risk is appropriately and effectively identified and managed.
Evidence:
- monthly meetings with Kaplan to check against service levels, including those relating to progressing applications for reasonable adjustments, managing complaints and website accessibility
- review of joint risk log at monthly contract meetings
- checking Kaplan's internal audit plans
- monitoring Kaplan's lessons learned log and action plan
- reviewing and monitoring Kaplan's business continuity planning
- Independent Reviewer report.
Assurance on commitment to continuous improvement
Continuous improvement is made to the SQE and assessment delivery and action taken where necessary because of lessons learnt.
Evidence:
- lessons learnt log and actions taken are available to us
- annual review of all processes
- regular stakeholder engagement through meetings and focus groups
- qualitative feedback obtained from candidates through focus groups
- evidence of actions taken in response to issues.