The National Audit Office (NAO) has today published its investigation into the Home Office’s response to widespread cheating by international students in English language tests. Clearly widespread cheating did take place but some people may have been wrongly accused and in some cases, unfairly removed from the UK.
Evidence shows there was extensive fraud in the student visa system prior to 2014. In February 2014 BBC Panorama uncovered evidence of organised cheating in two English language test centres run on behalf of the Educational Testing Service (ETS). This included providing English-speakers to take speaking tests instead of the real candidates and staff reading out multiple choice answers for other tests. The Home Office responded vigorously, investigating colleges, test centres and students.
After the Panorama programme the Home Office began cancelling the visas of those it considered to have cheated in the Test of English for International Communication (TOEIC). It used evidence from ETS on who had cheated to do this. It suspended all ETS testing and initiated criminal investigations into test centres. It also prevented colleges associated with cheating from sponsoring international students’ visas.
It is still difficult to estimate accurately the exact scale of cheating and how many people may have been affected because of the quality of evidence used to determine who cheated and the data kept by the Home Office on the action taken against individuals.
ETS used new voice recognition technology to uncover who had cheated by having someone else sit their test. After review by human listeners and other checks, ETS identified 97% of all UK tests as “suspicious”. It classified 58% of 58,459 UK tests as “invalid” and 39% as “questionable”. The Home Office did not have the expertise to validate the results nor did it, at this stage, get an expert opinion on the quality of the voice recogonition evidence. Individuals with “questionable” results were allowed to re-sit the tests, but the Home Office started cancelling visas of those individuals given an “invalid” test.
There have been competing views of the validity of the technology. In 2015, the National Union of Students (NUS) commissioned an expert who said the software could have made mistakes in up to 20% of cases and human listeners in up to 30% of cases.
In 2016, the Home Office sought an independent expert who said that the error rate would be significantly less than 1%. The expert had more information but still needed to make a series of assumptions about the performance of the technology and the people checking the results. The expert’s evidence backs up ETS’s overall assessment of widespread cheating, but neither proves definitively that an individual’s test was invalid.
Most tests considered invalid by voice recognition checks did have very high marks compared to people who were cleared. 49% of invalid tests were taken by highly fluent English speakers. Some scores are not easily explained by the methods of cheating Panorama identified and have not been investigated by the Home Office. For example, thousands of people suspected of cheating had low scores in multiple choice tests, indicating they were not provided with the answers. 100 people with invalid speaking tests (0.3% of invalid results) had lower scores than the level required for study in the UK, meaning supposed proxies were actually people with limited English language ability.
It was not possible for the Home Office to directly check ETS’s assessments of cheating. Some appeals challenged the handling of data by ETS and the test centres, particularly because some centres were run by criminals. People have been able to get hold of ETS voice recordings used in voice recognition checks but not the original audio recordings, although this evidence has stood up to challenge in criminal trials.
Thousands of people accused of cheating have still won the right to stay in the UK. 4,157 invalid cases have been granted leave to remain, including 477 who are now British citizens. 12,500 people appealed immigration decisions with 3,600 winning their cases. The Home Office has not tracked the reasons why people have been allowed to stay. Some have disproved allegations of cheating, others have remained on human rights grounds.
At the end of March 2019 Home Office data indicates 11,000 people who had taken TOEIC tests had left the country after the discovery of extensive cheating. Approximately 7,200 left voluntarily after April 2014, around 2,500 people were forcibly removed and almost 400 were refused re-entry to the UK. These numbers may be an underestimate.
Widespread action to close colleges also affected students who did not sit the TOEIC exams, as they had to find other courses. Some have struggled to secure new places, impacting on their visas and their ability to remain in the country. The Home Office offered help to 4,795 students and 837 students used this support.
“When the Home Office acted vigorously to exclude individuals and shut down colleges involved in the English language test cheating scandal, we think they should have taken an equally vigorous approach to protecting those who did not cheat but who were still caught up in the process, however small a proportion they might be. This did not happen. ”
Amyas Morse, the head of the NAO
Read the full report
Investigation into the response to cheating in English language tests
Notes for editors
- Press notices and reports are available from the date of publication on the NAO website. Hard copies can be obtained by using the relevant links on our website.
- The National Audit Office scrutinises public spending for Parliament and is independent of government. The Comptroller and Auditor General (C&AG), Sir Amyas Morse KCB, is an Officer of the House of Commons and leads the NAO, which employs some 785 people. The C&AG certifies the accounts of all government departments and many other public sector bodies. He has statutory authority to examine and report to Parliament on whether departments and the bodies they fund have used their resources efficiently, effectively, and with economy. Our studies evaluate the value for money of public spending, nationally and locally. Our recommendations and reports on good practice help government improve public services. Our work led to audited savings of £741 million in 2017.