I have a confession to make. When many in the EAL world were trying to persuade the DfE to introduce a the national EAL assessment system we know now as the A-E proficiency scales, I was making the opposite argument.
In January 2016 someone at the DfE rang me and asked me what I thought of the idea of a new national EAL assessment system. At that time that the DfE had only just abolished the national assessment system that we had had for the previous 25 years and with it the national EAL assessment system (A Language in Common). I had to ask why it was that EAL pupils should be singled out for a national assessment system when the government thought pupils in general did not need such a system and schools were being encouraged to innovate and invent their own assessment systems. Could be it be that the DfE needed such a system to allocate money to particular areas on the basis of a judgment about how well developed the English skills were of EAL pupils on those areas? Certainly not, came the reply. The data were needed to inform policy development. The DfE were aware that EAL was a school census category that included everyone from the absolutely new to the country with barely a word of English Slovak Roma to second generation Chinese students likely to end up at Oxford. They needed to be able to break EAL down into stages of English development, which sounds very sensible, if a bit of an imposition on those schools that already had effective EAL assessment systems in place. When I asked what the budget was for introducing the new system to teachers and what the arrangements would be for developing consistency in judgements across the country, I drew a complete blank. There would just be the half a side of A4 in the Schools Census Guide with, for example, a 48 word definition of New to English[1] that makes no mention of how long pupil might have been learning English.
So I suggested that the DfE would be likely to get a more reliable and less resource consuming estimate of how many pupils were new to English by using the data they already have and the accumulated knowledge of experienced EAL practitioners. The Schools census gives the DfE the following data: a pupil’s date of birth, a pupil’s ethnicity, a pupil’s home language, the UPN (Unique Pupil Number) and a flag that says EAL/not EAL. Combining these data would probably give a more accurate estimate of the number of pupils New to English than analysing proficiency code returns. The ninth and tenth digit of the UPN tell you the year in which it was issued[2]. The year of issue generally marks entry to the English education system. So a twelve year old EAL flagged Slovak speaker whose UPN was issued in the year that s/he was eleven or twelve is probably New to English, whether ethnicity is recorded as East European, White Other or Roma.
The DfE were also interested were also interested in how development in English affects outcomes in national test and exams. I politely suggested that there were plenty of rigorous studies by people like Feyisa Demie and Steve Strand that already confirmed what common sense predicts: the more English you have the better you do in tests conducted in English.
The DfE have now scrapped the proficiency scales and there is no longer a requirement to record them. I presume the DfE have now realised that they have collected a lot of useful data from schools that have made well informed judgements about EAL proficiency and that they have also collected a lot of not useful data from schools that made less well informed judgements about EAL proficiency; the problem is DfE does not know which is which. That may explain the demise of the proficiency scales, though the legally dubious nature of the other new census and also abandoned questions on nationality and country of birth (introduced with the proficiency scales) may be a factor too
Here ends my confession, almost. Yes, I tried to persuade the DfE not to introduce proficiency scales. My logic was sound and I strongly suspect they have not learnt much form the exercise. However, I do have a real sense of regret at the passing of the proficiency scales. There were unintended consequences that seem to me to be extremely positive. Significant documents such as The Bell Foundation’s EAL Assessment Framework for Schools and NASSEA’s reworking of its assessment framework might not exist and would certainly not be so well known without the proficiency scales. As EAL professionals we have had both more and more useful conversations about assessment than we have had for years. In many of the schools The EAL Academy works with we have also seen an increased interest among school leaders and subject and class teachers in talking about the rates of progress of EAL pupils and how we can assess that progress. The challenge now is to maintain the very welcome momentum that has been created around EAL assessment without the need to submit proficiency scale data to the DfE.
[1] DfE, School census 2017 to 2018 Guide, version 1.6 (page 65): “May use first language for learning and other purposes. May remain completely silent in the classroom. May be copying / repeating some words or phrases. May understand some everyday expressions in English but may have minimal or no literacy in English. Needs a considerable amount of EAL support.”
[2] DfE Unique pupil numbers (UPNs): A guide for schools and local authorities (page 17)
This article also appears in the Summer 2018 issue of EAL Journal.