Testing times: How much is enough when it comes to literacy and numeracy?
Literacy and numeracy tests seem to be flavour of the month in education policy circles. But how many is enough?
Currently, students in school undergo literacy and numeracy tests in years 3, 5, 7 and 9 through the NAPLAN – the National Assessment Project for Literacy and Numeracy. There’s a great deal of debate surrounding these tests and their usefulness.
Governments like them, as they believe they “measure” whether there’s been improvement or not in students over a period of time. Schools who do well in them use them as marketing tools, while those that don’t spend time explaining why there’s been no improvement. Schools may also see them as a snapshot of where students are in terms of reading, writing and numbers – hardly a “measure” of literacy and numeracy.
Letters and numbers
And here’s where life becomes complicated: What are literacy and numeracy?
Any form of literacy is the use of disciplinary knowledge in specific contexts. Literacy should be the use of English knowledge and skills in a variety of contexts; numeracy the use of mathematical knowledge and skills in a variety of contexts – and the list goes on for scientific literacy, financial literacy and so on.
In 2018 research conducted in New South Wales, surveys of principals indicated that the majority (56 per cent) use the terms “English” and “literacy” interchangeably, and the same with “mathematics” and “numeracy”. The important difference is that English is the discipline is studied in school that develops the knowledge and skills of that discipline, and similarly for mathematics or any discipline area in the curriculum.
A matter of context
One of the issues in assessing literacy and numeracy, and any form of literacy, is what contexts do you use to assess the use of disciplinary knowledge and skills? For example, a numeracy question given to school-leavers recently was: “A teacher works three days a week – express the workload as a decimal.” The context in this instance requires knowledge that a full-time workload is five days a week for a teacher – something not often understood by students, as they don’t experience full-time work, and they also have many teachers who don’t work five days a week. So, while the required response is 0.6, reaching the answer relies on the awareness of a full-time workload, as well as calculating that the decimal 3/5 equals 0.6.
The current cry from industry (the Australian Industry Group) about the literacy levels of students from schools as being absent, particularly in the sense of OHS issues, speaks more to the familiarity with the context rather than the knowledge and skills of students.
OHS contexts are both generic and specific, and it’s important that every workplace educates its workforce. People who work in that context are familiar with such contexts; those new to the context are not. It’s also important to indicate what sort of literacy is needed – working as an apprentice builder is a different context from working in a retail or professional setting.
The proposed changes to the General Achievement Test (GAT) for year 11 and 12 students in Victoria is shifting the purpose of this test. As can be seen from current VCAA website, the GAT is a test of general knowledge and skills in the broad areas of written communication, mathematics, science and technology and humanities, the arts and social sciences.
It’s not a literacy or numeracy test. Its purpose is fourfold:
Help check that school-based and external assessments have been accurately assessed
Contribute to statistical moderation of school-based assessment results
Help calculate derived examination scores
Determine scores in school-based assessment, external assessment, and if a derived examination score is required due to illness, accident and trauma.
Given its purpose, it’s only undertaken by those students who study a Unit 3 and 4 sequence, not by VCAL students or students undertaking Units 1 and 2 (often taken in Year 11).
It appears there’s some difference between what is knowledge and skills associated with learning disciplines, those elements that comprise the majority of the school curriculum, and the use of this knowledge in particular contexts.
Whatever context is chosen will be discriminatory to some sectors of the community. For example, using water usage data as a context for numeracy would be foreign to many country students whose source of water would be through capturing rainwater in tanks.
It’s a constant wonder why we consistently rely on trying to assess students in ways that assumes they’re all the same.
General capabilities in the curriculum such as critical and creative thinking, and personal and social capabilities rely on reasoning in varying ways such as through analysing and responding to data or emotions.
It’s reasonable to assess students in areas in which the curriculum defines what is to be learnt. What isn’t reasonable is to assess students in areas outside the curriculum.
Perhaps literacy and numeracy should be the focus of the curriculum, as students should be able to use what they learn. If this is the case, the focus needs to be on the curriculum rather than judging students (and teachers) on what they do or don’t know.
Given the focus on the individual in our education system, it’s a constant wonder why we consistently rely on trying to assess students in ways that assumes they’re all the same. Students undertaking VCE and VCAL learn in very different ways. How will a literacy and numeracy test give consideration to these differences in a one-off system-level test?
AI for Social Good?
The Little Food Festival 2020
Spotlight Report 2: Implementing an Integrated STEM Education in Schools - Five Key Questions Answered
You may republish this article online or in print under our Creative Commons licence. You may not edit or shorten the text, you must attribute the article to Monash Education Futures, and you must include the author’s name in your republication.
If you have any questions, please email email@example.com