Why Question Level Analysis (QLA) is some data you should be bothered with

In this age of accountability and proving you are doing all you say you are data appears to god. If we look at what we get bombarded with and what we are asked to produce, many of it is not useful for everyday teaching. This can lead us to run and scream when we see a table of data of overly formatted chart. Whilst I am one who advocates knowing your students, focusing on them as individuals and taking data with a pinch of salt I have recently come across a set of data that can be easily created and is really useful in classroom practice. I am one of those people who find data rather frustrating so for me to tell you to create more must be a big deal!

The data in question is Question Level Analysis or QLA. If you have not come across this before then it is as simple as setting a threshold mark for an answer on an exam paper. If students get over that mark they have done well, if they are below it then it is an area where improvement is needed. Popping the data into a spreadsheet with a few formulas and a bit of conditional formatting, it is clear to see where both individuals and the whole class needs to improve. You can then focus on the topics around those questions during lesson time and intervention sessions. For instance if a question on data collection came up on a mock test and was worth six marks you might expect students to do well so set a threshold of four marks. If the majority of students got less than four when you marked the test you would know that was an area for a revision session. If all bar two got four or over you could individually target those students.

On that basis, it makes QLA a great tool for focusing revision in and out of lessons. However, in subjects such as ICT, Business and Computing we do not always get the same time as some core subjects. This means we may need to wait for mock exams weeks. A Maths teacher at my school, however, has a different approach which I have used to focus revision. Rather than waiting for mock exams, she sets different questions for a starter activity. These are based on question that are likely to come up in the exam. She can then do the QLA on these to find misconceptions and move revision forward.

Recently I tried this technique out and it has worked wonders for understanding where students are. Some areas I thought the students were strong on they struggled with. In home learning and in class tasks they have all the support so can write good answers, take the support away and the holes in their knowledge can be seen. How I went about the starter activity QLA was to give students twenty minutes to answer eight questions. This was for the Cambridge Nationals in ICT at Level 2, so I asked questions based around the case study we have been given for their examination. The questions were on areas I expect to come up. Once complete, I took the questions in and gave students marks out of two for each question…

  • Zero for questions not attempted or no clear understanding
  • One for limited ideas or bullet points on the topic
  • Two for answers that had a clear understanding of the topic
Students were given questions for the starter activity then these were used with QLA to work out revision topics
Students were given questions for the starter activity then these were used with QLA to work out revision topics

From here I created a simple spreadsheet that used conditional formatting to show the zeros in red, ones in yellow and twos in green. I added every student’s score to the spreadsheet and thus can see at a glance what each student understands. From here I added up all the scores for each question and again conditionally formatted the totals. This time less than ten was red and a concern, ten to twenty five yellow and needs recapping, and finally over twenty five green showing the students understand the concepts so we do not need to touch on it whole class. What is concerning is that I have no topics in the green overall, this may be because we have moved onto a different unit and thus need to go back and recap knowledge from the examination unit. What was interesting was that it was not only the two areas of acceptable use policies and archiving that I expected to be misunderstood that came in low. We have covered databases to death, however students are still not able to pick up the concepts behind them. It is clear that whilst my whole class revision sessions need to focus on these topics I once again need to look at how I deliver databases. A positive aspect of the data was that students were unclear of Global Positioning Satellites (GPS) the previous time we took the exam but now seem to have a good understanding.

What the QLA has now done is allowed me to look closely at an area I thought students understood and spend less time on an area I presumed they did not have good knowledge in. In this way QLA has allowed me to focus on students needs, it is not data that I will be picked up on and asked about, it may not be used to prove that I have done a specific intervention but it has changed the way I teach so the students get the best possible support and, quite frankly, that is what data should be used for.

Leave a Reply

Your email address will not be published. Required fields are marked *