Category Archives: Exams

Lies, Damned Lies and GCSE English Results

The exam results period is never short of controversy, each year there seems to be a new issue with marking, supposed ‘soft’ subjects, the BTEC v. GCSE debate…however, this year is different and not in a good way.

You would have had to have been in a cave for the past week to avoid all the news stories about the drop in GCSE English results, or, in my case, in Egypt. There have been other articles written about this, an excellent one by @RealGeoffBarton for example, many focusing on the AQA qualification. I am writing this post partly to get my own head around the situation from the perspective of someone teaching the OCR qualification, but also to cut through some of the media misunderstanding of the situation, and – let’s be brutally honest here – to prepare myself for the oncoming storm from SMT and parents.

So, there has been a drop in the number of students achieving a C grade GCSE  or higher in English – so much we know. A quick trawl of the Internet will show that this drop varies from school to school from a few percent to shocking figures of 16-20%. Following Gove’s repeated grandstanding (minus hard evidence I hasten to add) about ‘falling standards’ and claims from some parts of the press about how easy the qualifications were to pass (anyone heard the myth that if you write your name correctly you get a grade?), it was not surprising that there was going to be some fall out and that grades were likely to take a hit. However, as Orwell’s ‘Animal Farm’ suggests, all students and exams are equal, but some are more equal than others.

The press have reported in varying levels of accuracy and froth, the Daily Mail for example reported: “claims that pupils who took the exam in January found it easier to gain C grades than those who sat it in the summer”. Others reported exam boards explaining that the difference was due to the new syllabus. I hope state the case as I see it and explain, hopefully in layman’s terms, why this deflation of grades is unfair.

The New Syllabus

This summer marked the first cohort going through the new GCSE syllabus. The syllabus was introduced in September 2010 and included several changes to the previous exams – the introduction of ‘controlled assessment’, a type of coursework being completed under exam conditions, being the most notable one. Yes, you would expect a few teething problems as students, examiners and teachers get to grips with the changes, but these should be fairly minor as the core of English remains the same – reading, writing, speaking and listening.

The mark schemes for the new exam ‘controlled assessments’ changed, due to the insistence of the now-defunct QCDA (Qualifications and Curriculum Development Agency), descriptions of students’ skills being matched to bands rather than grades. This was a more complex change (something many of us are used to at A-Level), but the description of C grade skills for teachers experienced in marking C grade work, and guidance via exemplars from the board, meant that, although the boundaries were a little fuzzy in places, ultimately the skills and quality needed for a C grade were pretty much the same as under the old syllabus.

Obviously, it would be unfair if, because of an accident of birth, you needed to achieve a much higher range of skills in order to get the same C grade as previous years, wouldn’t it? Surely that is the point of the grading, and exams of this type, if you get a C you have X range of skills, so colleges and employers can compare applicants fairly? If this is not the case, why do we bother with the exams at all?

The Harder Summer Exam

Much of the press reporting focuses on the assertion that the summer exam was much harder than the January one. This may well be the case. I would not be surprised if the mark schemes were more stringent and that, where a student was previously given the benefit of the doubt, this was no longer the case. This does need to be investigated and, if it turns out to be the case that the exam was much harder than previous sessions, adjustments should be made to ensure that there is consistency and fairness.

As each exam series has a new paper, you expect there to be a little movement in the marks needed to achieve particular grades – we as teachers expect this, a really tricky paper will generally have a lower grade threshold than an easier one to ensure parity. We expect the grade boundary for a paper to shift by a few marks – that is fair. What seems unfair is a sudden shift of grades by 10 marks or more, suggesting that it is harder to achieve a C grade than before – as I said above, if a summer 2012 C grade is not the same as a Jan 2012, summer 2011 or 2008 grade, then the whole point of assessing students via GCSE exams is flawed and unfair.

 The Controlled Assessment

This is the element marked in school and samples sent to the exam board for moderation. Typically, 60% of the GCSE is made up of internal assessment and 40% by the exam. The exam boards set the tasks, we teach the skills and content and the students complete the task under exam conditions. We can’t mark drafts or give feedback on the piece until it has been completed. The teacher then marks the assessment piece using the mark scheme provided by the exam board, this is split into bands and marks, not grades. We send these marks to the exam board, as well as an estimated grade (based on our professional judgement and previous experience). So far, so good.

I teach the OCR course for GCSE English Language, I have been to the board training sessions, we have moderated the work as a department, we have sent off our sample and (post-results) received confirmation that there has been “no adjustment” needed, that is that we have applied the exam board mark scheme accurately – matching work to bands and the relevant marks. So no problem there…well, yes! The controlled assessment is a huge, and, what I think is, a key part of the unfairness of this summer’s exam results.

This is where it gets a bit technical. Each exam series, the boards produce a list of grade boundaries for the marks awarded for each module, these ‘raw’ marks are then converted to UMS (Uniform Mark Scale) – this allows for adjustments to the boundaries, for example the differences in exam papers I mentioned above. While the public exam boundary may change within the same series, the controlled assessment boundaries should not (although they may change slightly from year to year) as the tasks are the same, can be completed at any point over the two years and are marked using the same mark scheme. The only difference with the controlled assessment is that the marks could be submitted to the exam board in either January or May (depending on which elements of the course were being counted towards the 40% terminal rule).

Following me so far? Good.

The controlled assessment tasks and mark schemes have not changed over the two-year course, so there is no variation in content that we might see in the external exam. Our moderator reports (and, I expect, many other schools’) state that there is ‘No Adjustment’ to either CA unit, so they agree our marks and our application of the mark scheme. As the qualification is criterion referenced, the grade equivalent for the mark awarded for the CA units (certainly within a single cohort) should not change – if they do, it suggests that C grades from different years and sessions are not actually the same which is obviously unfair and makes the whole exam system a farce.

Ah, I hear some of you say, the boundaries have been changed to avoid ‘dumbing down’, to increase ‘challenge’, to make the exams ‘harder’. Ok, so, if that is the case then surely we will see a similar increase in the boundaries for all grades?

No! The changes to the marks are not equitable, they hit the C/D and lower grades rather than the B-A*. Across the two English Language CA units (A651 and A652) the difference in marks (for OCR) to achieve a particular grade are as follows:

  • A* – 1 mark less than January
  • A – the same as January
  • B – 4 marks more
  • C – 8 marks more
  • D – 8 marks more
  • E – 9 marks more
  • F – 9 marks more
  • G – 9 marks more

I suggest that this is a political move, as if it were about rigour and ensuring challenge then surely all grades should have been affected? It seems to suggest that those in selective or high achieving schools (hmm, the children of many of our politicians perhaps?) are less likely to be affected. Perhaps the powers that be don’t wish to upset their privileged friends? Those students who are most in need of the C grade for college, apprenticeships or jobs, who need a good education to improve their chances in life seem to be the target of this change. It smacks of social engineering at its worst. This is unfair.

The second issue is the change within the same exam series depending on when the CA marks were submitted – if we had submitted the controlled assessments in January  the same pieces of work, by the same students, marked by us and given the same raw mark which was agreed by the exam board, were worth up to 9 marks less if we submitted in the summer rather than January. This unfairly penalizes  students, as in other schools, those who had the same or slightly lower raw score would have been awarded a higher mark if they were entered in January. The scaling for UMS makes this even bigger, so some students are 10 or 11 UMS worse off.

 What Should Be Done?

Firstly, the summer exam for all English exams should be reviewed to check that those who sat it in the summer were not unfairly penalised due to political pressure. Basically, would a response in the summer exam have achieved a higher grade in January? If so, they should be amended.

Secondly, the grade boundaries from January for the controlled assessment grades should be applied to the summer entry – ensuring that all students have been treated equally.

Finally, an urgent review of the whole situation with clear recommendations in plenty of time to avoid this situation next year. I am not advocating ‘giving’ students grades they don’t deserve, but be fair. If the exams are too easy – make them harder for all. Tell us what each grade is and give us examples to illustrate it – that will make it clear for everyone. Otherwise we are in the bizarre situation, to use an Olympic analogy, of a high jump final where no one knows how high you need to jump to win or even qualify.

I will be following Ofqual’s investigation and the outcome very closely.

—————————————————————————————————————-

Ofqual’s less that fab report here.

Why Data Use By Teachers is Key

I am a data geek, I make no apologies for the fact that I love a good spreadsheet and tinkering with charts. However, it seems that this is not the case for the majority of teachers. Yes data can be time consuming, it can be confusing and, in some schools, it is jealously guarded by members of SLT who pass out morsels to the waiting staff.

School Use of Data

The use of data in schools can be a contentious issue. Data is generally collected for two main purposes – for internal tracking, monitoring and diagnostic purposes, and for external purposes. Over the past 20 years, data for external consumption has become a focus, not the sole focus, but certainly it has moved into the realm of non-teaching professionals; a shift that makes many teachers uncomfortable. Kelly, et al (2010:4) found that: “staff think it is collected for external accountability purposes, but that it should be collected for internal improvement purposes.”

Parental Choice

One of the key arguments towards schools presenting their data is that it allows parents to make informed decisions about their choice of school. This view has its problems, firstly, ‘choice’ only truly exists in some areas, largely cities where there are a selection of schools with available places. Secondly, as shown by Allen and Burgess (2012), information about how a school has performed in the past is not an accurate indicator of how the school will perform in six years time, and therefore there is still a high amount of guesswork.

DfE School Performance Tables

The increased level of detail and data in the DfE school performance tables means that, more than ever, staff and managers need to be aware of the data we have in school and how it will be presented in public. This increased focus does not necessarily mean that we need to do things differently as English teachers, but being aware of the additional focus on English and Maths, for pupils of all attainment levels is key.

Although I have issues with league tables and reporting data that might not be fully understood by the consumer, I feel that, in an imperfect world, this measure helps to focus our resources not solely on those who are on the C/D borderline, and if necessary, justifies the inclusion of pupils who would not normally be targeted – not that such justification should be needed, but in these high stake days of A*-C including English and Maths the focus has been skewed – one of Goldstein and Leckie’s (2008:69) “perverse measures”. The continued tracking of these groups means we can check that all pupils are being offered support and that we are doing the best for all groups.

Data, Data Everywhere

One of the difficulties faced by teachers is the fact that the information needed for the department and the individual comes from a variety of different sources, in particular: school systems like SIMs, SISRA, FFT, school, department and individual spreadsheets. It is not surprising that for many this is ovewhealming. This disparate range of sources, and the fact that department spreadsheets often need to be created, is not uncommon in schools. Perhaps this is one of the reasons that data use in schools is not as effective as it might be, as Heads of Subject rarely have sufficient time to source and collate the information, and it takes a well trained administrator, with sufficient time, to be able to keep on top of it. As van Barneveld (2008:2) states: “large –scale assessment data were neither current enough nor aligned adequately with daily instruction”.

It is this gap between the data produced and staff need that makes many teachers reluctant to use or rely upon the available data. “Use of pupil attainment and progress data is widespread across the profession, but least so among classroom teachers” (Kelly, et al. 2010:3).

Why data is important to you

It is easy, as a subject leader, to see the importance of data, in particular lag data, when fulfilling the information needed for SLT meetings and SEFs, it is harder to see it if you are a classroom teacher. But, I want to convince you that knowing how to use data effectively is vital for all teachers.

Firstly, let’s talk pragmatically, what’s in it for you? Staff must feel confident in using data, as the new DfE Standards for Teachers, from September 2012, states that teachers must:

2 – Promote good progress and outcomes by pupils…be accountable for pupils’ attainment, progress and outcomes.

6 – Make accurate and productive use of assessment…use relevant data to monitor progress, set targets, and plan subsequent lessons.

So, based on this, it is the job of all teachers to use and understand data. It will also be very helpful to keep track of the various groups of students you teach. Being able to go into meetings knowing, for example, who your low attainers are, what progress they are making and what you are doing to ensure their progression, will make you feel more confident. What I am not saying is that you need to memorise all of this, that is where having a clear data storage system – be it a mark book or an electronic system – is key.

But, let’s be honest here, yes the standards are important to us as teachers, it forms part of our assessment and ultimately guides the PM process, but, being hit with the big stick of Ofsted is rarely what motivates teachers, and this is not the main reason we should use data. The real reason data is so important to us as teachers is as a tool to diagnose what students need to progress. This is so important, how can a student know where to go with their learning if we can’t give them some specific guidance. There are tool there to help – although APP isn’t statutory and can be a little unwieldy, it does provide a framework for assessing the students current position and guiding them towards improvement. Knowing specifically what your students need to do to improve means that meaningful feedback can be given.

Hattie (2012:16) states that:

The act of teaching requires deliberate interventions to ensure that there is cognitive change in the student; thus the key ingredients are being aware of the learning intentions, knowing when a student is successful in attaining those intentions, having sufficient understanding of the student’s prior understanding as he or she comes to the task and knowing enough about the content to provide meaningful and challenging experiences so that there is some sort of progressive development.

At the core of Hattie’s statement lies the importance for teachers to effectively use a range of data. Teachers need to have a full understanding of the available data in order to plan, teach and assess effectively. Good teachers know their students.

Overall, what has become clear through my research is that it is essential to remember that:

data in itself is insufficient; that it is the interpretation and subsequent use of data that can impact positively on teaching and learning, rather than the data itself (Kirkup, et al. 2005:102).

References

Allen, R. & Burgess, S. (2012) ‘Why The New School League Tables Are Much Better…But Could Be Better Still’ CMPO Viewpoint. http://cmpo.wordpress.com/2012/01/25/why-the-new-school-league-tables-are-much-better-but-could-be-better-still/ [accessed 30/01/12]

Goldstein, H. & Leckie, G. (2008) ‘School League Tables: What Can They Really Tell Us?’ Significance. June 2008 pp. 67-69

Hattie, J. (2012) Visible Learning for Teachers: Maximizing Impact on Learning. Abingdon: Routledge

Kelly, A., Downey, C., Rietdijk, W. (2010) ‘Data dictatorship and data democracy: understanding professional attitudes to the use of pupil performance data in English secondary schools’, CFBT. http://www.cfbt.com/evidenceforeducation/pdf/5417_DataDictatorship_web.pdf [accessed 27/01/2012]

Kirkup, C., Sizmur, J., Sturman, L., Lewis, K. (2005) ‘Schools’ Use of Data in Teaching and Learning’ NFER, http://www.nfer.ac.uk/nfer/publications/SUD01/SUD01_home.cfm?publicationID=161&title=Schools%27%20use%20of%20data%20in%20teaching%20and%20learning [accessed 27/01/2012]

Smith, A. (2011) High Performers: The Secrets of Successful Schools. Camarthen: Crown House Publishing

Van Barneveld, C. (2008) ‘Using Data to Improve Student Achievement’ What Works? Research into Practice. Research Monograph 15. http://www.edu.gov.on.ca/eng/literacynumeracy/inspire/research/whatWorks.html [accessed 27/01/2012]

Fishy Revision

Revision. Argh! Love it or hate it this seems to be mostly what we end up doing at this time of year (in between filling in reams of exam paperwork). The challenge is to try to make it effective and interesting – a challenge if ever there was one. The internet has been invaluable for trawling for great ideas, but I have also been digging through my old resources to see if there are any gems.

Today was revision for Of Mice and Men for OCR A663 next week. The group know the text well but planning is a bit of an issue, especially in the tight time frame (45 minutes). The exam requires the students to analyse language and techniques as well as making links to context. I wanted to create a task that developed planning but also encouraged the group to hit the assessment objectives in the exam.

I started off by borrowing the excellent Nominative Determination task from Miss Ryan’s GCSE English Blog . This was a really effective opening task as it got the group thinking the characters and analysing the language, and they really enjoyed it. As they thought through the significance of the names and their connotations I could hear mental lightbulbs going on around the room – love it!

In our mock exam, quite a few students failed to write about the context of the text or link it to the question. To combat this I came up with the mnemonic CRAFTI (using the helpful anagram solver on the Internet Anagram Server).

A Crafti Mnemonic

I tried to make this something memorable but that also covered each key point.

The next step was to think about planning, how could I make sure that the planning was quick and easy, but also encouraged relational thinking?

My collection of random USB pens came to the rescue. Every so often, since I started teaching, I have saved all the useful resources on my school user space onto a USB. Some of them stay there forever, but I have a peek every now and then to see if there is something worthwhile. Last night I found it.

As I have been experimenting with SOLO HOT maps, I wanted something visual and simple that could encourage deeper thinking. My solution was a fish-bone analysis, or at least my variation on one. I decided that the horizontal line should contain the Idea – i.e. the key point in the passage and key words from the question. This would encourage the group to focus on the question throughout their planning. Each pair of ‘bones’ would include brief points on Context, References, Audience, Feelings and Techniques. I used a series of powerpoint slides to show the process, using the example from the mock (Lennie and the ketchup in chapter 1).

Fish-Bone Planning

The final task, and one I have advised them to do for revision, was to choose a section of the text at randon, or to invent a non-extract based question, and to produce their own Fish-bone plan:

Fish-Bone Planning Task

The class really seemed to get to grips with this as a planning method, and I liked the fact that it could be loosely linked back to the text (‘flopping like a fish’). Overall, I was really pleased with this, having tried it with my Y10s during their lesson. It was also used by another teacher in an afterschool revision session, and it reportedly worked well. So the next step is to try it with one of my more challenging groups.

The Exam Season – A Plea

This is a bizarre time of the year. The majority of the controlled assessments are done, there is the usual scramble for those who arrived part way through Y11 or refused to complete work to have a finished folder. Students are demanding revision sessions where they expect their teacher to impart pearls of wisdom, while they sit passively, or don’t show up at all. Rivalries between departments reach breaking point as the key marginal groups are pulled in multiple directions at once. The whole thing seems to create a sense of sliding down a massive helter-skelter with nothing to stop you.

This is also the time of year that teachers become wild around the eyes with the pressure of too many tasks in a finite amount of time. The only thing keeping us going is the thought of a little gained time to tweak and improve for the next year.

However, this is also the time of year that two very different groups seem to go out of the way to make things as difficult as possible.

Exam Boards and Estimated Grades

At this busiest of times, and I know that those with a negative view of teachers will no doubt scoff, we have marking and annotation of coursework samples, preparation of in class and after school revision sessions. I just don’t understand why the requirements of the exam boards are quite so onerous.

I deal with KS4 English, currently made up of 3 different qualifications being taken by 240 students. I have to enter coursework marks, estimated grades for those marks, estimated exam grades for each module and for the qualification as a whole – this amounts to almost 3000 separate entries, either numbers entered onto a website or little boxes on an OMR sheet being coloured in.

Why? How much of this is actually necessary? If the students are taking the exams that is the grade that will count, not a ‘best guess’ from a teacher, why ask for estimated grade for coursework when I have already given you the actual mark I have given it? Surely my time as a teacher is best spent in the class or preparing excellent lessons?

The Press

However, the group I feel is most distructive at this time are the press; each year as the exam season looms, we see multiple stories about how easy the exams are, how they are dumbed down, how it is all teaching to the test (occasionally spiced up with an ‘aren’t teachers awful’ piece).

Now, I am not going to focus on the bracketed point – there are enough blog posts that have dealt with that issue, and I am sure there will be many more – my real concern here is the message we are presenting to those taking the exams. Those who rarely have a voice in the face of all of this criticism.

For the brighter, keen students, there is the pre-exam slap in the face: all your work is pointless, anyone can pass these exams as they are so easy, talk bandied about of ‘easy’ or ‘soft’ subjects…it is pretty demoralising to hear. Every year we loose one or two to ‘why bother then if they are so undervalued’, or those who fall into the trap of believing the hype and doing little work.

Yet the most destructive impact is on those at the other end of the scale; the students who don’t find school easy – whether it is because of home or social issues, low literacy levels, SEN. How much more distructive is it if you have worked your way to an E or a D grade, if you have tried your hardest, revised and then hear sneering news reports that say anyone can get a C grade or above? Or that vocational subjects are pointless? How hard is it to get those students motivated in the first place? To get them into school on a regular basis (any trawl through school data will show that lower ability groups have worse attendance, on average than those above)? To build their confidence that taking the exam is worth while, that there is a chance that they will achieve that magical C grade? How much more damaging are these stories and comments  to our most vulnerable students?

A Plea

So my message to the press, and politicians looking for a quick story or a memorable sound bite – please think of the impact your words have, exaggerating the negative and twisting the positive does not help the students you claim to be most concerned about. When we look at how other successful countries (and we are a successful country) organise their schools  and exam systems, we should also look at the press and Government messages in those countries, do they run down their own exam system, fill their papers with stories of how bad the teachers are and how easy the exams have become?

Teacher bashing has always been a popular media topic, and I am sure it will continue to be so, however, we chose this career, many of us choose to stay despite working in challenging schools and coming across soul destroying situations and choices – but ultimately we chose this career because we want to make a difference.

However, the students you denigrate with these stories have no choice. They have only one chance at being a Y11, they can’t control whether they attend a privileged private school, an outstanding school, an inner city school. This is their opportunity to do well, and having large parts of the population criticise and downplay the massive effort that most of our young people put in does not help. Unfortunately, many of our most vulnerable listen to that message and think what is the point.

HOT Maps – A Real Eureka Moment

Having had several successful lessons using the SOLO structure and hexagons, I decided that it was time to branch out a bit and to try a wider range of SOLO techniques. Again, I decided to try these with a range of classes.

Compare/Contrast Map

The first HOT map I looked at was the Compare /Contrast  map. I used this initially with my Y12 Film Studies class to explore the similarities and differences between their comparative study films. They had been, generally, fairly good at identifying key features about the films separately, however, were struggling to make direct links between the films. I used Word It Out to create a word cloud based on a synopsis of each film from IMDB, I then showed the group some examples and got them to work in pairs. I linked this to group planning of an essay where I used Triptico to sort the class into groups – they produced bullet points for each paragraph. I sorted them again and they had to add or delete bullet points. I sorted them one last time to write the paragraph. This worked well for those students who had studied the films carefully, less well for those who had not revised carefully (this was perhaps a bit of a warning for them). It did help to highlight the links between the films but at a fairly simple level – the next step will be a part whole analysis to extend their understanding of the roles and development of the points they identified.

I used the same HOT map with my Y13 Film students to develop their ability to make and analyse specific textual references (AO2) to back up the more generalised comments made in their essays (AO1). For the students to achieve the highest grades both areas must be covered in detail.

This time, I adapted the Compare /Contrast map by including a series of screen shots for one of their films; we also focused on a specific exam question to fully explore the level of detail needed in each paragraph within their essay.

Compare Contrast Map

The focal point was Mise-en-scene and I had chosen four screen shots from ‘The Story of the Apartment’ in ‘City of God’. We discussed the significance of the shots and the students annotated the images. I then asked them which specific shots we could use to compare from ‘La Haine’ – the class identified the scene in Hubert’s boxing gym, the housing in the banlieue, the apartment or the art gallery in Paris and the view of the Eiffel Tower from the top of the tower block. I then asked the group to explore the similarities and the differences in the mise-en-scene and to start making links to why this was the case. This worked well, and as it was focused on developing a very specific skill, I felt it was successful in making the group fully aware of the interaction between the two assessment objectives as well as the two films.

Whole/Part Analysis Map

My next experiment was to try the whole/part analysis HOT map. I decided that I would do this with two very different classes to assess the impact – a top set Year 10 and a bottom set Year 11. Both groups are in the process of final revision for English Literature GCSE exams.

The Year 11 group were working on ‘An Inspector Calls’ for OCR A662 and the focus was to develop their understanding of the text so they could answer in more detail and move towards the C grade. I used a whole/part analysis map with 3 parts.

Whole/Part Analysis

They filled in the ‘whole’ segment with their overview of the play with little prompting and often suggested relevant bits of detail to each other. As a class we explored the role that Setting/Context played and then the group used the second box to explore character – and used copies of the text to look up relevant details and quotations. This brought us to the end of the first lesson. I was pleased with the progress made by the group and the fact that they had recalled some very useful points, however, I was not quite sure about how ‘considering the impact of a part being missing’ would work, nor of its impact.

The Year 10 first lesson was similar to the Y11, this time the focus was ‘Of Mice and Men‘ for  A663 and, as they are a top set I extended the parts to 5 which we discussed and labeled as a group. They then completed the key elements for each part.

Multistructural Stage

The second lesson provided the ‘Eureka’ moment. I introduced the question – ‘What would be the impact if this part were missing?. We went through an example as a group, using the character of Slim as an example. They came up with lots of ideas about what would happen if Slim were not in the novel, from fairly straightforward points about there being no one to stop Lennie and George being fired in section 3, to more complex ones about George having no one to confide in or to present the arguments for killing Lennie.

Relational and Extended Abstract

They were already starting to  move onto the next question – ‘Therefore, can you evaluate the role of this part?’ – and continued to do so when they were working in pairs. This will be something that I will scaffold a little more with the lower group. The level of discussion amongst the group was amazing, they moved from these points onto detailed consideration of why Steinbeck had used the character or the setting and linking to his purpose. There were some real cognitive leaps, like the group who discussed the theme of religion saying that: the natural setting at the beginning and end could represent God in nature; that Slim’s empathy and understanding, combined with the religious connotations of his description, made him almost like a religious leader and that the men are not presented as religious as church goers are part of a community and the men are outsiders. Totally A* personal analysis and interpretation. I was blown away. This is when I ‘got’ the missing part question. Try it!

Experiments With Hexagons

Twitter is an intriguing place. For all the general chat and points of view that most people associate with the format, there are also some real gems.

The Inspiration

A #pedagoofriday post from @LearningSpy about using hexagons for work for ‘An Inspector Calls’ piqued my interest and, via a quick tweet, led me to his blog post on Hexagonal Learning as well as a range of references to follow up on SOLO taxonomy from @Totallywired77 amongst others.

This seemed just what I needed to bridge the gap with my lower Y11 group between knowing things aboutAnimal Farm and being able to make the type of links they needed to achieve higher grades. So I thought I would give it a go.

The lesson

I decided to focus on the character of Boxer and used the SMART board (shapes and infinite cloning)  to give them a brief demonstration. At the start of the lesson I had given the group a sheet with the SOLO levels on it and discussed them briefly. The class were allowed to choose their own groups and were given a selection of pre-cut out hexagons. We started by identifying a range of points and quotations about Boxer (multistructural) and then I asked what they needed to do to move up a level – make links. This is a class, who can be challenging, with grades ranging from F to D.  They all worked brilliantly discussing the points and making links (relational) with only a minimum amount of input from me.

Hexagon Lesson Exploring the Question ‘Is Boxer a Hero?’

Part way through the lesson, a member of SMT popped in – they left and returned with another member of staff to show them what I was doing! Now that has never happened before.

Result

That was just the start. The use of hexagons and SOLO have spread throughout the English department in a matter of weeks, and were commented on positively during a mock OfSted inspection. This has definitely become part of my teaching repertoire; I have rarely found anything that works with pupils of all abilities and levels like this.

More Examples

Further Reading

Tait Coles’ Blog

Learning Spy’s Blog

Lisa Ashes’s Blog