5 Minutes For Marking
Regardless of how motivated teachers are, the amount of marking they can handle is ultimately driven by how much time they have at their disposal and how many students they support. In the state system, the likelihood is that most students will get less than 5 minutes of marking time each week.
The DfE Teachers Workload Diary Survey (2013) indicated that 9.4 hours were available for marking each week (56 total hours worked).
Less than 5 minutes of marking per student per week!
Although precise outcomes will depend on actual teaching loads, if a teacher supports 150 students (e.g. 6 classes of 25 students) there is just 3 mins 48 seconds of marking time each week for each student (9.4 hours divided by 150).
Less assessed work and cursory marking
Even for highly skilled teachers the net result is that assessed work will reduce and marking quality will be increasingly limited to providing a summative outcome and some brief feedback. For a student load of 150 this means that during the course of a typical school year a student will benefit from just 2 hours 28 minutes of marking. Although the level of teaching experience will make a difference it does not seem likely that this will be enough for marking to alter the outcome that students are expected to achieve.
In a Progess8 world where exceeding expected grades is so important, just how is a school supposed to find more time for marking? There are two possible solutions:
1. Fewer or smaller classes
Automate marking and feedback provision
In an austere environment and no matter what ‘job threat’ suspicions the profession might hold, intelligent automation is the only realistic solution. However, the experience so far has been arguably patchy. Up until now automated assessment has focussed on “right or wrong” summative assessment with limited and quite general feedback. This is someway short of what a student or parent might expect is necessary to improve grade attainment.
In technical subjects this is critical. Get the first question wrong in a structured maths assessment without immediate specific feedback and the exercise becomes largely pointless from a learning progression perspective. This doesn’t mean that automation does not work, it just means that propositions need to aspire to achieve more. Schools need opportunities to work with software solutions that cover all aspects of the learning process (instruction, identification of learning gaps and unique feedback to correct gaps).
Every time an error occurs students need to be able to review where they went wrong and correct the error. This is the only way that answers to subsequent questions are improved and can only be achieved by providing a worked answer demonstrating correct technique. This way students have a realistic chance of improving as they can correct technique errors.
Working with several solutions
Using one supplier for videos (often not much more than lack lustre screen recordings unless green screen tech is used), another for assessments and relying on teacher interventions to plug learning gaps is unlikely to succeed:
1. Teacher feedback is unlikely to be effective as it will be provided verbally, possibly directed to the entire class and will not be provided until it is too late.
2. Unless the videos and assessments come from a single source and are structured in a methodical way, it is unlikely that the assessments will effectively build on the support provided by the instruction video.
3. Activity records on several independent systems will be ineffective and will be exploited by the very students that require tight management to raise activity and grade achievement.
Increase feedback by automating it
Automation of feedback is potentially a huge step forward as it will exceed the level of feedback that could ever be provided manually. In a 60-minute lesson in which students might view an instruction video and complete two attempts, an automated feedback approach means that students get as much feedback as they need and teachers can target their attention on those students that need it most. Take away automated feedback and how much support will individual students actually receive?
Support to enable larger class sizes
If high quality automated feedback is provided it really should be possible for more able and motivated students to progress with minimal teacher support. Dare I say it, this means it is also highly likely that teachers could then handle larger student numbers or engage non-specialist teachers to manage classes. This way schools can give themselves the chance of rising to the austerity challenge of ‘delivering more with less money’. This really should be possible, but only if software solutions evolve to help schools to achieve this.
Although the significance is often poorly appreciated, using a single comprehensive solution can have dramatic impacts for schools. It will mean that teachers are supported by a single set of data records covering all of a student’s activity and providing a full rather than partial syllabus knowledge audit. Only then will teachers be empowered to effortlessly manage increased student activity and personalise learning. If data is organised effectively the data records can also be viewed at class level and form the basis of informed and highly effective management interventions where problems classes are identified.
Software v software solutions
Of course, the aim of school learning management systems is to provide software for teachers to create their own learning programmes. However, the main inhibitor is that teachers do not have the time to create the resources and organise them into a coherent high quality programme. This means the future lies in carefully designed software SOLUTIONS that have the scale incentive to invest the time to create and maintain engaging, comprehensive and teacher centric solutions for large numbers of schools.
|Review Magazines. All rights reserved Tel: 01234 348878 Fax: 01223 790191 Email: firstname.lastname@example.org Sitemap|