Wednesday, September 19, 2012


The next round of NAPLAN data is being released and the issues with summarised statistics arise.  In low socio-economic schools this data is damaging and will close schools - not because of poor teaching but because of cohort changes.

Let's take a sample school.

General assumption:
NAPLAN scores have dropped over four years.  Obviously something wrong with the teaching staff.

Examining raw data:
Increase in students with little or no schooling (refugee intake)
Opening of new school nearby attracting higher performing students
Half cohort was generally a weak group due to many students (and siblings) moving to private schools in yr 7 (entry to secondary schooling early was a significant factor in parents choosing schools in yr 7 coupled with aggressive marketing by private schools to maintain student numbers)
High turnover in experienced staff
Decrease in general school attendance (and students not attending at all) - increase in overseas holidays in yr 8, truancy, mental health issues
Issues with changing curriculum and yr 7 content not being taught to the level required by NAPLAN in public primary schools
Inability to move on students with little or no interest in schooling
Strong increase in performance of high school ready students (what high school teachers are trained to do) and low levels of improvement of students that are at primary levels during yr 7/8 (area of improvement for the school).

The issues make it hard to compete with local private schools.

None of these factors are taken into account by a one number summary, nor does it take into account the lead-in required to cater to a new circumstance that the school is experiencing (in this case a much higher number of low ability students).  Even if the school diagnosed the problem, reacted and implemented cohort specific solutions (including structural changes to better cater to low ability students), it takes lead time and strong leadership to identify and implement actions that have significant impact on NAPLAN statistics and student learning.  Yet in many cases a lower NAPLAN score will be seen as a teacher issue, comments driven by the misuse of statistics.

Furthermore, little analysis is done to see where systems are working and where changes in the pipeline have caused a significant positive change in student results.

Lastly, by releasing this data to parents (rather than aggressively seeking the problems and rectifying it within schools) a downward spiral commences.  A school with a low NAPLAN score does not attract good students, thus the score continues to drop each year and student numbers fall.  Senior school offerings reduce as student numbers are not sufficient to sustain courses.

Tuesday, September 11, 2012

Professional Development in Schools

At the moment the department lives between a rock and a hard place with professional development.  The new Australian curriculum requires a level of professional development to be successful but the department lacks the resources to implement it.

To do it properly requires a slow implementation over many years with a commitment to each year being implemented with a focus on contextual differences between schools.  A drip feed approach, working hand in hard will work but requires a range of strategies, ICT and monitoring that the department is not geared towards nor has a track record in being able to deliver.

What can be done with a relative few has been shown by the oft maligned Curriculum council (now known as SCASA), during the Mathematics NCOS rollout.  Rom, Malachi and crew did a good job of defining the curriculum succinctly and then supporting teachers understanding curriculum points.  The moderation process (albeit unwieldy and requiring personal statistical attention to maintain integrity) has worked to lesser and/or greater degrees.  Understanding the scope of assessment has not been an ongoing problem.

No such names can be readily placed for Australian curriculum.  There is no level of confidence in the process by teachers at this time.  The assessment model and levels of assessment is still a big black hole.

I'm not saying curriculum support branch aren't trying to help.  They are.  I think they need a little more practical and visible leadership and release from some of the hamstrings of the past.  Rather than being apologetic about what they can't be, they clearly need to focus on what needs to be done.  If they let go of the fringe materials (such as first steps) and focus on key requirements (specific learning area focuses (new content, changes to scope and sequence, what needs to be delivered, when it needs to be delivered) they may be more successful and useful.  Without commitment to a process at best they are going to be ill focused, at worst ineffective.

I would start by redeveloping the communication model.  The portals used are ineffective as they require teachers to log on to view them.  Start with Principals (where a solid communication network exists) and then work down.  Focus on Learning area objectives to reach Australian Curriculum guidelines and disseminate information to HODs and HOLAs.  Develop an online approach.  Get some money to do it properly and quickly - no two year processes, 10 weeks max each project using subject experts (I know expert is a bad word, but only because the experts of the past had a barrow to push and were academics or failed teachers - get the old crusty teachers of math that have taught effectively in the classroom, the statistics exist to identify who they are).  Couple them together with some of the new teachers that use ICT effectively who know better ways to distribute information.  Produce useful resources and teachers will be hooked.  TDC's were effective in this in that they produced usable resources - this time more time needs to be taken to ensure these resources are good.