Assessing Students' Media Work

By Chris M. Worsnop

Two teachers undertake to mark the same assignment independently. One gives the assignment a failing grade on account of its technical flaws, while the other gives it a healthy pass because of its originality. If you were one of the teachers, what would you do to make sure that next time the two teachers arrived at the same or very close to the same assessment? If you were the student, whose class would you rather be in? This article is about an assessment instrument that might have the potential to help the teachers in the above scenario, and to render the student's question irrelevant.

Two teachers undertake to mark the same assignment independently. One gives the assignment a failing grade on account of its technical flaws, while the other gives it a healthy pass because of its originality. If you were one of the teachers, what would you do to make sure that next time the two teachers arrived at the same or very close to the same assessment? If you were the student, whose class would you rather be in?

This article is about an assessment instrument that might have the potential to help the teachers in the above scenario, and to render the student's question irrelevant. The assessment instrument may also make it more likely for students to be encouraged to use media-other-than-writing instead of that old favourite, the essay.

It has always seemed strange to me that students in schools are so frequently asked to report back to their teachers about their learning in formats and technologies that belong to the eighteenth and nineteenth centuries. The essay seems still to be the dominant medium for such reporting, even when the student is reporting about some 20th or 21st century phenomenon such as computer technology, popular music, or nuclear physics.

In my 1994 book, Screening Images: Ideas for Media Education, I listed 186 different formats that are available to students for reporting about their learning, of which the essay was only one. My point is not to denigrate the essay – I am, after all, writing one now – but to promote alternatives. Why should not more students report to their Phys. Ed. teacher in video? To their drama teacher in audio? Why should more of them not make an independent study piece in history in the format of a radio documentary? Why should the teacher of mathematics not be happy to have more students hand in work in comic book format? Why do we not see more interviews, surveys, photographs, cartoons in the work students submit to their teachers across the whole curriculum?

I suspect that the answer to all these "why not" questions is that many teachers in all areas except media education feel uncomfortable assessing work that comes to them in any format other than writing. And the reason for this discomfort is not hard to find: unfamiliarity. Very few teachers have been trained in assessing a piece of video or an audio presentation. Every one knows how to spot spelling errors and split infinitives, though, and so writing remains the dominant medium for reporting about learning in our schools. (This is sometimes the case even in media education classrooms, especially in advanced, or academic classes. It sometimes happens that students rarely, if ever, get to work in the media that they are studying, and are expected to report about their media work always in writing.) I don't wish to suggest that writing should lose its dominance; just that some of the other formats and media should be allowed to get more of a look in as well.

The rubric for media work that I am offering as a partial solution to these issues is based in my experience working with writing rubrics over the past four years. I have worked on and helped develop rubrics, both holistic and analytical, for student writing at all levels from grade 3 to grade 12. That work, modified, has provided the basis for the media rubric. I have learned to prefer analytical over holistic rubrics, because I have found them more detailed, more reliable, and more closely linked to instruction.

This is how the media assessment rubric works. All modes of expression from writing to singing have certain traits in common:

I have taken the model of 6 levels of performance from the Ontario Common Curriculum and the Provincial Standards. This is essentially a standard five-level scale in which the zero level is called 1. People who are more comfortable using a five-level scale can easily adapt this one by dropping level 1.

So, we have five traits and six levels of performance. Its not hard to make a chart or grid, 6 X 5, and see that the thirty intersections could each be described in language that sets one intersection apart from the next. That is to say, there would be six statements describing performance in organization, ranging from the outstanding organization that would be expected for level six performance, to the absence of organization in level one performance. But the grid is more complex than that. Instead of having just one description for the trait of ideas and content at level five, for instance, there are three, since ideas and content consists of more than just one thing: it has, in fact, three parts, or indicators:

Each one of these indicators is described at each of the six levels of performance. The first one, Controlling Idea looks like this:

x The controlling idea is perceptive and insightful
y The controlling idea is thoughtful
z The controlling idea is clear but may be conventional
w The controlling idea is apparent but may be simple or derivative
v The controlling idea is discernible, sketchy (or plagiarized)
u The controlling idea is absent or must be inferred

When an indicator has been written this way, it is called a strand. Therefore, the rubric for the trait of ideas and content has three separate indicators, each stranded into six levels of performance. Since there are five traits, there are five rubrics within the overall set, each one made up of several indicators stranded into six levels.

When teachers at the 1995 AML Summer Institute used this rubric to assess a student made video, they quickly came to a consensus over the correct level to select, and commented on how the rubric helped focus assessment upon the important criteria as well as on the appropriate level.

Comments, questions and suggestions from fellow AML members are welcome. Please contact me at:

WRIGHT COMMUNICATIONS
2400 Dundas Street West, Unit 6, Suite 107
Mississauga, Ontario L5K 1H9
Tel: (905) 823-0875
Email:worsnop@pathway1.pathcom.com