Monday, March 31, 2008

Assessment issues

Hi everyone,

I just read through the assessment articles (remember, they're on NCTE's webpage, www.ncte.org, and you click on "classroom assessment" under Quick Links to access them). Especially in light of the theme I picked up about assessment involving multiple voices, I want to be sure that I'm not just giving a post that reinforces the "teacher asks, students respond" mentality. Instead, I want to give a few of my thoughts after reading and see what your thoughts are about the same issues.

Odd as it seems, what I most gravitated to was the NCTE position statement about assessment. It's clear, articulate, supported by research-- and pretty much ignored all over the country in favor of standardized tests speaking for a student's entire educational career. I know some of you are going to complain that a few of the articles are elementary-teacher oriented, but I wondered when I read those articles alongside the NCTE position statement if perhaps one of the barriers to authentic assessment at the secondary level is a belief that things like newsletters and detailed parent-teacher conferences stay below 7th grade. Quality assessment, the kind outlined in the NCTE position statement, takes a huge amount of time and effort from not only the teacher but also from students and parents. It involves teachers spreading the word about what's happening with their students' abilities. Standardized tests are quick and easy-- certainly not painless, but quick. We're a numbers-driven society, so getting out of the numbers mindset, as described in the article about the Tennessee teachers who wanted grades to reassure them during the summer institute, is an uphill battle. I think it's worth the battle, though, if we're teaching kids real-life learning.

So here's my question, which you may or may not want to chime in on: How do teachers enact the NCTE position statement on assessment, especially with the pressures of the WesTest? How might we buck the trend in diplomatic, reasoned ways? I think one way would be to implement portfolios to document student writing development. You've all done action research, you've all gathered data, you've all written about that data. What happens when that data is broadened into systematic study of students' portfolio work? It's one thing to say, "My method works" and another to say, "My method works and here's a whole year's worth of my students' writing products to back up my claim." Thoughts? Or, if you prefer, what most struck you from the readings about assessment?

No comments: