Sunday, April 7, 2013

Writing Driving CCSS Assessment

I spent Friday and Saturday in a workshop about the assessments educators will find emerging with oncoming Common Core State Standards. Along with 40+ educators from across Pennsylvania, the Institute for Learning (IFL) led us through several assessment models designed by the Partnership for Assessment of Readiness for College and Careers (PARCC).

The Department of Education has contracted PARCC who has in turn contracted IFL to assist with the creation and piloting of upcoming CCSS assessments.

I came to understand that many assumptions have arisen over the CCSS because not many firm answers or samples have been offered. The weekend workshop was a call to those involved on the inside to meet with some educators to share what the current models of assessment are looking like. My invitation to participate came to me through being actively involved with the National Writing Project. I believe six local Writing Project sites from the State of Pennsylvania were represented at the workshop. While not much has been officially released for public consumption, I found the weekend extremely helpful and will share the core of what I took from it.

The first message communicated was that the CCSS call for a 50-50 split of informational text to fiction at the middle school level is intended to represent a ratio across all subject areas. Some assumptions about the CCSS place the onus only an English teacher's shoulders. Additionally, many English teachers have lamented the perception that the strong call for nonfiction means an impending erasure of literature. 

As I understood the message, educators do not need to feel as though literature is being pitched by the CCSS. The co-director of the IFL, Anthony Petrosky noted that in the appendices of the CCSS it explicitly states that teachers should fill in the gaps of the CCSS--assuring us that this is a safety net to protect the presence of literature in a curriculum. While I have not read or heard that statement prior to this weekend, it did reassure me to some degree to at least hear it from someone involved on the inside.

The core intention of the CCSS is to see educators engaged in  regular practice with complex texts from outside the textbook-model as well as focus on academic language across all subject areas.

Often, math, science, and social studies courses do not offer opportunities for students to read and write texts beyond the textbook-driven lessons and questions. The observation is that many of these courses work through curriculums built like checklists of terms, events, and concepts without much opportunity for conversation, problem-solving, and deeper and more meaningful interactions with the content. The CCSS wants to encourage the building of knowledge in these subject areas through content-rich nonfiction. 

To help promote that end, assessments are being designed to measure a student's ability to read, write, and speak to the evidence found in a text--not pare a sliver of information from the many facts digested over a year. The PARCC's current design calls for three types of performance-based assessment types:
  1. Evidence based selection response
  2. Technology enhanced constructed response
  3. Prose constructed response
All of the assessment models that we experienced at the workshop advocated multiple reads of multiple texts within one unit or lesson. For example, one of our model units (Forensic Anthropology and the Science of Solving Crimes) was built upon two overarching questions:
  1. What roles does a forensic anthropologist play in the science of solving crimes?
  2. What methods do writers of informative texts use to convey complex ideas and information?

The experience placed a value on two details that can not be overlooked by education:

a. Learning is social
b. We must be the models of the lessons we teach.

In other words, the CCSS impels educators to be readers and writers of their content areas. The assessments somewhat divorce state testing from the current multiple-choice model. Every assessment we saw and used asked us to write and discuss. Again and again. This is a very Writing Project friendly approach.

Some assumptions bandied across the state suggest that writing is not a core component of the CCSS. This would be a grave mistake for districts to assume.

Writing is the mode on which the assessments are being built.

The CCSS's impetus to integrate reading and writing across all subject areas drives the dismissal of classes only using textbook-driven instruction...and their mirrored counterparts in current state testing. Textbook instruction often presents one type of question and one mode of learning--the CCSS assessments want to encourage a social component to learning that is integrated with ample opportunities for students to read and write.

It seems to me, professional development should start with helping teachers across all content areas tlearn how to write in their content area and find content-rich nonfiction to use in their classes. Additionally, all teachers need help in understanding how to encourage, direct, and read student writing. Research shows us that teachers learn best about teaching by talking about student work samples...and students learn best by having the opportunity to write and read their own thoughts in small groups--before opening it up to larger group conversations.

This is not a short-term process. Clearly, the CCSS is positioning educators to make a long-term commitment towards altering the way in which we teach and in how we see time best spent in our classrooms.

Much more information will emerge as the months pass--it has too--as we are inching closer to the CCSS adoption across the states--and this is not a quick (or easy) transition. However, if you lasted this long through my post, keep a firm grasp that it indeed appears that writing and the ability to extract and synthesize evidence from multiple texts is the essential component of upcoming CCSS assessments.


  1. I teach at the high school level in Ohio, and the CCSS shift to an emphasis of 70% non-fiction at the HS level. I am thrilled that CCSS requires a shared responsibility for literacy instruction. Helping all content area teachers gain confidence in teaching literacy skills is an opportunity for collaboration. And I am thrilled that the standards move all students to higher leves of thinking. I cringe when teachers ask, "How will this appear on the test?" But I am going to ask, how will non-fiction appear on the PARCC assessment? Will non-fiction/disciplinary literacy tasks appear on the ELA assessment? As states use value added data for teacher evaluations, will ELA teachers carry the data "burden" for non-fiction across the disciplines? In Ohio, value added will be 50% of a teacher's evaluation, and while there has been discussion about shared attribution at the elementary/MS levels, I have not heard much discussion about how data will be shared by teachers across disciplines at the high school level. I think the perception is that data from the ELA assessment belongs to the ELA teachers. I know it is not the intention of the standards or the assessment to push literature to the side, but this is may be the unintended consequence if ELA teachers' evaluations are tied to an ELA assessment that is 70% non-fiction. I love non-fiction. I love using non-fiction as mentor texts with my students; however, I am afraid that great fiction, and the YA lit that motivates readers will be sacrificed.

  2. In Oregon, we will be using the Smarter Balance assessment which is using similar thoughts for ELA. I am a math teacher and have been writing tasks for the upcoming assessment. I think both consortiums agree that analyzing text (whether in ELA or Math) is very important and students need to be able to describe and construct writing that shows their thinking. I am moving to more writing opportunities in math next year as a way to help students look at different ideas. I'll be interested to hear more of your thoughts..

    The Other Side of the Equation