“The Scientific Management of Writing and the Residue of Reform”
This article is the winner of the 2008 CCCC James Berlin Memorial Outstanding Dissertation Award.
Author and Audience
Eric D. Turley identifies his research as lying in the intersection of composition, writing assessment, and school reform. Since this is a dissertation, Turley is producing this scholarship for his thesis committee, presumably with the intention of reframing/continuing the piece as a publication for a larger academic audience interested in the fields of composition, writing assessment, and school reform. Turley’s completed his dissertation at the University of Nebraska, Lincoln, under the supervision of Professor Chris W. Gallagher. Gallagher works in the areas of writing pedagogy, assessment and accountability, literacy studies, and rhetorical theory. Interestingly, Gallagher’s 1998 dissertation, “Reflexive Inquiry: Rethinking Pedagogy and Literacy,” also won the CCCC James Berlin Memorial Outstanding Dissertation Award.
In addition to this dissertation, Turley and Gallagher co-authored “On the Uses of Rubrics, Reframing the Great Rubric Debate,” which is linked to USF’s myreviewers web site. Turley is currently an Assistant Professor at the University of Missouri, St. Louis.
Turley identifies the following research questions in his Introduction:
- What are the origins of standardized writing tests within the fields of English Education and Composition? Where did they come from? Who created them? What were their purposes? And what problems were they trying to solve by implementing them?
- Why do politicians, policymakers, administrators, and teachers turn to standardized writing tests to solve their writing problems?
- How do standardized writing tests impact student writing, teachers’ pedagogies and theories of writing, and school culture? (6-7)
Turley argues that current problems with assessment and accountability, specifically in the form of writing scales and standardized tests, are a result of educational reform that began in the Progressive Era in an adaptation of Fredrick Taylor’s model for industrial efficiency. Turley discusses the move in the early 1900s to develop a scientific, objective tool to measure student writing and from there provides a general history of developments in assessment in relation to Taylor’s efficiency model. Turley identifies this adaptation of Taylor’s model as ineffective, as it focuses on efficiency of process rather than on quality of process or product.
“Through the creation of standardized tools to measure writing and teachers the residue of Taylor’s ideology and practices are embedded within contemporary theories and practices of writing assessment today. (10)”
This suggests Turley’s view of standardized assessment tools as a way to hold teachers accountable for student success, rather than focusing on ways to ensure student growth and development. Turley also discusses the political nature of educational reform as an administrative response to a perceived need—a need that may not be recognized by those that inhabit the institution:
“The residue of reform efforts of the past are still with schools today, often unnoticed or unquestioned by those who inhabit schools. Like the graded school, credit unit, bell systems, and urbanization of schooling, standardized writing assessment has become part of the grammar of schooling” (11).
In addition to exploring the connection between assessment issues and Taylor’s ideology, Turley also grounds his work in Foucault’s theories on power relations and subjugated knowledges. Turley uses Foucault as the theoretical underpinning to support his inclusion of counter-narratives in the qualitative portion of his study that examines writing assessment within a specific school district.
Epistemology and Rhetorical Stance
This piece seems to be primarily hypothesis-testing, in that the qualitative study portion of the research emerged from the author’s work with Taylor’s model of industrial efficiency in combination with Foucault’s theories of power, displacement, and subjugated knowledge. Turley employs both a theoretical perspective, which he supports with narrative descriptions of his qualitative research.
This dissertation has a very unique style. While the traditional elements of the dissertation are present, the introduction and each of the four chapters are preceded by vignettes that correspond to the chapter topic and describe situations the researcher experienced during the process. This allows the reader to hear the author’s voice and perceptions of events involved in the research process without taking focus away from the participants in the study.
Turley also chooses to include both narratives and counter-narratives concerning the writing test that is required for graduation in the district he is researching. The inclusion of the counter-narratives highlights both dissension and agreement between teachers and administrators.
Methods of Data Collection and Analysis
In order to address his first research question concerning the origins of standardized writing assessment in Composition, Turley performs an archival study of the first 15 years of English Journal as well as early issues of Teacher’s College Record, Pedagogical Seminary, and Educational Administration and Supervision. Within this archival study, Turley wants to expose the “institutional habits” and “cultural beliefs” of the Progressive Era that lead to the creation of the first scientific writing assessment tools. Through this archival examination, Turley identifies standardization as an administrative response to a perceived problem and also identifies Taylor’s ideology of efficiency as the model for this reform movement. This archival examination in combination with detailed literature reviews makes up the introduction and first chapter of the dissertation.
In order to address his remaining research questions, Turley also performs a year long qualitative study (2006-2007 school year) within a school district that enforces a standardized writing test (WGE) as a graduation requirement. Chapter 2 provides an overview of the study, in which Turley collects data from 10 teachers in 2 high schools (5 from each school), 2 principals, and 2 district administrators. Turley employs 2 rounds of interviews for each participant, which last from 45 – 90 minutes. In addition, Turley observed 11 courses and collected classroom and district documents relating to writing instruction and/or the WGE. Of the 2 schools, Turley notes that one school is the oldest in the district, with a traditional structure and a diverse population that tends to pass the WGE at a slower rate than students at the other school involved in the study, which is a new school with a non-traditional structure and a mainstream population.
Turley also points to the fact that he engaged in an ongoing process of data analysis throughout the collection process, in order to allow the data to inform his work and allow him to follow interesting threads that become evident through analysis. When analyzing the first round of interviews, Turley used these questions to guide him:
• What impact does the WGE have on Butler Public School District?
• What impact does the WGE have on Wilson and Marshall High Schools?
• What impact does the WGE have on teachers and their classroom practice?
• What impact does the WGE have on students and their writing?
Turley then coded the interview transcripts using Kathy Charmaz’s principles of constructivist grounded theory, specifically the “constant comparative method” of data analysis. However, he never seems to explain the theory, the method, or any conclusions drawn from this coding process.
Turley also strives for polyvocality within his text in order to represent his attempt to “share and create knowledge with the participants” (27). He does this by including narrative and counter narratives within the qualitative study portion of the research. In addition, he encourages the teachers who participated in the study to “read, respond to and critique” the study: one teacher response is included and Turley writes that more responses will be included in forthcoming versions of the work.
After his general overview of the research project, Turley devotes Chapters 3 and 4 to interesting moments within the process. In Chapter 3, Turley explores the issues of validity and reliability by examining 3 specific moments during grade norming sessions in which dissension occurred regarding scoring. In this chapter, Hurley notes that the WGE is based on AP examinations, alluding to issues of bias but never expanding on these thoughts. Instead, Hurley uses his examination of these three specific situations to determine this 5 question framework designed to serve as a heuristic for building writing assessment:
1) What is the purpose of the assessment?
2) What is the definition of writing? What should students do / be able to do?
3) What kinds of assessment tools would achieve said purpose and produce data
that corresponds to the previously stated definition of writing?
4) What is the relationship between assessment and pedagogy? Does the data
from the assessment tool prove useful to schools, teachers, students and parents?
5) What are the social and educational impacts of the assessment on students?
Turley moves on to examine the ways in which the pressure of the WGE effects writing instruction, resulting in a loss of agency for the instructor who feels the need to “teach for the test.” He then examines how instructor’s attempt to regain this lost agency by attempting to access and integrate their subjugated knowledge of writing and pedagogy into the classroom. Thus, the instructors both obey and resist the instructional methods prescribed by the WGE.
One methodological critique of the study is Turley’s failure to explain the importance of or analyze the data resulting from his coding of the initial interview transcripts. He states the coding of the interview transcripts as part of his methodology but never does anything with the results.
In terms of content, I am surprised that the author only addresses issues of bias and discrimination that result from standardized tests, particularly in terms of disadvantages for minority or poverty stricken students, in terms of one of the participant teacher’s concerns about multicultural issues with the test. This conversation is framed in terms of this instructor’s subjugated knowledge and is treated anecdotally but never explored in detail. Turley highlights the economic and social differences present in the populations of the two high schools participating in his study, yet never goes anywhere with this information. He provides a detailed breakdown of student demographics at each school, but never discusses how these demographics are related to the issues of reliability and validity discussed in Chapter 3.
I also find the lack of student voices in the text to be problematic. While I appreciate the focus on the ways in which the standardized test effects the ways in which instructors teach, I continually wondered how this all related to the students. In addition to the voices of teachers and administrators, I wanted to hear the voices of students discussing the perceived value, or lack thereof, of the WGE.
Implications for Further Research
In the Epilogue, Turley identifies some areas for future research as a result of this study:
- Need to internally document and research discontinuities between teachers and administrators in terms of theories and competing forms of knowledge
- After these discontinuities have been identified, researchers should “reimagine” pedagogies to address these discontinuities
- Research should be conducted in regards to including the subjugated knowledge of teachers in school reform efforts.
This study also opens up other areas for research. For example, I think the lack of student voices within this text is an area that can be addressed by further research aimed at understanding student perceptions, successes, problems, etc. with standardized writing assessments. Another possible area for research that arises from this study is in the actual effectiveness/accuracy of the test – are students who pass the test able to demonstrate an ability to write in college or the workplace? Of course that would require a new tool for assessment…