My Principal Wants to Improve Test Scores... Is He Right?

  • 08 December, 2018
  • 21 Comments

Teacher question:

I hope and pray that you write about or repost regarding state reading assessments. I just received a call from a frantic academic coach stating that her principal has told her teachers to look at our state test’s achievement level descriptors and create test-based questions aligned to those levels to ask when immersing students in literature and informational texts. Is this a good use of their time? Isn’t it really all about the text as wells as students’ knowledge of the subject matter, vocabulary, and sentence complexity? Please help!

Shanahan response:

You’re right. It’s been awhile since I’ve gotten up on this particular soapbox.   

Many consider this “the season to be jolly,” but for schools the kickoff for heavy test prep is soon to begin. Bah, humbug.

That principal has probably been told to “use your data” or to create “data driven classrooms,” with the idea being to shine on the annual accountability tests.

While I appreciate the hopefulness behind this practice, I have one small concern…. The fact that it doesn’t actually work.

These so-called test score improvement experts who promulgate these ideas don’t seem to mind that their recommendations contradict both the research and successful educational policy and practice.

Their “theory”—and it is just a theory—is that one can raise reading scores through targeted teaching of particular comprehension skills. Teachers are to use the results of their state accountability tests to look for fine-grained weaknesses in reading achievement—or to try to identify which educational standards the kids aren’t meeting.

This idea makes sense, perhaps, in mathematics. If kids perform well on the addition and subtraction problems but screw up on the multiplication ones, then focusing more heavily on multiplication MIGHT make sense.

But reading comprehension questions are a horse of a different color. There is no reason to think that practicing answering particular types of comprehension questions would improve test performance.

Question types are not skills (e.g., main idea, supporting details, drawing conclusions, inferencing). In math, 3x9 is going to be 27 every doggone time. But the main idea of a short story? That is going to depend upon the content of the story and how the author constructed the tale. In other words, the answer is going to be different with each text.

Practicing skills is fine, but if what you are practicing is not repeatable, then it is not a skill. 

The test makers know this. Look at any of the major tests (e.g., SBAC, PARCC, AIR, SAT, ACT). They will tell you that their test is based upon the educational standards or that their questions are consistent with those standards. But when they report student performance, they provide an overall reading comprehension score, with no sub-scores based on the various question types.

Why do they do it that way?

Because it is impossible to come up with a valid and reliable score for any of these question types. ACT studied it closely and found that question types didn’t determine reading performance. Texts mattered but questions types didn’t. In fact, they concluded that if the questions were complex and the texts were simple, readers could answer any kind of question successfully; but if the questions were simple and the texts were hard, the readers couldn’t answer any kinds of question.

Reading comprehension tests measure how well students can read a collection of texts—not the types of questions they can answer.

If this principal really wants to see better test performance, there is a trick that I’m ready to reveal here.

The path to better reading scores? Teach kids to read.

It works like magic.

Devote substantial time to teaching phonemic awareness (preK-1), phonics (preK-2), oral reading fluency, vocabulary, reading comprehension, and writing. Make sure kids are being taught to read grade level texts—not just texts at the kids’ supposed reading levels” in grades 2 and up.

Comments

See what others have to say about this topic.

Harriett Janetos
Dec 09, 2018 03:06 AM

And I would add one more thing based on training I had this summer through SBAC and teaching a third grade class once a week this year: Find high interest articles (Time for Kids, Scholastic, etc.) on a related topic (such as animal rescue) and after doing close reading activities, have students use information from those articles to write informational and argumentative pieces. If test prep consists of spending time on reading activities involving "analysis" and "synthesis", then it is time well spent.

Pat Stone
Dec 09, 2018 08:48 AM

Great blog, thanks.
Why do you not give an age group for these: “oral reading fluency, vocabulary, reading comprehension, and writing“ as you do for phonemic awareness and phonics? They should be followed by a bracket such as (pre K - every year). Giving those age groups for phonemic awareness and phonics makes people think they should only teach those to beginner readers in a similar way to the problem expressed by your questioner. While you claim to advocate well-rounded proper teaching of reading in all its aspects, your specifics tend to endorse and encourage the teaching of phonics in isolation.

Tony
Dec 09, 2018 01:39 PM

You appeared to miss a couple of important points in response to this teacher's inquiry.

First, you don't reference seeing or conducting a first hand analysis of the assessment in your response. It may important not to pass judgement before you analyze the assessment. Also, the feedback you provide is broad enough to apply to most standardized tests like the PSAT, SAT, GRE, etc?

Second, how does the teacher reflect on his/her practice to improve reading outcomes for every/all student/s. While we may disagree on the type of assessment construct, many well-intentioned and hard-working teachers deflect responsibility/accountability by making claims that reading instruction and achievement are not measurable. The comparison to math is also unfair. Math is more about process and there are many ways at arriving at the "correct" answer. In the process, some kids, mostly underrepresented, underserved, and students with disabilities are left behind.

I am not arguing for more testing and/or for more test grades. In fact, I argue for more authentic assessment. So here is the Gordion knot of reading instruction. Time and opportunity are our greatest resource in this life. When in doubt, measure it so you don't go too far before you can correct the path that lies ahead.

Bernadette
Dec 09, 2018 03:20 PM

Thank you Tim for your expertise. You mention main ideas and inference, for example, are question types, and they are not skills. I’ve been presenting them as skills to my graduate students. Now I’m wondering what language I should use when we discuss supporting learners in making inferences and picking out the main ideas. If I omit the word skill, what’s a more accurate term? Thanks so much for your guidance. I learn a lot from you.

Leslie
Dec 09, 2018 03:29 PM

I respectfully disagree with you on this one. As a former ELA teacher, my break through for helping students to reach a deeper more effective level of text comprehension started with my ability to target the areas (standards) where they struggled. Without data-driven targeted instruction, I was taking a shot in the dark regarding my students struggles. I 100% agree that isolated instruction doesn't get to the heart of a student's ability or struggle, but it definitely helps to inform practice and allows teachers to more effectively support students with their deficits. It also allows teachers to encourage students with specific areas of strength. I want to encourage teachers to use all resources to improve instruction. Identifying and asking specific targeted questions helps and it gives struggling students a chance to find and leverage thier areas of strength to make improvements. A coach would never tell a players to only focus on speed or agility alone, but using targeted skills and our strengths certainly gives leverage to improve the intangibles. I think targeted practice is a long overdue approach to support reading instruction.

Lisa Butler
Dec 09, 2018 05:13 PM

The more I work with students with reading deficits, the more suspicious I am that background knowledge is the key ingredient that is missing from a student's ability to comprehend. I begin to suspect that when a student, especially, but not limited to, a low SES student does not understand subtleties in text, it is because they cannot see beyond the surface level of the text. Students that have no idea what a common idiom or simple historical reference means will be impaired in their ability to comprehend grade level material. Not even understanding what they don't know, they read through it and are surprised to find out they don't "get it." Unfortunately, I believe this phenomenon can be traced to not reading for pleasure. Whether the reason is a lack of skill, interest, or time spent on screens of all kinds, when you find that you are explaining the simplest concepts, it takes away the ability to work on the finer points of comprehension. Unfortunately, this is not something that is easily remedied or taught as it requires a more comprehensive education, beyond the teaching of how to discern the main idea.

Nancy
Dec 09, 2018 08:19 PM

Bernadette - Maybe teach these as "thought processes" instead of "skills". Have your students closely read by analyzing the writing style of the author to see how the text is organized and how ideas are presented. If an idea can be inferred from a paragraph, have students go back and highlight which sentences in the text helped them to create the inference. An author of informational/expository text should write in a reader-friendly manner by putting a main idea statement in the beginning of a paragraph then follow the rest of the paragraph with supporting details, for example. Conversely, an author might present the details first and then conclude a paragraph with the main idea. It is up to the teacher to carefully preview the reading material, perhaps create various question types, then engage the class in attempting to locate answers within the text while looking for signal words, using graphic organizers, and finding the underlying text organization.

Timothy Shanahan
Dec 09, 2018 11:32 PM

Pat--

The reason I didn't give an age group on those other aspects of instruction is because research hasn't identified particular ages for those things. Reading comprehension strategies, for instance, have been found to be effective in grades 1-12 (and if you accept listening comprehension work to be close enough, then K-12). Didn't seem necessary. PA and phonics have a short shelf life with regard to developmental teaching. They have clear and significant payoffs for learning early on, but not later.

tim

Timothy Shanahan
Dec 09, 2018 11:37 PM

Tony-

What i described is true for all standardized measures of reading comprehension. It has to do with both the nature of reading comprehension and how tests are constructed. Reading comprehension tests have repeatedly been found to measure only one thing (not a plethora of skills) and question variation has been found to explain no variation in performance on comprehension tests (the passages make a difference, the questions not so much). Students do not perform differently on different kinds of questions.

You seem to think this is a criticism of reading comprehension tests. It is not (which is why the test makers agree with me on what I'm saying). These tests do a fine job of measuring reading comprehension. They do not, however, measure how well students answer particular kinds of questions because there are no psychological difference in those questions. The comparison with math is absolutely appropriate given that the arithmetic examples given are clearly learned in a particular sequence.

tim

Timothy Shanahan
Dec 09, 2018 11:39 PM

Leslie--

The tests don't work that way. Your kids didn't do better because you taught targeted skills. You might have gotten them to read more, you might have been tougher in getting them to pay attention to the content of the texts, but research is very clear that kids do not perform differently on those different kinds of items-- the differences that you are seeing are just noise.

good luck.

tim

Donna
Dec 10, 2018 09:04 AM

Mr. Shanahan,
I work at a private high school for students with significant LD. A few of my students are reading at K-2 reading levels. Their skills in the foundational areas are very weak. Do you feel that it is too late for their reading to improve much if I give them intensive instruction in phonological awareness, most specifically phonemic awareness?

Tim Shanahan
Dec 10, 2018 04:38 PM

Donna- no I don’t think it is too late, but usually in these situations progress can be very slow and will require a lot f work (and motivation).

Tim

Evette
Dec 11, 2018 11:50 PM

I agree with you whole heartily with teaching comprehension. My only question is that I feel just as much reading comprehension takes place in the questions. Reading becomes strategic at the secondary level. Yes I need to teach comprehension but I can’t ignore the questions as part of the text. If they don’t understand the question they will most likely answer the question wrong. Reading comprehension is in the questions too. I’ve been a secondary reading specialist for the past 16 yrs. I have found great benefit in teaching to read and treating the questions as part of the reading and giving them a strategy to help answe the question correctly. Thank you and I love reading your articles.

Brian Spivey
Dec 12, 2018 12:06 AM

We, in the world of education, are still looking for a magic bullet. There is nothing wrong with test prep, as long as the preparation includes skill building, focusing on the many skills that students need to practice in order to be proficient in the grade-level standards.

Paul de Maat
Dec 12, 2018 08:51 AM

Thank you for this really interesting reading. You write : 'ACT studied it closely and found that question types didn’t determine reading performance.' I am very much interested in this study. Where can I find it?

Tom meskel
Dec 13, 2018 10:24 AM

Ditto on the fetish to describe all of math instruction as "3 times 9." (Once the student knows "3 times 9" - does he/she know when this knowledge is needed?) Having the students construct "good test items" themselves can be a very effective device for " test preparation" (if that is indeed your cup of tea.) It also helps them think a bit about what they know well - or - "kinda know."

Laura
Dec 19, 2018 05:34 PM

I totally agree with you. I am struggling with a related issue. I recently took a position as a literacy coach at an elementary school. At this school teachers have professional learning community meetings each week. At these meetings they are required to choose a standard that a lot of students struggled with on an assessment given the previous week and come up with instruction and activities for the "red", "yellow", and "green" students to do in the 30 minutes of remediation/enrichment time that will occur each day the following week. Each teacher "takes a color" and works with the students "in that color" for the week. They rotate math and ELA standards from week to week. It is not surprising that even though the groups are set up to be flexible, many times students remain "in the same color" from week to week (though not always). As a literacy coach, I am struggling to support this instruction, particularly when RL and RI standards have been chosen. I feel like addressing weaknesses in this manner is somewhat fragmented. It seems to me that this time would be better spent teaching students strategies (such as comprehension monitoring, summarizing, or visualization) or morphology, that sort of thing, which can't usually be accomplished in five days. Students with considerable delays also get a 15-20 minute intervention time 3-4 days/week and all of the teachers also do whole group instruction and guided reading groups each day. I am wondering your thoughts on this way of doing things. Citing research studies, if possible,to support your position would be greatly appreciated.

Lindsey Lush
Jan 10, 2019 10:55 AM

This issue is very pressing for me right now. I am being instructed by admin (we're talking weekly "data meetings", created "data binders", county-level trainings... data ad nauseam) to use diagnostic results to inform my reading and math instruction.

You say, "Question types are not skills (e.g., main idea, supporting details, drawing conclusions, inferencing)" and that the key is authentic reading instruction - which I enthusiastically agree and am fighting daily for the time and space to do so in my own classroom- but based on your research and the research you've read, is it then unreliable to use testing data to support individualized skill focus? Our benchmark standardized tests provide a very detailed list of skills the assessment claims to have determined each student has mastered and is ready to learn.

Should this shape our reading instruction? What I mean to say is, should this shape our authentic reading instruction, such as choosing texts and strategies that support skills the test has supposedly determined students are ready to learn? I am interested in your take on this practice as I am trying to decide my own stance. Thanks!

Bernard
Feb 12, 2019 02:37 PM

Awesome Blog Entry! I wholeheartedly agree!

SAM BOMMARITO
Feb 13, 2019 12:44 PM

I SOOOOOOOO agree with you on this one. My experience is that once you've done enough practice to familiarize the students with the nuances of a particular test- it is best to focus the rest of the time on just what you suggest. Teach them to read. Normally the familiarizing part took a week or two give or take. It dealt with such things as whether or not to guess if you don't know (some tests penalize for that some don't), how multiple choice tests often work (correct answer plus three foils). On ones where you don't know the answer, start asking which choice is wrong. Then guess (if appropriate) on what's left. Getting rid of the ones you think are wrong improves the odds your guess might be right. Even had them make some of their own questions using the right answer three foils and sharing them with each other. Reminded them that first impressions are usually right, so go with it. Only use "boil down", my name for the technique I described. if you are sure you don't know. My logic about spending so little time on test taking practice was simple. Once they were familiar enough with the nuances of the test (which I thought might gave a little score boast), there will be a very long plateau of no improvement. The test was accurately reporting they had gained no new knowledge of reading. Seen places where the instruction for entire school year was effectively taken up with test practice. Sad. And as you already guessed, very low test scores. Overall I think you got this one totally right..

Vicki Taylor
Feb 14, 2019 07:17 PM

The way I suggest teachers look at reading comprehension (in upper elementary) is with the analogy of a carpenter. The carpenter has tools he/she uses to craft beautiful works of art. In reading, the tools are the skills the standards give you to unpack a text. The goal is not to master the standards, but rather to be able to understand content (informational) or appreciate fine literature at the deepest level. This is done WITH the standards or our tools. Some people think the goal is to master the standards because that is what we are "hired by the state to do." I don't agree with this. We are hired to teach children to read for the purpose of learning information or appreciating the aesthetics of literature, which is our "piece of beautiful furniture." We can't get sidetracked or fixate on the standards solely because then we often use substandard text (test prep materials) to teach the skill, which is often not engaging and turns children off to reading. We need to have engaging text to begin with and then show students how to use the right tools to unpack it--rich text is key!

What Are your thoughts?

Leave me a comment and I would like to have a discussion with you!

Comment *
Name*
Email*
Website
Comments

My Principal Wants to Improve Test Scores... Is He Right?

21 comments

One of the world’s premier literacy educators.

He studies reading and writing across all ages and abilities. Feel free to contact him.