’Tis the Season of Test Prep: Bah Humbug

  • 14 December, 2024
  • 18 Comments

Blast from the Past:  Usually, “Blasts from the Past” are re-postings of earlier blogs, with minimal revision. This one is a bit different. This time I’ve combined and revised two earlier postings (December 8, 2018; January 19, 2019). Each year, I receive numerous requests from teachers seeking ways to prepare their students to excel on the accountability tests or to resist their school district’s pressure to do a lot of test prep. Although it is only December, those letters have already started to come in. I think this is the earliest ever!

Teacher question:  

We’ve been given a directive to provide plans/resources to “support schools in preparing students for these high-stakes tests.” Our ELA team is discouraged by this ask. We joked that we’d create a document that just said, “stay the course-and teach word recognition to those who are not yet reading at grade level.” The district is looking at data to determine areas of need. This leads us down a slippery slope that often ends with schools forming main idea reteaching groups, and inference groups, etc. What advice do you have for us?

Shanahan response:

It’s been a while since I’ve gotten up on this soapbox. 

Many consider this “the season to be jolly,” but for schools the kickoff for heavy test prep is soon to begin.

Your district wants to “use the data” or to create “data driven classrooms.” They have been told, without evidence, that approach will allow them to shine on the annual accountability tests.

I appreciate the hopefulness behind this practice, but I have one small concern…. The fact that it doesn’t work….

These so-called test score improvement experts who promulgate these ideas don’t seem to mind that their recommendations contradict both the research (e.g., Langer, 2001) and successful educational policy and practice.

Their “theory”—and it is just a theory—is that one can raise reading scores through targeted teaching of specific comprehension skills. Teachers are to use the results of their state accountability tests to look for fine-grained weaknesses in reading achievement—or to identify which educational standards the kids aren’t meeting.

This idea makes sense in math, perhaps. If kids perform well on the addition and subtraction problems but screw up on the multiplication ones, then focusing more heavily on multiplication can work.

But reading comprehension questions are a horse of a different color. There is no reason to think that practicing answering types of comprehension questions would improve test performance.

Question types are not skills (e.g., main idea, supporting details, drawing conclusions, inferencing). In math, 3x9 is going to be 27 every doggone time. But the main idea of a short story? That is going to depend upon the content of the story and how the author constructed the tale. In other words, the answer is going to be different with each text.

Practicing skills is fine, but if what you are practicing is not repeatable, then it is not a skill. 

The test makers know this. Look at any of the major tests (e.g., SBAC, PARCC, AIR, SAT, ACT). They will tell you that their test is based upon the educational standards or that their questions are consistent with those standards. But when they report student performance, they provide an overall reading comprehension score, with no sub-scores based on the various question types.

Why do they do it that way?

Because it is impossible to come up with a valid and reliable score for any of these question types. ACT studied it closely and found that question types didn’t determine reading performance. Texts mattered but question types didn’t. In fact, they concluded that if the questions were complex and the texts were simple, readers could answer any kind of question successfully; but if the questions were simple and the texts were hard, the readers couldn’t answer any question types.

Reading comprehension tests measure how well students can read a collection of texts—not how well they can answer different types of questions. 

If your principal really wants to see better test performance, there is a trick that I’m ready to reveal here.

The path to better reading scores? Teach kids to read. 

It works like magic.

Kids don’t do well on the tests because we don’t spend enough time on those things that make a difference in making kids proficient. Most American elementary schools these days pride themselves on their 90-minute reading blocks… but much of that time is devoted to activities that do little to promote reading ability.  Kids are supposedly reading independently or doing shut-up-sheets while the teachers are working with the other kids.

I’d love it if instead of a 90-minute block, we’d commit to providing 90 minutes of teaching and guided practice to each child each day. That might take more than 90 minutes to deliver, but it would sure give kids a better chance to become proficient.

In my schools, I required 120-180 minutes per day of reading and writing instruction. I know that’s a lot, but it is accomplishable in most schools if they ditch the test prep and reading activities that don’t contribute much.

This instructional time should be devoted to explicit teaching and guided practice aimed at developing knowledge of words (including phonemic awareness, phonics, letter names, spelling, morphology, vocabulary); oral reading fluency (accuracy, automaticity, prosody); reading comprehension (written language, strategies, knowledge); and writing (transcription, composition). And, for English learners (and perhaps poverty kids too)—explicit oral language teaching.

Too many teachers think kids would be better off reading on their own than working with them because “reading is learned by reading.” Kids do need to read, but that practice is best included in reading lessons than pushed away. I encourage devoting at least half the instructional time to reading and writing.

In a reading comprehension lesson, there will be teacher-led demonstrations and explanations and guided discussion, and so on—but the students should also be reading text. The same is true for decoding; during a big chunk of that instruction kids should be decoding and encoding words.

In grade 2 and up, students spend too much time working with books they can already read reasonably well. There is no such thing as an “instructional level” in reading, at least beyond first grade. Teaching kids at their supposed “reading levels” hasn’t been found to facilitate learning but lowers the sophistication and complexity of the content and language they get to work with.

We do too little to develop students’ reading stamina. Oh, I know that some are proud that they use books instead of short stories to teach reading, or that many assign extended silent reading. But those tend to be sink-or-swim propositions. Kids would be better prepared for tests (and many real reading situations) if there was an intentional regimen of stretching how long they can persist in making sense of texts. For many, having to read an extended fourth-grade selection silently to answer questions doesn’t go so well since they’ve never done anything that demanding before.

Lack of a knowledge-focused curriculum is an important culprit, too. Science and social studies aren’t given enough time in elementary school (and the value of the literature may be suspect, as well). Kids should get daily work in those subjects, and those lessons should include the reading of content text.

Nothing very exciting here, right?

If you want higher test scores, it takes a lot of dedicated teaching of the key things that matter in learning. Nothing sexy about it. Yet too few kids get those things and test prep is not a replacement. Focus like a laser on what works, and your kids will do better. 

References

ACT. (2006). Reading between the lines. Iowa City, IA: ACT, Inc.

Langer, J. A. (2001). Beating the odds: Teaching middle and high school students to read and write well. American Educational Research Journal, 38(4), 837-880.

Shanahan, T. (2015). Let’s get higher scores on these new assessments. The Reading Teacher, 68, 459-463.

Shanahan, T. (2014). How and how not to prepare students for the new tests. The Reading Teacher, 68, 184-188.

Comments on Previous Versions

LISTEN TO MORE: Shanahan On Literacy Podcast

Comments

See what others have to say about this topic.

Carol McKinley Dec 14, 2024 04:02 PM

Our school is piloting thinkSRSD, a writing program that explicitly teaches, models and practices responding to high level prompts for texts that have been previously read from the ELA curriculum. Our third grade teacher (who has been doing thinkSRSD from the beginning of the year), now wants to switch to preparing for the state tests. She plans on continuing with the same weekly thinkSRSD format, but using text sets specifically written for text prep by Wonders. However, our curriculum director wants her to continue using the CKLA texts to respond to high level prompts. My questions, "Does it matter which texts she uses? Is it better to continue reading grade-level complex texts, and practicing how to plan, organize and write effective responses to high level prompts? Or, is it better to use prepared text sets, presumably grade-level and complex, and practice how to plan, organize and write effective responses to high level prompts?"

Timothy Shanahan Dec 14, 2024 04:19 PM

Carol--
I don't think we know the answer to that (I know of no data on that). Although I'm not a fan of most prep, teaching students to be responsive to a writing prompt has value beyond testing (even professional writers are given writing assignments that they need to respond to which they must respond appropriately). In this case, I would probably make my choice on the basis of (1) which texts are the highest quality in terms of content and writing and level -- yes, I favor grade level text in terms of length, linguistic complexity, and appropriateness of content?, and (2) which scheme seems most likely to guide students to become independent in their ability to respond to a prompt -- which one will end up with the kids gaining enough metacognitive awareness that they will be able to negotiate the prompt on their own without the external support.

tim

Dr. Bill Conrad Dec 14, 2024 04:27 PM

Test- prep combined with bubble kid identification is the worst form of educational malpractice. But malpractice never deters educators who focus on self over service rather than service to students and families!

Timothy Shanahan Dec 14, 2024 04:52 PM

Billy--
I wish I could disagree with you on that. I wonder how many teachers and principals recognize that those approaches are doing just that -- putting the educators' needs/desires over the kids' needs?

tim

Lauren Dec 14, 2024 06:12 PM

I work with elementary students. Students are given the CAASPP test in grades 3-5. I just keep thinking that these tests are not developmentally appropriate for elementary students. They are really difficult. Isn't it in some ways a fool's errand to keep banging our heads against the wall with these tests year after year? The students are reading on grade level, but they keep failing with these tests. Something is off here. I heard a quote that I liked stating that America is not a great nation because we are great test takers. We are a great nation because we are great innovators and creative thinkers. Long ago state tests were pretty short, and they provided a snapshot into how schools were performing. Now these tests go on for weeks and they expect nine-year-old students to write college dissertations... It sounds like someone who does not know much about how young students learn, came up with the content of these assessments. I think the priority should be interesting and engaging lesson content for students. Focus on these tests is hurting students.

Cindy Wilson Dec 14, 2024 08:05 PM

Thank you for saying ALL of this! Schools in Arkansas are currently throwing everything at the wall to see what sticks in the name of raising test scores. While some things we did with balanced literacy instruction may not have been the most sound (lack of phonics instruction, for example), I feel like some components were valuable to helping kids become better readers, as well as lovers of reading: books, authors, genres, etc. I don't see enough books in kids hands, nor are they given time to read the ones they are able to access.

Harriett Janetos Dec 14, 2024 10:00 PM

"Although I'm not a fan of most prep, teaching students to be responsive to a writing prompt has value beyond testing (even professional writers are given writing assignments that they need to respond to which they must respond appropriately)."

I agree! I felt that "preparing" my third graders for the CAASP was a good use of time because it simply meant that they were reading grade-level articles on a single topic and synthesizing a response to a prompt using evidence from the articles. "Writing to reading" (as Steve Graham reminds us) has enormous value since it improves both reading and writing.

"Writing to Read is a new Carnegie Corporation report published by the Alliance for Excellent Education which finds that while reading and writing are closely connected, writing is an often-overlooked tool for improving reading skills and content learning."

https://www.carnegie.org/publications/writing-to-read-evidence-for-how-writing-can-improve-reading/

Lynne Ord-Oraniuk Dec 14, 2024 11:53 PM

As a school leader I take your advice to heart and implement the practices suggested. In Australia we have 4 categories in our National test- Exceeding and Strong are the benchmark that we are measured on, and the other categories are developing and needs attention. In my previous school, 76% of the students were EAL and it was a socially and economically disadvantaged school with most students not attending preschool. For the past 2 years the writing program has been developed to build background knowledge in content areas, science and social studies through reading literature as well as quality narratives. The students are taught sentence level work as part of the writing process which also helps them as readers. This year 82% of the students in Year 3 were in the exceeding and strong category which outperformed the state of Victoria by 4%. This is a small sample size but all we did was teach, apart from an on-demand practice. Like you said quality teaching will take care of tests, and it also encourages kids to write about things that matter to them not us. Thank you for your weekly blog it is simply the best!

Heather Baker-Sullivan Dec 15, 2024 03:04 AM

I teach 8th grade. I developed a sequenced reading and writing activity for informational text where they copy text features down (word for word) -title and any subheadings and describe any photos/images; inference the main idea with RACE format (2 supporting details of the text features) ; read and annotate the text using signposts; summarize each section using subheading to support each sentence of an overall summary paragraph; and wrap up with a main idea inference with text details. Does this seem a useful approach to you?

Timothy Shanahan Dec 15, 2024 01:57 PM

Heather--
Yes, that can be useful. It has much in common with techniques like SQ3R, THIEVES, and SoapSTONE. They provide students with a self-guidance scheme for guiding their own efforts at comprehension. These techniques tend to focus mainly on the structure of the text -- which is great for texts the students can read reasonably well -- it increases the chances that they'll stay focused and remember information. However, none of these helps kids to deal with the linguistic and knowledge barriers that texts often present -- what if I'm having trouble with decoding fluently, with vocabulary, syntax, cohesion, memory, etc. Useful, but limited.

tim

Timothy Shanahan Dec 15, 2024 01:59 PM

Lynne--

If the best way to ensure test improvement is to teach reading well, then it is important to remember that teaching reading well includes lots of writing and writing instruction.

tim

Bri Dec 15, 2024 04:32 PM

How do you suggest we assess if students have mastered the content standards (middle school level)? Our school follows the PLC process, which has a heavy emphasis on CFAs (common formative assessments). I previously taught math and have plenty of experience developing CFAs and using to intervene and track mastery with success. I am struggling with how to apply this practice to secondary ELA. At the tier 1 level, we are currently tracking mastery of standards through a spreadsheet. We are noticing that certain students are struggling with central idea while others are struggling with inferencing (some struggle with both). Some of these students are receiving intervention in a variety of areas (decoding, vocab, fluency, etc.), while others are just struggling in specific areas. We use test-like passages an questions to assess and then work in small groups. I'm unsure how else to help when there are specific standards that students are struggling with. Would love to hear your suggestions, as we are clearly going about this wrong!

Gale Morrison Dec 16, 2024 11:47 AM

Brilliant! Well done

Gretchen G. Dec 17, 2024 02:35 PM

I have been a teacher in Mississippi for 25 years, and I am proud to say that I am part of the "Mississippi Miracle"....however, it really isn't a miracle that our reading scores have moved like they have. For 10+ years now, our state has moved toward explicit instruction in reading. And it has worked. There is still way too much test prep and bubble kid stuff for my liking, but we are moving in the right direction. I am so glad you are a voice of reason for what actually works in reading!

Billy Conrad Dec 17, 2024 09:24 PM

Billy?

Chris Vander Ark Dec 16, 2024 01:35 PM

Tim--thanks for your thoughts on all of this. I'd be interested in your ideas on what a structured reading instructional time looks like in high school, where many of our students read far below grade level. Your description of a focused reading lesson seems to describe an elementary classroom.

With the ACT coming up, we do plan to spend some time once a week for several weeks on a reading passage, with students reading the passage and answering the questions. For us it's good focused practice for them--the ACT is an excuse to do the focused reading. Again, grateful for your take on our planned approach.

Timothy Shanahan Dec 18, 2024 07:48 PM

Bri--
Reading isn't like math. In fact, when it comes to reading comprehension, for the most part, you CANNOT test the standards individually (they don't work like that). Instead of trying to test individual standards, it makes more sense to assign grade level texts to students and to have them answer questions about those texts or to write a summary of what the text said. It is okay to use the standards to frame those questions but if a student gets such a question wrong it tells you nothing about that skill. Students do not respond to such question types in a reliable manner and there are many reasons a student might get any question wrong (including decoding problems, vocabulary limitations, lack of fluency, weak grammar/syntax skills, etc.).

tim

Gaynor Dec 19, 2024 10:20 PM

Being the same age as Timothy , I have observed over many decades the new fads that , in my view, destroyed all education. These include dropping spelling, not checking every elementary child's oral reading every day , no grammar exercises nor rote learning tables and basic arithmetic algorithms all 'old school ' stuff. Formal 'testing' was not done at elementary levels but instead revision, consolidation and reinforcement were used daily. For example the silent' e 'rule was taught then revised for weeks afterwards intermittently . Cognitive science now supports this.

What Are your thoughts?

Leave me a comment and I would like to have a discussion with you!

Comment *
Name*
Email*
Website
Comments

’Tis the Season of Test Prep: Bah Humbug

18 comments

One of the world’s premier literacy educators.

He studies reading and writing across all ages and abilities. Feel free to contact him.