This week the Thomas Fordham Institute released a new provocative report, Social Studies Instruction and Reading Comprehension. It says U.S. schools spend “excessive amounts of time” teaching the English Language Arts (ELA) and not enough time on social studies. In fact, they claim that this imbalance is lowering reading achievement. Kids will read better if they get less reading instruction and more social studies teaching. I’ve long argued for what they call out here as excessive. To reach the reading levels we aspire to, children need lots of reading and writing instruction (and, at least for English learners, lots of time for oral English work, as well). This report is out of step with what I believe to be the preponderance of evidence in the matter.
Admittedly, parts of this report rankle… particularly when causal effects are attributed to correlational data (for those of you not fluent in statistics – this simply means they make claims not justified by the data they have analyzed). The real surprise for me – given the tenor and reasoning of the report – is that I fully agree with their conclusions.
These days people are so ideological that they go blind when something supports their view and can see nothing but the devils in the details for any disparate evidence. My goal is not to undermine the report, so much as to point out the flaws in its reasoning and inadequacy of its evidence. This is important because these kinds of “hooray for our side” claims plague pedagogy. These kinds of arguments do as much to warn people off of research as the flat-earth science deniers who champion any data that supports their view and pigsties any equivalent evidence that goes the other way.
I also hope to look at these data in some ways that might be more persuasive and useful for those who make these timing decisions.
My first problem with the report: Never send correlational data to do a causal study’s job (and certainly don’t claim that the use of multiple correlations makes it a causal study… oh, Doctor).
Despite the causal claims put forth in this report, such as “Increased instructional time in social studies—but not in ELA—is associated with improved reading ability”, the evidence just doesn’t match with the assertions.
What does that claim make it sound like they studied?
It sounds to me like they tested students’ reading and then had teachers either increase their reading instruction or their social studies teaching, or just kept things as they were, and then after a while, tested the kids again to find out how much reading improvement they could be attributed to these three approaches. It would even be better if they tested these students’ social studies knowledge to be sure the increases in reading were attributable to the learning that resulted from the extra social studies teaching. Such a study, if well implemented, would offer strong evidence for increasing social studies to make better readers.
But nothing of the sort was done here. Instead, teachers were surveyed about how much instruction their schools offered each year and in fifth grade the students were tested in reading. The students who scored best in reading tended to be enrolled in schools where teachers had reported more social studies instruction. These teachers did not even necessarily know how much instruction was provided in subjects that they didn’t teach, but they were encouraged to answer anyway. And, of course, no one measured the impact of increasing social studies teaching since no teacher increased the amount of social studies teaching.
The study neither observed any of this instruction, nor evaluated the learning that came from it. This investigation aimed to support the theories of E.D. Hirsch (and others) on the importance of cultural knowledge in reading comprehension. I’ve observed a lot of elementary social studies instruction in my time and I wouldn’t necessarily expect it to lead to big gains in “knowledge” (especially in the primary grades). But my skepticism aside, for this theory to be supported you’d need to show that the social studies teaching increased student knowledge about social studies, that the students simultaneously improved in reading, and that this social studies knowledge was instrumental in the reading improvements.
This research didn’t even review the passages used to measure reading comprehension to see how aligned the measure was with the social studies knowledge that was supposedly leading to these higher reading scores. If the reading test passages were about social studies content, then I wouldn’t be surprised if social studies instruction had an impact. But what if these were fictional passages or passages about science? Then what happens to these claims about the potency of social studies instruction?
A peculiar finding in this study was that there was no connection between the amount of ELA, math, science, the arts, physical education and student reading levels. They wave that problem off with the claim that cultural knowledge only comes from social studies and science classes and then claim that science vocabulary might be too technical to translate to improved reading. That interpretation not only requires some very uncomfortable gyrations (just reading it made my back hurt), but it certainly turns Hirsch’s theory to thin gruel (knowledge is important, but only knowledge about social studies). A simpler conclusion would be to recognize the greater likelihood of finding one meaningless significant correlation among seven independent comparisons.
Another possible explanation for these odd results would be that whatever constellation of conditions that led some schools to offer more social studies teaching were the same things that led to higher reading achievement. For instance, they noted that private schools had higher achievement and more social studies teaching. The researchers wisely corrected for some of these differences statistically, but there’s a reason why such correlational certainty often evaporates within the context of real policy implementation.
This is the kind of study that should encourage researchers to test out its recommendations; to experiment to see if the promised benefits result. Only then should policymakers take it as a call to action.
Some of the data here don’t add up either. According to the U.S. Department of Education the typical elementary school day – minus lunch and recess – is 6.5 hours or 390 minutes per day. The Fordham report separates the data by percentiles of amount of teaching that may have been provided in each subject. For example, they separated out the top 10% of schools that claimed to devote the greatest amount of time to each subject. A school was in that 90th percentile column if it offered about 2.5 hours per day of ELA. That same school may or may not have been included in the 90th percentile for social studies.
I found it interesting to imagine a highly academic school that managed to crack the top 10% for all the subjects. You know, a school with a lot of reading and writing, science, math, social studies, music, and so on. You’d think that such a school would be a rare beast. It seems like it may even be an impossibility, since all the subjects are competing for the same pool of time. More ELA time must, as these authors conclude, lead to less social studies time. Schools with lots of art would probably end up with little math or science.
Except that isn’t what you find.
If all schools delivered daily instructional amounts sufficient to place them at the 90th percentile for each and every subject, we’d still have about a 30 minutes of unaccounted time each school day (shifting these minutes to social studies alone would be enough move the bottom 10% to the top 90% in amounts of social studies teaching).
Even more interesting is to look at the median, that is the average school time allotments. Imagining schools that offer average amounts of ELA, math, science, social studies, and non-core subjects, would leave about an hour and a half of instructional time unaccounted for each day.
Instead of trying to steal time from reading and writing instruction for social studies, why not use a small amount of that lost time?
Think of it this way: Let’s imagine a family with a $40,000 annual income. They currently spend $10,000 on housing, $10,000 on food, and $5,000 each on healthcare and transportation. Food prices rise dramatically, and they need another $3,000 a year to cover the additional cost. Would you recommend that they stop paying rent or going to the doctor to make up for the shortfall? Or, would you wonder why it couldn’t come out of the $10,000 not being used for the family’s survival?
Arguing for less time for language instruction to accommodate adequate time for social studies is kind of like that.
But isn’t it peculiar that the amount of ELA time didn’t correlate with reading achievement?
Indeed. And, yet, there are some possible reasons for that. One problem is the relative amount of variance associated with the different subject matters. In math and reading, relative variance was low (.19 and .18, respectively) probably because schools are explicit about how much time to spend on those subjects. Social studies and science get less attention, so they lead to a lot more variation in teacher practice which increases the possibility of finding a positive correlation with achievement (relative variance for social studies, science, and non-core subjects were .39, .40., and .44, respectively).
Even with that, however, there was certainly enough variation in the amount of ELA teaching to correlate with reading achievement. But here we have the same problem with reading that I mentioned earlier with social studies. We don’t know what this time was used for. In many schools, students’ independent reading time is counted as reading instruction (despite the poor results associated with that practice). The same happens with teacher book sharing, when teachers read chapter books to students. Such practices although enjoyable, perhaps, do little to improve reading achievement, though they do divert a substantial amount of instructional time. The same can be said for all of the worksheets and other activities used to keep kids busy while the teacher works with other small groups. None of that independent busywork has ever been found to do much for reading, and yet those activities often take up one-third to two-thirds of the instructional time.
There are literally thousands of studies showing the impact of increasing the amounts of instruction on student learning (e.g., preschool, full-day kindergarten, use of time during the school day, afterschool programs, summer programs, homework, days without substitute teachers, years with minimal “Act of God” days, mathmagenic processing, models of school achievement, academic press, and so on). There definitely are exceptions to this overwhelming pattern of results, but these are exceptions that prove the rule (such as unmonitored afterschool programs don't seem to improve achievement, but afterschool programs in which we know teaching is taking place do).
Amount of instruction matters, but what is being taught, how it is being taught, and how the learning is to be measured, all matter in this equation as well (though you can’t tell that from this report).
Frankly, as a profession, we’ve been careless in our safeguarding of children’s instructional time (grabbing at those exceptional cases when amount of teaching doesn’t matter and folkloric theories of motivation as excuses for not maximizing teaching and academic experience). Encouraging the kind of groveling between subjects encouraged in this study is not the way to fix that.
Instead, I argue for making every minute count; for providing substantial academic experience with the various arts, sciences, and humanities; and for teaching reading/language with rich literary and informational texts worth reading and remembering. I have no problem with 45 minutes per day of social studies. In fact, I like the idea. Fordham’s notion of how to get there is problematic, however.
Note: I want to thank Michael Petrilli for catching a couple of factual errors in the original blog posting. I have corrected those errors as of October 1, 2020 (omitting one erroneous sentence and replacing one relative standard deviation). Neither change required any revision of the point of view expressed in this blog, and, yet, accuracy matters. The responsibility for the original errors is mine alone.
Copyright © 2020 Shanahan on Literacy. All rights reserved. Web Development by Dog and Rooster, Inc.