Charleston County School District’s literacy academies for its worst readers showed progress again last school year, but the way those results were reported raised a number of questions.
The figures released by school leaders indicate that the extra help being offered to struggling readers is having a positive effect, and officials say they’ve made changes to further improve these programs.
Still, the numbers reflecting students’ performance don’t match other test scores, leaving some to question the programs’ accuracy and value.
“I don’t know if they were intentionally deceitful or incompetent, but I know the data they reported to the board is bad and useless,” said Jon Butzon, who chairs an education advocacy group, the Charleston Education Network. “And we’re talking about smart people who published this data.”
What does the district’s annual literacy report cover?
After a series of Post and Courier stories showed nearly 20 percent of the county’s ninth-graders read on a fourth-grade level or worse, the county school board set a policy in 2010 making literacy the district’s No. 1 priority. That policy required the district to report annually the number and percentage of students reading below grade level in grades three through eight.
It also required the superintendent to develop and implement a district-wide literacy intervention model to identify and address students with reading problems.
Superintendent Nancy McGinley created literacy academies to work with the district’s worst readers in first, third and sixth grades.
The results in the literacy report released this fall showed how those students fared at the beginning of the 2011-12 school year compared with the end, as well as the percentage of students reading below-grade level in grades third through eighth.
What did the report say?
In every grade from third through eighth, the district’s figures showed no more than 11.5 percent of students reading below grade level; the district-wide average was 10.8 percent.
First Grade Academies operated in 45 elementary schools and served 677 students. Weak readers were pulled out of their regular classes to receive extra help; some kindergartners and second-graders also were served.
Students in first-grade academies showed the most progress. About 22 percent of those participants were scoring in the 10th percentile or below at the beginning of the year, and that improved to 13 percent scoring in that same category by the end of the school year.
Third Grade Academies operated in 11 low-income schools and organized separate classes of students. Among their 226 participants, 28.3 percent scored in the 10th percentile or below at the beginning of the school year, and that improved to 20.4 percent by the end of the year.
Sixth Grade Academies existed in eight middle schools serving 219 students, and they also had separate classes of students. In those schools, the percentage of students scoring in the lowest percentile improved eight percentage points to 38.8.
Why are some questioning the figures?
Some of the figures presented don’t match other test scores. For example, according to the district’s report, no more than 11.5 percent of students were reading below grade level in third through eighth grades.
But that seems to conflict with state Palmetto Assessment of State Standards scores, which showed at least 18 percent of students at each of those grade levels scored “not met” in English/language arts this year. And it’s a far cry from the 13.1 percent of the district’s freshmen who were reading at a fourth-grade level or worse earlier this fall; the percentage of freshmen who were below the ninth-grade level would have been even higher.
Laura Donnelly, the district’s director of assessment and evaluation, said “below grade level” in this literacy report is defined as students who scored in the 16th percentile or lower.
There’s a range for what’s considered below grade level, and the 16th percentile is the cutoff mark for being significantly below that, she said.
Other figures released by the district don’t clearly show the percentage of students who ended the year reading on grade level. The district showed the percentage of students scoring in certain ranges at the beginning of the year compared with the end, but it doesn’t show how many are on grade level at the end.
“I can’t make any sense of this,” Butzon said. “I can’t argue that progress is being made, but we just don’t know how much. This is smoke and mirrors and spinmanship. I have no idea whether we’re getting what we’re supposed to get out of it.”
Donnelly said the report is a reflection of the way in which educators asked that students’ progress be reported. The report has three tiers, or ranges of students scoring in certain areas, and that’s used in educators’ decisions on who is served by the literacy academies. Students in different percentile ranges receive different help.
What changes could the district make to gauge students’ reading skills?
Officials said the reason the literacy report results don’t match other tests is because they’re measuring different skills. For example, the state PASS test is measuring whether students have mastered certain skills, and Lexile levels are a completely separate measure that reflects students’ reading ability.
Butzon said the district should be using Lexile levels to describe students’ grade-level reading ability; students either can or can’t read on grade level, and using a percentile isn’t the way to describe that, he said.
“It’s the most precise way to do it,” he said of Lexile levels. “I’m telling you what they chose was even less precise.”
Donnelly said Lexile levels, which were converted into grade-level equivalents and used to determine the percentage of freshmen reading on a fourth-grade level, aren’t precise as a measure for students’ grade levels.
“We’re not comfortable (with that) and we’re forced to do it,” she said. “It’s a rough measure.”
MetaMetrics is the educational measurement organization that developed Lexile levels. It says there is no direct correspondence between a specific Lexile measure and a specific grade level, and any classroom or grade will have a range of readers with a range of Lexile levels, according to its website.
And MetaMetrics doesn’t report students’ abilities as grade-level equivalents because “they are a deceptively simple way to characterize a student’s test score.”
That said, MetaMetrics studied the issue and has published a chart equivocating Lexile measures with students’ grade levels.
What’s next, and what’s at stake?
Reading in the early grades is key for students’ later success in school and in life.
McGinley said the extra help for sixth-graders was affected by school-level changes, and schools didn’t follow the model for the program.
“We were not happy with these results,” McGinley told the board earlier this fall. “When we compared the model in sixth grade to first and third where we saw bigger gains, we totally revamped the Sixth Grade Academy to parallel the lower grades with a more intense and consistent approach school to school.”
This year, the first, third and sixth-grade academies also have been expanded into a Primary Grades Academy and a Middle Grades Academy. That means needy kindergartners through third-graders in 47 schools and sixth- through eighth-graders in 19 middle schools are receiving extra help.
The district also has created a new Literacy Based Learning Division with its existing staff so it could ensure a focus on the academies, as well as improve reading instruction in regular classrooms.
Reach Diette Courrégé Casey at 937-5546.