A child’s academic progress is every parent’s concern; tracking that progress is a fundamental responsibility of our schools. Good educational data and metrics, with evidence-based instruction, can change the outcome of a child’s life.
For a child like our son, who has Down syndrome and does not take Texas’s suite of standardized tests, that data is largely missing. If it exists at all, it’s defined one student at a time, in a special education student’s Individual Education Plan, or IEP. Objective data on the academic progress of groups of special education students, if they don’t take standardized tests, doesn’t seem to exist at all. Without it, evidence-based choices about instruction, services and materials can’t be made.
Until recently we thought the lack of data was due to a lack of funding. Now, we’re pretty sure it’s not.
In November 2015, we began meeting with the superintendent of our local school system to discuss how our proposed donation of $500,000 might be used to collect this data and improve the outcomes for kids in special education. We weren’t naïve enough to expect instant acceptance and implementation of the proposal, but we did hope the district would move quickly to embrace the idea and the money. It didn’t work out that way.
Curiously, districts across Texas seem to avoid objective IEP goals, opting instead for squishy goals like, “With three prompts, Mary will correctly answer questions about a passage with 70 percent accuracy three out of five times.” These appear objective until one thinks critically. Who will assess Mary? When and how? Are the answers coming from Mary or the person prompting her? Resulting information is less meaningful than data collected under a goal like “ABC standardized reading test will be administered to Mary every 10 weeks. Mary will independently and correctly answer 80 percent of reading comprehension questions on 2nd-grade-level passages.”
Consider a reading class with ten students using two methods for improving comprehension. With good, data-driven IEP’s, parents would get solid data about their individual child’s academic progress. But with information about the entire cohort (i.e., ten kids, two methods and the measured academic progress for each child), administrators would gain great information as to the efficacy of the instruction and method. If they shared that information, parents would know, too. Maybe no one made progress; maybe one method produced better outcomes than the other. A big plus to this approach: All IEPs would require objective, measurable goals.
We had worked with the superintendent to identify a worthy project that had never been done; we were passionate and knowledgeable. Yet a year into what had seemed to be productive negotiations, the district returned to a prior position we had repeatedly rejected, asking us to donate the money with no continuing input to make sure it stayed true to its purpose. After a great deal of thought, we withdrew our offer. The district Board of Trustees never put the possible donation on a meeting agenda or discussed it publicly. The district seemed intent to avoid the donation; the question was “why.”
There’s no state mandate to collect group academic data for kids who don’t take standardized tests. Data is a double-edged sword: the same data that shows areas for improvement also illuminates failures. There are legal remedies for failure; parents can take the district through “due process” and ultimately to court to insure their child’s access to an appropriate education is preserved.
This puts “do nothing” in the win/win position — a district incurs no cost and there is no district data available to parents that could validate the need for better instruction. Lawyers rarely suggest clients collect and report information that highlights shortcomings when there are defined remedies for failures. And as noted by the Houston Chronicle, districts in Texas expend substantial effort and legal fees denying children access to special education services they’re entitled to.
One measures what one values. Anything measured improves. If we never measure the efficacy of Special Education, we are not valuing it — nor can we improve it.
After data — collected only at our insistence — proved our son hadn’t made progress in reading comprehension for five years, he was finally given a new reading instruction method. He advanced four grade levels in one semester. This proved transformational: He was just accepted by a college program for people with intellectual disabilities.
Data changed his outcome.
It can change the outcome for lots of kids with special needs if we allow it to be collected and transparent. In at least one district in Texas, it’s been made clear that this isn’t a matter of money. The money can be found, but the will to collect and report this data is clearly missing.