A few weeks ago public education advocate Lori Kirkpatrick wrote about the STAAR results Dallas ISD recently published. The piece noted how testing expert Dr. Walter Stroup has explained that the expanded STAAR scale allows for the appearance of significant gains. The District has been promoting big gains for DISD, but we wondered how they were coming to this conclusion.
After some digging, we found a huge disconnect between the facts and what the District and the media presented! The District actually does not know if the students have grown nor do they know if the quality of instruction has improved. They only thing they have is scores with no context.
We reached out to DISD for answers to the two questions posed. Below in black is the email sent and in blue is the reply received from the District's Evaluation and Assessment Department.
To Whom It May Concern,
At the May 9, 2019 BOT Briefing, DISD Administration presented a table of STAAR growth for 5th grade reading and math from 2017-2019. I've attached it for your reference. I wonder if your department has done any further analysis on that data, particularly the following:
In terms of number of items correct, what does this growth difference represent?
It is not appropriate to use items correct for determining growth for students. What the state uses is the scale score. The point of the scaling is to allow for differential % correct and still be equivalent. One would need to look at the average scale score from year to year for each test and subject which collapses the performance standards. These outcomes would then be compared to the state. We have started looking at that but do not have any report to provide at this time.
What fraction of the variance is sensitive to differences in instruction?
We continue to research the relationship between quality of instruction and student outcomes; however, we are unable to provide a specific number that relates the variance to teaching/differences in instruction.
Thank you in advance for your consideration.
So there it is, the analysis to confirm or deny progress hasn't been done yet by the District. Didn't you get the impression from the charts that DISD is making progress? And again, maybe we are but we do not know that by the charts presented.
This is a huge problem! Why are the District, Trustees, and local media presenting these charts as if they're meaningful?
Please note in the reply the District states in order to "[determine] growth for students...one would need to look at the average scale score from year to year for each test and subject...These outcomes would then be compared to the state." You mean like was done here with the ACE program that showed NO SIGNIFICANT improvement in outcomes with ACE?
Education analysis is complex and should be thoroughly evaluated in the proper context before being presented to the community.