“But so far, said Louis M. Fabrizio, the director of the division of accountability services in the state department of education, ‘it's not helping much at all.’In Tennessee, the home of value-added models, the picture is not much different according to Olson's report:
‘I think many felt that this was going to be the magic bullet to make this whole thing better, and it doesn't,’ he said of using a growth model, ‘or at least it doesn't from what we have seen so far.’”
“In Tennessee, only eight schools' achievement of AYP was attributable to the growth model said Connie J. Smith, the director of accountability for the state education department. Tennessee uses individual student data to project whether students will be proficient three years into the future.
‘I was not surprised,’ Ms. Smith said. ‘It's a stringent application of the projection model.’
Despite the few schools affected, she said, ‘it's always worth doing and using a growth model, even if it helps one school.’”
Understanding complicated mathematical or measurement models is a long row to hoe. General confusion is likely to be enhanced with the great amount of research, reviews and press talking about gain scores, growth models, vertical scales, value-added models, etc. Hence, PEM has been trying to simplify some of the misunderstandings regarding such models via our own research and publications. Check out a recent addition to our website: An Empirical Investigation of Growth Models for a simple empirical comparison of some common models.
“This paper empirically compared five growth models in the attempt to inform practitioners about relative strengths and weaknesses of the models. Using simulated data where the true growth is assumed know a priori, the research question was to investigate whether the various growth models were able to recover the true ranking of schools.”
No comments:
Post a Comment