The New York Times couldn’t hold back: Big Study Links Good Teachers to Lasting Gain.
And for two years now, the so-called Chetty study that claims teacher quality directly causes higher lifetime earnings for their students has been re-released by Chetty and colleagues as well as rehashed by the free-pass mainstream media with a fervor that seems at least over-the-top.
When the Chetty numbers (has anyone noticed that the lifetime earning numbers appear to change?) are put into perspective, all the air should deflate from that misleading balloon: $50,000 gained over a lifetime (40 years) is only about 1.5-2 tanks of gas a month. But inflating that balloon to $1.4 million for a class of 28! Now you have something … (Hint: An overinflated balloon, possibly filled with poop.)
Initial scholarly responses to the Chetty balloon were cautious and critical (see notably Bruce Baker and Matthew Di Carlo here), but the free-pass mainstream press kept on keeping on.
Thus, since I have made a case for our needing a critical free press (in other words, “free press” and not “free pass” found among press-release journalists), we are at a key moment with the release of a thorough review of the Chetty study, a review that discredits the claims, Review of Measuring the Impacts of Teachers by Moshe Adler:
Can the quality of teachers be measured the way that a person’s weight or height is measured? Some economists have tried, but the “value-added” they have attempted to measure has proven elusive. The results have not been consistent over tests or over time. Nevertheless, a two-part report by Raj Chetty and his colleagues claims that higher value-added scores for teachers lead to greater economic success for their students later in life. This review of the methods of Chetty et al. focuses on their most important result: that teacher value-added affects income in adulthood. Five key problems with the research emerge. First, their own results show that the calculation of teacher value-added is unreliable. Second, their own research also generated a result that contradicts their main claim—but the report pushed that inconvenient result aside. Third, the trumpeted result is based on an erroneous calculation. Fourth, the report incorrectly assumes that the (miscalculated) result holds across students’ lifetimes despite the authors’ own research indicating otherwise. Fifth, the report cites studies as support for the authors’ methodology, even though they don’t provide that support. [emphasis added] Despite widespread references to this study in policy circles, the shortcomings and shaky extrapolations make this report misleading and unreliable for determining educational policy.
So my question now is: Will the free-pass mainstream media clean up their Chetty mess?
I suspect we will not have a NYT scorching headline, we will not even have a NYT article, we probably will not see interviews on NPR with Adler, and I am skeptical about Education Week‘s coverage (beyond some bloggers).
The fair and balanced mainstream media, alas, (those journalists who cannot judge the credibility of the research they cover) will not fall all over themselves to cover and then repeat for two years to come the popping of the Chetty balloon because that would mean admitting their own incompetence, which is the sweet allure of fairness that leaves us all misinformed.