How Flawed Social Science Leads Us Astray By Michael M. Rosen

https://www.nationalreview.com/magazine/2021/07/01/how-flawed-social-science-leads-us-astray/?utm_source=recirc-desktop&utm_

There are two classes of people in the world, the old saw goes, those who divide hu­manity into two classes, and those who do not.

The charms of easy categorization and broad generalization have tempted social scientists for centuries. But the trend has picked up speed, especially in psychology, over the last several decades, as Jesse Singal capably demonstrates in The Quick Fix, his engaging and persuasive examination of the growing, unfortunate popularity of faddish theories and their prosperous purveyors.

A heterodox liberal who wrote for various center-left outlets before succumbing to the Substack revolution and starting a popular podcast with fellow nonconformist Katie Herzog, Singal characterizes his book as “an attempt to explain the allure of fad psychology, why that allure is so strong, and how both individuals and institutions can do a better job of resisting it.” The product of rigorous research and thoughtful interviews, The Quick Fix surveys a broad expanse of social-science findings and methodically assesses their flaws.

Singal bemoans the all-too-typical “jump from claims that are empirically defensible but complex and context-dependent to claims that are scientifically questionable but sexy and exciting — and simple.” Instead, he argues in favor of slow, incremental progress in social-science research that, while less flashy than the regnant style of inquiry, would not be as susceptible to debunking.

The Quick Fix takes aim at a wide range of pop-psychological trends, including those originating on the left (such as the self-esteem movement, inaugurated in Northern California) as well as those coming from the right (such as the “superpredator” theory of juvenile crime in the mid 1990s, which conservatives championed and which even Hillary Clinton and Joe Biden came to embrace). These swiftly adopted theories all fell woefully short in practice, however, frequently confusing correlation with causation, falling prey to attribution error (unduly emphasizing personality-based explanations over situational reasons for a given phenomenon), and confounding scholarly and popular work.

Particular forms of statistical manipulation have come to infect many of the studies purporting to document key psychological trends. One such method is “p-hacking”: massaging data so that they fit snugly under a widely accepted statistical threshold designed to help exclude results that could have been the product of pure randomness. “Hidden flexibility” is another: suppressing testing results not congenial to the desired hypothesis and instead publishing only supportive data. “File-drawing” is yet another: ignoring unpublished studies that revealed results contrary to the published ones. Then there is “range re­striction”: clustering data samples in a particular segment of distribution, yielding a misleading result.

But apart from massaging the numbers, pathologies in pop social science also stem from the relentless mathematification of inherently fuzzy, inchoate concepts. Take “positive psychology,” for instance, a movement pioneered three decades ago by Martin Seligman with his 1991 book Learned Optimism: How to Change Your Mind and Your Life. This school posits that much of our psychological health depends on our outlook on life, which, critically, lies within our control. But how to quantify this? After a fellow researcher argued in a 2005 article that up to 40 percent of the variance in human happiness derives from the choices we make, Seligman devised a “happiness formula,” namely H = S + C + V, where S represents the genetic “set point,” or baseline happiness dictated immutably by our genetics, C reflects life circumstances beyond our power, and V signifies elements within our voluntary control.

Seligman has parlayed his academic success at the University of Pennsylvania into developing the Positive Psychology Center, which in turn has influenced police departments and schools. The real-world results, however, have disappointed. A 2015 meta-analysis of the results of the Penn Resilience Program (PRP) — a simplified, shallower version of the celebrated cognitive-behavioral therapy method of inhibiting depression that Seligman designed to be scalable to group treatment — cast doubt on whether “PRP is effective when delivered under real-world conditions.”

Most prominently, positive psychology gained traction in the military, where Seligman was invited to inaugurate the Comprehensive Soldier Fitness (CSF) program, whose centerpiece was modeled after PRP and aimed to reduce the incidence of post-traumatic stress disorder among combat troops. Ultimately, the Pentagon, all too happy to invest in a self-actualizing, resilience-building project, created the largest psychology experiment in human history, pumping more than a quarter of a billion dollars into CSF, but wound up with surprisingly little to show for it. A meta-study by the National Academy of Sciences concluded that CSF had produced “no clinically meaningful differences in pre- and posttest scores” and “no difference in diagnosis among those receiving the [CSF] intervention” and those who hadn’t. (Singal also applies a microscope to Angela Duckworth’s celebrated “grit” re­search, discerning little evidence of the supposedly outsize impact of perseverance on success.)

The implicit-association test (IAT) is another flawed and misunderstood psychological metric. Created by two Harvard scholars in 1998, the IAT measures how rapidly participants associate positive words with particular racial, ethnic, or religious groups, thereby revealing their unspoken, subconscious bias. The test has drawn praise across disciplines and in both popular and academic settings. The only problem, Singal finds, is “there doesn’t appear to be any published evidence that the race IAT has a test-retest reliability that would be seen as acceptable for real-world use in most contexts.”

It’s worth noting that Singal doesn’t believe that the IAT is worthless, but he questions whether implicit bias plays a significant role in “generating racial discrepancies.” The limited effectiveness of measures such as the IAT, attempts to inhibit microaggressions, and the types of intrusive diversity training favored by the anti-racism industry doesn’t inherently disqualify them, but it should dampen our expectations. As Singal puts it, they reflect “a style of education and training that seems much more geared at some sort of internal spiritual cleansing or awakening than at spurring actions geared at improving the world out there.”

Perhaps most depressingly, Singal chronicles the alarming “replication crisis” that has afflicted the social sciences in recent years. Meta-analyses published in Science by Brian Nosek of the University of Virginia revealed that only one-third to one-half of studies his team examined could be replicated, i.e., return statistically significant results affirming the original finding. Worse yet, only 25 percent of social-psychology experiments were replicable.

What, then, can be done to repair this state of affairs? Researchers have proposed several mechanisms, including “preregistration,” or the publication of hypotheses and methodologies before testing; data-sharing among scientists; peer review at the design stage; raising statistical thresholds for publication; and applying probabilistic analysis to data. More generally, Singal urges social scientists to exhibit “a level of humility and realism about the complexity of their work.” He advises researchers to “focus on earning the credibility that legitimates a seat at the policy table.”

In places, such as his analysis of Cass Sunstein and Richard Thaler’s “nudge” approach, Singal devotes too much time to describing particular pop-psychological trends and too little to methodically debunking them. And in others, he indulges in the kind of lengthy renunciation of systemic racism that, on his podcast, he castigates others for and that does little to further his analysis. A more detailed and realistic strategy for overcoming the malign forces besetting social-science research would have been most welcome.

But in the end, Singal, who dislikes the term “heterodox” as a descriptor of his work, more than lives up to his nonconformist billing. In The Quick Fix, he wisely counsels us to resist the appeal of the monocausal explanation, the oversimplified narrative, the “this one personality trait determines everything” study in favor of careful, meticu­lous, nuanced research that sizzles less but enlightens more.

Comments are closed.