A smarter way of thinking about intelligence

Why is one particular intellectual capacity valued over so many other worthy qualities, like compassion, honesty, courage, and common sense?

Adobe Stock

At some point during the past decade, Harvard professor Michael Sandel started to notice the increasingly frequent invocation of a particular word: “smart.” The term was being applied to all manner of products and devices: smart phones, smart cars, smart thermostats, even smart toasters. He also heard the word creeping into the language of politics, employed to justify and promote governmental initiatives. “The way the word was being used bothered me,” Sandel says. “It seemed to pair a narrow kind of technocratic expertise with an attitude of smug superiority.”

Political philosopher that he is, Sandel decided to conduct an analysis of presidential rhetoric. Before the 1980s, he found, American presidents rarely used the word “smart” in their public speeches. Ronald Reagan and George H.W. Bush employed the term relatively sparingly. But the use of the word in presidential remarks “exploded” during the administrations of Bill Clinton and George W. Bush, Sandel reported, with each man uttering the word “smart” at least 450 times. Barack Obama spoke it more than 900 times, and Hillary Clinton often invoked the term both as Secretary of State and as a candidate running for the highest office. This “rhetorical tic,” Sandel came to recognize, was representative of a much more sweeping cultural change, one he addresses with concern in his new book, “The Tyranny of Merit.” Over the past 40 years, he observes, America’s ruling class has exalted one quality, one virtue, one human attribute above all others: smartness.


What’s wrong with that? Surely smart is better than dumb. But, as Sandel points out, the elevation of smart over dumb leaves behind many other valid measures of value: right vs. wrong, fair vs. unfair, free vs. unfree, equal vs. unequal. What’s more, this framing effectively excludes many citizens from the public conversation about important issues. “Talk of what is ‘smart’ versus what is ‘dumb’ — as natural and unobjectionable as it might seem to the elites who speak that language — lands very differently on the ears of others who hear it,” he says.

Sandel is gesturing to those who find themselves on the losing side of the brutal meritocratic competition that has come to dominate our national life. Over the last few decades, power, wealth, and prestige have increasingly flowed to individuals who manifest a particular kind of intelligence, characterized by the ability to quickly recognize patterns in abstract information and to solve well-defined problems of a verbal or quantitative nature. These are the people who can excel on tests like the SAT and the ACT, who can earn top grades in high school, and who can gain admission to selective colleges and universities. As the graduates of these institutions increasingly came to dominate the top ranks of many professions, their distinguishing characteristic — their type of smartness — became ever more culturally valued and materially rewarded.


In recent years, the stakes have grown even higher. In an increasingly precarious economy, it has come to seem that only those possessed of a particular kind of intelligence have a chance at a stable, secure life, or at the satisfying sense that one’s work is honored and valued by society at large.

This relentless winnowing of opportunity has had harmful effects on all involved. Among those able to compete in the smartness-selection game, the process contorts behavior: of parents and students, who fight ferociously for places at coveted colleges; of young professionals, who secure high salaries with exhaustingly long workweeks; and of those who hold power in both the public and the private sphere and rig the game to favor their own interests. Those shut out of the competition are likely to feel humiliated or hopeless, and these feelings lead them to support politicians who promise to punish the resented “elites.” On neither side of this divide do people devote themselves to what Sandel calls “the common good.”


The overvaluing of a narrow kind of intelligence has had other deleterious consequences. It has warped our institutions of higher learning, which now function as giant social sorting machines. And it has turned our public and private leadership class into an assemblage of uninspired technocrats, uninterested in and perhaps incapable of grappling with the enormous problems we face. For all our obsession with smartness, we’ve created a system that is deeply unintelligent.

And so the question looms: Have we arrived at a moment when we might reconsider the premium we’ve placed on conventional intelligence? Are we ready to rethink what it means to be smart?

“Peak head”

Michael Sandel is not the only observer who believes the answer is yes. Essayist and academic Fredrik deBoer is the author of another new book, “The Cult of Smart,” which maintains that many of us have become devout acolytes in an unforgiving religion. The cult of smart is “the great American obsession with appearing intelligent above and beyond all things, the one value that is thought to define us and our worth,” deBoer writes. This proposition has a harsh corollary, he notes: “If being smart is the only thing that matters, then you must be a failed human being if you aren’t.”


In an interview, deBoer — a self-described Marxist who writes for left-wing publications — cited his most vehement objection to the cult of smart: In our society, the fundamentals of a good life are assured only for those who do well in school, when they should be available to all as a matter of right. “Americans are uniquely obsessed with academics because the stakes of not doing well are so high in this country,” he said. “In the US, if you fail at school, you’re going to struggle to find stability and security in adulthood.” Yet not everyone can excel within the narrow parameters of formal education. “The system,” he noted, “has created many more ways to be a loser than to be a winner.”

“Head, Hand, Heart” is yet another book published this fall that takes up similar themes. It’s written by David Goodhart, a British journalist and political analyst who founded the magazine Prospect and who now works for Policy Exchange, a center-right think tank. Echoing the others, he argues that “one form of human aptitude — cognitive analytical ability, or the talent that helps people pass exams and then handle information efficiently in their professional lives — has become the gold standard of human esteem.” Goodhart is most distressed by how little respect and recognition is granted to those who do work that involves physical labor (the “hand” of his title) or caring for others (the “heart”). Over the past few decades, Goodhart writes, “it sometimes feels as if an enormous social vacuum cleaner has sucked up status” from manual occupations and the caring professions, redirecting all the cultural cachet to people who use their heads to process abstract symbols.


The trends these authors describe have become ever more insistent in recent years: Growing abundance at the top, growing scarcity for those at the bottom. Heightening intensity around college admissions, leading affluent parents to seek out any edge, even an illegal one (such as in the Varsity Blues scandal, referenced by all three authors). Deepening despondency among the less fortunate, visible in rising rates of opioid addiction, workforce nonparticipation, and even “deaths of despair.” We were already approaching what Goodhart calls “peak head” — an extreme state of imbalance tilted in favor of the cognitive. Then all of it got a hard shove from COVID-19.

The coronavirus pandemic revealed in a new way the fissures and fault lines of our system. We saw how heavily our lifestyles depend on essential workers, many of whom lack a college degree and many of whom struggle to get by. We came to appreciate the value of care work that happens in the home — work that is typically overlooked in an achievement culture impressed by shiny resumes. As stories about eviction courts and food banks filled the news, we were reminded of how many people are left down below in a society where the rhetoric (as Michael Sandel has documented) is always about “rising.”

If the pandemic called out deficits in our current arrangements, it also hinted at the possibility of real change — change that was strenuously resisted before. An example: For years, progressives have advocated for more federal assistance for Americans trying to make ends meet. For years, conservatives have resisted such measures, deriding them as “handouts.” Then COVID-19 hit, and within weeks Republicans and Democrats crafted a bill delivering stimulus checks and generous supplements to unemployment benefits. In the spring of 2020, research has indicated, the poverty rate in the United States actually declined as a result of these payments.

Another example: For years, education activists have called on universities to drop the SAT as an admission requirement, arguing that the test disadvantages poor students and members of minority groups. For years, most universities kept the SAT hurdle firmly in place. Then COVID-19 hit, and one school after another — Harvard, Yale, Columbia, Stanford, and many others — announced that SAT scores would not be necessary.

These developments may be fleeting — Congress and the White House have not been able to agree on a second stimulus measure, and universities may reinstate the SAT requirement once the coronavirus has receded — but the pandemic has demonstrated that the way things have long been doesn’t have to be the way things will be. The value we place on smartness might be one more assumption that feels a little less certain after the events of 2020.

To be sure, the pandemic has also highlighted the critical importance of scientific and medical expertise: The world anxiously awaits a vaccine created by well-trained researchers. But the failure of our nation to rise to a crisis that was widely predicted — as well as its failure to address and solve the myriad other crises we face — has raised larger questions about the qualities we tend to prize in our leaders.

Beyond IQ

Robert Sternberg, a professor of human development at Cornell who studies the assessment of intelligence, has long been critical of our reliance on “IQ-like” tests such as the SAT to distribute opportunity. “IQ tests do predict a lot of individual, short-term outcomes that society thinks are important,” Sternberg says. “But they are bad at predicting collective, long-term outcomes.” We select our leaders for their ability to engage in quick, superficial, conventional thinking, says Sternberg, rather than for the traits our world really needs: wisdom and creativity. “Look around us. It’s not working!” he exclaimed in a recent editorial.

Sternberg’s sense of urgency is fueled by our demonstrably poor performance on what he calls an “existential IQ test”: dealing with climate change. We hardly deserve to call ourselves intelligent if we preside over our own extinction, he points out. In his own forthcoming book, “Adaptive Intelligence,” Sternberg goes full-on apocalyptic: “If we do not change our notion of intelligence, and if we do not give up on our false ‘meritocracy,’ in a small number of generations, we humans will have nothing to worry about,” he writes. “It will be up to the cockroaches, bacteria, and other hardy species that survive human-made devastation to determine the future course the world will take.”

If we were to do as Robert Sternberg advises — if we were to “change our notion of intelligence” — what might the outcome look like? Although the thinkers mentioned above start out from different ideological perspectives, they converge on a surprisingly similar vision. They imagine a world in which respect and prestige is granted to people who work with their hands and who care for others — not just those who sit in offices and stare at screens. They picture an economy in which doing well in school doesn’t determine one’s life chances, in which training for a trade is as fruitful a path as going off to college. They envisage a civil society in which individuals meet one another on similar footing — where, in Michael Sandel’s words, they have an “equality of condition” in the public square regardless of whether they are equal in economic class. Above all, they contemplate a day when we value human qualities other than analytical intelligence — qualities like compassion, honesty, courage, and common sense.

The authors chart different routes to this better place. Sandel recommends revisions to our nation’s tax policy that would reward labor and penalize unproductive financial speculation; he also suggests a lottery to select (from a pool of qualified applicants) those students who are offered places at elite universities, as a way of puncturing the meritocratic obsession with being “the best.” Fredrik deBoer advocates the adoption of universal health care and a universal basic income as ways of lifting the pressure students feel to achieve academic success. And David Goodhart maintains that real change must be cultural in nature, manifested “in the way we talk to one another.”

But is such profound cultural change really possible? It’s instructive to hear from a man who has already tried, with some success, to alter the way we think about intelligence. Back in 1983, Harvard education professor Howard Gardner introduced the theory of multiple intelligences. In addition to the logical-mathematical and linguistic capacities that constitute the conventional notion of what makes a person smart, he proposed five additional dimensions: musical intelligence, spatial intelligence, bodily-kinesthetic intelligence, interpersonal intelligence, and intrapersonal intelligence.

In his just-published intellectual autobiography, “A Synthesizing Mind,” Gardner describes the idea’s reception. While it was embraced by the public, and by educators in particular, experts in the intelligence assessment field “were quite critical, and sometimes viciously so,” Gardner recounts. He sums up their indignant reaction: “This guy Gardner threatened to knock down the edifice of intelligence, the house of IQ, that had been carefully constructed over the decades.” Trying to change psychologists’ views of intelligence, he concludes, “is like trying to move gravestones in a graveyard.”

But the rest of us are not so rigid. From his longtime perch between the worlds of education and academia, Gardner has observed a waxing and waning of the notion of IQ as “the gold standard, the single measure of human aptitude.” The theory of multiple intelligences emerged at “a moment of openness, of experimentation,” he said in a recent interview, describing how a number of schools around the world drew on its insights to broaden the instruction they offered to students. That moment was followed by a retrenchment, however: years of single-minded focus on standardized testing and a “core” curriculum.

Now, says Gardner, “I think attitudes are beginning to change again. We’ve gone down the road of a one-size-fits-all approach to human ability. We’ve gone down that road too far, and it’s time to turn back.”

Annie Murphy Paul’s next book, “The Extended Mind: The Power of Thinking Outside the Brain,” will be published in June.